WorldWideScience

Sample records for current treatment algorithms

  1. Current algorithm for the surgical treatment of facial pain

    Directory of Open Access Journals (Sweden)

    Munawar Naureen

    2007-07-01

    Full Text Available Background Facial pain may be divided into several distinct categories, each requiring a specific treatment approach. In some cases, however, such categorization is difficult and treatment is ineffective. We reviewed our extensive clinical experience and designed an algorithmic approach to the treatment of medically intractable facial pain that can be treated through surgical intervention. Methods Our treatment algorithm is based on taking into account underlying pathological processes, the anatomical distribution of pain, pain characteristics, the patient's age and medical condition, associated medical problems, the history of previous surgical interventions, and, in some cases, the results of psychological evaluation. The treatment modalities involved in this algorithm include diagnostic blocks, peripheral denervation procedures, craniotomy for microvascular decompression of cranial nerves, percutaneous rhizotomies using radiofrequency ablation, glycerol injection, balloon compression, peripheral nerve stimulation procedures, stereotactic radiosurgery, percutaneous trigeminal tractotomy, and motor cortex stimulation. We recommend that some patients not receive surgery at all, but rather be referred for other medical or psychological treatment. Results Our algorithmic approach was used in more than 100 consecutive patients with medically intractable facial pain. Clinical evaluations and diagnostic workups were followed in each case by the systematic choice of the appropriate intervention. The algorithm has proved easy to follow, and the recommendations include the identification of the optimal surgery for each patient with other options reserved for failures or recurrences. Our overall success rate in eliminating facial pain presently reaches 96%, which is higher than that observed in most clinical series reported to date Conclusion This treatment algorithm for the intractable facial pain appears to be effective for patients with a wide variety

  2. [Algorithms for treatment of complex hand injuries].

    Science.gov (United States)

    Pillukat, T; Prommersberger, K-J

    2011-07-01

    The primary treatment strongly influences the course and prognosis of hand injuries. Complex injuries which compromise functional recovery are especially challenging. Despite an apparently unlimited number of injury patterns it is possible to develop strategies which facilitate a standardized approach to operative treatment. In this situation algorithms can be important guidelines for a rational approach. The following algorithms have been proven in the treatment of complex injuries of the hand by our own experience. They were modified according to the current literature and refer to prehospital care, emergency room management, basic strategy in general and reconstruction of bone and joints, vessels, nerves, tendons and soft tissue coverage in detail. Algorithms facilitate the treatment of severe hand injuries. Applying simple yes/no decisions complex injury patterns are split into distinct partial problems which can be managed step by step.

  3. Principles of a new treatment algorithm in multiple sclerosis

    DEFF Research Database (Denmark)

    Hartung, Hans-Peter; Montalban, Xavier; Sorensen, Per Soelberg

    2011-01-01

    We are entering a new era in the management of patients with multiple sclerosis (MS). The first oral treatment (fingolimod) has now gained US FDA approval, addressing an unmet need for patients with MS who wish to avoid parenteral administration. A second agent (cladribine) is currently being...... considered for approval. With the arrival of these oral agents, a key question is where they may fit into the existing MS treatment algorithm. This article aims to help answer this question by analyzing the trial data for the new oral therapies, as well as for existing MS treatments, by applying practical...... clinical experience, and through consideration of our increased understanding of how to define treatment success in MS. This article also provides a speculative look at what the treatment algorithm may look like in 5 years, with the availability of new data, greater experience and, potentially, other novel...

  4. Treatment Algorithm for Ameloblastoma

    Directory of Open Access Journals (Sweden)

    Madhumati Singh

    2014-01-01

    Full Text Available Ameloblastoma is the second most common benign odontogenic tumour (Shafer et al. 2006 which constitutes 1–3% of all cysts and tumours of jaw, with locally aggressive behaviour, high recurrence rate, and a malignant potential (Chaine et al. 2009. Various treatment algorithms for ameloblastoma have been reported; however, a universally accepted approach remains unsettled and controversial (Chaine et al. 2009. The treatment algorithm to be chosen depends on size (Escande et al. 2009 and Sampson and Pogrel 1999, anatomical location (Feinberg and Steinberg 1996, histologic variant (Philipsen and Reichart 1998, and anatomical involvement (Jackson et al. 1996. In this paper various such treatment modalities which include enucleation and peripheral osteotomy, partial maxillectomy, segmental resection and reconstruction done with fibula graft, and radical resection and reconstruction done with rib graft and their recurrence rate are reviewed with study of five cases.

  5. Treatment Algorithms Based on Tumor Molecular Profiling: The Essence of Precision Medicine Trials.

    Science.gov (United States)

    Le Tourneau, Christophe; Kamal, Maud; Tsimberidou, Apostolia-Maria; Bedard, Philippe; Pierron, Gaëlle; Callens, Céline; Rouleau, Etienne; Vincent-Salomon, Anne; Servant, Nicolas; Alt, Marie; Rouzier, Roman; Paoletti, Xavier; Delattre, Olivier; Bièche, Ivan

    2016-04-01

    With the advent of high-throughput molecular technologies, several precision medicine (PM) studies are currently ongoing that include molecular screening programs and PM clinical trials. Molecular profiling programs establish the molecular profile of patients' tumors with the aim to guide therapy based on identified molecular alterations. The aim of prospective PM clinical trials is to assess the clinical utility of tumor molecular profiling and to determine whether treatment selection based on molecular alterations produces superior outcomes compared with unselected treatment. These trials use treatment algorithms to assign patients to specific targeted therapies based on tumor molecular alterations. These algorithms should be governed by fixed rules to ensure standardization and reproducibility. Here, we summarize key molecular, biological, and technical criteria that, in our view, should be addressed when establishing treatment algorithms based on tumor molecular profiling for PM trials. © The Author 2015. Published by Oxford University Press.

  6. Esophageal achalasia: current diagnosis and treatment.

    Science.gov (United States)

    Schlottmann, Francisco; Patti, Marco G

    2018-05-27

    Esophageal achalasia is a primary esophageal motility disorder of unknown origin, characterized by lack of peristalsis and by incomplete or absent relaxation of the lower esophageal sphincter in response to swallowing. The goal of treatment is to eliminate the functional obstruction at the level of the gastroesophageal junction Areas covered: This comprehensive review will evaluate the current literature, illustrating the diagnostic evaluation and providing an evidence-based treatment algorithm for this disease Expert commentary: Today we have three very effective therapeutic modalities to treat patients with achalasia - pneumatic dilatation, per-oral endoscopic myotomy and laparoscopic Heller myotomy with fundoplication. Treatment should be tailored to the individual patient, in centers where a multidisciplinary approach is available. Esophageal resection should be considered as a last resort for patients who have failed prior therapeutic attempts.

  7. Fast voxel and polygon ray-tracing algorithms in intensity modulated radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Fox, Christopher; Romeijn, H. Edwin; Dempsey, James F.

    2006-01-01

    We present work on combining three algorithms to improve ray-tracing efficiency in radiation therapy dose computation. The three algorithms include: An improved point-in-polygon algorithm, incremental voxel ray tracing algorithm, and stereographic projection of beamlets for voxel truncation. The point-in-polygon and incremental voxel ray-tracing algorithms have been used in computer graphics and nuclear medicine applications while the stereographic projection algorithm was developed by our group. These algorithms demonstrate significant improvements over the current standard algorithms in peer reviewed literature, i.e., the polygon and voxel ray-tracing algorithms of Siddon for voxel classification (point-in-polygon testing) and dose computation, respectively, and radius testing for voxel truncation. The presented polygon ray-tracing technique was tested on 10 intensity modulated radiation therapy (IMRT) treatment planning cases that required the classification of between 0.58 and 2.0 million voxels on a 2.5 mm isotropic dose grid into 1-4 targets and 5-14 structures represented as extruded polygons (a.k.a. Siddon prisms). Incremental voxel ray tracing and voxel truncation employing virtual stereographic projection was tested on the same IMRT treatment planning cases where voxel dose was required for 230-2400 beamlets using a finite-size pencil-beam algorithm. Between a 100 and 360 fold cpu time improvement over Siddon's method was observed for the polygon ray-tracing algorithm to perform classification of voxels for target and structure membership. Between a 2.6 and 3.1 fold reduction in cpu time over current algorithms was found for the implementation of incremental ray tracing. Additionally, voxel truncation via stereographic projection was observed to be 11-25 times faster than the radial-testing beamlet extent approach and was further improved 1.7-2.0 fold through point-classification using the method of translation over the cross product technique

  8. Output Current Ripple Reduction Algorithms for Home Energy Storage Systems

    Directory of Open Access Journals (Sweden)

    Jin-Hyuk Park

    2013-10-01

    Full Text Available This paper proposes an output current ripple reduction algorithm using a proportional-integral (PI controller for an energy storage system (ESS. In single-phase systems, the DC/AC inverter has a second-order harmonic at twice the grid frequency of a DC-link voltage caused by pulsation of the DC-link voltage. The output current of a DC/DC converter has a ripple component because of the ripple of the DC-link voltage. The second-order harmonic adversely affects the battery lifetime. The proposed algorithm has an advantage of reducing the second-order harmonic of the output current in the variable frequency system. The proposed algorithm is verified from the PSIM simulation and experiment with the 3 kW ESS model.

  9. An algorithmic approach for the treatment of severe uncontrolled asthma

    Science.gov (United States)

    Zervas, Eleftherios; Samitas, Konstantinos; Papaioannou, Andriana I.; Bakakos, Petros; Loukides, Stelios; Gaga, Mina

    2018-01-01

    A small subgroup of patients with asthma suffers from severe disease that is either partially controlled or uncontrolled despite intensive, guideline-based treatment. These patients have significantly impaired quality of life and although they constitute asthma patients, they are responsible for more than half of asthma-related healthcare costs. Here, we review a definition for severe asthma and present all therapeutic options currently available for these severe asthma patients. Moreover, we suggest a specific algorithmic treatment approach for the management of severe, difficult-to-treat asthma based on specific phenotype characteristics and biomarkers. The diagnosis and management of severe asthma requires specialised experience, time and effort to comprehend the needs and expectations of each individual patient and incorporate those as well as his/her specific phenotype characteristics into the management planning. Although some new treatment options are currently available for these patients, there is still a need for further research into severe asthma and yet more treatment options. PMID:29531957

  10. Fast treatment plan modification with an over-relaxed Cimmino algorithm

    International Nuclear Information System (INIS)

    Wu Chuan; Jeraj, Robert; Lu Weiguo; Mackie, Thomas R.

    2004-01-01

    A method to quickly modify a treatment plan in adaptive radiotherapy was proposed and studied. The method is based on a Cimmino-type algorithm in linear programming. The fast convergence speed is achieved by over-relaxing the algorithm relaxation parameter from its sufficient convergence range of (0, 2) to (0, ∞). The algorithm parameters are selected so that the over-relaxed Cimmino (ORC) algorithm can effectively approximate an unconstrained re-optimization process in adaptive radiotherapy. To demonstrate the effectiveness and flexibility of the proposed method in adaptive radiotherapy, two scenarios with different organ motion/deformation of one nasopharyngeal case were presented with comparisons made between this method and the re-optimization method. In both scenarios, the ORC algorithm modified treatment plans have dose distributions that are similar to those given by the re-optimized treatment plans. It takes us using the ORC algorithm to finish a treatment plan modification at least three times faster than the re-optimization procedure compared

  11. Reliability and safety of a new upper cervical spine injury treatment algorithm

    Directory of Open Access Journals (Sweden)

    Andrei Fernandes Joaquim

    Full Text Available ABSTRACT In the present study, we evaluated the reliability and safety of a new upper cervical spine injury treatment algorithm to help in the selection of the best treatment modality for these injuries. Methods Thirty cases, previously treated according to the new algorithm, were presented to four spine surgeons who were questioned about their personal suggestion for treatment, and the treatment suggested according to the application of the algorithm. After four weeks, the same questions were asked again to evaluate reliability (intra- and inter-observer using the Kappa index. Results The reliability of the treatment suggested by applying the algorithm was superior to the reliability of the surgeons’ personal suggestion for treatment. When applying the upper cervical spine injury treatment algorithm, an agreement with the treatment actually performed was obtained in more than 89% of the cases. Conclusion The system is safe and reliable for treating traumatic upper cervical spine injuries. The algorithm can be used to help surgeons in the decision between conservative versus surgical treatment of these injuries.

  12. Introducing New Biosimilars Into Current Treatment Algorithms

    Directory of Open Access Journals (Sweden)

    John D. Isaacs

    2016-07-01

    Full Text Available Three biosimilar products are now licensed for the treatment of rheumatic diseases in Europe. The European Medicines Agency (EMA requires that similarity between a biosimilar and its reference product is demonstrated using a rigorous, stepwise process that includes extensive physicochemical and biological analytical testing, non-clinical pharmacology, clinical evaluations, and pharmacovigilance plans. Each step is highly sensitive to any differences between products and progressively reduces any uncertainty over similarity; all steps must be satisfied to demonstrate biosimilarity. The US Food and Drug Administration (FDA requires a similar stringent biosimilar development process. The etanercept biosimilar SB4 (Benepali®, recently approved for the treatment of rheumatoid arthritis, psoriatic arthritis, axial spondyloarthritis (ankylosing spondylitis, non-radiographic axial spondyloarthritis, and plaque psoriasis, is herein used to demonstrate the detailed analytical characterisation and clinical testing that are required by the EMA before biosimilars are approved for use. A comprehensive characterisation study involving >55 physiochemical and >25 biological assays demonstrated that SB4 has highly similar structural, physicochemical, and biological quality attributes to reference etanercept. A Phase I study demonstrated pharmacokinetic equivalence between SB4 and reference etanercept in healthy male subjects. Furthermore, a Phase III, randomised, controlled trial performed in patients with moderate-to-severe rheumatoid arthritis despite treatment with methotrexate (MTX showed that SB4 was equivalent to etanercept in terms of efficacy, safety, and immunogenicity. In conclusion, the biosimilar development process performed according to EMA or FDA guidelines is highly rigorous and comprehensive. Biosimilars such as SB4 are now available in clinical practice and are likely to improve access, reduce costs, and ultimately, improve health outcomes.

  13. Automatic J–A Model Parameter Tuning Algorithm for High Accuracy Inrush Current Simulation

    Directory of Open Access Journals (Sweden)

    Xishan Wen

    2017-04-01

    Full Text Available Inrush current simulation plays an important role in many tasks of the power system, such as power transformer protection. However, the accuracy of the inrush current simulation can hardly be ensured. In this paper, a Jiles–Atherton (J–A theory based model is proposed to simulate the inrush current of power transformers. The characteristics of the inrush current curve are analyzed and results show that the entire inrush current curve can be well featured by the crest value of the first two cycles. With comprehensive consideration of both of the features of the inrush current curve and the J–A parameters, an automatic J–A parameter estimation algorithm is proposed. The proposed algorithm can obtain more reasonable J–A parameters, which improve the accuracy of simulation. Experimental results have verified the efficiency of the proposed algorithm.

  14. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    Science.gov (United States)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  15. Analysis of Radiation Treatment Planning by Dose Calculation and Optimization Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae Sup; Yoon, In Ha; Lee, Woo Seok; Baek, Geum Mun [Dept. of Radiation Oncology, Asan Medical Center, Seoul (Korea, Republic of)

    2012-09-15

    Analyze the Effectiveness of Radiation Treatment Planning by dose calculation and optimization algorithm, apply consideration of actual treatment planning, and then suggest the best way to treatment planning protocol. The treatment planning system use Eclipse 10.0. (Varian, USA). PBC (Pencil Beam Convolution) and AAA (Anisotropic Analytical Algorithm) Apply to Dose calculation, DVO (Dose Volume Optimizer 10.0.28) used for optimized algorithm of Intensity Modulated Radiation Therapy (IMRT), PRO II (Progressive Resolution Optimizer V 8.9.17) and PRO III (Progressive Resolution Optimizer V 10.0.28) used for optimized algorithm of VAMT. A phantom for experiment virtually created at treatment planning system, 30x30x30 cm sized, homogeneous density (HU: 0) and heterogeneous density that inserted air assumed material (HU: -1,000). Apply to clinical treatment planning on the basis of general treatment planning feature analyzed with Phantom planning. In homogeneous density phantom, PBC and AAA show 65.2% PDD (6 MV, 10 cm) both, In heterogeneous density phantom, also show similar PDD value before meet with low density material, but they show different dose curve in air territory, PDD 10 cm showed 75%, 73% each after penetrate phantom. 3D treatment plan in same MU, AAA treatment planning shows low dose at Lung included area. 2D POP treatment plan with 15 MV of cervical vertebral region include trachea and lung area, Conformity Index (ICRU 62) is 0.95 in PBC calculation and 0.93 in AAA. DVO DVH and Dose calculation DVH are showed equal value in IMRT treatment plan. But AAA calculation shows lack of dose compared with DVO result which is satisfactory condition. Optimizing VMAT treatment plans using PRO II obtained results were satisfactory, but lower density area showed lack of dose in dose calculations. PRO III, but optimizing the dose calculation results were similar with optimized the same conditions once more. In this study, do not judge the rightness of the dose

  16. Analysis of Radiation Treatment Planning by Dose Calculation and Optimization Algorithm

    International Nuclear Information System (INIS)

    Kim, Dae Sup; Yoon, In Ha; Lee, Woo Seok; Baek, Geum Mun

    2012-01-01

    Analyze the Effectiveness of Radiation Treatment Planning by dose calculation and optimization algorithm, apply consideration of actual treatment planning, and then suggest the best way to treatment planning protocol. The treatment planning system use Eclipse 10.0. (Varian, USA). PBC (Pencil Beam Convolution) and AAA (Anisotropic Analytical Algorithm) Apply to Dose calculation, DVO (Dose Volume Optimizer 10.0.28) used for optimized algorithm of Intensity Modulated Radiation Therapy (IMRT), PRO II (Progressive Resolution Optimizer V 8.9.17) and PRO III (Progressive Resolution Optimizer V 10.0.28) used for optimized algorithm of VAMT. A phantom for experiment virtually created at treatment planning system, 30x30x30 cm sized, homogeneous density (HU: 0) and heterogeneous density that inserted air assumed material (HU: -1,000). Apply to clinical treatment planning on the basis of general treatment planning feature analyzed with Phantom planning. In homogeneous density phantom, PBC and AAA show 65.2% PDD (6 MV, 10 cm) both, In heterogeneous density phantom, also show similar PDD value before meet with low density material, but they show different dose curve in air territory, PDD 10 cm showed 75%, 73% each after penetrate phantom. 3D treatment plan in same MU, AAA treatment planning shows low dose at Lung included area. 2D POP treatment plan with 15 MV of cervical vertebral region include trachea and lung area, Conformity Index (ICRU 62) is 0.95 in PBC calculation and 0.93 in AAA. DVO DVH and Dose calculation DVH are showed equal value in IMRT treatment plan. But AAA calculation shows lack of dose compared with DVO result which is satisfactory condition. Optimizing VMAT treatment plans using PRO II obtained results were satisfactory, but lower density area showed lack of dose in dose calculations. PRO III, but optimizing the dose calculation results were similar with optimized the same conditions once more. In this study, do not judge the rightness of the dose

  17. A comparison between anisotropic analytical and multigrid superposition dose calculation algorithms in radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Wu, Vincent W.C.; Tse, Teddy K.H.; Ho, Cola L.M.; Yeung, Eric C.Y.

    2013-01-01

    Monte Carlo (MC) simulation is currently the most accurate dose calculation algorithm in radiotherapy planning but requires relatively long processing time. Faster model-based algorithms such as the anisotropic analytical algorithm (AAA) by the Eclipse treatment planning system and multigrid superposition (MGS) by the XiO treatment planning system are 2 commonly used algorithms. This study compared AAA and MGS against MC, as the gold standard, on brain, nasopharynx, lung, and prostate cancer patients. Computed tomography of 6 patients of each cancer type was used. The same hypothetical treatment plan using the same machine and treatment prescription was computed for each case by each planning system using their respective dose calculation algorithm. The doses at reference points including (1) soft tissues only, (2) bones only, (3) air cavities only, (4) soft tissue-bone boundary (Soft/Bone), (5) soft tissue-air boundary (Soft/Air), and (6) bone-air boundary (Bone/Air), were measured and compared using the mean absolute percentage error (MAPE), which was a function of the percentage dose deviations from MC. Besides, the computation time of each treatment plan was recorded and compared. The MAPEs of MGS were significantly lower than AAA in all types of cancers (p<0.001). With regards to body density combinations, the MAPE of AAA ranged from 1.8% (soft tissue) to 4.9% (Bone/Air), whereas that of MGS from 1.6% (air cavities) to 2.9% (Soft/Bone). The MAPEs of MGS (2.6%±2.1) were significantly lower than that of AAA (3.7%±2.5) in all tissue density combinations (p<0.001). The mean computation time of AAA for all treatment plans was significantly lower than that of the MGS (p<0.001). Both AAA and MGS algorithms demonstrated dose deviations of less than 4.0% in most clinical cases and their performance was better in homogeneous tissues than at tissue boundaries. In general, MGS demonstrated relatively smaller dose deviations than AAA but required longer computation time

  18. Estimating the chance of success in IVF treatment using a ranking algorithm.

    Science.gov (United States)

    Güvenir, H Altay; Misirli, Gizem; Dilbaz, Serdar; Ozdegirmenci, Ozlem; Demir, Berfu; Dilbaz, Berna

    2015-09-01

    In medicine, estimating the chance of success for treatment is important in deciding whether to begin the treatment or not. This paper focuses on the domain of in vitro fertilization (IVF), where estimating the outcome of a treatment is very crucial in the decision to proceed with treatment for both the clinicians and the infertile couples. IVF treatment is a stressful and costly process. It is very stressful for couples who want to have a baby. If an initial evaluation indicates a low pregnancy rate, decision of the couple may change not to start the IVF treatment. The aim of this study is twofold, firstly, to develop a technique that can be used to estimate the chance of success for a couple who wants to have a baby and secondly, to determine the attributes and their particular values affecting the outcome in IVF treatment. We propose a new technique, called success estimation using a ranking algorithm (SERA), for estimating the success of a treatment using a ranking-based algorithm. The particular ranking algorithm used here is RIMARC. The performance of the new algorithm is compared with two well-known algorithms that assign class probabilities to query instances. The algorithms used in the comparison are Naïve Bayes Classifier and Random Forest. The comparison is done in terms of area under the ROC curve, accuracy and execution time, using tenfold stratified cross-validation. The results indicate that the proposed SERA algorithm has a potential to be used successfully to estimate the probability of success in medical treatment.

  19. [Long-acting insulins in the treatment of type 2 diabetes and their position in the current treatment algorithm].

    Science.gov (United States)

    Haluzík, Martin

    Insulin therapy has been for many years an inseparable part of the treatment of patients with type 2 diabetes, in particular those with longer diabetes duration. Current national and international guidelines list insulin treatment as a possible second choice therapy in patient with unsatisfactory glucose control on monotherapy with metformin. In reality, insulin therapy is often initiated later than it optimally should be. The reasons include among others the fear of patients and sometimes also of physicians from the side effects of insulin. Even though the options of antidiabetic treatment has been diversified by the addition of novel groups of antidiabetics with good efficacy and low risk of hypoglycemia, long acting insulin therapy still remains the most effective way of decreasing fasting hyperglycemia with the effect lasting further throughout the day. In this paper we summarize the current knowledge concerning long-acting insulins available on the Czech market or the ones that should be available in the near future. We discuss the differences among available long-acting insulins and their clinical consequences with respect to the selection of particular insulin for particular patient.Key words: biosimilar insulins - body weight - diabetes mellitus - hypoglycemia - long acting insulin.

  20. A non-linear algorithm for current signal filtering and peak detection in SiPM

    International Nuclear Information System (INIS)

    Putignano, M; Intermite, A; Welsch, C P

    2012-01-01

    Read-out of Silicon Photomultipliers is commonly achieved by means of charge integration, a method particularly susceptible to after-pulsing noise and not efficient for low level light signals. Current signal monitoring, characterized by easier electronic implementation and intrinsically faster than charge integration, is also more suitable for low level light signals and can potentially result in much decreased after-pulsing noise effects. However, its use is to date limited by the need of developing a suitable read-out algorithm for signal analysis and filtering able to achieve current peak detection and measurement with the needed precision and accuracy. In this paper we present an original algorithm, based on a piecewise linear-fitting approach, to filter the noise of the current signal and hence efficiently identifying and measuring current peaks. The proposed algorithm is then compared with the optimal linear filtering algorithm for time-encoded peak detection, based on a moving average routine, and assessed in terms of accuracy, precision, and peak detection efficiency, demonstrating improvements of 1÷2 orders of magnitude in all these quality factors.

  1. Treatment Algorithm of a Hypertension Specialist

    Czech Academy of Sciences Publication Activity Database

    Peleška, Jan; Anger, Z.; Buchtela, David; Tomečková, Marie; Veselý, Arnošt; Zvárová, Jana

    2007-01-01

    Roč. 25, Suppl. 2 (2007), S 383-S 383 ISSN 0952-1178. [European Meeting on Hypertension /17./. 15.06.2007-19.06.2007, Milan] R&D Projects: GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : treatment algorithm for hypertension * hypertension specialist Subject RIV: FA - Cardiovascular Diseases incl. Cardiotharic Surgery

  2. Mycoplasma genitalium infections: current treatment options and resistance issues

    Directory of Open Access Journals (Sweden)

    Sethi S

    2017-09-01

    Full Text Available Sunil Sethi, Kamran Zaman, Neha Jain Department of Medical Microbiology, Postgraduate Institute of Medical Education and Research, Chandigarh, India Abstract: Mycoplasma genitalium is one of the important causes of non-gonococcal urethritis. Rising incidence and emerging antimicrobial resistance are a major concern these days. The poor clinical outcomes with doxycycline therapy led to the use of azithromycin as the primary drug of choice. Single-dose azithromycin regimen over a period of time was changed to extended regimen following studies showing better clinical cures and less risk of resistance development. However, emerging macrolide resistance, either due to transmission of resistance or drug pressure has further worsened the management of this infection. The issues of drug resistance and treatment failures also exist in cases of M. genitalium infection. At present, the emergence of multidrug-resistant (MDR M. genitalium strains is an alarming sign for its treatment and the associated public health impact due to its complications. However, newer drugs like pristinamycin, solithromycin, sitafloxacin, and others have shown a hope for the clinical cure, but need further clinical trials to optimize the therapeutic dosing schedules and formulate appropriate treatment regimens. Rampant and inappropriate use of these newer drugs will further sabotage future attempts to manage MDR strains. There is currently a need to formulate diagnostic algorithms and etiology-based treatment regimens rather than the syndromic approach, preferably using combination therapy instead of a monotherapy. Awareness about the current guidelines and recommended treatment regimens among clinicians and local practitioners is of utmost importance. Antimicrobial resistance testing and global surveillance are required to assess the efficacy of current treatment regimens and for guiding future research for the early detection and management of MDR M. genitalium infections

  3. Effectiveness of multiple sclerosis treatment with current immunomodulatory drugs.

    Science.gov (United States)

    Milo, Ron

    2015-04-01

    Multiple sclerosis (MS) is a chronic inflammatory disease of the CNS of a putative autoimmune origin characterized by neurologic dysfunction disseminated in space and time due to demyelination and axonal loss that results in progressive disability. Recent advances in understanding the immune pathogenesis of the disease resulted in the introduction of numerous effective immunomodulatoty drugs having diverse mechanisms of action, modes of administration and risk-benefit profiles. This results in more complex albeit more promising treatment selection and choices. The epidemiology, clinical features, pathogenesis and diagnosis of the disease are discussed. The mode of action and main characteristics of current immunomodulatory drugs for MS and their place in the therapeutic algorithm of the disease based on evidence from clinical trials are described. Speculation on new paradigms, treatment goals and outcome measures aimed at improving the landscape of MS treatment is presented. Multiple disease, drug and patient-related factors should be taken into consideration when selecting the appropriate drug and treatment strategy to the appropriate patient, thus paving the road for personalized medicine in MS.

  4. Outcome of a 4-step treatment algorithm for depressed inpatients

    NARCIS (Netherlands)

    Birkenhäger, T.K.; Broek, W.W. van den; Moleman, P.; Bruijn, J.A.

    2006-01-01

    Objective: The aim of this study was to examine the efficacy and the feasibility of a 4-step treatment algorithm for inpatients with major depressive disorder. Method: Depressed inpatients, meeting DSM-IV criteria for major depressive disorder, were enrolled in the algorithm that consisted of

  5. Cost-effectiveness of lanreotide Autogel in treatment algorithms of acromegaly

    NARCIS (Netherlands)

    Biermasz, Nienke R.; Roelfsema, Ferdinand; Pereira, Alberto M.; Romijn, Johannes A.

    2009-01-01

    The introduction of effective pharmacological treatments has changed the management of acromegaly. However, chronic, life-long treatment with somatostatin analogues and/or growth hormone receptor antagonists is very expensive. We estimated the costs of treatment algorithms to control acromegaly from

  6. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  7. A leaf sequencing algorithm to enlarge treatment field length in IMRT

    International Nuclear Information System (INIS)

    Xia Ping; Hwang, Andrew B.; Verhey, Lynn J.

    2002-01-01

    With MLC-based IMRT, the maximum usable field size is often smaller than the maximum field size for conventional treatments. This is due to the constraints of the overtravel distances of MLC leaves and/or jaws. Using a new leaf sequencing algorithm, the usable IMRT field length (perpendicular to the MLC motion) can be mostly made equal to the full length of the MLC field without violating the upper jaw overtravel limit. For any given intensity pattern, a criterion was proposed to assess whether an intensity pattern can be delivered without violation of the jaw position constraints. If the criterion is met, the new algorithm will consider the jaw position constraints during the segmentation for the step and shoot delivery method. The strategy employed by the algorithm is to connect the intensity elements outside the jaw overtravel limits with those inside the jaw overtravel limits. Several methods were used to establish these connections during segmentation by modifying a previously published algorithm (areal algorithm), including changing the intensity level, alternating the leaf-sequencing direction, or limiting the segment field size. The algorithm was tested with 1000 random intensity patterns with dimensions of 21x27 cm2, 800 intensity patterns with higher intensity outside the jaw overtravel limit, and three different types of clinical treatment plans that were undeliverable using a segmentation method from a commercial treatment planning system. The new algorithm achieved a success rate of 100% with these test patterns. For the 1000 random patterns, the new algorithm yields a similar average number of segments of 36.9±2.9 in comparison to 36.6±1.3 when using the areal algorithm. For the 800 patterns with higher intensities outside the jaw overtravel limits, the new algorithm results in an increase of 25% in the average number of segments compared to the areal algorithm. However, the areal algorithm fails to create deliverable segments for 90% of these

  8. Technical Note: Improving the VMERGE treatment planning algorithm for rotational radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Gaddy, Melissa R., E-mail: mrgaddy@ncsu.edu; Papp, Dávid, E-mail: dpapp@ncsu.edu [Department of Mathematics, North Carolina State University, Raleigh, North Carolina 27695-8205 (United States)

    2016-07-15

    Purpose: The authors revisit the VMERGE treatment planning algorithm by Craft et al. [“Multicriteria VMAT optimization,” Med. Phys. 39, 686–696 (2012)] for arc therapy planning and propose two changes to the method that are aimed at improving the achieved trade-off between treatment time and plan quality at little additional planning time cost, while retaining other desirable properties of the original algorithm. Methods: The original VMERGE algorithm first computes an “ideal,” high quality but also highly time consuming treatment plan that irradiates the patient from all possible angles in a fine angular grid with a highly modulated beam and then makes this plan deliverable within practical treatment time by an iterative fluence map merging and sequencing algorithm. We propose two changes to this method. First, we regularize the ideal plan obtained in the first step by adding an explicit constraint on treatment time. Second, we propose a different merging criterion that comprises of identifying and merging adjacent maps whose merging results in the least degradation of radiation dose. Results: The effect of both suggested modifications is evaluated individually and jointly on clinical prostate and paraspinal cases. Details of the two cases are reported. Conclusions: In the authors’ computational study they found that both proposed modifications, especially the regularization, yield noticeably improved treatment plans for the same treatment times than what can be obtained using the original VMERGE method. The resulting plans match the quality of 20-beam step-and-shoot IMRT plans with a delivery time of approximately 2 min.

  9. Texas Medication Algorithm Project: development and feasibility testing of a treatment algorithm for patients with bipolar disorder.

    Science.gov (United States)

    Suppes, T; Swann, A C; Dennehy, E B; Habermacher, E D; Mason, M; Crismon, M L; Toprac, M G; Rush, A J; Shon, S P; Altshuler, K Z

    2001-06-01

    Use of treatment guidelines for treatment of major psychiatric illnesses has increased in recent years. The Texas Medication Algorithm Project (TMAP) was developed to study the feasibility and process of developing and implementing guidelines for bipolar disorder, major depressive disorder, and schizophrenia in the public mental health system of Texas. This article describes the consensus process used to develop the first set of TMAP algorithms for the Bipolar Disorder Module (Phase 1) and the trial testing the feasibility of their implementation in inpatient and outpatient psychiatric settings across Texas (Phase 2). The feasibility trial answered core questions regarding implementation of treatment guidelines for bipolar disorder. A total of 69 patients were treated with the original algorithms for bipolar disorder developed in Phase 1 of TMAP. Results support that physicians accepted the guidelines, followed recommendations to see patients at certain intervals, and utilized sequenced treatment steps differentially over the course of treatment. While improvements in clinical symptoms (24-item Brief Psychiatric Rating Scale) were observed over the course of enrollment in the trial, these conclusions are limited by the fact that physician volunteers were utilized for both treatment and ratings. and there was no control group. Results from Phases 1 and 2 indicate that it is possible to develop and implement a treatment guideline for patients with a history of mania in public mental health clinics in Texas. TMAP Phase 3, a recently completed larger and controlled trial assessing the clinical and economic impact of treatment guidelines and patient and family education in the public mental health system of Texas, improves upon this methodology.

  10. A new column-generation-based algorithm for VMAT treatment plan optimization

    International Nuclear Information System (INIS)

    Peng Fei; Epelman, Marina A; Romeijn, H Edwin; Jia Xun; Gu Xuejun; Jiang, Steve B

    2012-01-01

    We study the treatment plan optimization problem for volumetric modulated arc therapy (VMAT). We propose a new column-generation-based algorithm that takes into account bounds on the gantry speed and dose rate, as well as an upper bound on the rate of change of the gantry speed, in addition to MLC constraints. The algorithm iteratively adds one aperture at each control point along the treatment arc. In each iteration, a restricted problem optimizing intensities at previously selected apertures is solved, and its solution is used to formulate a pricing problem, which selects an aperture at another control point that is compatible with previously selected apertures and leads to the largest rate of improvement in the objective function value of the restricted problem. Once a complete set of apertures is obtained, their intensities are optimized and the gantry speeds and dose rates are adjusted to minimize treatment time while satisfying all machine restrictions. Comparisons of treatment plans obtained by our algorithm to idealized IMRT plans of 177 beams on five clinical prostate cancer cases demonstrate high quality with respect to clinical dose–volume criteria. For all cases, our algorithm yields treatment plans that can be delivered in around 2 min. Implementation on a graphic processing unit enables us to finish the optimization of a VMAT plan in 25–55 s. (paper)

  11. Evidence-based algorithm for heparin dosing before cardiopulmonary bypass. Part 1: Development of the algorithm.

    Science.gov (United States)

    McKinney, Mark C; Riley, Jeffrey B

    2007-12-01

    The incidence of heparin resistance during adult cardiac surgery with cardiopulmonary bypass has been reported at 15%-20%. The consistent use of a clinical decision-making algorithm may increase the consistency of patient care and likely reduce the total required heparin dose and other problems associated with heparin dosing. After a directed survey of practicing perfusionists regarding treatment of heparin resistance and a literature search for high-level evidence regarding the diagnosis and treatment of heparin resistance, an evidence-based decision-making algorithm was constructed. The face validity of the algorithm decisive steps and logic was confirmed by a second survey of practicing perfusionists. The algorithm begins with review of the patient history to identify predictors for heparin resistance. The definition for heparin resistance contained in the algorithm is an activated clotting time 450 IU/kg heparin loading dose. Based on the literature, the treatment for heparin resistance used in the algorithm is anti-thrombin III supplement. The algorithm seems to be valid and is supported by high-level evidence and clinician opinion. The next step is a human randomized clinical trial to test the clinical procedure guideline algorithm vs. current standard clinical practice.

  12. Treatment of giant cell tumor of bone: Current concepts.

    Science.gov (United States)

    Puri, Ajay; Agarwal, Manish

    2007-04-01

    Giant cell tumor (GCT) of bone though one of the commonest bone tumors encountered by an orthopedic surgeon continues to intrigue treating surgeons. Usually benign, they are locally aggressive and may occasionally undergo malignant transformation. The surgeon needs to strike a balance during treatment between reducing the incidence of local recurrence while preserving maximal function.Differing opinions pertaining to the use of adjuvants for extension of curettage, the relative role of bone graft or cement to pack the defect and the management of recurrent lesions are some of the issues that offer topics for eternal debate.Current literature suggests that intralesional curettage strikes the best balance between controlling disease and preserving optimum function in the majority of the cases though there may be occasions where the extent of the disease mandates resection to ensure adequate disease clearance.An accompanying treatment algorithm helps outline the management strategy in GCT.

  13. Treatment of giant cell tumor of bone: Current concepts

    Directory of Open Access Journals (Sweden)

    Puri Ajay

    2007-01-01

    Full Text Available Giant cell tumor (GCT of bone though one of the commonest bone tumors encountered by an orthopedic surgeon continues to intrigue treating surgeons. Usually benign, they are locally aggressive and may occasionally undergo malignant transformation. The surgeon needs to strike a balance during treatment between reducing the incidence of local recurrence while preserving maximal function. Differing opinions pertaining to the use of adjuvants for extension of curettage, the relative role of bone graft or cement to pack the defect and the management of recurrent lesions are some of the issues that offer topics for eternal debate. Current literature suggests that intralesional curettage strikes the best balance between controlling disease and preserving optimum function in the majority of the cases though there may be occasions where the extent of the disease mandates resection to ensure adequate disease clearance. An accompanying treatment algorithm helps outline the management strategy in GCT.

  14. Development of an integrative algorithm for the treatment of various stages of full-thickness burns of the first commissure of the hand.

    Science.gov (United States)

    Yuste, Valentin; Delgado, Julio; Agullo, Alberto; Sampietro, Jose Mauel

    2017-06-01

    Burns of the first commissure of the hand can evolve into an adduction contracture of the thumb. We decided to conduct a review of the existing literature on the treatment of full-thickness burns of the first commissure in order to develop a treatment algorithm that integrates the various currently available procedures. A search of the existing literature was conducted, focusing on the treatment of a burn of the first commissure in its chronic and acute phases. A total of 29 relevant articles were selected; 24 focused exclusively on the chronic contracture stage, while 3 focused exclusively on the acute burn stage, and 2 articles studied both stages. A therapeutic algorithm for full-thickness burns of the first commissure of the hand was developed. With this algorithm we sought to relate each degree and stage of the burn with a treatment. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.

  15. Evaluation of focused ultrasound algorithms: Issues for reducing pre-focal heating and treatment time.

    Science.gov (United States)

    Yiannakou, Marinos; Trimikliniotis, Michael; Yiallouras, Christos; Damianou, Christakis

    2016-02-01

    Due to the heating in the pre-focal field the delay between successive movements in high intensity focused ultrasound (HIFU) are sometimes as long as 60s, resulting to treatment time in the order of 2-3h. Because there is generally a requirement to reduce treatment time, we were motivated to explore alternative transducer motion algorithms in order to reduce pre-focal heating and treatment time. A 1 MHz single element transducer with 4 cm diameter and 10 cm focal length was used. A simulation model was developed that estimates the temperature, thermal dose and lesion development in the pre-focal field. The simulated temperature history that was combined with the motion algorithms produced thermal maps in the pre-focal region. Polyacrylimde gel phantom was used to evaluate the induced pre-focal heating for each motion algorithm used, and also was used to assess the accuracy of the simulation model. Three out of the six algorithms having successive steps close to each other, exhibited severe heating in the pre-focal field. Minimal heating was produced with the algorithms having successive steps apart from each other (square, square spiral and random). The last three algorithms were improved further (with small cost in time), thus eliminating completely the pre-focal heating and reducing substantially the treatment time as compared to traditional algorithms. Out of the six algorithms, 3 were successful in eliminating the pre-focal heating completely. Because these 3 algorithms required no delay between successive movements (except in the last part of the motion), the treatment time was reduced by 93%. Therefore, it will be possible in the future, to achieve treatment time of focused ultrasound therapies shorter than 30 min. The rate of ablated volume achieved with one of the proposed algorithms was 71 cm(3)/h. The intention of this pilot study was to demonstrate that the navigation algorithms play the most important role in reducing pre-focal heating. By evaluating in

  16. [New calculation algorithms in brachytherapy for iridium 192 treatments].

    Science.gov (United States)

    Robert, C; Dumas, I; Martinetti, F; Chargari, C; Haie-Meder, C; Lefkopoulos, D

    2018-05-18

    Since 1995, the brachytherapy dosimetry protocols follow the methodology recommended by the Task Group 43. This methodology, which has the advantage of being fast, is based on several approximations that are not always valid in clinical conditions. Model-based dose calculation algorithms have recently emerged in treatment planning stations and are considered as a major evolution by allowing for consideration of the patient's finite dimensions, tissue heterogeneities and the presence of high atomic number materials in applicators. In 2012, a report from the American Association of Physicists in Medicine Radiation Therapy Task Group 186 reviews these models and makes recommendations for their clinical implementation. This review focuses on the use of model-based dose calculation algorithms in the context of iridium 192 treatments. After a description of these algorithms and their clinical implementation, a summary of the main questions raised by these new methods is performed. Considerations regarding the choice of the medium used for the dose specification and the recommended methodology for assigning materials characteristics are especially described. In the last part, recent concrete examples from the literature illustrate the capabilities of these new algorithms on clinical cases. Copyright © 2018 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  17. An Algorithm for Surface Current Retrieval from X-band Marine Radar Images

    Directory of Open Access Journals (Sweden)

    Chengxi Shen

    2015-06-01

    Full Text Available In this paper, a novel current inversion algorithm from X-band marine radar images is proposed. The routine, for which deep water is assumed, begins with 3-D FFT of the radar image sequence, followed by the extraction of the dispersion shell from the 3-D image spectrum. Next, the dispersion shell is converted to a polar current shell (PCS using a polar coordinate transformation. After removing outliers along each radial direction of the PCS, a robust sinusoidal curve fitting is applied to the data points along each circumferential direction of the PCS. The angle corresponding to the maximum of the estimated sinusoid function is determined to be the current direction, and the amplitude of this sinusoidal function is the current speed. For validation, the algorithm is tested against both simulated radar images and field data collected by a vertically-polarized X-band system and ground-truthed with measurements from an acoustic Doppler current profiler (ADCP. From the field data, it is observed that when the current speed is less than 0.5 m/s, the root mean square differences between the radar-derived and the ADCP-measured current speed and direction are 7.3 cm/s and 32.7°, respectively. The results indicate that the proposed procedure, unlike most existing current inversion schemes, is not susceptible to high current speeds and circumvents the need to consider aliasing. Meanwhile, the relatively low computational cost makes it an excellent choice in practical marine applications.

  18. Pressure ulcers: Current understanding and newer modalities of treatment

    Directory of Open Access Journals (Sweden)

    Surajit Bhattacharya

    2015-01-01

    Full Text Available This article reviews the mechanism, symptoms, causes, severity, diagnosis, prevention and present recommendations for surgical as well as non-surgical management of pressure ulcers. Particular focus has been placed on the current understandings and the newer modalities for the treatment of pressure ulcers. The paper also covers the role of nutrition and pressure-release devices such as cushions and mattresses as a part of the treatment algorithm for preventing and quick healing process of these wounds. Pressure ulcers develop primarily from pressure and shear; are progressive in nature and most frequently found in bedridden, chair bound or immobile people. They often develop in people who have been hospitalised for a long time generally for a different problem and increase the overall time as well as cost of hospitalisation that have detrimental effects on patient′s quality of life. Loss of sensation compounds the problem manifold, and failure of reactive hyperaemia cycle of the pressure prone area remains the most important aetiopathology. Pressure ulcers are largely preventable in nature, and their management depends on their severity. The available literature about severity of pressure ulcers, their classification and medical care protocols have been described in this paper. The present treatment options include various approaches of cleaning the wound, debridement, optimised dressings, role of antibiotics and reconstructive surgery. The newer treatment options such as negative pressure wound therapy, hyperbaric oxygen therapy, cell therapy have been discussed, and the advantages and disadvantages of current and newer methods have also been described.

  19. Pressure ulcers: Current understanding and newer modalities of treatment

    Science.gov (United States)

    Bhattacharya, Surajit; Mishra, R. K.

    2015-01-01

    This article reviews the mechanism, symptoms, causes, severity, diagnosis, prevention and present recommendations for surgical as well as non-surgical management of pressure ulcers. Particular focus has been placed on the current understandings and the newer modalities for the treatment of pressure ulcers. The paper also covers the role of nutrition and pressure-release devices such as cushions and mattresses as a part of the treatment algorithm for preventing and quick healing process of these wounds. Pressure ulcers develop primarily from pressure and shear; are progressive in nature and most frequently found in bedridden, chair bound or immobile people. They often develop in people who have been hospitalised for a long time generally for a different problem and increase the overall time as well as cost of hospitalisation that have detrimental effects on patient's quality of life. Loss of sensation compounds the problem manifold, and failure of reactive hyperaemia cycle of the pressure prone area remains the most important aetiopathology. Pressure ulcers are largely preventable in nature, and their management depends on their severity. The available literature about severity of pressure ulcers, their classification and medical care protocols have been described in this paper. The present treatment options include various approaches of cleaning the wound, debridement, optimised dressings, role of antibiotics and reconstructive surgery. The newer treatment options such as negative pressure wound therapy, hyperbaric oxygen therapy, cell therapy have been discussed, and the advantages and disadvantages of current and newer methods have also been described. PMID:25991879

  20. An optimisation algorithm for determination of treatment margins around moving and deformable targets

    International Nuclear Information System (INIS)

    Redpath, Anthony Thomas; Muren, Ludvig Paul

    2005-01-01

    Purpose: Determining treatment margins for inter-fractional motion of moving and deformable clinical target volumes (CTVs) remains a major challenge. This paper describes and applies an optimisation algorithm designed to derive such margins. Material and methods: The algorithm works by expanding the CTV, as determined from a pre-treatment or planning scan, to enclose the CTV positions observed during treatment. CTV positions during treatment may be obtained using, for example, repeat CT scanning and/or repeat electronic portal imaging (EPI). The algorithm can be applied to both individual patients and to a set of patients. The margins derived will minimise the excess volume outside the envelope that encloses all observed CTV positions (the CTV envelope). Initially, margins are set such that the envelope is more than adequately covered when the planning CTV is expanded. The algorithm uses an iterative method where the margins are sampled randomly and are then either increased or decreased randomly. The algorithm is tested on a set of 19 bladder cancer patients that underwent weekly repeat CT scanning and EPI throughout their treatment course. Results: From repeated runs on individual patients, the algorithm produces margins within a range of ±2 mm that lie among the best results found with an exhaustive search approach, and that agree within 3 mm with margins determined by a manual approach on the same data. The algorithm could be used to determine margins to cover any specified geometrical uncertainty, and allows for the determination of reduced margins by relaxing the coverage criteria, for example disregarding extreme CTV positions, or an arbitrarily selected volume fraction of the CTV envelope, and/or patients with extreme geometrical uncertainties. Conclusion: An optimisation approach to margin determination is found to give reproducible results within the accuracy required. The major advantage with this algorithm is that it is completely empirical, and it is

  1. Implementation of pencil kernel and depth penetration algorithms for treatment planning of proton beams

    International Nuclear Information System (INIS)

    Russell, K.R.; Saxner, M.; Ahnesjoe, A.; Montelius, A.; Grusell, E.; Dahlgren, C.V.

    2000-01-01

    The implementation of two algorithms for calculating dose distributions for radiation therapy treatment planning of intermediate energy proton beams is described. A pencil kernel algorithm and a depth penetration algorithm have been incorporated into a commercial three-dimensional treatment planning system (Helax-TMS, Helax AB, Sweden) to allow conformal planning techniques using irregularly shaped fields, proton range modulation, range modification and dose calculation for non-coplanar beams. The pencil kernel algorithm is developed from the Fermi-Eyges formalism and Moliere multiple-scattering theory with range straggling corrections applied. The depth penetration algorithm is based on the energy loss in the continuous slowing down approximation with simple correction factors applied to the beam penumbra region and has been implemented for fast, interactive treatment planning. Modelling of the effects of air gaps and range modifying device thickness and position are implicit to both algorithms. Measured and calculated dose values are compared for a therapeutic proton beam in both homogeneous and heterogeneous phantoms of varying complexity. Both algorithms model the beam penumbra as a function of depth in a homogeneous phantom with acceptable accuracy. Results show that the pencil kernel algorithm is required for modelling the dose perturbation effects from scattering in heterogeneous media. (author)

  2. [Current treatment of hepatic trauma].

    Science.gov (United States)

    Silvio-Estaba, Leonardo; Madrazo-González, Zoilo; Ramos-Rubio, Emilio

    2008-05-01

    The therapeutic and diagnostic approach of liver trauma injuries (by extension, of abdominal trauma) has evolved remarkably in the last decades. The current non-surgical treatment in the vast majority of liver injuries is supported by the accumulated experience and optimal results in the current series. It is considered that the non-surgical treatment of liver injuries has a current rate of success of 83-100%, with an associated morbidity of 5-42%. The haemodynamic stability of the patient will determine the applicability of the non-surgical treatment. Arteriography with angioembolisation constitutes a key technical tool in the context of liver trauma. Patients with haemodynamic instability will need an urgent operation and can benefit from abdominal packing techniques, damage control and post-operative arteriography. The present review attempts to contribute to the current, global and practical management in the care of liver trauma.

  3. Validation of Varian's AAA algorithm with focus on lung treatments

    International Nuclear Information System (INIS)

    Roende, Heidi S.; Hoffmann, Lone

    2009-01-01

    The objective of this study was to examine the accuracy of the Anisotropic Analytical Algorithm (AAA). A variety of different field configurations in homogeneous and in inhomogeneous media (lung geometry) was tested for the AAA algorithm. It was also tested against the present Pencil Beam Convolution (PBC) algorithm. Materials and methods. Two dimensional (2D) dose distributions were measured for a variety of different field configurations in solid water with a 2D array of ion chambers. The dose distributions of patient specific treatment plans in selected transversal slices were measured in a Thorax lung phantom with Gafchromic dosimetry films. A Farmer ion chamber was used to check point doses in the Thorax phantom. The 2D dose distributions were evaluated with a gamma criterion of 3% in dose and 3 mm distance to agreement (DTA) for the 2D array measurements and for the film measurements. Results. For AAA, all fields tested in homogeneous media fulfilled the criterion, except asymmetric fields with wedges and intensity modulated plans where deviations of 5 and 4%, respectively, were seen. Overall, the measured and calculated 2D dose distributions for AAA in the Thorax phantom showed good agreement - both for 6 and 15 MV photons. More than 80% of the points in the high dose regions met the gamma criterion, though it failed at low doses and at gradients. For the PBC algorithm only 30-70% of the points met the gamma criterion. Conclusion. The AAA algorithm has been shown to be superior to the PBC algorithm in heterogeneous media, especially for 15 MV. For most treatment plans the deviations in the lung and the mediastinum regions are below 3%. However, the algorithm may underestimate the dose to the spinal cord by up to 7%

  4. Algorithm of physical rehabilitation of athletes in polyclinic stage of treatment of osteochondrosis

    Directory of Open Access Journals (Sweden)

    E. V. MAKAROVA

    2014-12-01

    Full Text Available Purpose: develop an algorithm which improves the physical rehabilitation of athletes with osteochondrosis on polyclinic stage of treatment. Material: processed of scientific literature and Internet. Results: this research analyzes traditional recovery of athletes with spinal osteochondrosis on polyclinic stage of treatment. The practical and clinical study showed that etiopathogenesis injuries and diseases of the musculoskeletal system of athletes of different specialties have similar evidences. Therefore, a recovery of sports working capacity during medical rehabilitation treatment does not differ from ordinary patients. We propose an algorithm of physical rehabilitation, aimed for the fast recovery of health of athletes and return them to the athletic training, and conducting therapeutic exercises with specific objects such as balancing disk and preventive "Osan". Conclusions : established that the physical rehabilitation of athletes at polyclinic stage of treatment similar to the rehabilitation of patients non-athletes, and improved algorithm for the physical rehabilitation of athletes with osteochondrosis in the polyclinic stage of treatment.

  5. Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis

    Science.gov (United States)

    Chiou, Jin-Chern

    1990-01-01

    Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.

  6. Orthognathic Surgery in Craniofacial Microsomia: Treatment Algorithm

    Science.gov (United States)

    Valladares, Salvador; Torrealba, Ramón; Nuñez, Marcelo; Uribe, Francisca

    2015-01-01

    Summary: Craniofacial microsomia is a broad term that covers a variety of craniofacial malformation conditions that are caused by alterations in the derivatives of the first and second pharyngeal arches. In general terms, diverse therapeutic alternatives are proposed according to the growth stage and the severity of the alteration. When craniofacial growth has concluded, conventional orthognathic surgery (Le Fort I osteotomy, bilateral sagittal split osteotomy, and genioplasty) provides good alternatives for MI and MIIA type cases. Reconstruction of the mandibular ramus and temporomandibular joint before orthognathic surgery is the indicated treatment for cases MIIB and MIII. The goal of this article is to establish a surgical treatment algorithm for orthognathic surgery on patients with craniofacial microsomia, analyzing the points that allow the ideal treatment for each patient to be chosen. PMID:25674375

  7. A DVH-guided IMRT optimization algorithm for automatic treatment planning and adaptive radiotherapy replanning

    International Nuclear Information System (INIS)

    Zarepisheh, Masoud; Li, Nan; Long, Troy; Romeijn, H. Edwin; Tian, Zhen; Jia, Xun; Jiang, Steve B.

    2014-01-01

    Purpose: To develop a novel algorithm that incorporates prior treatment knowledge into intensity modulated radiation therapy optimization to facilitate automatic treatment planning and adaptive radiotherapy (ART) replanning. Methods: The algorithm automatically creates a treatment plan guided by the DVH curves of a reference plan that contains information on the clinician-approved dose-volume trade-offs among different targets/organs and among different portions of a DVH curve for an organ. In ART, the reference plan is the initial plan for the same patient, while for automatic treatment planning the reference plan is selected from a library of clinically approved and delivered plans of previously treated patients with similar medical conditions and geometry. The proposed algorithm employs a voxel-based optimization model and navigates the large voxel-based Pareto surface. The voxel weights are iteratively adjusted to approach a plan that is similar to the reference plan in terms of the DVHs. If the reference plan is feasible but not Pareto optimal, the algorithm generates a Pareto optimal plan with the DVHs better than the reference ones. If the reference plan is too restricting for the new geometry, the algorithm generates a Pareto plan with DVHs close to the reference ones. In both cases, the new plans have similar DVH trade-offs as the reference plans. Results: The algorithm was tested using three patient cases and found to be able to automatically adjust the voxel-weighting factors in order to generate a Pareto plan with similar DVH trade-offs as the reference plan. The algorithm has also been implemented on a GPU for high efficiency. Conclusions: A novel prior-knowledge-based optimization algorithm has been developed that automatically adjust the voxel weights and generate a clinical optimal plan at high efficiency. It is found that the new algorithm can significantly improve the plan quality and planning efficiency in ART replanning and automatic treatment

  8. Eddy current testing probe optimization using a parallel genetic algorithm

    Directory of Open Access Journals (Sweden)

    Dolapchiev Ivaylo

    2008-01-01

    Full Text Available This paper uses the developed parallel version of Michalewicz's Genocop III Genetic Algorithm (GA searching technique to optimize the coil geometry of an eddy current non-destructive testing probe (ECTP. The electromagnetic field is computed using FEMM 2D finite element code. The aim of this optimization was to determine coil dimensions and positions that improve ECTP sensitivity to physical properties of the tested devices.

  9. Diagnosis and treatment of acute ankle injuries: development of an evidence-based algorithm

    Directory of Open Access Journals (Sweden)

    Hans Polzer

    2012-01-01

    Full Text Available Acute ankle injuries are among the most common injuries in emergency departments. However, a standardized examination and an evidence-based treatment are missing. Therefore, aim of this study was to systematically search the current literature, classify the evidence and develop an algorithm for diagnosis and treatment of acute ankle injuries. We systematically searched PubMed and the Cochrane Database for randomized controlled trials, meta-analysis, systematic reviews, or if applicable observational studies and classified them according to their level of evidence. According to the currently available literature, the following recommendations are given. The Ottawa Ankle/Foot Rule should be applied in order to rule out fractures, Physical examination is sufficient for diagnosing injuries to the lateral ligament complex. Classification into stable and unstable injuries is applicable and of clinical importance. The squeeze-, crossed leg- and external rotation test are indicative for injuries of the syndesmosis. Magnetic resonance imaging is recommended to verify such injuries. Stable ankle sprains have a good prognosis, while for unstable ankle sprains conservative treatment is at least as effective as operative treatment without carrying possible complications. Early functional treatment leads to the fastest recovery and the least rate of re-injury. Supervised rehabilitation reduces residual symptoms and re-injuries. Taken these recommendations into account, we here present an applicable and evidence-based step by step decision pathway for the diagnosis and treatment of acute ankle injuries, which can be implemented in any emergency department or doctor’s practice. It provides quality assurance for the patient and confidence for the attending physician.

  10. Does videothoracoscopy improve clinical outcomes when implemented as part of a pleural empyema treatment algorithm?

    Directory of Open Access Journals (Sweden)

    Ricardo Mingarini Terra

    Full Text Available OBJECTIVE: We aimed to evaluate whether the inclusion of videothoracoscopy in a pleural empyema treatment algorithm would change the clinical outcome of such patients. METHODS: This study performed quality-improvement research. We conducted a retrospective review of patients who underwent pleural decortication for pleural empyema at our institution from 2002 to 2008. With the old algorithm (January 2002 to September 2005, open decortication was the procedure of choice, and videothoracoscopy was only performed in certain sporadic mid-stage cases. With the new algorithm (October 2005 to December 2008, videothoracoscopy became the first-line treatment option, whereas open decortication was only performed in patients with a thick pleural peel (>2 cm observed by chest scan. The patients were divided into an old algorithm (n = 93 and new algorithm (n = 113 group and compared. The main outcome variables assessed included treatment failure (pleural space reintervention or death up to 60 days after medical discharge and the occurrence of complications. RESULTS: Videothoracoscopy and open decortication were performed in 13 and 80 patients from the old algorithm group and in 81 and 32 patients from the new algorithm group, respectively (p<0.01. The patients in the new algorithm group were older (41 +1 vs. 46.3+ 16.7 years, p = 0.014 and had higher Charlson Comorbidity Index scores [0(0-3 vs. 2(0-4, p = 0.032]. The occurrence of treatment failure was similar in both groups (19.35% vs. 24.77%, p = 0.35, although the complication rate was lower in the new algorithm group (48.3% vs. 33.6%, p = 0.04. CONCLUSIONS: The wider use of videothoracoscopy in pleural empyema treatment was associated with fewer complications and unaltered rates of mortality and reoperation even though more severely ill patients were subjected to videothoracoscopic surgery.

  11. Robust low frequency current ripple elimination algorithm for grid-connected fuel cell systems with power balancing technique

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong-Soo; Choe, Gyu-Yeong; Lee, Byoung-Kuk [School of Information and Communication Engineering, Sungkyunkwan University, 300 Cheoncheon-dong, Jangan-gu, Suwon, Gyeonggi-do 440-746 (Korea, Republic of); Kang, Hyun-Soo [R and D Center, Advanced Drive Technology (ADT) Company, 689-26 Geumjeong-dong, Gunpo-si, Gyeonggi-do 435-862 (Korea, Republic of)

    2011-05-15

    The low frequency current ripple in grid-connected fuel cell systems is generated from dc-ac inverter operation, which generates 60 Hz fundamental component, and gives harmful effects on fuel cell stack itself, such as making cathode surface responses slower, causing an increase of more than 10% in the fuel consumption, creating oxygen starvation, causing a reduction in the operating lifetime, and incurring a nuisance tripping such as overload situation. With these reasons, low frequency current ripple makes fuel cell system unstable and lifetime of fuel cell stack itself short. This paper presents a fast and robust control algorithm to eliminate low frequency current ripple in grid-connected fuel cell systems. Compared with the conventional methods, in the proposed control algorithm, dc link voltage controller is shifted from dc-dc converter to dc-ac inverter, resulting that dc-ac inverter handles dc link voltage control and output current control simultaneously with help of power balancing technique. The results indicate that the proposed algorithm can not only completely eliminate current ripple but also significantly reduce the overshoot or undershoot during transient states without any extra hardware. The validity of the proposed algorithm is verified by computer simulations and also by experiments with a 1 kW laboratory prototype. (author)

  12. Control algorithms based on the active and non-active currents for a UPQC without series transformers

    OpenAIRE

    Correa Monteiro, Luis Fernando; Aredes, Mauricio; Pinto, J. G.; Exposto, Bruno; Afonso, João L.

    2016-01-01

    This study presents control algorithms for a new unified power quality conditioner (UPQC) without the series transformers that are frequently used to make the insertion of the series converter of the UPQC between the power supply and the load. The behaviour of the proposed UPQC is evaluated in presence of voltage imbalances, as well as under non-sinusoidal voltage-and current conditions. The presented algorithms derive from the concepts involving the active and non-active currents, together w...

  13. New management algorithms in multiple sclerosis

    DEFF Research Database (Denmark)

    Sorensen, Per Soelberg

    2014-01-01

    complex. The purpose of the review has been to work out new management algorithms for treatment of relapsing-remitting multiple sclerosis including new oral therapies and therapeutic monoclonal antibodies. RECENT FINDINGS: Recent large placebo-controlled trials in relapsing-remitting multiple sclerosis......PURPOSE OF REVIEW: Our current treatment algorithms include only IFN-β and glatiramer as available first-line disease-modifying drugs and natalizumab and fingolimod as second-line therapies. Today, 10 drugs have been approved in Europe and nine in the United States making the choice of therapy more...

  14. A Proposed Algorithm for Improved Recognition and Treatment of the Depression/Anxiety Spectrum in Primary Care

    Science.gov (United States)

    Ballenger, James C.; Davidson, Jonathan R. T.; Lecrubier, Yves; Nutt, David J.

    2001-01-01

    The International Consensus Group on Depression and Anxiety has held 7 meetings over the last 3 years that focused on depression and specific anxiety disorders. During the course of the meeting series, a number of common themes have developed. At the last meeting of the Consensus Group, we reviewed these areas of commonality across the spectrum of depression and anxiety disorders. With the aim of improving the recognition and management of depression and anxiety in the primary care setting, we developed an algorithm that is presented in this article. We attempted to balance currently available scientific knowledge about the treatment of these disorders and to reformat it to provide an acceptable algorithm that meets the practical aspects of recognizing and treating these disorders in primary care. PMID:15014615

  15. Use of Monte Carlo computation in benchmarking radiotherapy treatment planning system algorithms

    International Nuclear Information System (INIS)

    Lewis, R.D.; Ryde, S.J.S.; Seaby, A.W.; Hancock, D.A.; Evans, C.J.

    2000-01-01

    Radiotherapy treatments are becoming more complex, often requiring the dose to be calculated in three dimensions and sometimes involving the application of non-coplanar beams. The ability of treatment planning systems to accurately calculate dose under a range of these and other irradiation conditions requires evaluation. Practical assessment of such arrangements can be problematical, especially when a heterogeneous medium is used. This work describes the use of Monte Carlo computation as a benchmarking tool to assess the dose distribution of external photon beam plans obtained in a simple heterogeneous phantom by several commercially available 3D and 2D treatment planning system algorithms. For comparison, practical measurements were undertaken using film dosimetry. The dose distributions were calculated for a variety of irradiation conditions designed to show the effects of surface obliquity, inhomogeneities and missing tissue above tangential beams. The results show maximum dose differences of 47% between some planning algorithms and film at a point 1 mm below a tangentially irradiated surface. Overall, the dose distribution obtained from film was most faithfully reproduced by the Monte Carlo N-Particle results illustrating the potential of Monte Carlo computation in evaluating treatment planning system algorithms. (author)

  16. How Effective Is Algorithm-Guided Treatment for Depressed Inpatients? Results from the Randomized Controlled Multicenter German Algorithm Project 3 Trial.

    Science.gov (United States)

    Adli, Mazda; Wiethoff, Katja; Baghai, Thomas C; Fisher, Robert; Seemüller, Florian; Laakmann, Gregor; Brieger, Peter; Cordes, Joachim; Malevani, Jaroslav; Laux, Gerd; Hauth, Iris; Möller, Hans-Jürgen; Kronmüller, Klaus-Thomas; Smolka, Michael N; Schlattmann, Peter; Berger, Maximilian; Ricken, Roland; Stamm, Thomas J; Heinz, Andreas; Bauer, Michael

    2017-09-01

    Treatment algorithms are considered as key to improve outcomes by enhancing the quality of care. This is the first randomized controlled study to evaluate the clinical effect of algorithm-guided treatment in inpatients with major depressive disorder. Inpatients, aged 18 to 70 years with major depressive disorder from 10 German psychiatric departments were randomized to 5 different treatment arms (from 2000 to 2005), 3 of which were standardized stepwise drug treatment algorithms (ALGO). The fourth arm proposed medications and provided less specific recommendations based on a computerized documentation and expert system (CDES), the fifth arm received treatment as usual (TAU). ALGO included 3 different second-step strategies: lithium augmentation (ALGO LA), antidepressant dose-escalation (ALGO DE), and switch to a different antidepressant (ALGO SW). Time to remission (21-item Hamilton Depression Rating Scale ≤9) was the primary outcome. Time to remission was significantly shorter for ALGO DE (n=91) compared with both TAU (n=84) (HR=1.67; P=.014) and CDES (n=79) (HR=1.59; P=.031) and ALGO SW (n=89) compared with both TAU (HR=1.64; P=.018) and CDES (HR=1.56; P=.038). For both ALGO LA (n=86) and ALGO DE, fewer antidepressant medications were needed to achieve remission than for CDES or TAU (Palgorithm-guided treatment is associated with shorter times and fewer medication changes to achieve remission with depressed inpatients than treatment as usual or computerized medication choice guidance. © The Author 2017. Published by Oxford University Press on behalf of CINP.

  17. Implementation of Human Trafficking Education and Treatment Algorithm in the Emergency Department.

    Science.gov (United States)

    Egyud, Amber; Stephens, Kimberly; Swanson-Bierman, Brenda; DiCuccio, Marge; Whiteman, Kimberly

    2017-11-01

    Health care professionals have not been successful in recognizing or rescuing victims of human trafficking. The purpose of this project was to implement a screening system and treatment algorithm in the emergency department to improve the identification and rescue of victims of human trafficking. The lack of recognition by health care professionals is related to inadequate education and training tools and confusion with other forms of violence such as trauma and sexual assault. A multidisciplinary team was formed to assess the evidence related to human trafficking and make recommendations for practice. After receiving education, staff completed a survey about knowledge gained from the training. An algorithm for identification and treatment of sex trafficking victims was implemented and included a 2-pronged identification approach: (1) medical red flags created by a risk-assessment tool embedded in the electronic health record and (2) a silent notification process. Outcome measures were the number of victims who were identified either by the medical red flags or by silent notification and were offered and accepted intervention. Survey results indicated that 75% of participants reported that the education improved their competence level. The results demonstrated that an education and treatment algorithm may be an effective strategy to improve recognition. One patient was identified as an actual victim of human trafficking; the remaining patients reported other forms of abuse. Education and a treatment algorithm were effective strategies to improve recognition and rescue of human trafficking victims and increase identification of other forms of abuse. Copyright © 2017 Emergency Nurses Association. Published by Elsevier Inc. All rights reserved.

  18. HOW DOES FINGOLIMOD (GILENYA® FIT IN THE TREATMENT ALGORITHM FOR HIGHLY ACTIVE RELAPSING-REMITTING MULTIPLE SCLEROSIS?

    Directory of Open Access Journals (Sweden)

    Franz eFazekas

    2013-05-01

    Full Text Available Multiple sclerosis (MS is a neurological disorder characterised by inflammatory demyelination and neurodegeneration in the central nervous system (CNS. Until recently, disease modifying treatment was based on agents requiring parenteral delivery, thus limiting long-term compliance. Basic treatments such as beta-interferon provide only moderate efficacy, and although therapies for second-line treatment and highly active MS are more effective, they are associated with potentially severe side effects. Fingolimod (Gilenya® is the first oral treatment of MS and has recently been approved as single disease-modifying therapy in highly active relapsing-remitting multiple sclerosis (RRMS for adult patients with high disease activity despite basic treatment (beta-interferon and for treatment-naïve patients with rapidly evolving severe RRMS. At a scientific meeting that took place in Vienna on November 18th, 2011, experts from 10 Central and Eastern European countries discussed the clinical benefits and potential risks of fingolimod for MS, suggested how the new therapy fits within the current treatment algorithm and provided expert opinion for the selection and management of patients.

  19. Comparison of two heterogeneity correction algorithms in pituitary gland treatments with intensity-modulated radiation therapy

    International Nuclear Information System (INIS)

    Albino, Lucas D.; Santos, Gabriela R.; Ribeiro, Victor A.B.; Rodrigues, Laura N.; Weltman, Eduardo; Braga, Henrique F.

    2013-01-01

    The dose accuracy calculated by a treatment planning system is directly related to the chosen algorithm. Nowadays, several calculation doses algorithms are commercially available and they differ in calculation time and accuracy, especially when individual tissue densities are taken into account. The aim of this study was to compare two different calculation algorithms from iPlan®, BrainLAB, in the treatment of pituitary gland tumor with intensity-modulated radiation therapy (IMRT). These tumors are located in a region with variable electronic density tissues. The deviations from the plan with no heterogeneity correction were evaluated. To initial validation of the data inserted into the planning system, an IMRT plan was simulated in a anthropomorphic phantom and the dose distribution was measured with a radiochromic film. The gamma analysis was performed in the film, comparing it with dose distributions calculated with X-ray Voxel Monte Carlo (XVMC) algorithm and pencil beam convolution (PBC). Next, 33 patients plans, initially calculated by PBC algorithm, were recalculated with XVMC algorithm. The treatment volumes and organs-at-risk dose-volume histograms were compared. No relevant differences were found in dose-volume histograms between XVMC and PBC. However, differences were obtained when comparing each plan with the plan without heterogeneity correction. (author)

  20. Russian guidelines for the management of COPD: algorithm of pharmacologic treatment

    Directory of Open Access Journals (Sweden)

    Aisanov Z

    2018-01-01

    Full Text Available Zaurbek Aisanov,1 Sergey Avdeev,2 Vladimir Arkhipov,3 Andrey Belevskiy,1 Alexander Chuchalin,1 Igor Leshchenko,4 Svetlana Ovcharenko,5 Evgeny Shmelev,6 Marc Miravitlles7 1Department of Pulmonology, N.I. Pirogov Russian State National Research Medical University, Healthcare Ministry of Russia, 2Clinical Department, Federal Pulmonology Research Institute, Federal Medical and Biological Agency of Russia, 3Clinical Pharmacology Department, RUDN University, 4Department of Phthisiology, Pulmonology and Thoracic Surgery, Ural State Medical University, Healthcare Ministry of Russia, Ekaterinburg, 5Internal Medicine Department No.1, I.M. Sechenov First Moscow State Medical University, Healthcare Ministry of Russia, 6Department of Differential Diagnostics, Federal Central Research Institute of Tuberculosis, Moscow, Russia; 7Pneumology Department, University Hospital Vall d’Hebron, Ciber de Enfermedades Respiratorias (CIBERES, Barcelona, Spain Abstract: The high prevalence of COPD together with its high level of misdiagnosis and late diagnosis dictate the necessity for the development and implementation of clinical practice guidelines (CPGs in order to improve the management of this disease. High-quality, evidence-based international CPGs need to be adapted to the particular situation of each country or region. A new version of the Russian Respiratory Society guidelines released at the end of 2016 was based on the proposal by Global Initiative for Obstructive Lung Disease but adapted to the characteristics of the Russian health system and included an algorithm of pharmacologic treatment of COPD. The proposed algorithm had to comply with the requirements of the Russian Ministry of Health to be included into the unified electronic rubricator, which required a balance between the level of information and the simplicity of the graphic design. This was achieved by: exclusion of the initial diagnostic process, grouping together the common pharmacologic and

  1. Current treatment of low grade astrocytoma

    DEFF Research Database (Denmark)

    Pedersen, Christina Louise; Romner, Bertil

    2013-01-01

    Through a comprehensive review of the current literature, the present article investigates several aspects of low grade astrocytomas (LGA), including prognostic factors, treatment strategies and follow-up regimes. LGA are in general relatively slow-growing primary brain tumours, but they have a v...... effective in discriminating between tumour progression and radiation necrosis. The research into biomarkers is currently limited with regards to their applications in LGA diagnostics, and therefore further studies including larger patient populations are needed.......Through a comprehensive review of the current literature, the present article investigates several aspects of low grade astrocytomas (LGA), including prognostic factors, treatment strategies and follow-up regimes. LGA are in general relatively slow-growing primary brain tumours, but they have...... as the course of disease. The current literature seems to support the idea that treatment with radical tumour resection, where possible, yields better long term outcome for patients with LGA. However, adjuvant therapy is often necessary. Administering early postoperative radiotherapy to patients with partially...

  2. Treatment Algorithms in Systemic Lupus Erythematosus.

    Science.gov (United States)

    Muangchan, Chayawee; van Vollenhoven, Ronald F; Bernatsky, Sasha R; Smith, C Douglas; Hudson, Marie; Inanç, Murat; Rothfield, Naomi F; Nash, Peter T; Furie, Richard A; Senécal, Jean-Luc; Chandran, Vinod; Burgos-Vargas, Ruben; Ramsey-Goldman, Rosalind; Pope, Janet E

    2015-09-01

    To establish agreement on systemic lupus erythematosus (SLE) treatment. SLE experts (n = 69) were e-mailed scenarios and indicated preferred treatments. Algorithms were constructed and agreement determined (≥50% respondents indicating ≥70% agreement). Initially, 54% (n = 37) responded suggesting treatment for scenarios; 13 experts rated agreement with scenarios. Fourteen of 16 scenarios had agreement as follows: discoid lupus: first-line therapy was topical agents and hydroxychloroquine and/or glucocorticoids then azathioprine and subsequently mycophenolate (mofetil); uncomplicated cutaneous vasculitis: initial treatment was glucocorticoids ± hydroxychloroquine ± methotrexate, followed by azathioprine or mycophenolate and then cyclophosphamide; arthritis: initial therapy was hydroxychloroquine and/or glucocorticoids, then methotrexate and subsequently rituximab; pericarditis: first-line therapy was nonsteroidal antiinflammatory drugs, then glucocorticoids with/without hydroxychloroquine, then azathioprine, mycophenolate, or methotrexate and finally belimumab or rituximab, and/or a pericardial window; interstitial lung disease/alveolitis: induction was glucocorticoids and mycophenolate or cyclophosphamide, then rituximab or intravenous gamma globulin (IVIG), and maintenance followed with azathioprine or mycophenolate; pulmonary hypertension: glucocorticoids and mycophenolate or cyclophosphamide and an endothelin receptor antagonist were initial therapies, subsequent treatments were phosphodiesterase-5 inhibitors and then prostanoids and rituximab; antiphospholipid antibody syndrome: standard anticoagulation with/without hydroxychloroquine, then a thrombin inhibitor for venous thrombosis, versus adding aspirin or platelet inhibition drugs for arterial events; mononeuritis multiplex and central nervous system vasculitis: first-line therapy was glucocorticoids and cyclophosphamide followed by maintenance with azathioprine or mycophenolate, and

  3. Evaluation of a Machine-Learning Algorithm for Treatment Planning in Prostate Low-Dose-Rate Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Nicolae, Alexandru [Department of Physics, Ryerson University, Toronto, Ontario (Canada); Department of Medical Physics, Odette Cancer Center, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Morton, Gerard; Chung, Hans; Loblaw, Andrew [Department of Radiation Oncology, Odette Cancer Center, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Jain, Suneil; Mitchell, Darren [Department of Clinical Oncology, The Northern Ireland Cancer Centre, Belfast City Hospital, Antrim, Northern Ireland (United Kingdom); Lu, Lin [Department of Radiation Therapy, Odette Cancer Center, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Helou, Joelle; Al-Hanaqta, Motasem [Department of Radiation Oncology, Odette Cancer Center, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Heath, Emily [Department of Physics, Carleton University, Ottawa, Ontario (Canada); Ravi, Ananth, E-mail: ananth.ravi@sunnybrook.ca [Department of Medical Physics, Odette Cancer Center, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada)

    2017-03-15

    Purpose: This work presents the application of a machine learning (ML) algorithm to automatically generate high-quality, prostate low-dose-rate (LDR) brachytherapy treatment plans. The ML algorithm can mimic characteristics of preoperative treatment plans deemed clinically acceptable by brachytherapists. The planning efficiency, dosimetry, and quality (as assessed by experts) of preoperative plans generated with an ML planning approach was retrospectively evaluated in this study. Methods and Materials: Preimplantation and postimplantation treatment plans were extracted from 100 high-quality LDR treatments and stored within a training database. The ML training algorithm matches similar features from a new LDR case to those within the training database to rapidly obtain an initial seed distribution; plans were then further fine-tuned using stochastic optimization. Preimplantation treatment plans generated by the ML algorithm were compared with brachytherapist (BT) treatment plans in terms of planning time (Wilcoxon rank sum, α = 0.05) and dosimetry (1-way analysis of variance, α = 0.05). Qualitative preimplantation plan quality was evaluated by expert LDR radiation oncologists using a Likert scale questionnaire. Results: The average planning time for the ML approach was 0.84 ± 0.57 minutes, compared with 17.88 ± 8.76 minutes for the expert planner (P=.020). Preimplantation plans were dosimetrically equivalent to the BT plans; the average prostate V150% was 4% lower for ML plans (P=.002), although the difference was not clinically significant. Respondents ranked the ML-generated plans as equivalent to expert BT treatment plans in terms of target coverage, normal tissue avoidance, implant confidence, and the need for plan modifications. Respondents had difficulty differentiating between plans generated by a human or those generated by the ML algorithm. Conclusions: Prostate LDR preimplantation treatment plans that have equivalent quality to plans created

  4. Evaluation of a Machine-Learning Algorithm for Treatment Planning in Prostate Low-Dose-Rate Brachytherapy

    International Nuclear Information System (INIS)

    Nicolae, Alexandru; Morton, Gerard; Chung, Hans; Loblaw, Andrew; Jain, Suneil; Mitchell, Darren; Lu, Lin; Helou, Joelle; Al-Hanaqta, Motasem; Heath, Emily; Ravi, Ananth

    2017-01-01

    Purpose: This work presents the application of a machine learning (ML) algorithm to automatically generate high-quality, prostate low-dose-rate (LDR) brachytherapy treatment plans. The ML algorithm can mimic characteristics of preoperative treatment plans deemed clinically acceptable by brachytherapists. The planning efficiency, dosimetry, and quality (as assessed by experts) of preoperative plans generated with an ML planning approach was retrospectively evaluated in this study. Methods and Materials: Preimplantation and postimplantation treatment plans were extracted from 100 high-quality LDR treatments and stored within a training database. The ML training algorithm matches similar features from a new LDR case to those within the training database to rapidly obtain an initial seed distribution; plans were then further fine-tuned using stochastic optimization. Preimplantation treatment plans generated by the ML algorithm were compared with brachytherapist (BT) treatment plans in terms of planning time (Wilcoxon rank sum, α = 0.05) and dosimetry (1-way analysis of variance, α = 0.05). Qualitative preimplantation plan quality was evaluated by expert LDR radiation oncologists using a Likert scale questionnaire. Results: The average planning time for the ML approach was 0.84 ± 0.57 minutes, compared with 17.88 ± 8.76 minutes for the expert planner (P=.020). Preimplantation plans were dosimetrically equivalent to the BT plans; the average prostate V150% was 4% lower for ML plans (P=.002), although the difference was not clinically significant. Respondents ranked the ML-generated plans as equivalent to expert BT treatment plans in terms of target coverage, normal tissue avoidance, implant confidence, and the need for plan modifications. Respondents had difficulty differentiating between plans generated by a human or those generated by the ML algorithm. Conclusions: Prostate LDR preimplantation treatment plans that have equivalent quality to plans created

  5. Multi objective optimization of horizontal axis tidal current turbines, using Meta heuristics algorithms

    International Nuclear Information System (INIS)

    Tahani, Mojtaba; Babayan, Narek; Astaraei, Fatemeh Razi; Moghadam, Ali

    2015-01-01

    Highlights: • The performance of four different Meta heuristic optimization algorithms was studied. • Power coefficient and produced torque on stationary blade were selected as objective functions. • Chord and twist distributions were selected as decision variables. • All optimization algorithms were combined with blade element momentum theory. • The best Pareto front was obtained by multi objective flower pollination algorithm for HATCTs. - Abstract: The performance of horizontal axis tidal current turbines (HATCT) strongly depends on their geometry. According to this fact, the optimum performance will be achieved by optimized geometry. In this research study, the multi objective optimization of the HATCT is carried out by using four different multi objective optimization algorithms and their performance is evaluated in combination with blade element momentum theory (BEM). The second version of non-dominated sorting genetic algorithm (NSGA-II), multi objective particle swarm optimization algorithm (MOPSO), multi objective cuckoo search algorithm (MOCS) and multi objective flower pollination algorithm (MOFPA) are the selected algorithms. The power coefficient and the produced torque on stationary blade are selected as objective functions and chord and twist distributions along the blade span are selected as decision variables. These algorithms are combined with the blade element momentum (BEM) theory for the purpose of achieving the best Pareto front. The obtained Pareto fronts are compared with each other. Different sets of experiments are carried out by considering different numbers of iterations, population size and tip speed ratios. The Pareto fronts which are achieved by MOFPA and NSGA-II have better quality in comparison to MOCS and MOPSO, but on the other hand a detail comparison between the first fronts of MOFPA and NSGA-II indicated that MOFPA algorithm can obtain the best Pareto front and can maximize the power coefficient up to 4.3% and the

  6. An algorithm for the evaluation and treatment of sacroiliac joint dysfunction.

    Science.gov (United States)

    Carlson, Samuel W; Magee, Sean; Carlson, Walter O

    2014-11-01

    Approximately 90 percent of adults experience an episode of low back pain in their lifetime. Sacroiliac joint (SIJ) dysfunction has been shown to cause approximately 13-30 percent of LBP in the adult population. SIJ fusion is becoming an increasingly popular treatment alternative for SIJ dysfunction. This paper presents a literature-based algorithm to assist the clinician in the evaluation and treatment of patients with suspected SIJ dysfunction.

  7. Development of transmission dose estimation algorithm for in vivo dosimetry in high energy radiation treatment

    International Nuclear Information System (INIS)

    Yun, Hyong Geun; Shin, Kyo Chul; Hun, Soon Nyung; Woo, Hong Gyun; Ha, Sung Whan; Lee, Hyoung Koo

    2004-01-01

    In vivo dosimetry is very important for quality assurance purpose in high energy radiation treatment. Measurement of transmission dose is a new method of in vivo dosimetry which is noninvasive and easy for daily performance. This study is to develop a tumor dose estimation algorithm using measured transmission dose for open radiation field. For basic beam data, transmission dose was measured with various field size (FS) of square radiation field, phantom thickness (Tp), and phantom chamber distance (PCD) with a acrylic phantom for 6 MV and 10 MV X-ray. Source to chamber distance (SCD) was set to 150 cm. Measurement was conducted with a 0.6 cc Farmer type ion chamber. By using regression analysis of measured basic beam data, a transmission dose estimation algorithm was developed. Accuracy of the algorithm was tested with flat solid phantom with various thickness in various settings of rectangular fields and various PCD. In our developed algorithm, transmission dose was equated to quadratic function of log(A/P) (where A/P is area-perimeter ratio) and the coefficients of the quadratic functions were equated to tertiary functions of PCD. Our developed algorithm could estimate the radiation dose with the errors within ±0.5% for open square field, and with the errors within ±1.0% for open elongated radiation field. Developed algorithm could accurately estimate the transmission dose in open radiation fields with various treatment settings of high energy radiation treatment. (author)

  8. The Impact of a Line Probe Assay Based Diagnostic Algorithm on Time to Treatment Initiation and Treatment Outcomes for Multidrug Resistant TB Patients in Arkhangelsk Region, Russia.

    Science.gov (United States)

    Eliseev, Platon; Balantcev, Grigory; Nikishova, Elena; Gaida, Anastasia; Bogdanova, Elena; Enarson, Donald; Ornstein, Tara; Detjen, Anne; Dacombe, Russell; Gospodarevskaya, Elena; Phillips, Patrick P J; Mann, Gillian; Squire, Stephen Bertel; Mariandyshev, Andrei

    2016-01-01

    In the Arkhangelsk region of Northern Russia, multidrug-resistant (MDR) tuberculosis (TB) rates in new cases are amongst the highest in the world. In 2014, MDR-TB rates reached 31.7% among new cases and 56.9% among retreatment cases. The development of new diagnostic tools allows for faster detection of both TB and MDR-TB and should lead to reduced transmission by earlier initiation of anti-TB therapy. The PROVE-IT (Policy Relevant Outcomes from Validating Evidence on Impact) Russia study aimed to assess the impact of the implementation of line probe assay (LPA) as part of an LPA-based diagnostic algorithm for patients with presumptive MDR-TB focusing on time to treatment initiation with time from first-care seeking visit to the initiation of MDR-TB treatment rather than diagnostic accuracy as the primary outcome, and to assess treatment outcomes. We hypothesized that the implementation of LPA would result in faster time to treatment initiation and better treatment outcomes. A culture-based diagnostic algorithm used prior to LPA implementation was compared to an LPA-based algorithm that replaced BacTAlert and Löwenstein Jensen (LJ) for drug sensitivity testing. A total of 295 MDR-TB patients were included in the study, 163 diagnosed with the culture-based algorithm, 132 with the LPA-based algorithm. Among smear positive patients, the implementation of the LPA-based algorithm was associated with a median decrease in time to MDR-TB treatment initiation of 50 and 66 days compared to the culture-based algorithm (BacTAlert and LJ respectively, ptime to MDR-TB treatment initiation of 78 days when compared to the culture-based algorithm (LJ, ptime to MDR diagnosis and earlier treatment initiation as well as better treatment outcomes for patients with MDR-TB. These findings also highlight the need for further improvements within the health system to reduce both patient and diagnostic delays to truly optimize the impact of new, rapid diagnostics.

  9. THERAPEUTIC EYELIDS HYGIENE IN THE ALGORITHMS OF PREVENTION AND TREATMENT OF OCULAR SURFACE DISEASES. PART II

    Directory of Open Access Journals (Sweden)

    V. N. Trubilin

    2016-01-01

    Full Text Available The second part of the article is focused on the issue of prevention and treatment of the most common clinical situations in which applicable therapeutic hygiene: seborrheic blepharitis; Staphylococcal blepharitis; Allergic blepharitis; barley and chalazion; prevention keratoconjunctival xerosis (during the preoperative and postoperative period, while wearing contact lenses, in computer vision syndrome, in remission after acute inflammation of the conjunctiva and cornea. There is an algorithm for the therapeutic care of eyelids and the basic mechanisms of action of this procedure. Until recently, the treatment of dry eye syndrome involves the use tearsubstitude therapy. Ten or fifteen years ago, 2–3 tearsubstitudes were presented at the domestic market. Currently, there are doses of different forms of artificial tears, while there are hundreds of them on the western pharmaceutical market. The rapid development in the search for new forms tearsubstitudes is not accident. This is due to the increasing number of patients suffering from disorders of the tear membrane stability, which achieves, according to different sources, up to 40–60% of the adult population. It should be noted that the primary cause of dry eye syndrome in 85–95% of patients is meibomian gland’s dysfunction, thus applying tearsubstitudes symptomatic therapy is treatment that does not solve the problem on the pathogenic level. For this reason, conducting therapeutic hygiene century (warm compresses + self-massage is an important component of the treatment of this group of patients. Objective evidence of relevance and effectiveness of therapeutic care age, in our opinion, is the rapid development of the pharmaceutical market in this area. There is a large number of new gels, lotions, wipes and other products for hygiene century every year. Clear algorithms that include therapeutic hygiene century (dates, the indications for the use of certain hygiene products is an actual

  10. Spatial-time-state fusion algorithm for defect detection through eddy current pulsed thermography

    Science.gov (United States)

    Xiao, Xiang; Gao, Bin; Woo, Wai Lok; Tian, Gui Yun; Xiao, Xiao Ting

    2018-05-01

    Eddy Current Pulsed Thermography (ECPT) has received extensive attention due to its high sensitive of detectability on surface and subsurface cracks. However, it remains as a difficult challenge in unsupervised detection as to identify defects without knowing any prior knowledge. This paper presents a spatial-time-state features fusion algorithm to obtain fully profile of the defects by directional scanning. The proposed method is intended to conduct features extraction by using independent component analysis (ICA) and automatic features selection embedding genetic algorithm. Finally, the optimal feature of each step is fused to obtain defects reconstruction by applying common orthogonal basis extraction (COBE) method. Experiments have been conducted to validate the study and verify the efficacy of the proposed method on blind defect detection.

  11. Contemporary Management of Mandibular Fracture Nonunion-A Retrospective Review and Treatment Algorithm.

    Science.gov (United States)

    Ostrander, Benjamin T; Wang, Howard D; Cusano, Alessandro; Manson, Paul N; Nam, Arthur J; Dorafshar, Amir H

    2018-02-06

    Nonunion is an uncommon complication after mandibular fractures. The purpose of this investigation was to compare outcomes of patients with mandibular fracture nonunion who were treated with a 1- versus 2-stage approach and propose a pragmatic treatment algorithm for surgical management based on preoperative characteristics. The authors conducted a retrospective study consisting of patients who presented to 2 level 1 trauma centers for the management of mandibular fracture nonunion over a 10-year period. The primary predictor variable was 1- versus 2-stage treatment. Outcomes were examined to propose a treatment algorithm. Eighteen patients were included in the study. The sample's mean age was 44.0 ± 19.3 years and most were men (88.9%). Mandibular angle and body accounted for 77.8% of cases. A single-stage approach was used in 13 patients (72.2%). Bone grafts or vascularized bone flaps were required in 13 patients (72.2%). Patients who required 2-stage treatments had intraoral soft tissue defects. Mean length of follow-up was 13.3 ± 20.4 months. All patients achieved bony union, with complications occurring in 5 patients (27.8%). The authors' 10-year experience was used to formulate a treatment algorithm based on bony defect size and soft tissue status, which can be used to inform optimal surgical management. Nonunion of mandibular fractures is an infrequent and complex condition requiring careful and deliberate surgical management. A single-stage approach is appropriate in most cases and does not negatively affect outcomes. Bony defect size and soft tissue status are essential parameters for determining the approach and timing of reconstruction. Copyright © 2018 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  12. Study on hybrid multi-objective optimization algorithm for inverse treatment planning of radiation therapy

    International Nuclear Information System (INIS)

    Li Guoli; Song Gang; Wu Yican

    2007-01-01

    Inverse treatment planning for radiation therapy is a multi-objective optimization process. The hybrid multi-objective optimization algorithm is studied by combining the simulated annealing(SA) and genetic algorithm(GA). Test functions are used to analyze the efficiency of algorithms. The hybrid multi-objective optimization SA algorithm, which displacement is based on the evolutionary strategy of GA: crossover and mutation, is implemented in inverse planning of external beam radiation therapy by using two kinds of objective functions, namely the average dose distribution based and the hybrid dose-volume constraints based objective functions. The test calculations demonstrate that excellent converge speed can be achieved. (authors)

  13. Current review and a simplified "five-point management algorithm" for keratoconus

    Directory of Open Access Journals (Sweden)

    Rohit Shetty

    2015-01-01

    Full Text Available Keratoconus is a slowly progressive, noninflammatory ectatic corneal disease characterized by changes in corneal collagen structure and organization. Though the etiology remains unknown, novel techniques are continuously emerging for the diagnosis and management of the disease. Demographical parameters are known to affect the rate of progression of the disease. Common methods of vision correction for keratoconus range from spectacles and rigid gas-permeable contact lenses to other specialized lenses such as piggyback, Rose-K or Boston scleral lenses. Corneal collagen cross-linking is effective in stabilizing the progression of the disease. Intra-corneal ring segments can improve vision by flattening the cornea in patients with mild to moderate keratoconus. Topography-guided custom ablation treatment betters the quality of vision by correcting the refractive error and improving the contact lens fit. In advanced keratoconus with corneal scarring, lamellar or full thickness penetrating keratoplasty will be the treatment of choice. With such a wide spectrum of alternatives available, it is necessary to choose the best possible treatment option for each patient. Based on a brief review of the literature and our own studies we have designed a five-point management algorithm for the treatment of keratoconus.

  14. Validation of an algorithm-based definition of treatment resistance in patients with schizophrenia.

    Science.gov (United States)

    Ajnakina, Olesya; Horsdal, Henriette Thisted; Lally, John; MacCabe, James H; Murray, Robin M; Gasse, Christiane; Wimberley, Theresa

    2018-02-19

    Large-scale pharmacoepidemiological research on treatment resistance relies on accurate identification of people with treatment-resistant schizophrenia (TRS) based on data that are retrievable from administrative registers. This is usually approached by operationalising clinical treatment guidelines by using prescription and hospital admission information. We examined the accuracy of an algorithm-based definition of TRS based on clozapine prescription and/or meeting algorithm-based eligibility criteria for clozapine against a gold standard definition using case notes. We additionally validated a definition entirely based on clozapine prescription. 139 schizophrenia patients aged 18-65years were followed for a mean of 5years after first presentation to psychiatric services in South-London, UK. The diagnostic accuracy of the algorithm-based measure against the gold standard was measured with sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV). A total of 45 (32.4%) schizophrenia patients met the criteria for the gold standard definition of TRS; applying the algorithm-based definition to the same cohort led to 44 (31.7%) patients fulfilling criteria for TRS with sensitivity, specificity, PPV and NPV of 62.2%, 83.0%, 63.6% and 82.1%, respectively. The definition based on lifetime clozapine prescription had sensitivity, specificity, PPV and NPV of 40.0%, 94.7%, 78.3% and 76.7%, respectively. Although a perfect definition of TRS cannot be derived from available prescription and hospital registers, these results indicate that researchers can confidently use registries to identify individuals with TRS for research and clinical practices. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. SU-E-J-218: Evaluation of CT Images Created Using a New Metal Artifact Reduction Reconstruction Algorithm for Radiation Therapy Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Niemkiewicz, J; Palmiotti, A; Miner, M; Stunja, L; Bergene, J [Lehigh Valley Health Network, Allentown, PA (United States)

    2014-06-01

    Purpose: Metal in patients creates streak artifacts in CT images. When used for radiation treatment planning, these artifacts make it difficult to identify internal structures and affects radiation dose calculations, which depend on HU numbers for inhomogeneity correction. This work quantitatively evaluates a new metal artifact reduction (MAR) CT image reconstruction algorithm (GE Healthcare CT-0521-04.13-EN-US DOC1381483) when metal is present. Methods: A Gammex Model 467 Tissue Characterization phantom was used. CT images were taken of this phantom on a GE Optima580RT CT scanner with and without steel and titanium plugs using both the standard and MAR reconstruction algorithms. HU values were compared pixel by pixel to determine if the MAR algorithm altered the HUs of normal tissues when no metal is present, and to evaluate the effect of using the MAR algorithm when metal is present. Also, CT images of patients with internal metal objects using standard and MAR reconstruction algorithms were compared. Results: Comparing the standard and MAR reconstructed images of the phantom without metal, 95.0% of pixels were within ±35 HU and 98.0% of pixels were within ±85 HU. Also, the MAR reconstruction algorithm showed significant improvement in maintaining HUs of non-metallic regions in the images taken of the phantom with metal. HU Gamma analysis (2%, 2mm) of metal vs. non-metal phantom imaging using standard reconstruction resulted in an 84.8% pass rate compared to 96.6% for the MAR reconstructed images. CT images of patients with metal show significant artifact reduction when reconstructed with the MAR algorithm. Conclusion: CT imaging using the MAR reconstruction algorithm provides improved visualization of internal anatomy and more accurate HUs when metal is present compared to the standard reconstruction algorithm. MAR reconstructed CT images provide qualitative and quantitative improvements over current reconstruction algorithms, thus improving radiation

  16. SU-E-J-218: Evaluation of CT Images Created Using a New Metal Artifact Reduction Reconstruction Algorithm for Radiation Therapy Treatment Planning

    International Nuclear Information System (INIS)

    Niemkiewicz, J; Palmiotti, A; Miner, M; Stunja, L; Bergene, J

    2014-01-01

    Purpose: Metal in patients creates streak artifacts in CT images. When used for radiation treatment planning, these artifacts make it difficult to identify internal structures and affects radiation dose calculations, which depend on HU numbers for inhomogeneity correction. This work quantitatively evaluates a new metal artifact reduction (MAR) CT image reconstruction algorithm (GE Healthcare CT-0521-04.13-EN-US DOC1381483) when metal is present. Methods: A Gammex Model 467 Tissue Characterization phantom was used. CT images were taken of this phantom on a GE Optima580RT CT scanner with and without steel and titanium plugs using both the standard and MAR reconstruction algorithms. HU values were compared pixel by pixel to determine if the MAR algorithm altered the HUs of normal tissues when no metal is present, and to evaluate the effect of using the MAR algorithm when metal is present. Also, CT images of patients with internal metal objects using standard and MAR reconstruction algorithms were compared. Results: Comparing the standard and MAR reconstructed images of the phantom without metal, 95.0% of pixels were within ±35 HU and 98.0% of pixels were within ±85 HU. Also, the MAR reconstruction algorithm showed significant improvement in maintaining HUs of non-metallic regions in the images taken of the phantom with metal. HU Gamma analysis (2%, 2mm) of metal vs. non-metal phantom imaging using standard reconstruction resulted in an 84.8% pass rate compared to 96.6% for the MAR reconstructed images. CT images of patients with metal show significant artifact reduction when reconstructed with the MAR algorithm. Conclusion: CT imaging using the MAR reconstruction algorithm provides improved visualization of internal anatomy and more accurate HUs when metal is present compared to the standard reconstruction algorithm. MAR reconstructed CT images provide qualitative and quantitative improvements over current reconstruction algorithms, thus improving radiation

  17. Evaluation of a treatment-based classification algorithm for low back pain: a cross-sectional study.

    Science.gov (United States)

    Stanton, Tasha R; Fritz, Julie M; Hancock, Mark J; Latimer, Jane; Maher, Christopher G; Wand, Benedict M; Parent, Eric C

    2011-04-01

    Several studies have investigated criteria for classifying patients with low back pain (LBP) into treatment-based subgroups. A comprehensive algorithm was created to translate these criteria into a clinical decision-making guide. This study investigated the translation of the individual subgroup criteria into a comprehensive algorithm by studying the prevalence of patients meeting the criteria for each treatment subgroup and the reliability of the classification. This was a cross-sectional, observational study. Two hundred fifty patients with acute or subacute LBP were recruited from the United States and Australia to participate in the study. Trained physical therapists performed standardized assessments on all participants. The researchers used these findings to classify participants into subgroups. Thirty-one participants were reassessed to determine interrater reliability of the algorithm decision. Based on individual subgroup criteria, 25.2% (95% confidence interval [CI]=19.8%-30.6%) of the participants did not meet the criteria for any subgroup, 49.6% (95% CI=43.4%-55.8%) of the participants met the criteria for only one subgroup, and 25.2% (95% CI=19.8%-30.6%) of the participants met the criteria for more than one subgroup. The most common combination of subgroups was manipulation + specific exercise (68.4% of the participants who met the criteria for 2 subgroups). Reliability of the algorithm decision was moderate (kappa=0.52, 95% CI=0.27-0.77, percentage of agreement=67%). Due to a relatively small patient sample, reliability estimates are somewhat imprecise. These findings provide important clinical data to guide future research and revisions to the algorithm. The finding that 25% of the participants met the criteria for more than one subgroup has important implications for the sequencing of treatments in the algorithm. Likewise, the finding that 25% of the participants did not meet the criteria for any subgroup provides important information regarding

  18. TREATMENT FOR ACUTE/SUBACUTE MUSCULOSKELETAL PAIN, BY USING AN ALGORITHM FOR STEPWISE CHOICE OF ANALGESIC DRUGS AND FOR MONITORING THEIR EFFICACY: PRELIMINARY DATA OF THE ANALGESIC TREATMENT USING SYSTEMIC ALGORITHM (ATUSA PROGRAM

    Directory of Open Access Journals (Sweden)

    N. V. Gontarenko

    2016-01-01

    Full Text Available To optimize treatment for musculoskeletal pain (MSP is a topical medical and social problem. A meeting of experts was held inMoscowin June 2015 to discuss the possibility of forming an interdisciplinary approach and elaborating a unified MSP treatment algorithm based on the com prehensive pathogenetically justified use of different classes of medicines. The Analgesic Treatment Using a Systemic Algorithm (ATUSA trial is a retrospective observational study of the effectiveness of this approach in clinical practice.Objective: to investigate the efficiency of combination treatment for MSP in real clinical practice.Patients and methods. A study group consisted of 3304 patients (women (54.3% and men (45.7%; mean age 48.9±14.6 years with osteoarthritis, nonspecific back pain, and rheumatic juxta-articular soft tissue pathology who had visited their doctors for acute/subacute MSP. Treatment was performed in accordance with the following algorithm: the first appointment was a nonsteroidal anti-inflammatory drug (NSAID, such as aceclofenac, in case of contraindications, paracetamol and/or tramadol + a topical NSAID, in case of indications, muscle relaxants. The therapeutic efficiency was monitored every 7 days (a total of 4 visits; during each visit, therapy could be changed: switching to another NSAID, local administration of glucocorticoids (GC, as well as antidepressants or anticonvulsants. The dynamics of pain (a 0–10 pain intensity numeric rating scale, the number of patients in whom MSP had been resolved completely, as well as treatment satisfaction were taken into account to assess the results of treatment.Results. The first appointment in 97.5% of the patients was NSAIDs, mainly aceclofenac (93.7%, that was used in combination with a muscle relaxant in 67.7%. By Visit 4, there was a reduction in MSP from 6.9±1.5–2.2±1.3 scores. MSP was completely resolved in 77.0% of the patients. The vast majority (88.4% of the patients rated their

  19. An improved fast and elitist multi-objective genetic algorithm-ANSGA-II for multi-objective optimization of inverse radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Cao Ruifen; Li Guoli; Song Gang; Zhao Pan; Lin Hui; Wu Aidong; Huang Chenyu; Wu Yican

    2007-01-01

    Objective: To provide a fast and effective multi-objective optimization algorithm for inverse radiotherapy treatment planning system. Methods: Non-dominated Sorting Genetic Algorithm-NSGA-II is a representative of multi-objective evolutionary optimization algorithms and excels the others. The paper produces ANSGA-II that makes use of advantage of NSGA-II, and uses adaptive crossover and mutation to improve its flexibility; according the character of inverse radiotherapy treatment planning, the paper uses the pre-known knowledge to generate individuals of every generation in the course of optimization, which enhances the convergent speed and improves efficiency. Results: The example of optimizing average dose of a sheet of CT, including PTV, OAR, NT, proves the algorithm could find satisfied solutions in several minutes. Conclusions: The algorithm could provide clinic inverse radiotherapy treatment planning system with selection of optimization algorithms. (authors)

  20. ALGORITHM FOR TREATMENT OF PATIENTS WITH MESIAL OCCLUSION USING PROPRIETARY ORTHODONTIC DEVICE.

    Science.gov (United States)

    Flis, P; Filonenko, V; Doroshenko, N

    2017-10-01

    Early elimination of dentoalveolar apparatus orthodontic disorders is the dominant concept in the treatment technique. to present a sagittal anomalies treatment algorithm, of Class III particularly, in the transitional bite period by the proposed design of individual orthodontic devices with a movable ramp. The treatment algorithm consisted of several blocks: the motivation, the etiological factors establishment, the plan and treatment tactics creation basing on careful diagnosis, the stages of the active period of treatment and the patient management in the retention period. Anthropometric measurements of the maxilla and mandible models were performed to determine the degree of dental arches development. The length of the dental arches was determined on the models by the Nance method in combination with the Huckaba method and the sagital dimensions by the Mirgasizov's method. The leading role in the patients examination was taken by lateral cephalograms analyzing using the Sassouni Plus method. The proposed construction of an orthodontic appliance consists of a plastic base, a vestibular arc, retaining clasps and a ramp, which is connected with a base with two torsion springs. To demonstrate the effectiveness of the proposed construction, an example of the patient Y. treatment of at the age of 6 years 9 months is presented. After the treatment, positive morphological, functional and aesthetic changes were established. The usage of proposed orthodontic appliance with movable ramp allows to start orthodontic treatment at early age, increases its effectiveness and reduce the number of complications. The expediency of stage-by-stage treatment is approved by a positive results of such method. To achieve stable results, it is important to individualize their prognosis even at the planning stage of orthodontic treatment.

  1. Alzheimer’s Disease: Background, Current and Future Treatments

    OpenAIRE

    Evelyn Chou

    2014-01-01

    Alzheimer’s disease is a currently incurable neurodegenerative disorder, and its treatment has posed a big challenge. Proposed causes of Alzheimer’s disease include the cholinergic, amyloid and tau hypothesis. Current therapeutic treatments have been aimed at dealing with neurotransmitter imbalance. These include cholinesterase inhibitors and N-methyl D-aspartate receptor antagonists. However, current therapeutics have been unable to halt its progression. The future of Alzheimer’s disease tre...

  2. Treatment Algorithm for the Hypertension Specialist after the Milan Meeting in 2007

    Czech Academy of Sciences Publication Activity Database

    Peleška, Jan; Anger, Z.; Buchtela, David; Tomečková, Marie; Veselý, Arnošt; Zvárová, Jana

    2007-01-01

    Roč. 30 (2007), s. 374-374 ISSN 1420-4096. [Central European Meeting on Hypertension and Cardiovascular Disease Prevention. 11.10.2007-13.10.2007, Kraków] R&D Projects: GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : hypertension specialist * treatment algorithm for the hypertension * treatment after the Milan Meeting in 2007 Subject RIV: FA - Cardiovascular Disease s incl. Cardiotharic Surgery

  3. The International College of Neuro-Psychopharmacology (CINP) Treatment Guidelines for Bipolar Disorder in Adults (CINP-BD-2017), Part 2: Review, Grading of the Evidence, and a Precise Algorithm

    Science.gov (United States)

    Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Young, Allan; Blier, Pierre; Kasper, Siegfried; Moeller, Hans Jurgen

    2017-01-01

    Abstract Background: The current paper includes a systematic search of the literature, a detailed presentation of the results, and a grading of treatment options in terms of efficacy and tolerability/safety. Material and Methods: The PRISMA method was used in the literature search with the combination of the words ‘bipolar,’ ‘manic,’ ‘mania,’ ‘manic depression,’ and ‘manic depressive’ with ‘randomized,’ and ‘algorithms’ with ‘mania,’ ‘manic,’ ‘bipolar,’ ‘manic-depressive,’ or ‘manic depression.’ Relevant web pages and review articles were also reviewed. Results: The current report is based on the analysis of 57 guideline papers and 531 published papers related to RCTs, reviews, posthoc, or meta-analysis papers to March 25, 2016. The specific treatment options for acute mania, mixed episodes, acute bipolar depression, maintenance phase, psychotic and mixed features, anxiety, and rapid cycling were evaluated with regards to efficacy. Existing treatment guidelines were also reviewed. Finally, Tables reflecting efficacy and recommendation levels were created that led to the development of a precise algorithm that still has to prove its feasibility in everyday clinical practice. Conclusions: A systematic literature search was conducted on the pharmacological treatment of bipolar disorder to identify all relevant random controlled trials pertaining to all aspects of bipolar disorder and graded the data according to a predetermined method to develop a precise treatment algorithm for management of various phases of bipolar disorder. It is important to note that the some of the recommendations in the treatment algorithm were based on the secondary outcome data from posthoc analyses. PMID:27816941

  4. Clinical implications of the anisotropic analytical algorithm for IMRT treatment planning and verification

    International Nuclear Information System (INIS)

    Bragg, Christopher M.; Wingate, Katrina; Conway, John

    2008-01-01

    Purpose: To determine the implications of the use of the Anisotropic Analytical Algorithm (AAA) for the production and dosimetric verification of IMRT plans for treatments of the prostate, parotid, nasopharynx and lung. Methods: 72 IMRT treatment plans produced using the Pencil Beam Convolution (PBC) algorithm were recalculated using the AAA and the dose distributions compared. Twenty-four of the plans were delivered to inhomogeneous phantoms and verification measurements made using a pinpoint ionisation chamber. The agreement between the AAA and measurement was determined. Results: Small differences were seen in the prostate plans, with the AAA predicting slightly lower minimum PTV doses. In the parotid plans, there were small increases in the lens and contralateral parotid doses while the nasopharyngeal plans revealed a reduction in the volume of the PTV covered by the 95% isodose (the V 95% ) when the AAA was used. Large changes were seen in the lung plans, the AAA predicting reductions in the minimum PTV dose and large reductions in the V 95% . The AAA also predicted small increases in the mean dose to the normal lung and the V 20 . In the verification measurements, all AAA calculations were within 3% or 3.5 mm distance to agreement of the measured doses. Conclusions: The AAA should be used in preference to the PBC algorithm for treatments involving low density tissue but this may necessitate re-evaluation of plan acceptability criteria. Improvements to the Multi-Resolution Dose Calculation algorithm used in the inverse planning are required to reduce the convergence error in the presence of lung tissue. There was excellent agreement between the AAA and verification measurements for all sites

  5. Upper cervical injuries: Clinical results using a new treatment algorithm

    Directory of Open Access Journals (Sweden)

    Andrei F Joaquim

    2015-01-01

    Full Text Available Introduction: Upper cervical injuries (UCI have a wide range of radiological and clinical presentation due to the unique complex bony, ligamentous and vascular anatomy. We recently proposed a rational approach in an attempt to unify prior classification system and guide treatment. In this paper, we evaluate the clinical results of our algorithm for UCI treatment. Materials and Methods: A prospective cohort series of patients with UCI was performed. The primary outcome was the AIS. Surgical treatment was proposed based on our protocol: Ligamentous injuries (abnormal misalignment, facet perched or locked, increase atlanto-dens interval were treated surgically. Bone fractures without ligamentous injuries were treated with a rigid cervical orthosis, with exception of fractures in the dens base with risk factors for non-union. Results: Twenty-three patients treated initially conservatively had some follow-up (mean of 171 days, range from 60 to 436 days. All of them were neurologically intact. None of the patients developed a new neurological deficit. Fifteen patients were initially surgically treated (mean of 140 days of follow-up, ranging from 60 to 270 days. In the surgical group, preoperatively, 11 (73.3% patients were AIS E, 2 (13.3% AIS C and 2 (13.3% AIS D. At the final follow-up, the American Spine Injury Association (ASIA score was: 13 (86.6% AIS E and 2 (13.3% AIS D. None of the patients had neurological worsening during the follow-up. Conclusions: This prospective cohort suggested that our UCI treatment algorithm can be safely used. Further prospective studies with longer follow-up are necessary to further establish its clinical validity and safety.

  6. [Surgical treatment of gynecomastia: an algorithm].

    Science.gov (United States)

    Wolter, A; Scholz, T; Diedrichson, J; Liebau, J

    2013-04-01

    Gynecomastia is a persistent benign uni- or bilateral enlargement of the male breast ranging from small to excessive findings with marked skin redundancy. In this paper we introduce an algorithm to facilitate the selection of the appropriate surgical technique according to the presented morphological aspects. The records of 118 patients (217 breasts) with gynecomastia from 01/2009 to 08/2012 were retrospectively reviewed. The authors conducted three different surgical techniques depending on four severity grades. The outcome parameters complication rate, patient satisfaction with the aesthetic result, nipple sensitivity and the need to re-operate were observed and related to the employed technique. In 167 (77%) breasts with moderate breast enlargement without skin redundancy (Grade I-IIa by Simon's classification) a subcutaneous semicircular periareolar mastectomy was performed in combination with water-jet assisted liposuction. In 40 (18%) breasts with skin redundancy (Grade IIb) a circumferential mastopexy was performed additionally. An inferior pedicled mammaplasty was used in 10 (5%) severe cases (Grade III). Complication rate was 4.1%. Surgical corrections were necessary in 17 breasts (7.8%). The patient survey revealed a high satisfaction level: 88% of the patients rated the aesthetic results as "very good" or "good", nipple sensitivity was rated as "very good" or "good" by 83%. Surgical treatment of gynecomastia should ensure minimal scarring while respecting the aesthetic unit. The selection of the appropriate surgical method depends on the severity grade, the presence of skin redundancy and the volume of the male breast glandular tissue. The presented algorithm rarely leads to complications, is simple to perform and shows a high satisfaction rate and a preservation of the nipple sensitivity. © Georg Thieme Verlag KG Stuttgart · New York.

  7. Current Treatment of Chronic Lymphocytic Leukemia.

    Science.gov (United States)

    Jamroziak, Krzysztof; Puła, Bartosz; Walewski, Jan

    2017-01-01

    A number of new treatment options have recently emerged for chronic lymphocytic leukemia (CLL) patients, including the Bruton's tyrosine kinase (BTK) inhibitor ibrutinib, phosphatidylinositol-3-kinase (PI3K) delta isoform inhibitor idelalisib combined with rituximab, the Bcl-2 antagonist venetoclax, and the new anti-CD20 antibodies obinutuzumab and ofatumumab. Most of these agents are already included into treatment algorithms defined by international practice guidelines, but more clinical investigations are needed to answer still remaining questions. Ibrutinib was proven as a primary choice for patients with the TP53 gene deletion/mutation, who otherwise have no active treatment available. Idelalisib with rituximab is also an active therapy, but due to increased risk of serious infections, its use in first-line treatment is limited to patients for whom ibrutinib is not an option. A new indication for ibrutinib was recently approved for older patients with comorbidities, as an alternative to the already existing indication for chlorambucil with obinutuzumab. The use of kinase inhibitors is already well established in recurrent/refractory disease. Immunochemotherapy with fludarabine, cyclophosphamide, rituximab (FCR) remains a major first-line option for many CLL patients without the TP53 gene deletion/mutation, and who have no significant comorbidities or history of infections, and is particularly effective in patients with favorable features including mutated IGHV status. There are a number of issues regarding novel therapies for CLL that need further investigation such as optimum duration of treatment with kinase inhibitors, appropriate sequencing of novel agents, mechanisms of resistance to inhibitors and response to class switching after treatment failure, along with the potential role of combinations of targeted agents.

  8. Quantification of the influence of the choice of the algorithm and planning system on the calculation of a treatment plan

    International Nuclear Information System (INIS)

    Moral, F. del; Ramos, A.; Salgado, M.; Andrade, B; Munoz, V.

    2010-01-01

    In this work an analysis of the influence of the choice of the algorithm or planning system, on the calculus of the same treatment plan is introduced. For this purpose specific software has been developed for comparing plans of a series of IMRT cases of prostate and head and neck cancer calculated using the convolution, superposition and fast superposition algorithms implemented in the XiO 4.40 planning system (CMS). It has also been used for the comparison of the same treatment plan for lung pathology calculated in XiO with the mentioned algorithms, and calculated in the Plan 4.1 planning system (Brainlab) using its pencil beam algorithm. Differences in dose among the treatment plans have been quantified using a set of metrics. The recommendation for the dosimetrist of a careful choice of the algorithm has been numerically confirmed. (Author).

  9. Parallel algorithms

    CERN Document Server

    Casanova, Henri; Robert, Yves

    2008-01-01

    ""…The authors of the present book, who have extensive credentials in both research and instruction in the area of parallelism, present a sound, principled treatment of parallel algorithms. … This book is very well written and extremely well designed from an instructional point of view. … The authors have created an instructive and fascinating text. The book will serve researchers as well as instructors who need a solid, readable text for a course on parallelism in computing. Indeed, for anyone who wants an understandable text from which to acquire a current, rigorous, and broad vi

  10. Current and emerging treatment options for myopic choroidal neovascularization

    Directory of Open Access Journals (Sweden)

    El Matri L

    2015-04-01

    Full Text Available Leila El Matri, Ahmed Chebil, Fedra Kort Department B of Ophthalmology, Hedi Rais Institute of Ophthalmology, Faculty of Medicine of Tunis, University of El Manar, Tunis, Tunisia Abstract: Choroidal neovascularization (CNV is the main cause of visual impairment in highly myopic patients younger than 50 years of age. There are different treatments for myopic CNV (mCNV, with 5- to 10-year outcomes currently. Chorioretinal atrophy is still the most important determinant factor for visual outcome. The purpose of this study is to provide an overview of the current treatments for mCNV, including laser, surgical management, verteporfin photodynamic therapy, and mainly anti-vascular endothelial growth factor therapy. Emerging treatment options are also discussed. Keywords: myopia, choroidal neovascularization, current treatment, emerging treatment

  11. Unified treatment algorithm for the management of crotaline snakebite in the United States: results of an evidence-informed consensus workshop

    Directory of Open Access Journals (Sweden)

    Kerns William P

    2011-02-01

    Full Text Available Abstract Background Envenomation by crotaline snakes (rattlesnake, cottonmouth, copperhead is a complex, potentially lethal condition affecting thousands of people in the United States each year. Treatment of crotaline envenomation is not standardized, and significant variation in practice exists. Methods A geographically diverse panel of experts was convened for the purpose of deriving an evidence-informed unified treatment algorithm. Research staff analyzed the extant medical literature and performed targeted analyses of existing databases to inform specific clinical decisions. A trained external facilitator used modified Delphi and structured consensus methodology to achieve consensus on the final treatment algorithm. Results A unified treatment algorithm was produced and endorsed by all nine expert panel members. This algorithm provides guidance about clinical and laboratory observations, indications for and dosing of antivenom, adjunctive therapies, post-stabilization care, and management of complications from envenomation and therapy. Conclusions Clinical manifestations and ideal treatment of crotaline snakebite differ greatly, and can result in severe complications. Using a modified Delphi method, we provide evidence-informed treatment guidelines in an attempt to reduce variation in care and possibly improve clinical outcomes.

  12. Current options for the treatment of pathological scarring.

    Science.gov (United States)

    Poetschke, Julian; Gauglitz, Gerd G

    2016-05-01

    Scarring is the consequence of surgery, trauma or different skin diseases. Apart from fresh, immature scars,that transform into mature scars over the course of would healing and that do not require further treatment,linear hypertrophic scars, widespread hypertrophic scars, keloids and atrophic scars exist. Symptoms like pruritusand pain, stigmatization as well as functional and aesthetic impairments that are very disturbing for the affected patients can bethe basis for the desire for treatment. Today, a multitude of options for the treatment and prevention of scars exists. Topical agents based on silicone or onion extract, intralesional injections of cristalline glucocorticoids (oftentimes in combinationwith cryotherapy) or 5-Fluorouracil as well as ablative and nonablative laser treatment are used. Current guidelines summarize the multitude of available treatment options and the currently available datafor the treating physicians, allowing them to make clear therapy recommendations for every single scar type. Relieving patients of their discomfort and doing their aesthetic demands justice is thus possible. Apart from scar prevention becoming more and more important, the increased use of modernlaser treatment options constitutes a key point in clinical scar treatment. At the same time the attention is turned to evaluating current therapeutic options with the help of contemporary study designs so as to graduallyimprove the level of evidence in scar treatment. © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  13. A pragmatic evidence-based clinical management algorithm for burning mouth syndrome.

    Science.gov (United States)

    Kim, Yohanan; Yoo, Timothy; Han, Peter; Liu, Yuan; Inman, Jared C

    2018-04-01

    Burning mouth syndrome is a poorly understood disease process with no current standard of treatment. The goal of this article is to provide an evidence-based, practical, clinical algorithm as a guideline for the treatment of burning mouth syndrome. Using available evidence and clinical experience, a multi-step management algorithm was developed. A retrospective cohort study was then performed, following STROBE statement guidelines, comparing outcomes of patients who were managed using the algorithm and those who were managed without. Forty-seven patients were included in the study, with 21 (45%) managed using the algorithm and 26 (55%) managed without. The mean age overall was 60.4 ±16.5 years, and most patients (39, 83%) were female. Cohorts showed no statistical difference in age, sex, overall follow-up time, dysgeusia, geographic tongue, or psychiatric disorder; xerostomia, however, was significantly different, skewed toward the algorithm group. Significantly more non-algorithm patients did not continue care (69% vs. 29%, p =0.001). The odds ratio of not continuing care for the non-algorithm group compared to the algorithm group was 5.6 [1.6, 19.8]. Improvement in pain was significantly more likely in the algorithm group ( p =0.001), with an odds ratio of 27.5 [3.1, 242.0]. We present a basic clinical management algorithm for burning mouth syndrome which may increase the likelihood of pain improvement and patient follow-up. Key words: Burning mouth syndrome, burning tongue, glossodynia, oral pain, oral burning, therapy, treatment.

  14. Current constrained voltage scaled reconstruction (CCVSR) algorithm for MR-EIT and its performance with different probing current patterns

    International Nuclear Information System (INIS)

    Birguel, Oezlem; Eyueboglu, B Murat; Ider, Y Ziya

    2003-01-01

    Conventional injected-current electrical impedance tomography (EIT) and magnetic resonance imaging (MRI) techniques can be combined to reconstruct high resolution true conductivity images. The magnetic flux density distribution generated by the internal current density distribution is extracted from MR phase images. This information is used to form a fine detailed conductivity image using an Ohm's law based update equation. The reconstructed conductivity image is assumed to differ from the true image by a scale factor. EIT surface potential measurements are then used to scale the reconstructed image in order to find the true conductivity values. This process is iterated until a stopping criterion is met. Several simulations are carried out for opposite and cosine current injection patterns to select the best current injection pattern for a 2D thorax model. The contrast resolution and accuracy of the proposed algorithm are also studied. In all simulation studies, realistic noise models for voltage and magnetic flux density measurements are used. It is shown that, in contrast to the conventional EIT techniques, the proposed method has the capability of reconstructing conductivity images with uniform and high spatial resolution. The spatial resolution is limited by the larger element size of the finite element mesh and twice the magnetic resonance image pixel size

  15. [Current Treatment of Stable Angina].

    Science.gov (United States)

    Toggweiler, Stefan; Jamshidi, Peiman; Cuculi, Florim

    2015-06-17

    Current therapy for stable angina includes surgical and percutaneous revascularization, which has been improved tremendously over the last decades. Smoking cessation and regular exercise are the cornerstone for prevention of further cerebrovascular events. Medical treatment includes treatment of cardiovascular risk factors and antithrombotic management, which can be a challenge in some patients. Owing to the fact the coronary revascularization is readily accessible these days in many industrialized countries, the importance of antianginal therapy has decreased over the past years. This article presents a contemporary overview of the management of patients with stable angina in the year 2015.

  16. Data analysis algorithms for flaw sizing based on eddy current rotating probe examination of steam generator tubes

    International Nuclear Information System (INIS)

    Bakhtiari, S.; Elmer, T.W.

    2009-01-01

    Computer-aided data analysis tools can help improve the efficiency and reliability of flaw sizing based on nondestructive examination data. They can further help produce more consistent results, which is important for both in-service inspection applications and for engineering assessments associated with steam generator tube integrity. Results of recent investigations at Argonne on the development of various algorithms for sizing of flaws in steam generator tubes based on eddy current rotating probe data are presented. The research was carried out as part of the activities under the International Steam Generator Tube Integrity Program (ISG-TIP) sponsored by the U.S. Nuclear Regulatory Commission. A computer-aided data analysis tool has been developed for off-line processing of eddy current inspection data. The main objectives of the work have been to a) allow all data processing stages to be performed under the same user interface, b) simplify modification and testing of signal processing and data analysis scripts, and c) allow independent evaluation of viable flaw sizing algorithms. The focus of most recent studies at Argonne has been on the processing of data acquired with the +Point probe, which is one of the more widely used eddy current rotating probes for steam generator tube examinations in the U.S. The probe employs a directional surface riding differential coil, which helps reduce the influence of tubing artifacts and in turn helps improve the signal-to-noise ratio. Various algorithms developed under the MATLAB environment for the conversion, segmentation, calibration, and analysis of data have been consolidated within a single user interface. Data acquired with a number of standard eddy current test equipment are automatically recognized and converted to a standard format for further processing. Because of its modular structure, the graphical user interface allows user-developed routines to be easily incorporated, modified, and tested independent of the

  17. Combination treatment of neuropathic pain

    DEFF Research Database (Denmark)

    Holbech, Jakob Vormstrup; Jung, Anne; Jonsson, Torsten

    2017-01-01

    BACKGROUND: Current Danish treatment algorithms for pharmacological treatment of neuropathic pain (NeP) are tricyclic antidepressants (TCA), gabapentin and pregabalin as first-line treatment for the most common NeP conditions. Many patients have insufficient pain relief on monotherapy, but combin...

  18. Optimization of stereotactic body radiotherapy treatment planning using a multicriteria optimization algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Ghandour, Sarah; Cosinschi, Adrien; Mazouni, Zohra; Pachoud, Marc; Matzinger, Oscar [Riviera-Chablais Hospital, Vevey (Switzerland). Cancer Center, Radiotherapy Dept.

    2016-07-01

    To provide high-quality and efficient dosimetric planning for various types of stereotactic body radiotherapy (SBRT) for tumor treatment using a multicriteria optimization (MCO) technique fine-tuned with direct machine parameter optimization (DMPO). Eighteen patients with lung (n = 11), liver (n = 5) or adrenal cell cancer (n = 2) were treated using SBRT in our clinic between December 2014 and June 2015. Plans were generated using the RayStation trademark Treatment Planning System (TPS) with the VMAT technique. Optimal deliverable SBRT plans were first generated using an MCO algorithm to find a well-balanced tradeoff between tumor control and normal tissue sparing in an efficient treatment planning time. Then, the deliverable plan was post-processed using the MCO solution as the starting point for the DMPO algorithm to improve the dose gradient around the planning target volume (PTV) while maintaining the clinician's priorities. The dosimetric quality of the plans was evaluated using dose-volume histogram (DVH) parameters, which account for target coverage and the sparing of healthy tissue, as well as the CI100 and CI50 conformity indexes. Using a combination of the MCO and DMPO algorithms showed that the treatment plans were clinically optimal and conformed to all organ risk dose volume constraints reported in the literature, with a computation time of approximately one hour. The coverage of the PTV (D99% and D95%) and sparing of organs at risk (OAR) were similar between the MCO and MCO + DMPO plans, with no significant differences (p > 0.05) for all the SBRT plans. The average CI100 and CI50 values using MCO + DMPO were significantly better than those with MCO alone (p < 0.05). The MCO technique allows for convergence on an optimal solution for SBRT within an efficient planning time. The combination of the MCO and DMPO techniques yields a better dose gradient, especially for lung tumors.

  19. Algorithm for evaluating the effectiveness of a high-rise development project based on current yield

    Science.gov (United States)

    Soboleva, Elena

    2018-03-01

    The article is aimed at the issues of operational evaluation of development project efficiency in high-rise construction under the current economic conditions in Russia. The author touches the following issues: problems of implementing development projects, the influence of the operational evaluation quality of high-rise construction projects on general efficiency, assessing the influence of the project's external environment on the effectiveness of project activities under crisis conditions and the quality of project management. The article proposes the algorithm and the methodological approach to the quality management of the developer project efficiency based on operational evaluation of the current yield efficiency. The methodology for calculating the current efficiency of a development project for high-rise construction has been updated.

  20. Current treatments for radiation retinopathy

    Energy Technology Data Exchange (ETDEWEB)

    Giuliari, Gian Paolo; Simpson, E. Rand (Princess Margaret Hospital, Univ. of Toronto, Dept. of Ophthalmology and Vision Sciences, Toronto (Canada)), e-mail: gpgiuliari@gmail.com; Sadaka, Ama (Schepens Eye Research Inst., Boston, MA (United States)); Hinkle, David M. (Massachusetts Eye Research and Surgery Institution, Cambridge, MA (United States))

    2011-01-15

    Background. To review the currently available therapeutic modalities for radiation retinopathy (RR), including newer investigational interventions directed towards specific aspects of the pathophysiology of this refractory complication. Methods. A review of the literature encompassing the pathogenesis of RR and the current therapeutic modalities available was performed. Results. RR is a chronic and progressive condition that results from exposure to any source of radiation. It might be secondary to radiation treatment of intraocular tumors such as choroidal melanomas, retinoblastomas, and choroidal metastasis, or from unavoidable exposure to excessive radiation from the treatment of extraocular tumors like cephalic, nasopharyngeal, orbital, and paranasal malignancies. After the results of the Collaborative Ocular Melanoma Study, most of the choroidal melanomas are being treated with plaque brachytherapy increasing by that the incidence of this radiation complication. RR has been reported to occur in as many as 60% of eyes treated with plaque radiation, with higher rates associated with larger tumors. Initially, the condition manifests as a radiation vasculopathy clinically seen as microaneurysms and telangiectasis, with posterior development of retinal hard exudates and hemorrhages, macular edema, neovascularization and tractional retinal detachment. Regrettably, the management of these eyes remains limited. Photodynamic therapy, laser photocoagulation, oral pentoxyphylline and hyperbaric oxygen have been attempted as treatment modalities with inconclusive results. Intravitreal injections of anti-vascular endothelial growth factor such as bevacizumab, ranibizumab and pegaptanib sodium have been recently used, also with variable results. Discussion. RR is a common vision threatening complication following radiation therapy. The available therapeutic options are limited and show unsatisfactory results. Further large investigative studies are required for developing

  1. Cost-effectiveness of collaborative care including PST and an antidepressant treatment algorithm for the treatment of major depressive disorder in primary care; a randomised clinical trial

    Directory of Open Access Journals (Sweden)

    Beekman Aartjan TF

    2007-03-01

    Full Text Available Abstract Background Depressive disorder is currently one of the most burdensome disorders worldwide. Evidence-based treatments for depressive disorder are already available, but these are used insufficiently, and with less positive results than possible. Earlier research in the USA has shown good results in the treatment of depressive disorder based on a collaborative care approach with Problem Solving Treatment and an antidepressant treatment algorithm, and research in the UK has also shown good results with Problem Solving Treatment. These treatment strategies may also work very well in the Netherlands too, even though health care systems differ between countries. Methods/design This study is a two-armed randomised clinical trial, with randomization on patient-level. The aim of the trial is to evaluate the treatment of depressive disorder in primary care in the Netherlands by means of an adapted collaborative care framework, including contracting and adherence-improving strategies, combined with Problem Solving Treatment and antidepressant medication according to a treatment algorithm. Forty general practices will be randomised to either the intervention group or the control group. Included will be patients who are diagnosed with moderate to severe depression, based on DSM-IV criteria, and stratified according to comorbid chronic physical illness. Patients in the intervention group will receive treatment based on the collaborative care approach, and patients in the control group will receive care as usual. Baseline measurements and follow up measures (3, 6, 9 and 12 months are assessed using questionnaires and an interview. The primary outcome measure is severity of depressive symptoms, according to the PHQ9. Secondary outcome measures are remission as measured with the PHQ9 and the IDS-SR, and cost-effectiveness measured with the TiC-P, the EQ-5D and the SF-36. Discussion In this study, an American model to enhance care for patients with a

  2. A Genetic Algorithm Approach to the Optimization of a Radioactive Waste Treatment System

    International Nuclear Information System (INIS)

    Yang, Yeongjin; Lee, Kunjai; Koh, Y.; Mun, J.H.; Kim, H.S.

    1998-01-01

    This study is concerned with the applications of goal programming and genetic algorithm techniques to the analysis of management and operational problems in the radioactive waste treatment system (RWTS). A typical RWTS is modeled and solved by goal program and genetic algorithm to study and resolve the effects of conflicting objectives such as cost, limitation of released radioactivity to the environment, equipment utilization and total treatable radioactive waste volume before discharge and disposal. The developed model is validated and verified using actual data obtained from the RWTS at Kyoto University in Japan. The solution by goal programming and genetic algorithm would show the optimal operation point which is to maximize the total treatable radioactive waste volume and minimize the released radioactivity of liquid waste even under the restricted resources. The comparison of two methods shows very similar results. (author)

  3. The Texas Medication Algorithm Project (TMAP) schizophrenia algorithms.

    Science.gov (United States)

    Miller, A L; Chiles, J A; Chiles, J K; Crismon, M L; Rush, A J; Shon, S P

    1999-10-01

    In the Texas Medication Algorithm Project (TMAP), detailed guidelines for medication management of schizophrenia and related disorders, bipolar disorders, and major depressive disorders have been developed and implemented. This article describes the algorithms developed for medication treatment of schizophrenia and related disorders. The guidelines recommend a sequence of medications and discuss dosing, duration, and switch-over tactics. They also specify response criteria at each stage of the algorithm for both positive and negative symptoms. The rationale and evidence for each aspect of the algorithms are presented.

  4. EDITORIAL: International Workshop on Current Topics in Monte Carlo Treatment Planning

    Science.gov (United States)

    Verhaegen, Frank; Seuntjens, Jan

    2005-03-01

    The use of Monte Carlo particle transport simulations in radiotherapy was pioneered in the early nineteen-seventies, but it was not until the eighties that they gained recognition as an essential research tool for radiation dosimetry, health physics and later on for radiation therapy treatment planning. Since the mid-nineties, there has been a boom in the number of workers using MC techniques in radiotherapy, and the quantity of papers published on the subject. Research and applications of MC techniques in radiotherapy span a very wide range from fundamental studies of cross sections and development of particle transport algorithms, to clinical evaluation of treatment plans for a variety of radiotherapy modalities. The International Workshop on Current Topics in Monte Carlo Treatment Planning took place at Montreal General Hospital, which is part of McGill University, halfway up Mount Royal on Montreal Island. It was held from 3-5 May, 2004, right after the freezing winter has lost its grip on Canada. About 120 workers attended the Workshop, representing 18 countries. Most of the pioneers in the field were present but also a large group of young scientists. In a very full programme, 41 long papers were presented (of which 12 were invited) and 20 posters were on display during the whole meeting. The topics covered included the latest developments in MC algorithms, statistical issues, source modelling and MC treatment planning for photon, electron and proton treatments. The final day was entirely devoted to clinical implementation issues. Monte Carlo radiotherapy treatment planning has only now made a slow entrée in the clinical environment, taking considerably longer than envisaged ten years ago. Of the twenty-five papers in this dedicated special issue, about a quarter deal with this topic, with probably many more studies to follow in the near future. If anything, we hope the Workshop served as an accelerator for more clinical evaluation of MC applications. The

  5. Improved automated lumen contour detection by novel multifrequency processing algorithm with current intravascular ultrasound system.

    Science.gov (United States)

    Kume, Teruyoshi; Kim, Byeong-Keuk; Waseda, Katsuhisa; Sathyanarayana, Shashidhar; Li, Wenguang; Teo, Tat-Jin; Yock, Paul G; Fitzgerald, Peter J; Honda, Yasuhiro

    2013-02-01

    The aim of this study was to evaluate a new fully automated lumen border tracing system based on a novel multifrequency processing algorithm. We developed the multifrequency processing method to enhance arterial lumen detection by exploiting the differential scattering characteristics of blood and arterial tissue. The implementation of the method can be integrated into current intravascular ultrasound (IVUS) hardware. This study was performed in vivo with conventional 40-MHz IVUS catheters (Atlantis SR Pro™, Boston Scientific Corp, Natick, MA) in 43 clinical patients with coronary artery disease. A total of 522 frames were randomly selected, and lumen areas were measured after automatically tracing lumen borders with the new tracing system and a commercially available tracing system (TraceAssist™) referred to as the "conventional tracing system." The data assessed by the two automated systems were compared with the results of manual tracings by experienced IVUS analysts. New automated lumen measurements showed better agreement with manual lumen area tracings compared with those of the conventional tracing system (correlation coefficient: 0.819 vs. 0.509). When compared against manual tracings, the new algorithm also demonstrated improved systematic error (mean difference: 0.13 vs. -1.02 mm(2) ) and random variability (standard deviation of difference: 2.21 vs. 4.02 mm(2) ) compared with the conventional tracing system. This preliminary study showed that the novel fully automated tracing system based on the multifrequency processing algorithm can provide more accurate lumen border detection than current automated tracing systems and thus, offer a more reliable quantitative evaluation of lumen geometry. Copyright © 2011 Wiley Periodicals, Inc.

  6. Current Challenges in Cancer Treatment.

    Science.gov (United States)

    Zugazagoitia, Jon; Guedes, Cristiano; Ponce, Santiago; Ferrer, Irene; Molina-Pinelo, Sonia; Paz-Ares, Luis

    2016-07-01

    In this review, we highlight the current concepts and discuss some of the current challenges and future prospects in cancer therapy. We frequently use the example of lung cancer. We conducted a nonsystematic PubMed search, selecting the most comprehensive and relevant research articles, clinical trials, translational papers, and review articles on precision oncology and immuno-oncology. Papers were prioritized and selected based on their originality and potential clinical applicability. Two major revolutions have changed cancer treatment paradigms in the past few years: targeting actionable alterations in oncogene-driven cancers and immuno-oncology. Important challenges are still ongoing in both fields of cancer therapy. On the one hand, druggable genomic alterations are diverse and represent only small subsets of patients in certain tumor types, which limits testing their clinical impact in biomarker-driven clinical trials. Next-generation sequencing technologies are increasingly being implemented for molecular prescreening in clinical research, but issues regarding clinical interpretation of large genomic data make their wide clinical use difficult. Further, dealing with tumor heterogeneity and acquired resistance is probably the main limitation for the success of precision oncology. On the other hand, long-term survival benefits with immune checkpoint inhibitors (anti-programmed death cell protein-1/programmed death cell ligand-1[PD-1/L1] and anti-cytotoxic T lymphocyte antigen-4 monoclonal antibodies) are restricted to a minority of patients, and no predictive markers are yet robustly validated that could help us recognize these subsets and optimize treatment delivery and selection. To achieve long-term survival benefits, drug combinations targeting several molecular alterations or cancer hallmarks might be needed. This will probably be one of the most challenging but promising precision cancer treatment strategies in the future. Targeting single molecular

  7. Current treatment approaches in patients with ankylosing spondylitis

    Directory of Open Access Journals (Sweden)

    Bilal Elbey

    2015-03-01

    Full Text Available Ankylosing spondylitis (AS is a chronic, inflammatory, rheumatic disease that mainly affects sacroiliac joints and spine. AS predominantly occurs more often in males and typically begins in the second or third decade. The mainstay of therapy in AS are nonsteroidal anti-inflammatory drugs, which reduce inflammation and pain. Disease modifying antirheumatic drugs (DMARD did not have enough evidence to prove their effect in AS treatment. The use of DMARD may not sufficient to improve the treatment and symptoms. Currently, TNF-blockers such as, Golimumab Etanersept Adalimumab İnfliksimab have promising results in the treatment of AS. TNF-blockers improve the clinical signs and symptoms, and improve the patients’ physical function and quality of life. This manuscript is focused that Current pharmacological treatments in patients with ankylosing spondylitis.

  8. Algorithm for the treatment of type 2 diabetes: a position statement of Brazilian Diabetes Society

    OpenAIRE

    Lerario, Antonio C; Chacra, Antonio R; Pimazoni-Netto, Augusto; Malerbi, Domingos; Gross, Jorge L; Oliveira, Jos? EP; Gomes, Marilia B; Santos, Raul D; Fonseca, Reine MC; Betti, Roberto; Raduan, Roberto

    2010-01-01

    Abstract The Brazilian Diabetes Society is starting an innovative project of quantitative assessment of medical arguments of and implementing a new way of elaborating SBD Position Statements. The final aim of this particular project is to propose a new Brazilian algorithm for the treatment of type 2 diabetes, based on the opinions of endocrinologists surveyed from a poll conducted on the Brazilian Diabetes Society website regarding the latest algorithm proposed by American Diabetes Associatio...

  9. The psychopharmacology algorithm project at the Harvard South Shore Program: an algorithm for acute mania.

    Science.gov (United States)

    Mohammad, Othman; Osser, David N

    2014-01-01

    This new algorithm for the pharmacotherapy of acute mania was developed by the Psychopharmacology Algorithm Project at the Harvard South Shore Program. The authors conducted a literature search in PubMed and reviewed key studies, other algorithms and guidelines, and their references. Treatments were prioritized considering three main considerations: (1) effectiveness in treating the current episode, (2) preventing potential relapses to depression, and (3) minimizing side effects over the short and long term. The algorithm presupposes that clinicians have made an accurate diagnosis, decided how to manage contributing medical causes (including substance misuse), discontinued antidepressants, and considered the patient's childbearing potential. We propose different algorithms for mixed and nonmixed mania. Patients with mixed mania may be treated first with a second-generation antipsychotic, of which the first choice is quetiapine because of its greater efficacy for depressive symptoms and episodes in bipolar disorder. Valproate and then either lithium or carbamazepine may be added. For nonmixed mania, lithium is the first-line recommendation. A second-generation antipsychotic can be added. Again, quetiapine is favored, but if quetiapine is unacceptable, risperidone is the next choice. Olanzapine is not considered a first-line treatment due to its long-term side effects, but it could be second-line. If the patient, whether mixed or nonmixed, is still refractory to the above medications, then depending on what has already been tried, consider carbamazepine, haloperidol, olanzapine, risperidone, and valproate first tier; aripiprazole, asenapine, and ziprasidone second tier; and clozapine third tier (because of its weaker evidence base and greater side effects). Electroconvulsive therapy may be considered at any point in the algorithm if the patient has a history of positive response or is intolerant of medications.

  10. Genital pain: algorithm for management

    OpenAIRE

    Calixte, Nahomy; Brahmbhatt, Jamin; Parekattil, Sijo

    2017-01-01

    Chronic testicular pain although becoming very common in our patient population poses a challenge to the physician, the patient and his family. The pathogenesis of chronic orchialgia (CO) is not well understood. The objective of this paper is to review the current literature on chronic testicular pain and its management and to propose an algorithm for its treatment. Abstracts, original papers and review articles were reviewed during a literature search using words such as testicular pain, CO,...

  11. Expert Opinion on the Management of Lennox–Gastaut Syndrome: Treatment Algorithms and Practical Considerations

    Directory of Open Access Journals (Sweden)

    J. Helen Cross

    2017-09-01

    Full Text Available Lennox–Gastaut syndrome (LGS is a severe epileptic and developmental encephalopathy that is associated with a high rate of morbidity and mortality. It is characterized by multiple seizure types, abnormal electroencephalographic features, and intellectual disability. Although intellectual disability and associated behavioral problems are characteristic of LGS, they are not necessarily present at its outset and are therefore not part of its diagnostic criteria. LGS is typically treated with a variety of pharmacological and non-pharmacological therapies, often in combination. Management and treatment decisions can be challenging, due to the multiple seizure types and comorbidities associated with the condition. A panel of five epileptologists met to discuss consensus recommendations for LGS management, based on the latest available evidence from literature review and clinical experience. Treatment algorithms were formulated. Current evidence favors the continued use of sodium valproate (VPA as the first-line treatment for patients with newly diagnosed de novo LGS. If VPA is ineffective alone, evidence supports lamotrigine, or subsequently rufinamide, as adjunctive therapy. If seizure control remains inadequate, the choice of next adjunctive antiepileptic drug (AED should be discussed with the patient/parent/caregiver/clinical team, as current evidence is limited. Non-pharmacological therapies, including resective surgery, the ketogenic diet, vagus nerve stimulation, and callosotomy, should be considered for use alongside AED therapy from the outset of treatment. For patients with LGS that has evolved from another type of epilepsy who are already being treated with an AED other than VPA, VPA therapy should be considered if not trialed previously. Thereafter, the approach for a de novo patient should be followed. Where possible, no more than two AEDs should be used concomitantly. Patients with established LGS should undergo review by a neurologist

  12. Current and emerging treatment options for nasopharyngeal carcinoma

    Directory of Open Access Journals (Sweden)

    Spratt DE

    2012-10-01

    Full Text Available Daniel E Spratt, Nancy LeeDepartment of Radiation Oncology, Memorial Sloan-Kettering Cancer Center, New York, NY, USAAbstract: In this article, we focus on the current and emerging treatments in nasopharyngeal cancer (NPC. A detailed evolution of the current standard of care, and new techniques and treatment options will be reviewed. Intergroup 0099 established the role for chemoradiotherapy (chemo-RT in the treatment of nasopharyngeal carcinoma. Multiple randomized Phase III trials have shown the benefit of chemo-RT; however, none of these studies utilized modern radiotherapy (RT techniques of intensity-modulated radiation therapy (IMRT. IMRT has the ability to deliver high doses of radiation to the target structures while sparing adjacent bystander healthy tissues, and has now become the preferred RT treatment modality. Chemotherapy also has had a shifting paradigm of induction and/or adjuvant chemotherapy combined with RT alone, to the investigation with concurrent chemo-RT. New treatment options including targeted monoclonal antibodies and small molecule tyrosine kinase inhibitors are being studied in NPC. These new biologic therapies have promising in vitro activity for NPC, and emerging clinical studies are beginning to define their role. RT continues to expand its capabilities, and since IMRT and particle therapy, specifically intensity-modulated proton therapy (IMPT, has reports of impressive dosimetric efficacy in-silica. Adaptive RT is attempting to reduce toxicity while maintaining treatment efficacy, and the clinical results are still in their youth. Lastly, Epstein–Barr virus (EBV DNA has recently been studied for prediction of tumor response and its use as a biomarker is increasingly promising to aid in early detection as well as supplementing the current staging system. RT with or without chemotherapy remains the standard of care for nasopharyngeal carcinoma. Advances in RT technique, timing of chemotherapy, biologically

  13. MO-FG-CAMPUS-TeP2-01: A Graph Form ADMM Algorithm for Constrained Quadratic Radiation Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X; Belcher, AH; Wiersma, R [The University of Chicago, Chicago, IL (United States)

    2016-06-15

    Purpose: In radiation therapy optimization the constraints can be either hard constraints which must be satisfied or soft constraints which are included but do not need to be satisfied exactly. Currently the voxel dose constraints are viewed as soft constraints and included as a part of the objective function and approximated as an unconstrained problem. However in some treatment planning cases the constraints should be specified as hard constraints and solved by constrained optimization. The goal of this work is to present a computation efficiency graph form alternating direction method of multipliers (ADMM) algorithm for constrained quadratic treatment planning optimization and compare it with several commonly used algorithms/toolbox. Method: ADMM can be viewed as an attempt to blend the benefits of dual decomposition and augmented Lagrangian methods for constrained optimization. Various proximal operators were first constructed as applicable to quadratic IMRT constrained optimization and the problem was formulated in a graph form of ADMM. A pre-iteration operation for the projection of a point to a graph was also proposed to further accelerate the computation. Result: The graph form ADMM algorithm was tested by the Common Optimization for Radiation Therapy (CORT) dataset including TG119, prostate, liver, and head & neck cases. Both unconstrained and constrained optimization problems were formulated for comparison purposes. All optimizations were solved by LBFGS, IPOPT, Matlab built-in toolbox, CVX (implementing SeDuMi) and Mosek solvers. For unconstrained optimization, it was found that LBFGS performs the best, and it was 3–5 times faster than graph form ADMM. However, for constrained optimization, graph form ADMM was 8 – 100 times faster than the other solvers. Conclusion: A graph form ADMM can be applied to constrained quadratic IMRT optimization. It is more computationally efficient than several other commercial and noncommercial optimizers and it also

  14. Algorithms for Computing the Magnetic Field, Vector Potential, and Field Derivatives for Circular Current Loops in Cylindrical Coordinates

    Energy Technology Data Exchange (ETDEWEB)

    Walstrom, Peter Lowell [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-24

    A numerical algorithm for computing the field components Br and Bz and their r and z derivatives with open boundaries in cylindrical coordinates for circular current loops is described. An algorithm for computing the vector potential is also described. For the convenience of the reader, derivations of the final expressions from their defining integrals are given in detail, since their derivations (especially for the field derivatives) are not all easily found in textbooks. Numerical calculations are based on evaluation of complete elliptic integrals using the Bulirsch algorithm cel. Since cel can evaluate complete elliptic integrals of a fairly general type, in some cases the elliptic integrals can be evaluated without first reducing them to forms containing standard Legendre forms. The algorithms avoid the numerical difficulties that many of the textbook solutions have for points near the axis because of explicit factors of 1=r or 1=r2 in the some of the expressions.

  15. Numerical algorithm for laser treatment of powder layer with variable thickness

    Science.gov (United States)

    Soboleva, Polina; Knyazeva, Anna

    2017-12-01

    Two-dimensional model of laser treatment of powder layer on the substrate is proposed in this paper. The model takes into account the shrinkage of powder layer due to the laser treatment. Three simplified variants of the model were studied. Firstly, the influence of optical properties of powder layer on the maximal temperature was researched. Secondly, two-dimensional model for given thickness of powder layer was studied where practically uniform temperature distribution across thin powder layer was demonstrated. Then, the numerical algorithm was developed to calculate the temperature field for the area of variable size. The impact of the optical properties of powder material on the character of the temperature distribution was researched numerically.

  16. Unveiling the development of intracranial injury using dynamic brain EIT: an evaluation of current reconstruction algorithms.

    Science.gov (United States)

    Li, Haoting; Chen, Rongqing; Xu, Canhua; Liu, Benyuan; Tang, Mengxing; Yang, Lin; Dong, Xiuzhen; Fu, Feng

    2017-08-21

    Dynamic brain electrical impedance tomography (EIT) is a promising technique for continuously monitoring the development of cerebral injury. While there are many reconstruction algorithms available for brain EIT, there is still a lack of study to compare their performance in the context of dynamic brain monitoring. To address this problem, we develop a framework for evaluating different current algorithms with their ability to correctly identify small intracranial conductivity changes. Firstly, a simulation 3D head phantom with realistic layered structure and impedance distribution is developed. Next several reconstructing algorithms, such as back projection (BP), damped least-square (DLS), Bayesian, split Bregman (SB) and GREIT are introduced. We investigate their temporal response, noise performance, location and shape error with respect to different noise levels on the simulation phantom. The results show that the SB algorithm demonstrates superior performance in reducing image error. To further improve the location accuracy, we optimize SB by incorporating the brain structure-based conductivity distribution priors, in which differences of the conductivities between different brain tissues and the inhomogeneous conductivity distribution of the skull are considered. We compare this novel algorithm (called SB-IBCD) with SB and DLS using anatomically correct head shaped phantoms with spatial varying skull conductivity. Main results and Significance: The results showed that SB-IBCD is the most effective in unveiling small intracranial conductivity changes, where it can reduce the image error by an average of 30.0% compared to DLS.

  17. A Refined Self-Tuning Filter-Based Instantaneous Power Theory Algorithm for Indirect Current Controlled Three-Level Inverter-Based Shunt Active Power Filters under Non-sinusoidal Source Voltage Conditions

    Directory of Open Access Journals (Sweden)

    Yap Hoon

    2017-02-01

    Full Text Available In this paper, a refined reference current generation algorithm based on instantaneous power (pq theory is proposed, for operation of an indirect current controlled (ICC three-level neutral-point diode clamped (NPC inverter-based shunt active power filter (SAPF under non-sinusoidal source voltage conditions. SAPF is recognized as one of the most effective solutions to current harmonics due to its flexibility in dealing with various power system conditions. As for its controller, pq theory has widely been applied to generate the desired reference current due to its simple implementation features. However, the conventional dependency on self-tuning filter (STF in generating reference current has significantly limited mitigation performance of SAPF. Besides, the conventional STF-based pq theory algorithm is still considered to possess needless features which increase computational complexity. Furthermore, the conventional algorithm is mostly designed to suit operation of direct current controlled (DCC SAPF which is incapable of handling switching ripples problems, thereby leading to inefficient mitigation performance. Therefore, three main improvements are performed which include replacement of STF with mathematical-based fundamental real power identifier, removal of redundant features, and generation of sinusoidal reference current. To validate effectiveness and feasibility of the proposed algorithm, simulation work in MATLAB-Simulink and laboratory test utilizing a TMS320F28335 digital signal processor (DSP are performed. Both simulation and experimental findings demonstrate superiority of the proposed algorithm over the conventional algorithm.

  18. TITRATION: A Randomized Study to Assess 2 Treatment Algorithms with New Insulin Glargine 300 units/mL.

    Science.gov (United States)

    Yale, Jean-François; Berard, Lori; Groleau, Mélanie; Javadi, Pasha; Stewart, John; Harris, Stewart B

    2017-10-01

    It was uncertain whether an algorithm that involves increasing insulin dosages by 1 unit/day may cause more hypoglycemia with the longer-acting insulin glargine 300 units/mL (GLA-300). The objective of this study was to compare safety and efficacy of 2 titration algorithms, INSIGHT and EDITION, for GLA-300 in people with uncontrolled type 2 diabetes mellitus, mainly in a primary care setting. This was a 12-week, open-label, randomized, multicentre pilot study. Participants were randomly assigned to 1 of 2 algorithms: they either increased their dosage by 1 unit/day (INSIGHT, n=108) or the dose was adjusted by the investigator at least once weekly, but no more often than every 3 days (EDITION, n=104). The target fasting self-monitored blood glucose was in the range of 4.4 to 5.6 mmol/L. The percentages of participants reaching the primary endpoint of fasting self-monitored blood glucose ≤5.6 mmol/L without nocturnal hypoglycemia were 19.4% (INSIGHT) and 18.3% (EDITION). At week 12, 26.9% (INSIGHT) and 28.8% (EDITION) of participants achieved a glycated hemoglobin value of ≤7%. No differences in the incidence of hypoglycemia of any category were noted between algorithms. Participants in both arms of the study were much more satisfied with their new treatment as assessed by the Diabetes Treatment Satisfaction Questionnaire. Most health-care professionals (86%) preferred the INSIGHT over the EDITION algorithm. The frequency of adverse events was similar between algorithms. A patient-driven titration algorithm of 1 unit/day with GLA-300 is effective and comparable to the previously tested EDITION algorithm and is preferred by health-care professionals. Copyright © 2017 Diabetes Canada. Published by Elsevier Inc. All rights reserved.

  19. Algorithms for optimizing drug therapy

    Directory of Open Access Journals (Sweden)

    Martin Lene

    2004-07-01

    Full Text Available Abstract Background Drug therapy has become increasingly efficient, with more drugs available for treatment of an ever-growing number of conditions. Yet, drug use is reported to be sub optimal in several aspects, such as dosage, patient's adherence and outcome of therapy. The aim of the current study was to investigate the possibility to optimize drug therapy using computer programs, available on the Internet. Methods One hundred and ten officially endorsed text documents, published between 1996 and 2004, containing guidelines for drug therapy in 246 disorders, were analyzed with regard to information about patient-, disease- and drug-related factors and relationships between these factors. This information was used to construct algorithms for identifying optimum treatment in each of the studied disorders. These algorithms were categorized in order to define as few models as possible that still could accommodate the identified factors and the relationships between them. The resulting program prototypes were implemented in HTML (user interface and JavaScript (program logic. Results Three types of algorithms were sufficient for the intended purpose. The simplest type is a list of factors, each of which implies that the particular patient should or should not receive treatment. This is adequate in situations where only one treatment exists. The second type, a more elaborate model, is required when treatment can by provided using drugs from different pharmacological classes and the selection of drug class is dependent on patient characteristics. An easily implemented set of if-then statements was able to manage the identified information in such instances. The third type was needed in the few situations where the selection and dosage of drugs were depending on the degree to which one or more patient-specific factors were present. In these cases the implementation of an established decision model based on fuzzy sets was required. Computer programs

  20. A quality and efficiency analysis of the IMFASTTM segmentation algorithm in head and neck 'step and shoot' IMRT treatments

    International Nuclear Information System (INIS)

    Potter, Larry D.; Chang, Sha X.; Cullip, Timothy J.; Siochi, Alfredo C.

    2002-01-01

    The performance of segmentation algorithms used in IMFAST for 'step and shoot' IMRT treatment delivery is evaluated for three head and neck clinical treatments of different optimization objectives. The segmentation uses the intensity maps generated by the in-house TPS PLANUNC using the index-dose minimization algorithm. The dose optimization objectives include PTV dose uniformity and dose volume histogram-specified critical structure sparing. The optimized continuous intensity maps were truncated into five and ten intensity levels and exported to IMFAST for MLC segments optimization. The MLC segments were imported back to PLUNC for dose optimization quality calculation. The five basic segmentation algorithms included in IMFAST were evaluated alone and in combination with either tongue and groove/match line correction or fluence correction or both. Two criteria were used in the evaluation: treatment efficiency represented by the total number of MLC segments and optimization quality represented by a clinically relevant optimization quality factor. We found that the treatment efficiency depends first on the number of intensity levels used in the intensity map and second the segmentation technique used. The standard optimal segmentation with fluence correction is a consistent good performer for all treatment plans studied. All segmentation techniques evaluated produced treatments with similar dose optimization quality values, especially when ten-level intensity maps are used

  1. A novel current mode controller for a static compensator utilizing Goertzel algorithm to mitigate voltage sags

    International Nuclear Information System (INIS)

    Najafi, E.; Yatim, A.H.M.

    2011-01-01

    Research highlights: → We proposed a new current control method for STATCOM. → The current control method maintains a fixed switching frequency. → It also produces fewer harmonics compared to conventional hysteresis method. → A new voltage dip (sag) detection method was used in STATCOM. → The control method can mitigate voltage sag in each phase separately. -- Abstract: Static compensator (STATCOM) has been widely proposed for power quality and network stability improvement. It is easily connected in parallel to the electric network and has many advantages for electrical grids. It can improve network stability; power factor, power transfer rating and can avoid some disturbances such as sags and swells. Most of STATCOM controllers are based on voltage controllers that are based on balanced d-q transform. However, they are not thorough solutions for network disturbances since in most cases single-phase disturbances occur in electrical networks that cannot be avoided by the conventional controllers. Voltage mode controllers are also not capable of responding fast enough to the changes expected of a network system. This paper proposes a new current mode controller to overcome the mentioned problem. The approach uses a fixed frequency current controller to maintain voltage levels in voltage sags (dips). This approach is also simple and can be easily implemented by digitally. It has superior performance over conventional methods in terms of harmonic reduction in STATCOM output current. Another important factor for STATCOM effectiveness in sag mitigation is its sag detection method. This paper also introduces a new sag detection method based on Goertzel algorithm which is both effective and simple for practical applications. The simulation results presented illustrate the superiority of the proposed controller and sag detection algorithm to be utilized in the STATCOM.

  2. Approaches to drug therapy for COPD in Russia: a proposed therapeutic algorithm

    Directory of Open Access Journals (Sweden)

    Zykov KA

    2017-04-01

    Full Text Available Kirill A Zykov,1 Svetlana I Ovcharenko2 1Laboratory of Pulmonology, Moscow State University of Medicine and Dentistry named after A.I. Evdokimov, 2I.M. Sechenov First Moscow State Medical University, Moscow, Russia Abstract: Until recently, there have been few clinical algorithms for the management of patients with COPD. Current evidence-based clinical management guidelines can appear to be complex, and they lack clear step-by-step instructions. For these reasons, we chose to create a simple and practical clinical algorithm for the management of patients with COPD, which would be applicable to real-world clinical practice, and which was based on clinical symptoms and spirometric parameters that would take into account the pathophysiological heterogeneity of COPD. This optimized algorithm has two main fields, one for nonspecialist treatment by primary care and general physicians and the other for treatment by specialized pulmonologists. Patients with COPD are treated with long-acting bronchodilators and short-acting drugs on a demand basis. If the forced expiratory volume in one second (FEV1 is ≥50% of predicted and symptoms are mild, treatment with a single long-acting muscarinic antagonist or long-acting beta-agonist is proposed. When FEV1 is <50% of predicted and/or the COPD assessment test score is ≥10, the use of combined bronchodilators is advised. If there is no response to treatment after three months, referral to a pulmonary specialist is recommended for ­pathophysiological endotyping: 1 eosinophilic endotype with peripheral blood or sputum eosinophilia >3%; 2 neutrophilic endotype with peripheral blood neutrophilia >60% or green sputum; or 3 pauci-granulocytic endotype. It is hoped that this simple, optimized, step-by-step algorithm will help to individualize the treatment of COPD in real-world clinical practice. This algorithm has yet to be evaluated prospectively or by comparison with other COPD management algorithms, including

  3. Composite Differential Search Algorithm

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2014-01-01

    Full Text Available Differential search algorithm (DS is a relatively new evolutionary algorithm inspired by the Brownian-like random-walk movement which is used by an organism to migrate. It has been verified to be more effective than ABC, JDE, JADE, SADE, EPSDE, GSA, PSO2011, and CMA-ES. In this paper, we propose four improved solution search algorithms, namely “DS/rand/1,” “DS/rand/2,” “DS/current to rand/1,” and “DS/current to rand/2” to search the new space and enhance the convergence rate for the global optimization problem. In order to verify the performance of different solution search methods, 23 benchmark functions are employed. Experimental results indicate that the proposed algorithm performs better than, or at least comparable to, the original algorithm when considering the quality of the solution obtained. However, these schemes cannot still achieve the best solution for all functions. In order to further enhance the convergence rate and the diversity of the algorithm, a composite differential search algorithm (CDS is proposed in this paper. This new algorithm combines three new proposed search schemes including “DS/rand/1,” “DS/rand/2,” and “DS/current to rand/1” with three control parameters using a random method to generate the offspring. Experiment results show that CDS has a faster convergence rate and better search ability based on the 23 benchmark functions.

  4. A new algorithm for hip fracture surgery

    DEFF Research Database (Denmark)

    Palm, Henrik; Krasheninnikoff, Michael; Holck, Kim

    2012-01-01

    Background and purpose Treatment of hip fracture patients is controversial. We implemented a new operative and supervision algorithm (the Hvidovre algorithm) for surgical treatment of all hip fractures, primarily based on own previously published results. Methods 2,000 consecutive patients over 50...... years of age who were admitted and operated on because of a hip fracture were prospectively included. 1,000 of these patients were included after implementation of the algorithm. Demographic parameters, hospital treatment, and reoperations within the first postoperative year were assessed from patient...... by reoperations was reduced from 24% of total hospitalization before the algorithm was introduced to 18% after it was introduced. Interpretation It is possible to implement an algorithm for treatment of all hip fracture patients in a large teaching hospital. In our case, the Hvidovre algorithm both raised...

  5. SU-G-201-09: Evaluation of a Novel Machine-Learning Algorithm for Permanent Prostate Brachytherapy Treatment Planning

    International Nuclear Information System (INIS)

    Nicolae, A; Lu, L; Morton, G; Chung, H; Helou, J; Al Hanaqta, M; Loblaw, A; Ravi, A; Heath, E

    2016-01-01

    Purpose: A novel, automated, algorithm for permanent prostate brachytherapy (PPB) treatment planning has been developed. The novel approach uses machine-learning (ML), a form of artificial intelligence, to substantially decrease planning time while simultaneously retaining the clinical intuition of plans created by radiation oncologists. This study seeks to compare the ML algorithm against expert-planned PPB plans to evaluate the equivalency of dosimetric and clinical plan quality. Methods: Plan features were computed from historical high-quality PPB treatments (N = 100) and stored in a relational database (RDB). The ML algorithm matched new PPB features to a highly similar case in the RDB; this initial plan configuration was then further optimized using a stochastic search algorithm. PPB pre-plans (N = 30) generated using the ML algorithm were compared to plan variants created by an expert dosimetrist (RT), and radiation oncologist (MD). Planning time and pre-plan dosimetry were evaluated using a one-way Student’s t-test and ANOVA, respectively (significance level = 0.05). Clinical implant quality was evaluated by expert PPB radiation oncologists as part of a qualitative study. Results: Average planning time was 0.44 ± 0.42 min compared to 17.88 ± 8.76 min for the ML algorithm and RT, respectively, a significant advantage [t(9), p = 0.01]. A post-hoc ANOVA [F(2,87) = 6.59, p = 0.002] using Tukey-Kramer criteria showed a significantly lower mean prostate V150% for the ML plans (52.9%) compared to the RT (57.3%), and MD (56.2%) plans. Preliminary qualitative study results indicate comparable clinical implant quality between RT and ML plans with a trend towards preference for ML plans. Conclusion: PPB pre-treatment plans highly comparable to those of an expert radiation oncologist can be created using a novel ML planning model. The use of an ML-based planning approach is expected to translate into improved PPB accessibility and plan uniformity.

  6. SU-G-201-09: Evaluation of a Novel Machine-Learning Algorithm for Permanent Prostate Brachytherapy Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Nicolae, A [Odette Cancer Centre, Sunnybrook Health Sciences Centre, Toronto, ON (Canada); Department of Physics, Ryerson University, Toronto, ON (Canada); Lu, L; Morton, G; Chung, H; Helou, J; Al Hanaqta, M; Loblaw, A; Ravi, A [Odette Cancer Centre, Sunnybrook Health Sciences Centre, Toronto, ON (Canada); Heath, E [Carleton Laboratory for Radiotherapy Physics, Carleton University, Ottawa, ON, CA (Canada)

    2016-06-15

    Purpose: A novel, automated, algorithm for permanent prostate brachytherapy (PPB) treatment planning has been developed. The novel approach uses machine-learning (ML), a form of artificial intelligence, to substantially decrease planning time while simultaneously retaining the clinical intuition of plans created by radiation oncologists. This study seeks to compare the ML algorithm against expert-planned PPB plans to evaluate the equivalency of dosimetric and clinical plan quality. Methods: Plan features were computed from historical high-quality PPB treatments (N = 100) and stored in a relational database (RDB). The ML algorithm matched new PPB features to a highly similar case in the RDB; this initial plan configuration was then further optimized using a stochastic search algorithm. PPB pre-plans (N = 30) generated using the ML algorithm were compared to plan variants created by an expert dosimetrist (RT), and radiation oncologist (MD). Planning time and pre-plan dosimetry were evaluated using a one-way Student’s t-test and ANOVA, respectively (significance level = 0.05). Clinical implant quality was evaluated by expert PPB radiation oncologists as part of a qualitative study. Results: Average planning time was 0.44 ± 0.42 min compared to 17.88 ± 8.76 min for the ML algorithm and RT, respectively, a significant advantage [t(9), p = 0.01]. A post-hoc ANOVA [F(2,87) = 6.59, p = 0.002] using Tukey-Kramer criteria showed a significantly lower mean prostate V150% for the ML plans (52.9%) compared to the RT (57.3%), and MD (56.2%) plans. Preliminary qualitative study results indicate comparable clinical implant quality between RT and ML plans with a trend towards preference for ML plans. Conclusion: PPB pre-treatment plans highly comparable to those of an expert radiation oncologist can be created using a novel ML planning model. The use of an ML-based planning approach is expected to translate into improved PPB accessibility and plan uniformity.

  7. Conventional treatment planning optimization using simulated annealing

    International Nuclear Information System (INIS)

    Morrill, S.M.; Langer, M.; Lane, R.G.

    1995-01-01

    Purpose: Simulated annealing (SA) allows for the implementation of realistic biological and clinical cost functions into treatment plan optimization. However, a drawback to the clinical implementation of SA optimization is that large numbers of beams appear in the final solution, some with insignificant weights, preventing the delivery of these optimized plans using conventional (limited to a few coplanar beams) radiation therapy. A preliminary study suggested two promising algorithms for restricting the number of beam weights. The purpose of this investigation was to compare these two algorithms using our current SA algorithm with the aim of producing a algorithm to allow clinically useful radiation therapy treatment planning optimization. Method: Our current SA algorithm, Variable Stepsize Generalized Simulated Annealing (VSGSA) was modified with two algorithms to restrict the number of beam weights in the final solution. The first algorithm selected combinations of a fixed number of beams from the complete solution space at each iterative step of the optimization process. The second reduced the allowed number of beams by a factor of two at periodic steps during the optimization process until only the specified number of beams remained. Results of optimization of beam weights and angles using these algorithms were compared using a standard cadre of abdominal cases. The solution space was defined as a set of 36 custom-shaped open and wedged-filtered fields at 10 deg. increments with a target constant target volume margin of 1.2 cm. For each case a clinically-accepted cost function, minimum tumor dose was maximized subject to a set of normal tissue binary dose-volume constraints. For this study, the optimized plan was restricted to four (4) fields suitable for delivery with conventional therapy equipment. Results: The table gives the mean value of the minimum target dose obtained for each algorithm averaged over 5 different runs and the comparable manual treatment

  8. Evaluation of an electron Monte Carlo dose calculation algorithm for treatment planning.

    Science.gov (United States)

    Chamberland, Eve; Beaulieu, Luc; Lachance, Bernard

    2015-05-08

    The purpose of this study is to evaluate the accuracy of the electron Monte Carlo (eMC) dose calculation algorithm included in a commercial treatment planning system and compare its performance against an electron pencil beam algorithm. Several tests were performed to explore the system's behavior in simple geometries and in configurations encountered in clinical practice. The first series of tests were executed in a homogeneous water phantom, where experimental measurements and eMC-calculated dose distributions were compared for various combinations of energy and applicator. More specifically, we compared beam profiles and depth-dose curves at different source-to-surface distances (SSDs) and gantry angles, by using dose difference and distance to agreement. Also, we compared output factors, we studied the effects of algorithm input parameters, which are the random number generator seed, as well as the calculation grid size, and we performed a calculation time evaluation. Three different inhomogeneous solid phantoms were built, using high- and low-density materials inserts, to clinically simulate relevant heterogeneity conditions: a small air cylinder within a homogeneous phantom, a lung phantom, and a chest wall phantom. We also used an anthropomorphic phantom to perform comparison of eMC calculations to measurements. Finally, we proceeded with an evaluation of the eMC algorithm on a clinical case of nose cancer. In all mentioned cases, measurements, carried out by means of XV-2 films, radiographic films or EBT2 Gafchromic films. were used to compare eMC calculations with dose distributions obtained from an electron pencil beam algorithm. eMC calculations in the water phantom were accurate. Discrepancies for depth-dose curves and beam profiles were under 2.5% and 2 mm. Dose calculations with eMC for the small air cylinder and the lung phantom agreed within 2% and 4%, respectively. eMC calculations for the chest wall phantom and the anthropomorphic phantom also

  9. [Therapy of fatigue in multiple sclerosis : A treatment algorithm].

    Science.gov (United States)

    Veauthier, C; Paul, F

    2016-12-01

    Fatigue is one of the most frequent symptoms of multiple sclerosis (MS) and one of the main reasons for underemployment and early retirement. The mechanisms of MS-related fatigue are unknown but comorbid disorders play a major role. Anemia, diabetes, side effects of medication and depression should be ruled out. Moreover, excessive daytime sleepiness (EDS) should be differentiated from fatigue. No approved medicinal therapy of MS fatigue is currently available. Presentation of current treatment strategies with a particular focus on secondary fatigue due to sleep disorders. A review of the literature was carried out. All MS patients suffering from fatigue should be questioned with respect to EDS and if necessary sleep medical investigations should be carried out; however, pure fatigue without accompanying EDS can also be caused by a sleep disorder. Medications, particularly freely available antihistamines, can also increase fatigue. Furthermore, anemia, iron deficits, diabetes and hypothyroidism should be excluded. Self-assessment questionnaires show an overlap between depression and fatigue. Several studies have shown that cognitive behavioral therapy and various psychotherapeutic measures, such as vertigo training, progressive exercise training and individualized physiotherapy as well as fatigue management interventions can lead to a significant improvement of MS-related fatigue. There is currently no medication which is suitable for treatment of fatigue, with the exception of fampridine for the treatment of motor functions and motor fatigue.

  10. Current approaches in atrial fibrillation treatment

    Directory of Open Access Journals (Sweden)

    Cenk Sarı

    2014-09-01

    Full Text Available Atrial fibrillation (AF is the most common sustained arrhythmia encountered in clinical practice. Its incidence increases with age. AF is classified into subtypes according to the duration and/or able to provide sinus rhytym. İnitially, patients should be evaluated for rhythm or rate control for appropriate treatment. Second stage of strategy aimed to investigate the feasibility of anticoagulation therapy. Recently, due to the progress made in treatment with rhythm control and anticoagulation therapy, either American or European guidelines have been renovated. These developments have taken place in the newly published guide. In this article, the current change in the management of AF is discussed.

  11. Stator current harmonics evolution by neural network method based on CFE/SS algorithm for ACEC generator of Rey Power Plant

    International Nuclear Information System (INIS)

    Soleymani, S.; Ranjbar, A.M.; Mirabedini, H.

    2001-01-01

    One method for on-line fault diagnosis in synchronous generator is stator current harmonics analysis. Then artificial neural network is considered in this paper in order to evaluate stator current harmonics in different loads. Training set of artificial neural network is made ready by generator modeling, finite element method and state space model. Many points from generator capability curve are used in order to complete this set. Artificial neural network which is used in this paper is a percept ron network with a single hidden layer, Eight hidden neurons and back propagation algorithm. Results are indicated that the trained artificial neural network can identify stator current harmonics for arbitrary load from the capability curve. The error is less than 10% in comparison with values obtained directly from the CFE-SS algorithm. The rating parameters of modeled generator are 43950 (kV A), 11(KV), 3000 (rpm), 50 (H Z), (P F=0.8)

  12. Current treatment options for meningioma.

    Science.gov (United States)

    Apra, Caroline; Peyre, Matthieu; Kalamarides, Michel

    2018-03-01

    With an annual incidence of 5/100,000, meningioma is the most frequent primary tumor of the central nervous system. Risk factors are radiotherapy and hormone intake. Most meningiomas are grade I benign tumors, but up to 15% are atypical and 2% anaplastic according to the WHO 2016 histological criteria. Areas covered: This review details the current standard therapy based on international guidelines and recent literature, and describes new approaches developed to treat refractory cases. First-line treatments are observation and surgery, but adjuvant radiotherapy/radiosurgery is discussed for atypical and indicated for anaplastic meningiomas. The most problematic cases include skull base meningiomas that enclose vasculo-nervous structures and surgery- and radiation-refractory tumors that present with significant morbidity and mortality. The treatment of recurrent tumors is based on radiotherapy and repeated surgery. Systematic therapies are not effective in general but several clinical trials are ongoing. Expert commentary: Molecular characterization of the tumors, based on genetic mutations such as NF2, SMO, TERT, TRAF7, and on the methylation profile are developing, completing the histological classification and giving new insights into prognosis and treatment options.

  13. Current algorithms for computed electron beam dose planning

    International Nuclear Information System (INIS)

    Brahme, A.

    1985-01-01

    Two- and sometimes three-dimensional computer algorithms for electron beam irradiation are capable of taking all irregularities of the body cross-section and the properties of the various tissues into account. This is achieved by dividing the incoming broad beams into a number of narrow pencil beams, the penetration of which can be described by essentially one-dimensional formalisms. The constituent pencil beams are most often described by Gaussian, experimentally or theoretically derived distributions. The accuracy of different dose planning algorithms is discussed in some detail based on their ability to take the different physical interaction processes of high energy electrons into account. It is shown that those programs that take the deviations from the simple Gaussian model into account give the best agreement with experimental results. With such programs a dosimetric relative accuracy of about 5% is generally achieved except in the most complex inhomogeneity configurations. Finally, the present limitations and possible future developments of electron dose planning are discussed. (orig.)

  14. Balance control of grid currents for UPQC under unbalanced loads based on matching-ratio compensation algorithm

    DEFF Research Database (Denmark)

    Zhao, Xiaojun; Zhang, Chunjiang; Chai, Xiuhui

    2018-01-01

    In three-phase four-wire systems, unbalanced loads can cause grid currents to be unbalanced, and this may cause the neutral point potential on the grid side to shift. The neutral point potential shift will worsen the control precision as well as the performance of the threephase four-wire unified...... fluctuations, and elaborates the interaction between unbalanced grid currents and DC bus voltage fluctuations; two control strategies of UPQC under three-phase stationary coordinate based on the MCA are given, and finally, the feasibility and effectiveness of the proposed control strategy are verified...... power quality conditioner (UPQC), and it also leads to unbalanced three-phase output voltage, even causing damage to electric equipment. To deal with unbalanced loads, this paper proposes a matching-ratio compensation algorithm (MCA) for the fundamental active component of load currents...

  15. Challenges of implementing fibromyalgia treatment guidelines in current clinical practice.

    Science.gov (United States)

    Arnold, Lesley M; Clauw, Daniel J

    2017-09-01

    The current diagnostic and treatment pathway for patients with fibromyalgia (FM) is lengthy, complex, and characterized by multiple physician visits with an average 2-year wait until diagnosis. It is clear that effective identification and appropriate treatment of FM remain a challenge in current clinical practice. Ideally, FM management involves a multidisciplinary approach with the preferable patient pathway originating in primary care but supported by a range of health care providers, including referral to specialist care when necessary. After the publication of individual clinical studies, high-quality reviews, and meta-analyses, recently published FM treatment guidelines have transitioned from an expert consensus to an evidence-based approach. Evidence-based guidelines provide a framework for ensuring early diagnosis and timely adoption of appropriate treatment. However, for successful outcomes, FM treatments must adopt a more holistic approach, which addresses more than just pain. Impact on the associated symptoms of fatigue and cognitive problems, sleep and mood disturbances, and lowered functional status are also important in judging the success of FM therapy. Recently published guidelines recommend the adoption of a symptom-based approach to guide pharmacologic treatment. Emerging treatment options for FM may be best differentiated on the basis of their effect on comorbid symptoms that are often associated with pain (e.g. sleep disturbance, mood, fatigue). The current review discusses the most recently published Canadian guidelines and the implications of the recent European League Against Rheumatism (EULAR) recommendations, with a focus on the challenges of implementing these guidelines in current clinical practice.

  16. A botulinum toxin A treatment algorithm for de novo management of torticollis and laterocollis

    Science.gov (United States)

    Kupsch, Andreas; Müngersdorf, Martina; Paus, Sebastian; Stenner, Andrea; Jost, Wolfgang

    2011-01-01

    Objectives Few studies have investigated the injection patterns for botulinum toxin type A for the treatment of heterogeneous forms of cervical dystonia (CD). This large, prospective, open-label, multicentre study aimed to evaluate the effectiveness and safety of 500 U botulinum toxin A for the initial treatment according to a standardised algorithm of the two most frequent forms of CD, predominantly torticollis and laterocollis. Design Patients (aged ≥18 years) with CD not previously treated with botulinum neurotoxin therapy were given one treatment with 500 U Dysport, according to a defined intramuscular injection algorithm based on clinical assessment of direction of head deviation, occurrence of shoulder elevation, occurrence of tremor (all evaluated using the Tsui rating scale) and hypertrophy of the sternocleidomastoid muscle. Results In this study, 516 patients were enrolled, the majority of whom (95.0%) completed treatment. Most patients had torticollis (78.1%). At week 4, mean Tsui scores had significantly decreased by −4.01, −3.76 and −4.09 points in the total, torticollis and laterocollis populations, respectively. Symptom improvement was equally effective between groups. Tsui scores remained significantly below baseline at week 12 in both groups. Treatment was well tolerated; the most frequent adverse events were muscular weakness (13.8%), dysphagia (9.9%) and neck pain (6.6%). Conclusions Dysport 500 U is effective and well tolerated for the de novo management of a range of heterogeneous forms of CD, when using a standardised regimen that allows tailored dosing based on individual symptom assessment. Clinical trials information (NCT00447772; clinicaltrials.gov) PMID:22021883

  17. PREVALENCE OF MULTIPLE ADDICTIONS AND CURRENT TREATMENT BY DRUG TREATMENT CENTRES IN DURBAN, SOUTH AFRICA

    Directory of Open Access Journals (Sweden)

    Keen, Helen

    2015-09-01

    Full Text Available Substance-use disorders (SUD cause severe problems both globally and locally. Research suggests that multiple addictions create a more complex illness. This study investigated whether in-patients admitted for SUD at three drug treatment centres in Durban, South Africa had other, undiagnosed addictions. It utilised a three-phase concurrent mixed-methods design and initially screened for gambling and sex addiction. Results showed that, of the sample of 123 participants, 54% had either sex or gambling and 24% had both addictions which current treatment programmes neither assessed for nor treated. Recommendations include suggestions to update current assessment and treatment approaches and the need to train professional staff at drug treatment centres.

  18. The multi-objective optimization of the horizontal-axis marine current turbine based on NSGA-II algorithm

    International Nuclear Information System (INIS)

    Zhu, G J; Guo, P C; Luo, X Q; Feng, J J

    2012-01-01

    The present paper describes a hydrodynamic optimization technique for horizontal-axial marine current turbine. The pitch angle distribution is important to marine current turbine. In this paper, the pitch angle distribution curve is parameterized as four control points by Bezier curve method. The coordinates of the four control points are chosen as optimization variables, and the sample space are structured according to the Box-Behnken experimental design method (BBD). Then the power capture coefficient and axial thrust coefficient in design tip-speed ratio is obtained for all the elements in the sample space by CFD numerical simulation. The power capture coefficient and axial thrust are chosen as objective function, and quadratic polynomial regression equations are constructed to fit the relationship between the optimization variables and each objective function according to response surface model. With the obtained quadratic polynomial regression equations as performance prediction model, the marine current turbine is optimized using the NSGA-II multi-objective genetic algorithm, which finally offers an improved marine current turbine.

  19. GTV-based prescription in SBRT for lung lesions using advanced dose calculation algorithms

    International Nuclear Information System (INIS)

    Lacornerie, Thomas; Lisbona, Albert; Mirabel, Xavier; Lartigau, Eric; Reynaert, Nick

    2014-01-01

    The aim of current study was to investigate the way dose is prescribed to lung lesions during SBRT using advanced dose calculation algorithms that take into account electron transport (type B algorithms). As type A algorithms do not take into account secondary electron transport, they overestimate the dose to lung lesions. Type B algorithms are more accurate but still no consensus is reached regarding dose prescription. The positive clinical results obtained using type A algorithms should be used as a starting point. In current work a dose-calculation experiment is performed, presenting different prescription methods. Three cases with three different sizes of peripheral lung lesions were planned using three different treatment platforms. For each individual case 60 Gy to the PTV was prescribed using a type A algorithm and the dose distribution was recalculated using a type B algorithm in order to evaluate the impact of the secondary electron transport. Secondly, for each case a type B algorithm was used to prescribe 48 Gy to the PTV, and the resulting doses to the GTV were analyzed. Finally, prescriptions based on specific GTV dose volumes were evaluated. When using a type A algorithm to prescribe the same dose to the PTV, the differences regarding median GTV doses among platforms and cases were always less than 10% of the prescription dose. The prescription to the PTV based on type B algorithms, leads to a more important variability of the median GTV dose among cases and among platforms, (respectively 24%, and 28%). However, when 54 Gy was prescribed as median GTV dose, using a type B algorithm, the variability observed was minimal. Normalizing the prescription dose to the median GTV dose for lung lesions avoids variability among different cases and treatment platforms of SBRT when type B algorithms are used to calculate the dose. The combination of using a type A algorithm to optimize a homogeneous dose in the PTV and using a type B algorithm to prescribe the

  20. Current treatment paradigms in rheumatoid arthritis.

    Science.gov (United States)

    Fries, J F

    2000-06-01

    Rheumatoid arthritis (RA) has traditionally been treated using the pyramid approach, in which non-steroidal anti-inflammatory drugs (NSAIDs) are the first-line treatment and disease-modifying anti-rheumatic drugs (DMARDs) are introduced relatively late in the disease. This approach is no longer valid. Previously regarded as a benign disease, RA is now recognized as causing substantial morbidity and mortality, as do the NSAIDs used in treatment. DMARDs are more effective in controlling the pain and disability of RA than NSAIDs, and are often no more toxic. The current treatment paradigm emphasizes early, consistent use of DMARDs. A 'sawtooth' strategy of DMARD use has been proposed, in which a rising but low level of disability triggers a change in therapy. Determining the most clinically useful DMARD combinations and the optimal sequence of DMARD use requires effectiveness studies, Bayesian approaches and analyses of long-term outcomes. Such approaches will allow optimization of multiple drug therapies in RA, and should substantially improve the long-term outcome for many patients.

  1. Knowledge-based radiation therapy (KBRT) treatment planning versus planning by experts: validation of a KBRT algorithm for prostate cancer treatment planning

    International Nuclear Information System (INIS)

    Nwankwo, Obioma; Mekdash, Hana; Sihono, Dwi Seno Kuncoro; Wenz, Frederik; Glatting, Gerhard

    2015-01-01

    A knowledge-based radiation therapy (KBRT) treatment planning algorithm was recently developed. The purpose of this work is to investigate how plans that are generated with the objective KBRT approach compare to those that rely on the judgment of the experienced planner. Thirty volumetric modulated arc therapy plans were randomly selected from a database of prostate plans that were generated by experienced planners (expert plans). The anatomical data (CT scan and delineation of organs) of these patients and the KBRT algorithm were given to a novice with no prior treatment planning experience. The inexperienced planner used the knowledge-based algorithm to predict the dose that the OARs receive based on their proximity to the treated volume. The population-based OAR constraints were changed to the predicted doses. A KBRT plan was subsequently generated. The KBRT and expert plans were compared for the achieved target coverage and OAR sparing. The target coverages were compared using the Uniformity Index (UI), while 5 dose-volume points (D 10 , D 30, D 50 , D 70 and D 90 ) were used to compare the OARs (bladder and rectum) doses. Wilcoxon matched-pairs signed rank test was used to check for significant differences (p < 0.05) between both datasets. The KBRT and expert plans achieved mean UI values of 1.10 ± 0.03 and 1.10 ± 0.04, respectively. The Wilcoxon test showed no statistically significant difference between both results. The D 90 , D 70, D 50 , D 30 and D 10 values of the two planning strategies, and the Wilcoxon test results suggests that the KBRT plans achieved a statistically significant lower bladder dose (at D 30 ), while the expert plans achieved a statistically significant lower rectal dose (at D 10 and D 30 ). The results of this study show that the KBRT treatment planning approach is a promising method to objectively incorporate patient anatomical variations in radiotherapy treatment planning

  2. Current and emerging somatic treatment strategies in psychotic major depression.

    Science.gov (United States)

    Dannon, Pinhas N; Lowengrub, Katherine; Gonopolski, Yehudit; Kotler, Moshe

    2006-01-01

    Psychotic major depressive disorder (MDD) is a mood disorder characterized by severe affective and neurovegetative symptoms together with the presence of delusions and/or hallucinations. It is a common disorder seen in a quarter of consecutively admitted depressed patients and is often associated with severe symptomatology, increased suicide risk, poor acute response to antidepressants and poor acute and long-term treatment outcome. It is possible that poor response in psychotic depression is caused by the fact that we have yet to identify the most efficacious treatment protocol for psychotic MDD. Multiple studies have shown that modifications in the treatment paradigm may increase treatment efficacy in psychotic MDD. It has been generally accepted that, during the acute treatment phase, antidepressant-antipsychotic drug combination therapy is more effective than either treatment alone, although this strategy has recently been challenged. The question of the optimal duration of pharmacotherapy in order to prevent relapse and improve long-term (i.e., 5-year) outcome is a focus of current investigation. This article will review currently recommended treatment strategies for the acute, continuation and maintenance phases of therapy. In particular, it will address the role of newer-generation antidepressants, the role of second-generation antipsychotics, the use of mood stabilizers and indications for electroconvulsive therapy. Other possible treatment strategies such as transcranial magnetic stimulation, vagus nerve stimulation, deep-brain stimulation and glucocorticoid receptor antagonists will be discussed. Current recommendations for the prevention of relapse and improvement of long-term outcome will be reviewed.

  3. The current situation of treatment systems for alcoholism in Korea.

    Science.gov (United States)

    Kim, Jee Wook; Lee, Boung Chul; Kang, Tae-Cheon; Choi, Ihn-Geun

    2013-02-01

    Alcoholism is becoming one of the most serious issues in Korea. The purpose of this review article was to understand the present status of the treatment system for alcoholism in Korea compared to the United States and to suggest its developmental direction in Korea. Current modalities of alcoholism treatment in Korea including withdrawal treatment, pharmacotherapy, and psychosocial treatment are available according to Korean evidence-based treatment guidelines. Benzodiazepines and supportive care including vitamin and nutritional support are mainly used to treat alcohol withdrawal in Korea. Naltrexone and acamprosate are the drugs of first choice to treat chronic alcoholism. Psychosocial treatment methods such as individual psychotherapy, group psychotherapy, family therapy, cognitive behavior therapy, cue exposure therapy, 12-step facilitation therapy, self-help group therapy, and community-based treatment have been carried out to treat chronic alcoholism in Korea. However, current alcohol treatment system in Korea is not integrative compared to that in the United States. To establish the treatment system, it is important to set up an independent governmental administration on alcohol abuse, to secure experts on alcoholism, and to conduct outpatient alcoholism treatment programs and facilities in an open system including some form of continuing care.

  4. Current Treatments of Bruxism.

    Science.gov (United States)

    Guaita, Marc; Högl, Birgit

    2016-02-01

    Despite numerous case reports, the evidence for treatment of bruxism is still low. Different treatment modalities (behavioral techniques, intraoral devices, medications, and contingent electrical stimulation) have been applied. A clinical evaluation is needed to differentiate between awake bruxism and sleep bruxism and rule out any medical disorder or medication that could be behind its appearance (secondary bruxism). A polysomnography is required only in a few cases of sleep bruxism, mostly when sleep comorbidities are present. Counselling with regard to sleep hygiene, sleep habit modification, and relaxation techniques has been suggested as the first step in the therapeutic intervention, and is generally considered not harmful, despite low evidence of any efficacy. Occlusal splints are successful in the prevention of dental damage and grinding sounds associated with sleep bruxism, but their effects on reducing bruxism electromyographic (EMG) events are transient. In patients with psychiatric and sleep comorbidities, the acute use of clonazepam at night has been reported to improve sleep bruxism, but in the absence of double-blind randomized trials, its use in general clinical practice cannot be recommended. Severe secondary bruxism interfering with speaking, chewing, or swallowing has been reported in patients with neurological disorders such as in cranial dystonia; in these patients, injections of botulinum toxin in the masticatory muscles may decrease bruxism for up to 1-5 months and improve pain and mandibular functions. Long-term studies in larger and better specified samples of patients with bruxism, comparing the effects of different therapeutic modalities on bruxism EMG activity, progression of dental wear, and orofacial pain are current gaps of knowledge and preclude the development of severity-based treatment guidelines.

  5. Control algorithm for the inverter fed induction motor drive with DC current feedback loop based on principles of the vector control

    Energy Technology Data Exchange (ETDEWEB)

    Vuckovic, V.; Vukosavic, S. (Electrical Engineering Inst. Nikola Tesla, Viktora Igoa 3, Belgrade, 11000 (Yugoslavia))

    1992-01-01

    This paper brings out a control algorithm for VSI fed induction motor drives based on the converter DC link current feedback. It is shown that the speed and flux can be controlled over the wide speed and load range quite satisfactorily for simpler drives. The base commands of both the inverter voltage and frequency are proportional to the reference speed, but each of them is further modified by the signals derived from the DC current sensor. The algorithm is based on the equations well known from the vector control theory, and is aimed to obtain the constant rotor flux and proportionality between the electrical torque, the slip frequency and the active component of the stator current. In this way, the problems of slip compensation, Ri compensation and correction of U/f characteristics are solved in the same time. Analytical considerations and computer simulations of the proposed control structure are in close agreement with the experimental results measured on a prototype drive.

  6. Transcranial direct current stimulation as a treatment for auditory hallucinations.

    Directory of Open Access Journals (Sweden)

    Sanne eKoops

    2015-03-01

    Full Text Available Auditory hallucinations (AH are a symptom of several psychiatric disorders, such as schizophrenia. In a significant minority of patients, AH are resistant to antipsychotic medication. Alternative treatment options for this medication-resistant group are scarce and most of them focus on coping with the hallucinations. Finding an alternative treatment that can diminish AH is of great importance.Transcranial direct current stimulation (tDCS is a safe and non-invasive technique that is able to directly influence cortical excitability through the application of very low electric currents. A 1-2 mA direct current is applied between two surface electrodes, one serving as the anode and the other as the cathode. Cortical excitability is increased in the vicinity of the anode and reduced near the cathode. The technique, which has only a few transient side effects and is cheap and portable, is increasingly explored as a treatment for neurological and psychiatric symptoms. It has shown efficacy on symptoms of depression, bipolar disorder, schizophrenia, Alzheimer’s disease, Parkinson’s disease, epilepsy and stroke. However, the application of tDCS as a treatment for AH is relatively new. This article provides an overview of the current knowledge in this field and provides guidelines for future research.

  7. Treatment Algorithm for Patients with Non-arthritic Hip Pain, Suspect for an Intraarticular Pathology

    DEFF Research Database (Denmark)

    Jørgensen, Rasmus Wejnold; Dippmann, Christian; Dahl, L

    2016-01-01

    BACKGROUND: The amount of patients referred with longstanding, non-arthritic hip pain is increasing, as are the treatment options. Left untreated hip dysplasia, acetabular retroversion and femoroacetabular impingement (FAI) may lead to osteoarthritis (OA). Finding the right treatment option...... associated with acetabular retroversion described in the literature were the crossover sign, the posterior wall sign and the ischial spine sign, while Wiberg's lateral center-edge angle (CE-angle) together with Leqeusne's acetabular index indicate hip dysplasia. A Tönnis index >2 indicates osteoarthritis...... for the right patient can be challenging in patients with non-arthritic hip pain. PURPOSE: The purpose of this study was to categorize the radiographic findings seen in patients with longstanding hip pain, suspect for an intraarticular pathology, and provide a treatment algorithm allocating a specific treatment...

  8. Diagnosis and management of transthyretin familial amyloid polyneuropathy in Japan: red-flag symptom clusters and treatment algorithm.

    Science.gov (United States)

    Sekijima, Yoshiki; Ueda, Mitsuharu; Koike, Haruki; Misawa, Sonoko; Ishii, Tomonori; Ando, Yukio

    2018-01-17

    Hereditary ATTR (ATTRm) amyloidosis (also called transthyretin-type familial amyloid polyneuropathy [ATTR-FAP]) is an autosomal-dominant, adult-onset, rare systemic disorder predominantly characterized by irreversible, progressive, and persistent peripheral nerve damage. TTR gene mutations (e.g. replacement of valine with methionine at position 30 [Val30Met (p.Val50Met)]) lead to destabilization and dissociation of TTR tetramers into variant TTR monomers, which form amyloid fibrils that deposit in peripheral nerves and various organs, giving rise to peripheral and autonomic neuropathy and several non-disease specific symptoms.Phenotypic and genetic variability and non-disease-specific symptoms often delay diagnosis and lead to misdiagnosis. Red-flag symptom clusters simplify diagnosis globally. However, in Japan, types of TTR variants, age of onset, penetrance, and clinical symptoms of Val30Met are more varied than in other countries. Hence, development of a Japan-specific red-flag symptom cluster is warranted. Presence of progressive peripheral sensory-motor polyneuropathy and ≥1 red-flag sign/symptom (e.g. family history, autonomic dysfunction, cardiac involvement, carpal tunnel syndrome, gastrointestinal disturbances, unexplained weight loss, and immunotherapy resistance) suggests ATTR-FAP. Outside of Japan, pharmacotherapeutic options are first-line therapy. However, because of positive outcomes (better life expectancy and higher survival rates) with living donor transplant in Japan, liver transplantation remains first-line treatment, necessitating a Japan-specific treatment algorithm.Herein, we present a consolidated review of the ATTR-FAP Val30Met landscape in Japan and summarize findings from a medical advisory board meeting held in Tokyo on 18th August 2016, at which a Japan-specific ATTR-FAP red-flag symptom cluster and treatment algorithm was developed. Beside liver transplantation, a TTR-stabilizing agent (e.g. tafamidis) is a treatment option. Early

  9. [A diagnostic algorithm and treatment procedure in disordered vital functions in newborns admitted to a resuscitation ward].

    Science.gov (United States)

    Ostreĭkov, I F; Podkopaev, V N; Moiseev, D B; Karpysheva, E V; Markova, L A; Sizov, S V

    1997-01-01

    Total mortality decreased by 2.5 times in the wards for intensive care of the newborns in the Tushino Pediatric Hospital in 1996 and is now 7.6%. Such results are due to a complex of measures, one such measure being the development and introduction of an algorithm for the diagnosis and treatment of newborns hospitalized in intensive care wards. The algorithm facilitates the work of the staff, helps earlier diagnose a disease, and, hence, carry out timely scientifically based therapy.

  10. Acceleration feedback of a current-following synchronized control algorithm for telescope elevation axis

    Science.gov (United States)

    Tang, Tao; Zhang, Tong; Du, Jun-Feng; Ren, Ge; Tian, Jing

    2016-11-01

    This paper proposes a dual-motor configuration to enhance closed-loop performance of a telescope control system. Two identical motors are mounted on each side of a U-type frame to drive the telescope elevation axis instead of a single motor drive, which is usually used in a classical design. This new configuration and mechanism can reduce the motor to half the size used in the former design, and it also provides some other advantages. A master-slave current control mode is employed to synchronize the two motors. Acceleration feedback control is utilized to further enhance the servo performance. Extensive experiments are used to validate the effectiveness of the proposed control algorithm in synchronization, disturbance attenuation and low-velocity tracking.

  11. Induced current magnetic resonance electrical impedance tomography of brain tissues based on the J-substitution algorithm: a simulation study

    International Nuclear Information System (INIS)

    Liu Yang; Zhu Shanan; He Bin

    2009-01-01

    We have investigated induced current magnetic resonance electrical impedance tomography (IC-MREIT) by means of computer simulations. The J-substitution algorithm was implemented to solve the IC-MREIT reconstruction problem. By providing physical insight into the charge accumulating on the interfaces, the convergence characteristics of the reconstruction algorithm were analyzed. The simulation results conducted on different objects were well correlated with the proposed theoretical analysis. The feasibility of IC-MREIT to reconstruct the conductivity distribution of head-brain tissues was also examined in computer simulations using a multi-compartment realistic head model. The present simulation results suggest that IC-MREIT may have the potential to become a useful conductivity imaging technique.

  12. Treatment for primary hypothyroidism: current approaches and future possibilities

    Science.gov (United States)

    Chakera, Ali J; Pearce, Simon HS; Vaidya, Bijay

    2012-01-01

    Primary hypothyroidism is the most common endocrine disease. Although the diagnosis and treatment of hypothyroidism is often considered simple, there are large numbers of people with this condition who are suboptimally treated. Even in those people with hypothyroidism who are biochemically euthyroid on levothyroxine replacement there is a significant proportion who report poorer quality of life. This review explores the historical and current treatment options for hypothyroidism, reasons for and potential solutions to suboptimal treatment, and future possibilities in the treatment of hypothyroidism. PMID:22291465

  13. Principles of a new treatment algorithm in multiple sclerosis

    DEFF Research Database (Denmark)

    Hartung, Hans-Peter; Montalban, Xavier; Sorensen, Per Soelberg

    2011-01-01

    We are entering a new era in the management of patients with multiple sclerosis (MS). The first oral treatment (fingolimod) has now gained US FDA approval, addressing an unmet need for patients with MS who wish to avoid parenteral administration. A second agent (cladribine) is currently being...

  14. Current calibration, treatment, and treatment planning techniques among institutions participating in the Children's Oncology Group

    International Nuclear Information System (INIS)

    Urie, Marcia; FitzGerald, T.J.; Followill, David; Laurie, Fran; Marcus, Robert; Michalski, Jeff

    2003-01-01

    Purpose: To report current technology implementation, radiation therapy physics and treatment planning practices, and results of treatment planning exercises among 261 institutions belonging to the Children's Oncology Group (COG). Methods and Materials: The Radiation Therapy Committee of the newly formed COG mandated that each institution demonstrate basic physics and treatment planning abilities by satisfactorily completing a questionnaire and four treatment planning exercises designed by the Quality Assurance Review Center. The planning cases are (1) a maxillary sinus target volume (for two-dimensional planning), (2) a Hodgkin's disease mantle field (for irregular-field and off-axis dose calculations), (3) a central axis blocked case, and (4) a craniospinal irradiation case. The questionnaire and treatment plans were submitted (as of 1/30/02) by 243 institutions and completed satisfactorily by 233. Data from this questionnaire and analyses of the treatment plans with monitor unit calculations are presented. Results: Of the 243 clinics responding, 54% use multileaf collimators routinely, 94% use asymmetric jaws routinely, and 13% use dynamic wedges. Nearly all institutions calibrate their linear accelerators following American Association of Physicists in Medicine protocols, currently 16% with TG-51 and 81% with TG-21 protocol. Treatment planning systems are relied on very heavily for all calculations, including monitor units. Techniques and results of each of the treatment planning exercises are presented. Conclusions: Together, these data provide a unique compilation of current (2001) radiation therapy practices in institutions treating pediatric patients. Overall, the COG facilities have the equipment and the personnel to perform high-quality radiation therapy. With ongoing quality assurance review, radiation therapy compliance with COG protocols should be high

  15. WE-AB-209-06: Dynamic Collimator Trajectory Algorithm for Use in VMAT Treatment Deliveries

    Energy Technology Data Exchange (ETDEWEB)

    MacDonald, L [Department of Medical Physics, Dalhousie University, Halifax, Nova Scotia, CA (Canada); Thomas, C; Syme, A [Department of Medical Physics, Dalhousie University, Halifax, Nova Scotia, CA (Canada); Department of Radiation Oncology, Dalhousie University, Halifax, Nova Scotia (Canada); Medical Physics, Nova Scotia Cancer Centre, Halifax, Nova Scotia (Canada)

    2016-06-15

    Purpose: To develop advanced dynamic collimator positioning algorithms for optimal beam’s-eye-view (BEV) fitting of targets in VMAT procedures, including multiple metastases stereotactic radiosurgery procedures. Methods: A trajectory algorithm was developed, which can dynamically modify the angle of the collimator as a function of VMAT control point to provide optimized collimation of target volume(s). Central to this algorithm is a concept denoted “whitespace”, defined as area within the jaw-defined BEV field, outside of the PTV, and not shielded by the MLC when fit to the PTV. Calculating whitespace at all collimator angles and every control point, a two-dimensional topographical map depicting the tightness-of-fit of the MLC was generated. A variety of novel searching algorithms identified a number of candidate trajectories of continuous collimator motion. Ranking these candidate trajectories according to their accrued whitespace value produced an optimal solution for navigation of this map. Results: All trajectories were normalized to minimum possible (i.e. calculated without consideration of collimator motion constraints) accrued whitespace. On an acoustic neuroma case, a random walk algorithm generated a trajectory with 151% whitespace; random walk including a mandatory anchor point improved this to 148%; gradient search produced a trajectory with 137%; and bi-directional gradient search generated a trajectory with 130% whitespace. For comparison, a fixed collimator angle of 30° and 330° accumulated 272% and 228% of whitespace, respectively. The algorithm was tested on a clinical case with two metastases (single isocentre) and identified collimator angles that allow for simultaneous irradiation of the PTVs while minimizing normal tissue irradiation. Conclusion: Dynamic collimator trajectories have the potential to improve VMAT deliveries through increased efficiency and reduced normal tissue dose, especially in treatment of multiple cranial metastases

  16. WE-AB-209-06: Dynamic Collimator Trajectory Algorithm for Use in VMAT Treatment Deliveries

    International Nuclear Information System (INIS)

    MacDonald, L; Thomas, C; Syme, A

    2016-01-01

    Purpose: To develop advanced dynamic collimator positioning algorithms for optimal beam’s-eye-view (BEV) fitting of targets in VMAT procedures, including multiple metastases stereotactic radiosurgery procedures. Methods: A trajectory algorithm was developed, which can dynamically modify the angle of the collimator as a function of VMAT control point to provide optimized collimation of target volume(s). Central to this algorithm is a concept denoted “whitespace”, defined as area within the jaw-defined BEV field, outside of the PTV, and not shielded by the MLC when fit to the PTV. Calculating whitespace at all collimator angles and every control point, a two-dimensional topographical map depicting the tightness-of-fit of the MLC was generated. A variety of novel searching algorithms identified a number of candidate trajectories of continuous collimator motion. Ranking these candidate trajectories according to their accrued whitespace value produced an optimal solution for navigation of this map. Results: All trajectories were normalized to minimum possible (i.e. calculated without consideration of collimator motion constraints) accrued whitespace. On an acoustic neuroma case, a random walk algorithm generated a trajectory with 151% whitespace; random walk including a mandatory anchor point improved this to 148%; gradient search produced a trajectory with 137%; and bi-directional gradient search generated a trajectory with 130% whitespace. For comparison, a fixed collimator angle of 30° and 330° accumulated 272% and 228% of whitespace, respectively. The algorithm was tested on a clinical case with two metastases (single isocentre) and identified collimator angles that allow for simultaneous irradiation of the PTVs while minimizing normal tissue irradiation. Conclusion: Dynamic collimator trajectories have the potential to improve VMAT deliveries through increased efficiency and reduced normal tissue dose, especially in treatment of multiple cranial metastases

  17. MREIT experiments with 200 µA injected currents: a feasibility study using two reconstruction algorithms, SMM and harmonic BZ

    International Nuclear Information System (INIS)

    Arpinar, V E; Muftuler, L T; Hamamura, M J; Degirmenci, E

    2012-01-01

    Magnetic resonance electrical impedance tomography (MREIT) is a technique that produces images of conductivity in tissues and phantoms. In this technique, electrical currents are applied to an object and the resulting magnetic flux density is measured using magnetic resonance imaging (MRI) and the conductivity distribution is reconstructed using these MRI data. Currently, the technique is used in research environments, primarily studying phantoms and animals. In order to translate MREIT to clinical applications, strict safety standards need to be established, especially for safe current limits. However, there are currently no standards for safe current limits specific to MREIT. Until such standards are established, human MREIT applications need to conform to existing electrical safety standards in medical instrumentation, such as IEC601. This protocol limits patient auxiliary currents to 100 µA for low frequencies. However, published MREIT studies have utilized currents 10–400 times larger than this limit, bringing into question whether the clinical applications of MREIT are attainable under current standards. In this study, we investigated the feasibility of MREIT to accurately reconstruct the relative conductivity of a simple agarose phantom using 200 µA total injected current and tested the performance of two MREIT reconstruction algorithms. These reconstruction algorithms used are the iterative sensitivity matrix method (SMM) by Ider and Birgul (1998 Elektrik 6 215–25) with Tikhonov regularization and the harmonic B Z proposed by Oh et al (2003 Magn. Reason. Med. 50 875–8). The reconstruction techniques were tested at both 200 µA and 5 mA injected currents to investigate their noise sensitivity at low and high current conditions. It should be noted that 200 µA total injected current into a cylindrical phantom generates only 14.7 µA current in imaging slice. Similarly, 5 mA total injected current results in 367 µA in imaging slice. Total acquisition

  18. Selection and determination of beam weights based on genetic algorithms for conformal radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Xingen Wu; Zunliang Wang

    2000-01-01

    A genetic algorithm has been used to optimize the selection of beam weights for external beam three-dimensional conformal radiotherapy treatment planning. A fitness function is defined, which includes a difference function to achieve a least-square fit to doses at preselected points in a planning target volume, and a penalty item to constrain the maximum allowable doses delivered to critical organs. Adjustment between the dose uniformity within the target volume and the dose constraint to the critical structures can be achieved by varying the beam weight variables in the fitness function. A floating-point encoding schema and several operators, like uniform crossover, arithmetical crossover, geometrical crossover, Gaussian mutation and uniform mutation, have been used to evolve the population. Three different cases were used to verify the correctness of the algorithm and quality assessment based on dose-volume histograms and three-dimensional dose distributions were given. The results indicate that the genetic algorithm presented here has considerable potential. (author)

  19. Remote Sensing of Cloud Top Height from SEVIRI: Analysis of Eleven Current Retrieval Algorithms

    Science.gov (United States)

    Hamann, U.; Walther, A.; Baum, B.; Bennartz, R.; Bugliaro, L.; Derrien, M.; Francis, P. N.; Heidinger, A.; Joro, S.; Kniffka, A.; hide

    2014-01-01

    The role of clouds remains the largest uncertainty in climate projections. They influence solar and thermal radiative transfer and the earth's water cycle. Therefore, there is an urgent need for accurate cloud observations to validate climate models and to monitor climate change. Passive satellite imagers measuring radiation at visible to thermal infrared (IR) wavelengths provide a wealth of information on cloud properties. Among others, the cloud top height (CTH) - a crucial parameter to estimate the thermal cloud radiative forcing - can be retrieved. In this paper we investigate the skill of ten current retrieval algorithms to estimate the CTH using observations from the Spinning Enhanced Visible and InfraRed Imager (SEVIRI) onboard Meteosat Second Generation (MSG). In the first part we compare ten SEVIRI cloud top pressure (CTP) data sets with each other. The SEVIRI algorithms catch the latitudinal variation of the CTP in a similar way. The agreement is better in the extratropics than in the tropics. In the tropics multi-layer clouds and thin cirrus layers complicate the CTP retrieval, whereas a good agreement among the algorithms is found for trade wind cumulus, marine stratocumulus and the optically thick cores of the deep convective system. In the second part of the paper the SEVIRI retrievals are compared to CTH observations from the Cloud-Aerosol LIdar with Orthogonal Polarization (CALIOP) and Cloud Profiling Radar (CPR) instruments. It is important to note that the different measurement techniques cause differences in the retrieved CTH data. SEVIRI measures a radiatively effective CTH, while the CTH of the active instruments is derived from the return time of the emitted radar or lidar signal. Therefore, some systematic differences are expected. On average the CTHs detected by the SEVIRI algorithms are 1.0 to 2.5 kilometers lower than CALIOP observations, and the correlation coefficients between the SEVIRI and the CALIOP data sets range between 0.77 and 0

  20. FRAMEWORK FOR COMPARING SEGMENTATION ALGORITHMS

    Directory of Open Access Journals (Sweden)

    G. Sithole

    2015-05-01

    Full Text Available The notion of a ‘Best’ segmentation does not exist. A segmentation algorithm is chosen based on the features it yields, the properties of the segments (point sets it generates, and the complexity of its algorithm. The segmentation is then assessed based on a variety of metrics such as homogeneity, heterogeneity, fragmentation, etc. Even after an algorithm is chosen its performance is still uncertain because the landscape/scenarios represented in a point cloud have a strong influence on the eventual segmentation. Thus selecting an appropriate segmentation algorithm is a process of trial and error. Automating the selection of segmentation algorithms and their parameters first requires methods to evaluate segmentations. Three common approaches for evaluating segmentation algorithms are ‘goodness methods’, ‘discrepancy methods’ and ‘benchmarks’. Benchmarks are considered the most comprehensive method of evaluation. This paper shortcomings in current benchmark methods are identified and a framework is proposed that permits both a visual and numerical evaluation of segmentations for different algorithms, algorithm parameters and evaluation metrics. The concept of the framework is demonstrated on a real point cloud. Current results are promising and suggest that it can be used to predict the performance of segmentation algorithms.

  1. Therapeutic eyelids hygiene in the algorithms of prevention and treatment of ocular surface diseases

    Directory of Open Access Journals (Sweden)

    V. N. Trubilin

    2016-01-01

    Full Text Available When acute inflammation in anterior eye segment of a forward piece of an eye was stopped, ophthalmologists face a problem of absence of acute inflammation signs and at the same time complaints to the remain discomfort feelings. It causes dissatisfaction from the treatment. The complaints are typically caused by disturbance of tears productions. No accidental that the new group of diseases was allocated — the diseases of the ocular surface. Ocular surface is a difficult biologic system, including epithelium of the conjunctiva, cornea and limb, as well as the area costal margin eyelid and meibomian gland ducts. Pathological processes in conjunctiva, cornea and eyelids are linked with tears production. Ophthalmologists prescribes tears substitutions, providing short-term relief to patients. However, in respect that the lipid component of the tear film plays the key role in the preservation of its stability, eyelids hygiene is the basis for the treatment of dry eye associated with ocular surface diseases. Eyelids hygiene provides normal functioning of glands, restores the metabolic processes in skin and ensures the formation of a complete tear film. Protection of eyelids, especially the marginal edge from aggressive environmental agents, infections and parasites and is the basis for the prevention and treatment of blepharitis and dry eye syndrome. The most common clinical situations and algorithms of their treatment and prevention of dysfunction of the meibomian glands; demodectic blepharitis; seborrheic blepharitis; staphylococcal blepharitis; allergic blepharitis; barley and chalazion are discussed in the article. The prevention keratoconjunctival xerosis (before and postoperative period, caused by contact lenses, computer vision syndrome, remission after acute conjunctiva and cornea inflammation is also presented. The first part of the article presents the treatment and prevention algorithms for dysfunction of the meibomian glands, as well as

  2. Introduction to Evolutionary Algorithms

    CERN Document Server

    Yu, Xinjie

    2010-01-01

    Evolutionary algorithms (EAs) are becoming increasingly attractive for researchers from various disciplines, such as operations research, computer science, industrial engineering, electrical engineering, social science, economics, etc. This book presents an insightful, comprehensive, and up-to-date treatment of EAs, such as genetic algorithms, differential evolution, evolution strategy, constraint optimization, multimodal optimization, multiobjective optimization, combinatorial optimization, evolvable hardware, estimation of distribution algorithms, ant colony optimization, particle swarm opti

  3. A Review of Closed-Loop Algorithms for Glycemic Control in the Treatment of Type 1 Diabetes

    Directory of Open Access Journals (Sweden)

    Joseph El Youssef

    2009-03-01

    Full Text Available With the discovery of insulin came a deeper understanding of therapeutic options for one of the most devastating chronic diseases of the modern era, diabetes mellitus. The use of insulin in the treatment of diabetes, especially in those with severe insulin deficiency (type 1 diabetes, with multiple injections or continuous subcutaneous infusion, has been largely successful, but the risk for short term and long term complications remains substantial. Insulin treatment decisions are based on the patient’s knowledge of meal size, exercise plans and the intermittent knowledge of blood glucose values. As such, these are open loop methods that require human input. The idea of closed loop control of diabetes treatment is quite different: automated control of a device that delivers insulin (and possibly glucagon or other medications and is based on continuous or very frequent glucose measurements. Closed loop insulin control for type 1 diabetes is not new but is far from optimized. The goal of such a system is to avoid short-term complications (hypoglycemia and long-term complications (diseases of the eyes, kidneys, nerves and cardiovascular system by mimicking the normal insulin secretion pattern of the pancreatic beta cell. A control system for automated diabetes treatment consists of three major components, (1 a glucose sensing device that serves as the afferent limb of the system; (2 an automated control unit that uses algorithms which acquires sensor input and generates treatment outputs; and (3 a drug delivery device (primarily for delivery of insulin, which serves as the system’s efferent limb. There are several major issues that highlight the difficulty of interacting with the complex unknowns of the biological world. For example, development of accurate continuous glucose monitors is crucial; the state of the art in 2009 is that such devices sometimes experience drift and are intended only to supplement information received from standard

  4. SU-F-SPS-04: Dosimetric Evaluation of the Dose Calculation Accuracy of Different Algorithms for Two Different Treatment Techniques During Whole Breast Irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Pacaci, P; Cebe, M; Mabhouti, H; Codel, G; Serin, E; Sanli, E; Kucukmorkoc, E; Doyuran, M; Kucuk, N; Canoglu, D; Altinok, A; Acar, H; Caglar Ozkok, H [Medipol University, Istanbul, Istanbul (Turkey)

    2016-06-15

    Purpose: In this study, dosimetric comparison of field in field (FIF) and intensity modulated radiation therapy (IMRT) techniques used for treatment of whole breast radiotherapy (WBRT) were made. The dosimetric accuracy of treatment planning system (TPS) for Anisotropic Analytical Algorithm (AAA) and Acuros XB (AXB) algorithms in predicting PTV and OAR doses was also investigated. Methods: Two different treatment planning techniques of left-sided breast cancer were generated for rando phantom. FIF and IMRT plans were compared for doses in PTV and OAR volumes including ipsilateral lung, heart, left ascending coronary artery, contralateral lung and the contralateral breast. PTV and OARs doses and homogeneity and conformality indexes were compared between two techniques. The accuracy of TPS dose calculation algorithms was tested by comparing PTV and OAR doses measured by thermoluminescent dosimetry with the dose calculated by the TPS using AAA and AXB for both techniques. Results: IMRT plans had better conformality and homogeneity indexes than FIF technique and it spared OARs better than FIF. While both algorithms overestimated PTV doses they underestimated all OAR doses. For IMRT plan, PTV doses, overestimation up to 2.5 % was seen with AAA algorithm but it decreased to 1.8 % when AXB algorithm was used. Based on the results of the anthropomorphic measurements for OAR doses, underestimation greater than 7 % is possible by the AAA. The results from the AXB are much better than the AAA algorithm. However, underestimations of 4.8 % were found in some of the points even for AXB. For FIF plan, similar trend was seen for PTV and OARs doses in both algorithm. Conclusion: When using the Eclipse TPS for breast cancer, AXB the should be used instead of the AAA algorithm, bearing in mind that the AXB may still underestimate all OAR doses.

  5. Current developments in bovine mastitis treatment and control.

    Science.gov (United States)

    Wager, L A; Linquist, W E; Hayes, G L; Britten, A M; Whitehead, R G; Webster, D E; Barnes, F D

    1978-01-01

    Mastitis in its complexity has managed to forestall all efforts of eradication in spite of years of research, antibiotics and practical control measures. This minisymposium will touch on seven topics current to treatment and control of this economically important disease.

  6. Current Diagnosis, Treatment and Etiology of Status Epilepticus

    Directory of Open Access Journals (Sweden)

    Çetin Kürşad Akpınar

    2014-03-01

    Full Text Available Status Epilepticus (SE is a medical emergency that causes significant morbidity and mortality and requires prompt diagnosis and treatment. Although SE can be divided into two subgroups as convulsive and nonconvulsive, treatment principles are generally similar. Treatment should be prompt and underlying cause should be corrected. Although intravenous lorazepam is the first-line treatment due to a lower risk of relapse, diazepam becomes the first choice since loeazepan is not available in our country. Even though intravenous benzodiazepine stops seizures, intravenous antiepileptic drug (phenytoin, etc. should be administered at a loading dose. Patients with refractory status epilepticus should be supported with respect to vital, respiratory, metabolic and hemodynamic aspects and followed up in an intensive care unit to monitor cerebral electrical activity. The most common cause in the etiology is the cessation of antiepileptic drugs. The aim of SE treatment is to stop seizures and prevent complications and recurrence. In this paper, current diagnosis, treatment and etiology of SE are reviewed.

  7. Breast cancer treatment: historical review and current approaches

    International Nuclear Information System (INIS)

    Kulakowski, A.

    1994-01-01

    The evolution and development of opinions on the diagnosis and treatment of breast cancer since Galen to present time is presented. The concept of breast cancer as a local disease has been replaced by the understanding of its systemic character. On this background described are the methods of surgical treatment beginning from early - supraradical, to present -conservative approaches. The ''milestones'' in diagnosis and treatment of breast cancer of the last 40 years are presented. Current methods of breast cancer management include correct diagnosis (clinical examination, mammography, ultrasound, fine needle aspiration biopsy), TNM staging, adequate loco-regional therapy, systemic therapy, rehabilitation, reconstruction and careful follow-up. (author)

  8. Algorithm FIRE-Feynman Integral REduction

    International Nuclear Information System (INIS)

    Smirnov, A.V.

    2008-01-01

    The recently developed algorithm FIRE performs the reduction of Feynman integrals to master integrals. It is based on a number of strategies, such as applying the Laporta algorithm, the s-bases algorithm, region-bases and integrating explicitly over loop momenta when possible. Currently it is being used in complicated three-loop calculations.

  9. Golden Sine Algorithm: A Novel Math-Inspired Algorithm

    Directory of Open Access Journals (Sweden)

    TANYILDIZI, E.

    2017-05-01

    Full Text Available In this study, Golden Sine Algorithm (Gold-SA is presented as a new metaheuristic method for solving optimization problems. Gold-SA has been developed as a new search algorithm based on population. This math-based algorithm is inspired by sine that is a trigonometric function. In the algorithm, random individuals are created as many as the number of search agents with uniform distribution for each dimension. The Gold-SA operator searches to achieve a better solution in each iteration by trying to bring the current situation closer to the target value. The solution space is narrowed by the golden section so that the areas that are supposed to give only good results are scanned instead of the whole solution space scan. In the tests performed, it is seen that Gold-SA has better results than other population based methods. In addition, Gold-SA has fewer algorithm-dependent parameters and operators than other metaheuristic methods, increasing the importance of this method by providing faster convergence of this new method.

  10. A novel algorithm for discrimination between inrush current and internal faults in power transformer differential protection based on discrete wavelet transform

    Energy Technology Data Exchange (ETDEWEB)

    Eldin, A.A. Hossam; Refaey, M.A. [Electrical Engineering Department, Alexandria University, Alexandria (Egypt)

    2011-01-15

    This paper proposes a novel methodology for transformer differential protection, based on wave shape recognition of the discriminating criterion extracted of the instantaneous differential currents. Discrete wavelet transform has been applied to the differential currents due to internal fault and inrush currents. The diagnosis criterion is based on median absolute deviation (MAD) of wavelet coefficients over a specified frequency band. The proposed algorithm is examined using various simulated inrush and internal fault current cases on a power transformer that has been modeled using electromagnetic transients program EMTDC software. Results of evaluation study show that, proposed wavelet based differential protection scheme can discriminate internal faults from inrush currents. (author)

  11. Differences in absorbed doses at risk organs and target tumoral of planning(PTV) in lung treatments using two algorithms of different calculations

    International Nuclear Information System (INIS)

    Uruena Llinares, A.; Santos Rubio, A.; Luis Simon, F. J.; Sanchez Carmona, G.; Herrador Cordoba, M.

    2006-01-01

    The objective of this paper is to compare, in thirty treatments for lung cancer,the absorbed doses at risk organs and target volumes obtained between the two used algorithms of calculation of our treatment planning system Oncentra Masterplan, that is, Pencil Beams vs Collapsed Cone. For it we use a set of measured indicators (D1 and D99 of tumor volume, V20 of lung, homogeneity index defined as (D5-D95)/D prescribed, and others). Analysing the dta, making a descriptor analysis of the results, and applying the non parametric test of the ranks with sign of Wilcoxon we find that the use of Pencil Beam algorithm underestimates the dose in the zone of the PTV including regions of low density as well as the values of maximum dose in spine cord. So, we conclude that in those treatments in which the spine dose is near the maximum permissible limit or those in which the PTV it includes a zone with pulmonary tissue must be used the Collapse Cone algorithm systematically and in any case an analysis must become to choose between time and precision in the calculation for both algorithms. (Authors)

  12. An algorithm for identifying the best current friend in a social network

    Directory of Open Access Journals (Sweden)

    Francisco Javier Moreno

    2015-05-01

    Full Text Available A research field in the area of social networks (SNs is the identification of some types of users and groups. To facilitate this process, a SN is usually represented by a graph. The centrality measures, which identify the most important vertices in a graph according to some criterion, are usual tools to analyze a graph. One of these measures is the PageRank (a measure originally designed to classify web pages. Informally, in the context of a SN, the PageRank of a user i represents the probability that another user of the SN is seeing the page of i after a considerable time of navigation in the SN. In this paper, we define a new type of user in a SN: the best current friend. The idea is to identify, among the friends of a user i, who is the friend k that would generate the highest decrease in the PageRank of i if k stops being his/her friend. This may be useful to identify the users/customers whose friendship/relationship should be a priority to keep. We provide formal definitions, algorithms and some experiments for this subject. Our experiments showed that the best current friend of a user is not necessarily the one who has the highest PageRank in the SN nor the one who has more friends.

  13. Algorithms for Computing the Magnetic Field, Vector Potential, and Field Derivatives for a Thin Solenoid with Uniform Current Density

    Energy Technology Data Exchange (ETDEWEB)

    Walstrom, Peter Lowell [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-07

    A numerical algorithm for computing the field components Br and Bz and their r and z derivatives with open boundaries in cylindrical coordinates for radially thin solenoids with uniform current density is described in this note. An algorithm for computing the vector potential Aθ is also described. For the convenience of the reader, derivations of the final expressions from their defining integrals are given in detail, since their derivations are not all easily found in textbooks. Numerical calculations are based on evaluation of complete elliptic integrals using the Bulirsch algorithm cel. The (apparently) new feature of the algorithms described in this note applies to cases where the field point is outside of the bore of the solenoid and the field-point radius approaches the solenoid radius. Since the elliptic integrals of the third kind normally used in computing Bz and Aθ become infinite in this region of parameter space, fields for points with the axial coordinate z outside of the ends of the solenoid and near the solenoid radius are treated by use of elliptic integrals of the third kind of modified argument, derived by use of an addition theorem. Also, the algorithms also avoid the numerical difficulties the textbook solutions have for points near the axis arising from explicit factors of 1/r or 1/r2 in the some of the expressions.

  14. Evaluation of margining algorithms in commercial treatment planning systems

    International Nuclear Information System (INIS)

    Pooler, Alistair M.; Mayles, Helen M.; Naismith, Olivia F.; Sage, John P.; Dearnaley, David P.

    2008-01-01

    Introduction: During commissioning of the Pinnacle (Philips) treatment planning system (TPS) the margining algorithm was investigated and was found to produce larger PTVs than Plato (Nucletron) for identical GTVs. Subsequent comparison of PTV volumes resulting from the QA outlining exercise for the CHHIP (Conventional or Hypofractionated High Dose IMRT for Prostate Ca.) trial confirmed that there were differences in TPS's margining algorithms. Margining and the clinical impact of the different PTVs in seven different planning and virtual simulation systems (Pinnacle, Plato, Prosoma (MedCom), Eclipse (7.3 and 7.5) (Varian), MasterPlan (Nucletron), Xio (CMS) and Advantage Windows (AW) (GE)) is investigated, and a simple test for 3D margining consistency is proposed. Methods: Using each TPS, two different sets of prostate GTVs on 2.5 mm and 5 mm slices were margined according to the CHHIP protocol to produce PTV3 (prostate + 5 mm/0 mm post), PTV2 (PTV3 + 5 mm) and PTV1 (prostate and seminal vesicles + 10 mm). GTVs and PTVs were imported into Pinnacle for volume calculation. DVHs for 5 mm slice plans, created using the smallest PTVs, were recalculated on the largest PTV dataset and vice versa. Since adding a margin of 50 mm to a structure should give the same result as adding five margins of 10 mm, this was tested for each TPS (consistency test) using an octahedron as the GTV and CT datasets with 2.5 mm and 5 mm slices. Results: The CHHIP PTV3 and PTV1 volumes had a standard deviation, across the seven systems, of 5% and PTV2 (margined twice) 9%, on the 5 mm slices. For 2.5 mm slices the standard deviations were 4% and 6%. The ratio of the Pinnacle and the Eclipse 7.3 PTV2 volumes was 1.25. Rectal doses were significantly increased when encompassing Pinnacle PTVs (V 50 42.8%), compared to Eclipse 7.3 PTVs (V 50 = 36.4%). Conversely, fields that adequately treated an Eclipse 7.3 PTV2 were inadequate for a Pinnacle PTV2. AW and Plato PTV volumes were the most consistent

  15. Current strategies for the treatment of aneurysmal bone cysts

    Directory of Open Access Journals (Sweden)

    Panagiotis Tsagozis

    2015-12-01

    Full Text Available Aneurysmal bone cysts are benign bone tumors that usually present in childhood and early adulthood. They usually manifest as expansile osteolytic lesions with a varying potential to be locally aggressive. Since their first description in 1942, a variety of treatment methods has been proposed. Traditionally, these tumors were treated with open surgery. Either intralesional surgical procedures or en bloc excisions have been described. Furthermore, a variety of chemical or physical adjuvants has been utilized in order to reduce the risk for local recurrence after excision. Currently, there is a shift to more minimally invasive procedures in order to avoid the complications of open surgical excision. Good results have been reported during percutaneous surgery, or the use of embolization. Recently, sclerotherapy has emerged as a promising treatment, showing effective consolidation of the lesions and functional results that appear to be superior to the ones of open surgery. Lastly, non-invasive treatment, such as pharmaceutical intervention with denosumab or bisphosphonates has been reported to be effective in the management of the disease. Radiotherapy has also been shown to confer good local control, either alone or in conjunction to other treatment modalities, but is associated with serious adverse effects. Here, we review the current literature on the methods of treatment of aneurysmal bone cysts. The indication for each type of treatment along reported outcome of the intervention, as well as potential complications are systematically presented. Our review aims to increase awareness of the different treatment modalities and facilitate decision-making regarding each individual patient.

  16. Modified Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Surafel Luleseged Tilahun

    2012-01-01

    Full Text Available Firefly algorithm is one of the new metaheuristic algorithms for optimization problems. The algorithm is inspired by the flashing behavior of fireflies. In the algorithm, randomly generated solutions will be considered as fireflies, and brightness is assigned depending on their performance on the objective function. One of the rules used to construct the algorithm is, a firefly will be attracted to a brighter firefly, and if there is no brighter firefly, it will move randomly. In this paper we modify this random movement of the brighter firefly by generating random directions in order to determine the best direction in which the brightness increases. If such a direction is not generated, it will remain in its current position. Furthermore the assignment of attractiveness is modified in such a way that the effect of the objective function is magnified. From the simulation result it is shown that the modified firefly algorithm performs better than the standard one in finding the best solution with smaller CPU time.

  17. Angina pectoris: current therapy and future treatment options.

    Science.gov (United States)

    Parikh, Raj; Kadowitz, Philip J

    2014-02-01

    Angina pectoris is the consequence of an inequality between the demand and supply of blood to the heart. Angina manifests itself as chest pain or discomfort and is a common complaint of patients in the hospital and in the clinic. There are, in fact, roughly half a million new cases of angina per year. Chest pain, while having many etiologies, is generally considered to be most lethal when related to a cardiac cause. In this review, the authors outline the current medical and surgical therapies that are used in the management of angina. Highlights of the various clinical trials that have assisted in the investigation of these therapies are summarized also. Then, the authors provide a focused review of the novel therapy options for angina that are currently being explored. From new medical treatments to revised surgical techniques to the discovery of stem cell therapy, many innovative options are being investigated for the treatment of angina.

  18. Ambulance Clinical Triage for Acute Stroke Treatment: Paramedic Triage Algorithm for Large Vessel Occlusion.

    Science.gov (United States)

    Zhao, Henry; Pesavento, Lauren; Coote, Skye; Rodrigues, Edrich; Salvaris, Patrick; Smith, Karen; Bernard, Stephen; Stephenson, Michael; Churilov, Leonid; Yassi, Nawaf; Davis, Stephen M; Campbell, Bruce C V

    2018-04-01

    Clinical triage scales for prehospital recognition of large vessel occlusion (LVO) are limited by low specificity when applied by paramedics. We created the 3-step ambulance clinical triage for acute stroke treatment (ACT-FAST) as the first algorithmic LVO identification tool, designed to improve specificity by recognizing only severe clinical syndromes and optimizing paramedic usability and reliability. The ACT-FAST algorithm consists of (1) unilateral arm drift to stretcher <10 seconds, (2) severe language deficit (if right arm is weak) or gaze deviation/hemineglect assessed by simple shoulder tap test (if left arm is weak), and (3) eligibility and stroke mimic screen. ACT-FAST examination steps were retrospectively validated, and then prospectively validated by paramedics transporting culturally and linguistically diverse patients with suspected stroke in the emergency department, for the identification of internal carotid or proximal middle cerebral artery occlusion. The diagnostic performance of the full ACT-FAST algorithm was then validated for patients accepted for thrombectomy. In retrospective (n=565) and prospective paramedic (n=104) validation, ACT-FAST displayed higher overall accuracy and specificity, when compared with existing LVO triage scales. Agreement of ACT-FAST between paramedics and doctors was excellent (κ=0.91; 95% confidence interval, 0.79-1.0). The full ACT-FAST algorithm (n=60) assessed by paramedics showed high overall accuracy (91.7%), sensitivity (85.7%), specificity (93.5%), and positive predictive value (80%) for recognition of endovascular-eligible LVO. The 3-step ACT-FAST algorithm shows higher specificity and reliability than existing scales for clinical LVO recognition, despite requiring just 2 examination steps. The inclusion of an eligibility step allowed recognition of endovascular-eligible patients with high accuracy. Using a sequential algorithmic approach eliminates scoring confusion and reduces assessment time. Future

  19. Oligometastatic non-small-cell lung cancer: current treatment strategies

    Directory of Open Access Journals (Sweden)

    Richard PJ

    2016-11-01

    Full Text Available Patrick J Richard, Ramesh Rengan Department of Radiation Oncology, University of Washington, Seattle, WA, USA Abstract: The oligometastatic disease theory was initially described in 1995 by Hellman and Weichselbaum. Since then, much work has been performed to investigate its existence in many solid tumors. This has led to subclassifications of stage IV cancer, which could redefine our treatment approaches and the therapeutic outcomes for this historically “incurable” entity. With a high incidence of stage IV disease, non-small-cell lung cancer (NSCLC remains a difficult cancer to treat and cure. Recent work has proven the existence of an oligometastatic state in NSCLC in terms of properly selecting patients who may benefit from aggressive therapy and experience long-term overall survival. This review discusses the current treatment approaches used in oligometastatic NSCLC and provides the evidence and rationale for each approach. The prognostic factors of many trials are discussed, which can be used to properly select patients for aggressive treatment regimens. Future advances in both molecular profiling of NSCLC to find targetable mutations and investigating patient selection may increase the number of patients diagnosed with oligometastatic NSCLC. As this disease entity increases, it is of utmost importance for oncologists treating NSCLC to be aware of the current treatment strategies that exist and the potential advantages/disadvantages of each. Keywords: oligometastatic, non-small-cell lung cancer, oligoprogressive, treatment

  20. SURGICAL TREATMENT OF HEMORRHOIDS: A CRITICAL APPRAISAL OF THE CURRENT OPTIONS

    Science.gov (United States)

    CERATO, Marlise Mello; CERATO, Nilo Luiz; PASSOS, Patrícia; TREIGUE, Alberto; DAMIN, Daniel C.

    2014-01-01

    Introduction Surgical treatment of hemorrhoids is still a dilemma. New techniques have been developed leading to a lower rate of postoperative pain; however, they are associated with a greater likelihood of recurrence. Aim To review current indications as well as the results and complications of the main techniques currently used in the surgical treatment of hemorrhoidal disease. Methods A systematic search of the published data on the options for treatment of hemorrhoids up to December 2012 was conducted using Medline/PubMed, Cochrane, and UpToDate. Results Currently available surgical treatment options include procedure for prolapse and hemorrhoids (PPH), transanal hemorrhoidal dearterialization (THD), and conventional hemorrhoidectomy techniques. Excisional techniques showed similar results regarding pain, time to return to normal activities, and complication rates. PPH and THD were associated with less postoperative pain and lower complication rates; however, both had higher postoperative recurrence rates. Conclusion Conventional surgical techniques yield better long-term results. Despite good results in the immediate postoperative period, PPH and THD have not shown consistent long-term favorable results. PMID:24676303

  1. SU-E-T-202: Impact of Monte Carlo Dose Calculation Algorithm On Prostate SBRT Treatments

    Energy Technology Data Exchange (ETDEWEB)

    Venencia, C; Garrigo, E; Cardenas, J; Castro Pena, P [Instituto de Radioterapia - Fundacion Marie Curie, Cordoba (Argentina)

    2014-06-01

    Purpose: The purpose of this work was to quantify the dosimetric impact of using Monte Carlo algorithm on pre calculated SBRT prostate treatment with pencil beam dose calculation algorithm. Methods: A 6MV photon beam produced by a Novalis TX (BrainLAB-Varian) linear accelerator equipped with HDMLC was used. Treatment plans were done using 9 fields with Iplanv4.5 (BrainLAB) and dynamic IMRT modality. Institutional SBRT protocol uses a total dose to the prostate of 40Gy in 5 fractions, every other day. Dose calculation is done by pencil beam (2mm dose resolution), heterogeneity correction and dose volume constraint (UCLA) for PTV D95%=40Gy and D98%>39.2Gy, Rectum V20Gy<50%, V32Gy<20%, V36Gy<10% and V40Gy<5%, Bladder V20Gy<40% and V40Gy<10%, femoral heads V16Gy<5%, penile bulb V25Gy<3cc, urethra and overlap region between PTV and PRV Rectum Dmax<42Gy. 10 SBRT treatments plans were selected and recalculated using Monte Carlo with 2mm spatial resolution and mean variance of 2%. DVH comparisons between plans were done. Results: The average difference between PTV doses constraints were within 2%. However 3 plans have differences higher than 3% which does not meet the D98% criteria (>39.2Gy) and should have been renormalized. Dose volume constraint differences for rectum, bladder, femoral heads and penile bulb were les than 2% and within tolerances. Urethra region and overlapping between PTV and PRV Rectum shows increment of dose in all plans. The average difference for urethra region was 2.1% with a maximum of 7.8% and for the overlapping region 2.5% with a maximum of 8.7%. Conclusion: Monte Carlo dose calculation on dynamic IMRT treatments could affects on plan normalization. Dose increment in critical region of urethra and PTV overlapping region with PTV could have clinical consequences which need to be studied. The use of Monte Carlo dose calculation algorithm is limited because inverse planning dose optimization use only pencil beam.

  2. Current treatments for patients with Alzheimer disease.

    Science.gov (United States)

    Osborn, Gerald G; Saunders, Amanda Vaughn

    2010-09-01

    There is neither proven effective prevention for Alzheimer disease nor a cure for patients with this disorder. Nevertheless, a spectrum of biopsychosocial therapeutic measures is available for slowing progression of the illness and enhancing quality of life for patients. These measures include a range of educational, psychological, social, and behavioral interventions that remain fundamental to effective care. Also available are a number of pharmacologic treatments, including prescription medications approved by the US Food and Drug Administration for Alzheimer disease, "off-label" uses of medications to manage target symptoms, and controversial complementary therapies. Physicians must make the earliest possible diagnosis to use these treatments most effectively. Physicians' goals should be to educate patients and their caregivers, to plan long-term care options, to maximally manage concurrent illnesses, to slow and ameliorate the most disabling symptoms, and to preserve effective functioning for as long as possible. The authors review the various current treatments for patients with Alzheimer disease.

  3. Evaluation of dose calculation algorithms using the treatment planning system Xi O with tissue heterogeneity correction turned on

    International Nuclear Information System (INIS)

    Fairbanks, Leandro R.; Barbi, Gustavo L.; Silva, Wiliam T.; Reis, Eduardo G.F.; Borges, Leandro F.; Bertucci, Edenyse C.; Maciel, Marina F.; Amaral, Leonardo L.

    2011-01-01

    Since the cross-section for various radiation interactions is dependent upon tissue material, the presence of heterogeneities affects the final dose delivered. This paper aims to analyze how different treatment planning algorithms (Fast Fourier Transform, Convolution, Superposition, Fast Superposition and Clarkson) work when heterogeneity corrections are used. To that end, a farmer-type ionization chamber was positioned reproducibly (during the time of CT as well as irradiation) inside several phantoms made of aluminum, bone, cork and solid water slabs. The percent difference between the dose measured and calculated by the various algorithms was less than 5%.The convolution method shows better results for high density materials (difference ∼1 %), whereas the Superposition algorithm is more accurate for low densities (around 1,1%). (author)

  4. Digital image processing an algorithmic approach with Matlab

    CERN Document Server

    Qidwai, Uvais

    2009-01-01

    Introduction to Image Processing and the MATLAB EnvironmentIntroduction Digital Image Definitions: Theoretical Account Image Properties MATLAB Algorithmic Account MATLAB CodeImage Acquisition, Types, and File I/OImage Acquisition Image Types and File I/O Basics of Color Images Other Color Spaces Algorithmic Account MATLAB CodeImage ArithmeticIntroduction Operator Basics Theoretical TreatmentAlgorithmic Treatment Coding ExamplesAffine and Logical Operations, Distortions, and Noise in ImagesIntroduction Affine Operations Logical Operators Noise in Images Distortions in ImagesAlgorithmic Account

  5. Using genetic algorithms to optimise current and future health planning - the example of ambulance locations

    Directory of Open Access Journals (Sweden)

    Suzuki Hiroshi

    2010-01-01

    Full Text Available Abstract Background Ambulance response time is a crucial factor in patient survival. The number of emergency cases (EMS cases requiring an ambulance is increasing due to changes in population demographics. This is decreasing ambulance response times to the emergency scene. This paper predicts EMS cases for 5-year intervals from 2020, to 2050 by correlating current EMS cases with demographic factors at the level of the census area and predicted population changes. It then applies a modified grouping genetic algorithm to compare current and future optimal locations and numbers of ambulances. Sets of potential locations were evaluated in terms of the (current and predicted EMS case distances to those locations. Results Future EMS demands were predicted to increase by 2030 using the model (R2 = 0.71. The optimal locations of ambulances based on future EMS cases were compared with current locations and with optimal locations modelled on current EMS case data. Optimising the location of ambulance stations locations reduced the average response times by 57 seconds. Current and predicted future EMS demand at modelled locations were calculated and compared. Conclusions The reallocation of ambulances to optimal locations improved response times and could contribute to higher survival rates from life-threatening medical events. Modelling EMS case 'demand' over census areas allows the data to be correlated to population characteristics and optimal 'supply' locations to be identified. Comparing current and future optimal scenarios allows more nuanced planning decisions to be made. This is a generic methodology that could be used to provide evidence in support of public health planning and decision making.

  6. Sportsman hernia; the review of current diagnosis and treatment modalities.

    Science.gov (United States)

    Paksoy, Melih; Sekmen, Ümit

    2016-01-01

    Groin pain is an important clinical entity that may affect a sportsman's active sports life. Sportsman's hernia is a chronic low abdominal and groin pain syndrome. Open and laparoscopic surgical treatment may be chosen in case of conservative treatment failure. Studies on sportsman's hernia, which is a challenging situation in both diagnosis and treatment, are ongoing in many centers. We reviewed the treatment results of 37 patients diagnosed and treated as sportsman's hernia at our hospital between 2011-2014, in light of current literature.

  7. Current trends in endodontic practice: emergency treatments and technological armamentarium.

    Science.gov (United States)

    Lee, Michelle; Winkler, Johnathon; Hartwell, Gary; Stewart, Jeffrey; Caine, Rufus

    2009-01-01

    The current clinical practice of endodontics includes the utilization of a variety of new technological advances and materials. The last comprehensive survey that compared treatment modalities used in endodontic practices was conducted in 1990. The purpose of the current survey was to determine the frequency with which these new endodontic technologies and materials are being used in endodontic practices today. An e-mail questionnaire was sent to the 636 active diplomates of the American Board of Endodontics with current e-mail addresses. Two hundred thirty-two diplomates responded for a response rate of 35%. Calcium hydroxide was found to be the most frequently used intracanal medicament for all cases diagnosed with necrotic pulps. Ibuprofen was the most frequently prescribed medication for pain, and penicillin was the most frequently prescribed antibiotic when an active infection was present. Eighty-two percent of the respondents are still incorporating hand files in some fashion during the cleansing and shaping phase of treatment. Lateral condensation and continuous wave were the most common methods used for obturation. Digital radiography was reported as being used by 72.5% of the respondents, whereas 45.3% reported using the microscope greater than 75% of the patient treatment. Ultrasonics was used by 97.8% of the respondents. It appears from the results that new endodontic technology is currently being used in the endodontic offices of those who responded to the survey.

  8. New and current preventive treatment options in actinic keratosis.

    Science.gov (United States)

    Arenberger, P; Arenbergerova, M

    2017-09-01

    Actinic keratosis (AK) is a characteristic skin lesion on skin areas of subjects with mainly phototype I and phototype II, or with specific genetic factors and who are exposed to prolonged ultraviolet radiation. AK may be considered a precursor of in situ squamous cell carcinoma (SCC), a type of non-melanoma skin cancer (NMSC). However, it is still not possible to predict which AK lesions will develop into SCC. Early treatment of AK is therefore recommended. Despite the increasing number of patients with AK developing into SCC, to date, there is still no clear suggestion of therapeutic strategy for AK. Current treatment consists of a multitude of topical lesion-directed or field-directed therapies or a combination of both. Recently, orally administered nicotinamide has shown to significantly reduce rates of new NMSC and AK in high-risk patients. This study aims to provide an update on the most relevant information about AK and to provide an insight into current and new treatment options. © 2017 European Academy of Dermatology and Venereology.

  9. The indications and timing for operative management of spinal epidural abscess: literature review and treatment algorithm.

    Science.gov (United States)

    Tuchman, Alexander; Pham, Martin; Hsieh, Patrick C

    2014-08-01

    Delayed or inappropriate treatment of spinal epidural abscess (SEA) can lead to serious morbidity or death. It is a rare event with significant variation in its causes, anatomical locations, and rate of progression. Traditionally the treatment of choice has involved emergency surgical evacuation and a prolonged course of antibiotics tailored to the offending pathogen. Recent publications have advocated antibiotic treatment without surgical decompression in select patient populations. Clearly defining those patients who can be safely treated in this manner remains in evolution. The authors review the current literature concerning the treatment and outcome of SEA to make recommendations concerning what population can be safely triaged to nonoperative management and the optimal timing of surgery. A PubMed database search was performed using a combination of search terms and Medical Subject Headings, to identify clinical studies reporting on the treatment and outcome of SEA. The literature review revealed 28 original case series containing at least 30 patients and reporting on treatment and outcome. All cohorts were deemed Class III evidence, and in all but two the data were obtained retrospectively. Based on the conclusions of these studies along with selected smaller studies and review articles, the authors present an evidence-based algorithm for selecting patients who may be safe candidates for nonoperative management. Patients who are unable to undergo an operation, have a complete spinal cord injury more than 48 hours with low clinical or radiographic concern for an ascending lesion, or who are neurologically stable and lack risk factors for failure of medical management may be initially treated with antibiotics alone and close clinical monitoring. If initial medical management is to be undertaken the patient should be made aware that delayed neurological deterioration may not fully resolve even after prompt surgical treatment. Patients deemed good surgical

  10. Narcolepsy: current treatment options and future approaches

    Directory of Open Access Journals (Sweden)

    Michel Billiard

    2008-06-01

    Full Text Available Michel BilliardDepartment of Neurology, Gui de Chauliac Hospital, Montpellier, FranceAbstract: The management of narcolepsy is presently at a turning point. Three main avenues are considered in this review: 1 Two tendencies characterize the conventional treatment of narcolepsy. Modafinil has replaced methylphenidate and amphetamine as the first-line treatment of excessive daytime sleepiness (EDS and sleep attacks, based on randomized, double blind, placebo-controlled clinical trials of modafinil, but on no direct comparison of modafinil versus traditional stimulants. For cataplexy, sleep paralysis, and hypnagogic hallucinations, new antidepressants tend to replace tricyclic antidepressants and selective serotonin reuptake inhibitors (SSRIs in spite of a lack of randomized, double blind, placebo-controlled clinical trials of these compounds; 2 The conventional treatment of narcolepsy is now challenged by sodium oxybate, the sodium salt of gammahydroxybutyrate, based on a series of randomized, double-blind, placebo-controlled clinical trials and a long-term open label study. This treatment has a fairly good efficacy and is active on all symptoms of narcolepsy. Careful titration up to an adequate level is essential both to obtain positive results and avoid adverse effects; 3 A series of new treatments are currently being tested, either in animal models or in humans, They include novel stimulant and anticataplectic drugs, endocrine therapy, and, more attractively, totally new approaches based on the present state of knowledge of the pathophysiology of narcolepsy with cataplexy, hypocretine-based therapies, and immunotherapy.Keywords: narcolepsy, treatment, conventional drugs, modafinil, sodium oxybate, future treatments

  11. Estimation of parameters for the electrostatic discharge current equation with real human discharge events reference using genetic algorithms

    International Nuclear Information System (INIS)

    Katsivelis, P S; Gonos, I F; Stathopulos, I A

    2010-01-01

    Thorough study of the electrostatic discharge (ESD) current equation shows that it may be different from the equation proposed in the IEC 61000-4-2 Standard. This problem is dealt with in this paper. Using a 2.5 GHz digital oscilloscope and a 50 Ω Pellegrini target as the measuring system, and a dc power supply to provide a charging voltage of 2 kVdc, a series of measurements were performed, so real human-to-metal ESD current waveforms were recorded. Treating the average waveform as a reference, a genetic algorithm (GA) was applied to the equation of the IEC 61000-4-2 Standard for the ESD current, in order to achieve its best fitting to the data set. Four different error norms were used for the GA applications. The best result of the applications of each of them was saved and compared to the others. Thus, a very satisfactory modification of the Standard's equation is presented, which is closer to the real ESD current waveform

  12. Automated treatment planning for a dedicated multi-source intracranial radiosurgery treatment unit using projected gradient and grassfire algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Ghobadi, Kimia; Ghaffari, Hamid R.; Aleman, Dionne M.; Jaffray, David A.; Ruschin, Mark [Department of Mechanical and Industrial Engineering, University of Toronto, 5 King' s College Road, Toronto, Ontario M5S 3G8 (Canada); Department of Radiation Oncology, University of Toronto, Radiation Medicine Program, Princess Margaret Hospital, 610 University Avenue, Toronto, Ontario M5G 2M9 (Canada)

    2012-06-15

    Purpose: The purpose of this work is to develop a framework to the inverse problem for radiosurgery treatment planning on the Gamma Knife{sup Registered-Sign} Perfexion Trade-Mark-Sign (PFX) for intracranial targets. Methods: The approach taken in the present study consists of two parts. First, a hybrid grassfire and sphere-packing algorithm is used to obtain shot positions (isocenters) based on the geometry of the target to be treated. For the selected isocenters, a sector duration optimization (SDO) model is used to optimize the duration of radiation delivery from each collimator size from each individual source bank. The SDO model is solved using a projected gradient algorithm. This approach has been retrospectively tested on seven manually planned clinical cases (comprising 11 lesions) including acoustic neuromas and brain metastases. Results: In terms of conformity and organ-at-risk (OAR) sparing, the quality of plans achieved with the inverse planning approach were, on average, improved compared to the manually generated plans. The mean difference in conformity index between inverse and forward plans was -0.12 (range: -0.27 to +0.03) and +0.08 (range: 0.00-0.17) for classic and Paddick definitions, respectively, favoring the inverse plans. The mean difference in volume receiving the prescribed dose (V{sub 100}) between forward and inverse plans was 0.2% (range: -2.4% to +2.0%). After plan renormalization for equivalent coverage (i.e., V{sub 100}), the mean difference in dose to 1 mm{sup 3} of brainstem between forward and inverse plans was -0.24 Gy (range: -2.40 to +2.02 Gy) favoring the inverse plans. Beam-on time varied with the number of isocenters but for the most optimal plans was on average 33 min longer than manual plans (range: -17 to +91 min) when normalized to a calibration dose rate of 3.5 Gy/min. In terms of algorithm performance, the isocenter selection for all the presented plans was performed in less than 3 s, while the SDO was performed in an

  13. Automated treatment planning for a dedicated multi-source intracranial radiosurgery treatment unit using projected gradient and grassfire algorithms.

    Science.gov (United States)

    Ghobadi, Kimia; Ghaffari, Hamid R; Aleman, Dionne M; Jaffray, David A; Ruschin, Mark

    2012-06-01

    The purpose of this work is to develop a framework to the inverse problem for radiosurgery treatment planning on the Gamma Knife(®) Perfexion™ (PFX) for intracranial targets. The approach taken in the present study consists of two parts. First, a hybrid grassfire and sphere-packing algorithm is used to obtain shot positions (isocenters) based on the geometry of the target to be treated. For the selected isocenters, a sector duration optimization (SDO) model is used to optimize the duration of radiation delivery from each collimator size from each individual source bank. The SDO model is solved using a projected gradient algorithm. This approach has been retrospectively tested on seven manually planned clinical cases (comprising 11 lesions) including acoustic neuromas and brain metastases. In terms of conformity and organ-at-risk (OAR) sparing, the quality of plans achieved with the inverse planning approach were, on average, improved compared to the manually generated plans. The mean difference in conformity index between inverse and forward plans was -0.12 (range: -0.27 to +0.03) and +0.08 (range: 0.00-0.17) for classic and Paddick definitions, respectively, favoring the inverse plans. The mean difference in volume receiving the prescribed dose (V(100)) between forward and inverse plans was 0.2% (range: -2.4% to +2.0%). After plan renormalization for equivalent coverage (i.e., V(100)), the mean difference in dose to 1 mm(3) of brainstem between forward and inverse plans was -0.24 Gy (range: -2.40 to +2.02 Gy) favoring the inverse plans. Beam-on time varied with the number of isocenters but for the most optimal plans was on average 33 min longer than manual plans (range: -17 to +91 min) when normalized to a calibration dose rate of 3.5 Gy/min. In terms of algorithm performance, the isocenter selection for all the presented plans was performed in less than 3 s, while the SDO was performed in an average of 215 min. PFX inverse planning can be performed using

  14. Automated treatment planning for a dedicated multi-source intracranial radiosurgery treatment unit using projected gradient and grassfire algorithms

    International Nuclear Information System (INIS)

    Ghobadi, Kimia; Ghaffari, Hamid R.; Aleman, Dionne M.; Jaffray, David A.; Ruschin, Mark

    2012-01-01

    Purpose: The purpose of this work is to develop a framework to the inverse problem for radiosurgery treatment planning on the Gamma Knife ® Perfexion™ (PFX) for intracranial targets. Methods: The approach taken in the present study consists of two parts. First, a hybrid grassfire and sphere-packing algorithm is used to obtain shot positions (isocenters) based on the geometry of the target to be treated. For the selected isocenters, a sector duration optimization (SDO) model is used to optimize the duration of radiation delivery from each collimator size from each individual source bank. The SDO model is solved using a projected gradient algorithm. This approach has been retrospectively tested on seven manually planned clinical cases (comprising 11 lesions) including acoustic neuromas and brain metastases. Results: In terms of conformity and organ-at-risk (OAR) sparing, the quality of plans achieved with the inverse planning approach were, on average, improved compared to the manually generated plans. The mean difference in conformity index between inverse and forward plans was −0.12 (range: −0.27 to +0.03) and +0.08 (range: 0.00–0.17) for classic and Paddick definitions, respectively, favoring the inverse plans. The mean difference in volume receiving the prescribed dose (V 100 ) between forward and inverse plans was 0.2% (range: −2.4% to +2.0%). After plan renormalization for equivalent coverage (i.e., V 100 ), the mean difference in dose to 1 mm 3 of brainstem between forward and inverse plans was −0.24 Gy (range: −2.40 to +2.02 Gy) favoring the inverse plans. Beam-on time varied with the number of isocenters but for the most optimal plans was on average 33 min longer than manual plans (range: −17 to +91 min) when normalized to a calibration dose rate of 3.5 Gy/min. In terms of algorithm performance, the isocenter selection for all the presented plans was performed in less than 3 s, while the SDO was performed in an average of 215 min

  15. The development and current status of 131I treatment for hyperthyroidism

    International Nuclear Information System (INIS)

    Wang Chunmei; Wang Xuemei

    2010-01-01

    Hyperthyroidism is an autoimmune diseasein which excessive amounts of thyroid hormones circulate in the blood. The treatments for hyperthyroidism mainly include antithyroid drugs, 131 I treatment, and surgery. 131 I had been verified as an effective, safe, simple method to treat adult and children hyperthyroidism. Current research trends of 131 I treatment mainly are problems of 131 I treatment of hyperthyroidism and its long-term security. (authors)

  16. Narcolepsy: current treatment options and future approaches

    Science.gov (United States)

    Billiard, Michel

    2008-01-01

    The management of narcolepsy is presently at a turning point. Three main avenues are considered in this review: 1) Two tendencies characterize the conventional treatment of narcolepsy. Modafinil has replaced methylphenidate and amphetamine as the first-line treatment of excessive daytime sleepiness (EDS) and sleep attacks, based on randomized, double blind, placebo-controlled clinical trials of modafinil, but on no direct comparison of modafinil versus traditional stimulants. For cataplexy, sleep paralysis, and hypnagogic hallucinations, new antidepressants tend to replace tricyclic antidepressants and selective serotonin reuptake inhibitors (SSRIs) in spite of a lack of randomized, double blind, placebo-controlled clinical trials of these compounds; 2) The conventional treatment of narcolepsy is now challenged by sodium oxybate, the sodium salt of gammahydroxybutyrate, based on a series of randomized, double-blind, placebo-controlled clinical trials and a long-term open label study. This treatment has a fairly good efficacy and is active on all symptoms of narcolepsy. Careful titration up to an adequate level is essential both to obtain positive results and avoid adverse effects; 3) A series of new treatments are currently being tested, either in animal models or in humans, They include novel stimulant and anticataplectic drugs, endocrine therapy, and, more attractively, totally new approaches based on the present state of knowledge of the pathophysiology of narcolepsy with cataplexy, hypocretine-based therapies, and immunotherapy. PMID:18830438

  17. Algorithm for the treatment of type 2 diabetes: a position statement of Brazilian Diabetes Society.

    Science.gov (United States)

    Lerario, Antonio C; Chacra, Antonio R; Pimazoni-Netto, Augusto; Malerbi, Domingos; Gross, Jorge L; Oliveira, José Ep; Gomes, Marilia B; Santos, Raul D; Fonseca, Reine Mc; Betti, Roberto; Raduan, Roberto

    2010-06-08

    The Brazilian Diabetes Society is starting an innovative project of quantitative assessment of medical arguments of and implementing a new way of elaborating SBD Position Statements. The final aim of this particular project is to propose a new Brazilian algorithm for the treatment of type 2 diabetes, based on the opinions of endocrinologists surveyed from a poll conducted on the Brazilian Diabetes Society website regarding the latest algorithm proposed by American Diabetes Association /European Association for the Study of Diabetes, published in January 2009.An additional source used, as a basis for the new algorithm, was to assess the acceptability of controversial arguments published in international literature, through a panel of renowned Brazilian specialists. Thirty controversial arguments in diabetes have been selected with their respective references, where each argument was assessed and scored according to its acceptability level and personal conviction of each member of the evaluation panel.This methodology was adapted using a similar approach to the one adopted in the recent position statement by the American College of Cardiology on coronary revascularization, of which not only cardiologists took part, but also specialists of other related areas.

  18. Algorithm for the treatment of type 2 diabetes: a position statement of Brazilian Diabetes Society

    Directory of Open Access Journals (Sweden)

    Lerario Antonio C

    2010-06-01

    Full Text Available Abstract The Brazilian Diabetes Society is starting an innovative project of quantitative assessment of medical arguments of and implementing a new way of elaborating SBD Position Statements. The final aim of this particular project is to propose a new Brazilian algorithm for the treatment of type 2 diabetes, based on the opinions of endocrinologists surveyed from a poll conducted on the Brazilian Diabetes Society website regarding the latest algorithm proposed by American Diabetes Association /European Association for the Study of Diabetes, published in January 2009. An additional source used, as a basis for the new algorithm, was to assess the acceptability of controversial arguments published in international literature, through a panel of renowned Brazilian specialists. Thirty controversial arguments in diabetes have been selected with their respective references, where each argument was assessed and scored according to its acceptability level and personal conviction of each member of the evaluation panel. This methodology was adapted using a similar approach to the one adopted in the recent position statement by the American College of Cardiology on coronary revascularization, of which not only cardiologists took part, but also specialists of other related areas.

  19. The Psychopharmacology Algorithm Project at the Harvard South Shore Program: An Algorithm for Generalized Anxiety Disorder.

    Science.gov (United States)

    Abejuela, Harmony Raylen; Osser, David N

    2016-01-01

    This revision of previous algorithms for the pharmacotherapy of generalized anxiety disorder was developed by the Psychopharmacology Algorithm Project at the Harvard South Shore Program. Algorithms from 1999 and 2010 and associated references were reevaluated. Newer studies and reviews published from 2008-14 were obtained from PubMed and analyzed with a focus on their potential to justify changes in the recommendations. Exceptions to the main algorithm for special patient populations, such as women of childbearing potential, pregnant women, the elderly, and those with common medical and psychiatric comorbidities, were considered. Selective serotonin reuptake inhibitors (SSRIs) are still the basic first-line medication. Early alternatives include duloxetine, buspirone, hydroxyzine, pregabalin, or bupropion, in that order. If response is inadequate, then the second recommendation is to try a different SSRI. Additional alternatives now include benzodiazepines, venlafaxine, kava, and agomelatine. If the response to the second SSRI is unsatisfactory, then the recommendation is to try a serotonin-norepinephrine reuptake inhibitor (SNRI). Other alternatives to SSRIs and SNRIs for treatment-resistant or treatment-intolerant patients include tricyclic antidepressants, second-generation antipsychotics, and valproate. This revision of the GAD algorithm responds to issues raised by new treatments under development (such as pregabalin) and organizes the evidence systematically for practical clinical application.

  20. Routing algorithms in networks-on-chip

    CERN Document Server

    Daneshtalab, Masoud

    2014-01-01

    This book provides a single-source reference to routing algorithms for Networks-on-Chip (NoCs), as well as in-depth discussions of advanced solutions applied to current and next generation, many core NoC-based Systems-on-Chip (SoCs). After a basic introduction to the NoC design paradigm and architectures, routing algorithms for NoC architectures are presented and discussed at all abstraction levels, from the algorithmic level to actual implementation.  Coverage emphasizes the role played by the routing algorithm and is organized around key problems affecting current and next generation, many-core SoCs. A selection of routing algorithms is included, specifically designed to address key issues faced by designers in the ultra-deep sub-micron (UDSM) era, including performance improvement, power, energy, and thermal issues, fault tolerance and reliability.   ·         Provides a comprehensive overview of routing algorithms for Networks-on-Chip and NoC-based, manycore systems; ·         Describe...

  1. Pancreatic Cysts: Current Concepts of Pathogenesis, Diagnosis and Diagnostic and Treatment Approach

    Directory of Open Access Journals (Sweden)

    V.М. Ratchik

    2014-09-01

    Full Text Available The relevance of pancreatic cyst treatment is determined by the increase in the incidence of pancreatitis, a considerable number of complications and high mortality rate. In recent decades, there has been steady growth of destructive forms of pancreatitis, respectively the number of pancreatic cysts increases. Pancreatic cysts in 18–68 % of cases cause various complications (suppuration, perforation, bleeding, internal and external fistulas, malignant transformation that define high mortality — 9.2–53 %. The nature and extent of surgery depend on the etiology, the presence or absence of the cyst connection with ductal system, the presence of complications. Surgical treatment for pancreatic cysts remains the method of choice. Minimally invasive surgical procedures became widely used. High prevalence of cystic lesions of the pancreas, the difficulty of choosing the optimal method of treatment require the creation of a rational, convenient for clinical practice diagnostic and therapeutic algorithm. Dissatisfaction with the results of treatment and a large number of complications lead to the search for a new, so called gold standard for treatment of patients and determine the real place of minimally invasive and open surgical techniques.

  2. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    Science.gov (United States)

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  3. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    Science.gov (United States)

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  4. Impact of respiratory-correlated CT sorting algorithms on the choice of margin definition for free-breathing lung radiotherapy treatments.

    Science.gov (United States)

    Thengumpallil, Sheeba; Germond, Jean-François; Bourhis, Jean; Bochud, François; Moeckli, Raphaël

    2016-06-01

    To investigate the impact of Toshiba phase- and amplitude-sorting algorithms on the margin strategies for free-breathing lung radiotherapy treatments in the presence of breathing variations. 4D CT of a sphere inside a dynamic thorax phantom was acquired. The 4D CT was reconstructed according to the phase- and amplitude-sorting algorithms. The phantom was moved by reproducing amplitude, frequency, and a mix of amplitude and frequency variations. Artefact analysis was performed for Mid-Ventilation and ITV-based strategies on the images reconstructed by phase- and amplitude-sorting algorithms. The target volume deviation was assessed by comparing the target volume acquired during irregular motion to the volume acquired during regular motion. The amplitude-sorting algorithm shows reduced artefacts for only amplitude variations while the phase-sorting algorithm for only frequency variations. For amplitude and frequency variations, both algorithms perform similarly. Most of the artefacts are blurring and incomplete structures. We found larger artefacts and volume differences for the Mid-Ventilation with respect to the ITV strategy, resulting in a higher relative difference of the surface distortion value which ranges between maximum 14.6% and minimum 4.1%. The amplitude- is superior to the phase-sorting algorithm in the reduction of motion artefacts for amplitude variations while phase-sorting for frequency variations. A proper choice of 4D CT sorting algorithm is important in order to reduce motion artefacts, especially if Mid-Ventilation strategy is used. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. The evolution of brachytherapy treatment planning

    International Nuclear Information System (INIS)

    Rivard, Mark J.; Venselaar, Jack L. M.; Beaulieu, Luc

    2009-01-01

    Brachytherapy is a mature treatment modality that has benefited from technological advances. Treatment planning has advanced from simple lookup tables to complex, computer-based dose-calculation algorithms. The current approach is based on the AAPM TG-43 formalism with recent advances in acquiring single-source dose distributions. However, this formalism has clinically relevant limitations for calculating patient dose. Dose-calculation algorithms are being developed based on Monte Carlo methods, collapsed cone, and solving the linear Boltzmann transport equation. In addition to improved dose-calculation tools, planning systems and brachytherapy treatment planning will account for material heterogeneities, scatter conditions, radiobiology, and image guidance. The AAPM, ESTRO, and other professional societies are working to coordinate clinical integration of these advancements. This Vision 20/20 article provides insight into these endeavors.

  6. Role of cytokines and treatment algorithms in retinopathy of prematurity.

    Science.gov (United States)

    Hartnett, M Elizabeth

    2017-05-01

    Currently, severe retinopathy of prematurity (ROP) is diagnosed by clinical evaluation and not a laboratory test. Laser is still considered standard care. However, anti-vascular endothelial growth factor (VEGF) agents are being used and there are questions whether and/or if to use them, what dose or type of agent should be considered and what agent may be most beneficial in specific cases. Also unclear are the effects of laser or anti-VEGF on severe ROP, refractive outcomes or infant development. This article reviews recent studies related to these questions and other trials for severe ROP. Imaging studies identify biomarkers of risk (plus disease, stage 3 ROP, and ROP in zone I). Intravitreal bevacizumab or ranibizumab are reported effective in treating aggressive posterior ROP in small series. Recurrences and effects on myopia vary among studies. Use of anti-VEGF agents affects cytokines in the infant blood and reduces systemic VEGF for up to 2 months, raising potential safety concerns. The effects of treatment vary based on infant size and are not comparable. Evidence for most studies is not high. Studies support experimental evidence that inhibiting VEGF reduces stage 3 ROP and peripheral avascular retina. Ongoing large-scale clinical trials may provide clarity for best treatments of severe ROP. Current guidelines hold for screening and treatment for type 1 ROP.

  7. Current Perspectives on the Pedagogical Value of Algorithm Visualization

    DEFF Research Database (Denmark)

    Demetriadis, Stavros N.; Papadopoulos, Pantelis M.

    2004-01-01

    and presenting significant research results concerning their pedagogical efficiency. Available studies indicate that it is not the quality of the graphical display (“what the students see”) but students’ engagement in active learning situations with algorithm visualization systems (“what the students do...... of the software it seems that “low-tech and fidelity” AV construction systems may be quite adequate for supporting students’ engagement in essential learning activities....

  8. Fluid-structure-coupling algorithm

    International Nuclear Information System (INIS)

    McMaster, W.H.; Gong, E.Y.; Landram, C.S.; Quinones, D.F.

    1980-01-01

    A fluid-structure-interaction algorithm has been developed and incorporated into the two dimensional code PELE-IC. This code combines an Eulerian incompressible fluid algorithm with a Lagrangian finite element shell algorithm and incorporates the treatment of complex free surfaces. The fluid structure, and coupling algorithms have been verified by the calculation of solved problems from the literature and from air and steam blowdown experiments. The code has been used to calculate loads and structural response from air blowdown and the oscillatory condensation of steam bubbles in water suppression pools typical of boiling water reactors. The techniques developed here have been extended to three dimensions and implemented in the computer code PELE-3D

  9. Fluid structure coupling algorithm

    International Nuclear Information System (INIS)

    McMaster, W.H.; Gong, E.Y.; Landram, C.S.; Quinones, D.F.

    1980-01-01

    A fluid-structure-interaction algorithm has been developed and incorporated into the two-dimensional code PELE-IC. This code combines an Eulerian incompressible fluid algorithm with a Lagrangian finite element shell algorithm and incorporates the treatment of complex free surfaces. The fluid structure and coupling algorithms have been verified by the calculation of solved problems from the literature and from air and steam blowdown experiments. The code has been used to calculate loads and structural response from air blowdown and the oscillatory condensation of steam bubbles in water suppression pools typical of boiling water reactors. The techniques developed have been extended to three dimensions and implemented in the computer code PELE-3D

  10. A current view of the diagnostics and treatment of phenylketonuria in Slovakia

    Directory of Open Access Journals (Sweden)

    Ürge Oto

    2015-12-01

    Full Text Available An overview of the diagnostics and treatment of phenylketonuria in Slovakia is presented in this paper. The nature of diseases, incidence and prevalence in Slovakia, its genetic characteristics, current laboratory diagnostics and treatment options are defined. A new method of phenylketonuria screening in Slovakia, which has brought substantial improvement in early detection of the disease and shortening time for definitive diagnosis since 1995 as well as the importance of a tandem MS/MS (mass spectrometry introduced in the diagnosis of inherited metabolic disorders, is presented. The current state of phenylketonuria treatment focusing on low-protein dietary treatment and supplementation of amino acid mixtures is analysed. The use of sapropterin, enzyme replacement therapy, large neutral amino acids supplementation and gene therapy are also discussed.

  11. Programming Deep Brain Stimulation for Parkinson's Disease: The Toronto Western Hospital Algorithms.

    Science.gov (United States)

    Picillo, Marina; Lozano, Andres M; Kou, Nancy; Puppi Munhoz, Renato; Fasano, Alfonso

    2016-01-01

    Deep brain stimulation (DBS) is an established and effective treatment for Parkinson's disease (PD). After surgery, a number of extensive programming sessions are performed to define the most optimal stimulation parameters. Programming sessions mainly rely only on neurologist's experience. As a result, patients often undergo inconsistent and inefficient stimulation changes, as well as unnecessary visits. We reviewed the literature on initial and follow-up DBS programming procedures and integrated our current practice at Toronto Western Hospital (TWH) to develop standardized DBS programming protocols. We propose four algorithms including the initial programming and specific algorithms tailored to symptoms experienced by patients following DBS: speech disturbances, stimulation-induced dyskinesia and gait impairment. We conducted a literature search of PubMed from inception to July 2014 with the keywords "deep brain stimulation", "festination", "freezing", "initial programming", "Parkinson's disease", "postural instability", "speech disturbances", and "stimulation induced dyskinesia". Seventy papers were considered for this review. Based on the literature review and our experience at TWH, we refined four algorithms for: (1) the initial programming stage, and management of symptoms following DBS, particularly addressing (2) speech disturbances, (3) stimulation-induced dyskinesia, and (4) gait impairment. We propose four algorithms tailored to an individualized approach to managing symptoms associated with DBS and disease progression in patients with PD. We encourage established as well as new DBS centers to test the clinical usefulness of these algorithms in supplementing the current standards of care. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Current status of radiation treatment of water and wastewater

    International Nuclear Information System (INIS)

    Pikaev, A.K.

    1997-01-01

    This is a brief review of the current status of radiation treatment of surface water, groundwater, wastewaters, and sewage sludges. Sources of ionizing radiation, and combination radiation methods for purification are described in some detail. Special attention is paid to pilot and industrial facilities. (author)

  13. On the dosimetric impact of inhomogeneity management in the Acuros XB algorithm for breast treatment

    International Nuclear Information System (INIS)

    Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Cozzi, Luca

    2011-01-01

    A new algorithm for photon dose calculation, Acuros XB, has been recently introduced in the Eclipse, Varian treatment planning system, allowing, similarly to the classic Monte Carlo methods, for accurate modelling of dose deposition in media. Aim of the present study was the assessment of its behaviour in clinical cases. Datasets from ten breast patients scanned under different breathing conditions (free breathing and deep inspiration) were used to calculate dose plans using the simple two tangential field setting, with Acuros XB (in its versions 10 and 11) and the Anisotropic Analytical Algorithm (AAA) for a 6MV beam. Acuros XB calculations were performed as dose-to-medium distributions. This feature was investigated to appraise the capability of the algorithm to distinguish between different elemental compositions in the human body: lobular vs. adipose tissue in the breast, lower (deep inspiration condition) vs. higher (free breathing condition) densities in the lung. The analysis of the two breast structures presenting densities compatible with muscle and with adipose tissue showed an average difference in dose calculation between Acuros XB and AAA of 1.6%, with AAA predicting higher dose than Acuros XB, for the muscle tissue (the lobular breast); while the difference for adipose tissue was negligible. From histograms of the dose difference plans between AAA and Acuros XB (version 10), the dose of the lung portion inside the tangential fields presented an average difference of 0.5% in the free breathing conditions, increasing to 1.5% for the deep inspiration cases, with AAA predicting higher doses than Acuros XB. In lung tissue significant differences are found also between Acuros XB version 10 and 11 for lower density lung. Acuros XB, differently from AAA, is capable to distinguish between the different elemental compositions of the body, and suggests the possibility to further improve the accuracy of the dose plans computed for actual treatment of patients

  14. Application of the inverse estimation method of current distribution from magnetic fields using genetic algorithm to beam profile measurement

    International Nuclear Information System (INIS)

    Kishimoto, M.; Sakasai, K.; Ara, K.

    1994-01-01

    In this paper, the new type of non-invasive beam profile monitor for intense ion accelerator using high-temperature superconductor. We regard the inverse estimation problem of beam profile as the optimum allocation problem of the currents into the cross-section of the beam vacuum pipe and applied genetic algorithm to solve this optimization problem. And we carried out the computer simulation to verify the effectiveness of this inverse estimation method of beam profile. (author)

  15. Prepatellar and olecranon bursitis: literature review and development of a treatment algorithm.

    Science.gov (United States)

    Baumbach, Sebastian F; Lobo, Christopher M; Badyine, Ilias; Mutschler, Wolf; Kanz, Karl-Georg

    2014-03-01

    Olecranon bursitis and prepatellar bursitis are common entities, with a minimum annual incidence of 10/100,000, predominantly affecting male patients (80 %) aged 40-60 years. Approximately 1/3 of cases are septic (SB) and 2/3 of cases are non-septic (NSB), with substantial variations in treatment regimens internationally. The aim of the study was the development of a literature review-based treatment algorithm for prepatellar and olecranon bursitis. Following a systematic review of Pubmed, the Cochrane Library, textbooks of emergency medicine and surgery, and a manual reference search, 52 relevant papers were identified. The initial differentiation between SB and NSB was based on clinical presentation, bursal aspirate, and blood sampling analysis. Physical findings suggesting SB were fever >37.8 °C, prebursal temperature difference greater 2.2 °C, and skin lesions. Relevant findings for bursal aspirate were purulent aspirate, fluid-to-serum glucose ratio 3,000 cells/μl, polymorphonuclear cells >50 %, positive Gram staining, and positive culture. General treatment measures for SB and NSB consist of bursal aspiration, NSAIDs, and PRICE. For patients with confirmed NSB and high athletic or occupational demands, intrabursal steroid injection may be performed. In the case of SB, antibiotic therapy should be initiated. Surgical treatment, i.e., incision, drainage, or bursectomy, should be restricted to severe, refractory, or chronic/recurrent cases. The available evidence did not support the central European concept of immediate bursectomy in cases of SB. A conservative treatment regimen should be pursued, following bursal aspirate-based differentiation between SB and NSB.

  16. Current status in diabetic macular edema treatments

    Institute of Scientific and Technical Information of China (English)

    Pedro; Romero-Aroca

    2013-01-01

    Diabetes is a serious chronic condition,which increase the risk of cardiovascular diseases,kidney failure and nerve damage leading to amputation.Furthermore the ocular complications include diabetic macular edema,is the leading cause of blindness among adults in the industrialized countries.Today,blindness from diabetic macular edema is largely preventable with timely detection and appropriate interventional therapy.The treatment should include an optimized control of glycemia,arterial tension,lipids and renal status.The photocoagulation laser is currently restricted to focal macular edema in some countries,but due the high cost of intravitreal drugs,the use of laser treatment for focal and diffuse diabetic macular edema(DME),can be valid as gold standard in many countries.The intravitreal anti vascular endothelial growth factor drugs(ranibizumab and bevacizumab),are indicated in the treatment of all types of DME,but the correct protocol for administration should be defined for the different Retina Scientific Societies.The corticosteroids for diffuse DME,has a place in pseudophakic patients,but its complications restricted the use of these drugs for some patients.Finally the intravitreal interface plays an important role and its exploration is mandatory in all DME patients.

  17. Current and future medical treatments for patients with acromegaly.

    Science.gov (United States)

    Maffezzoni, Filippo; Formenti, Anna Maria; Mazziotti, Gherardo; Frara, Stefano; Giustina, Andrea

    2016-08-01

    Acromegaly is a relatively rare condition of growth hormone (GH) excess associated with significant morbidity and, when left untreated, high mortality. Therapy for acromegaly is targeted at decreasing GH and insulin-like growth hormone 1 levels, ameliorating patients' symptoms and decreasing any local compressive effects of the pituitary adenoma. The therapeutic options for acromegaly include surgery, medical therapies (such as dopamine agonists, somatostatin receptor ligands and the GH receptor antagonist pegvisomant) and radiotherapy. However, despite all these treatments option, approximately 50% of patients are not adequately controlled. In this paper, the authors discuss: 1) efficacy and safety of current medical therapy 2) the efficacy and safety of the new multireceptor-targeted somatostatin ligand pasireotide 3) medical treatments currently under clinical investigation (oral octreotide, ITF2984, ATL1103), and 4) preliminary data on the use of new injectable and transdermal/transmucosal formulations of octreotide. This expert opinion supports the need for new therapeutic agents and modalities for patients with acromegaly.

  18. Clostridium difficile infection: current, forgotten and emerging treatment options.

    Science.gov (United States)

    Drekonja, Dimitri M

    2014-09-01

    Clostridium difficile infection (CDI) has increased in incidence and severity, and is now among the most common nosocomial infections. Several agents are available for the initial treatment of CDI, some of which are rarely used, and none of which is clearly superior for initial clinical cure. Fidaxomicin appears to offer a benefit in terms of preventing recurrent disease, although the cost-benefit ratio is debated. Recurrent CDI is a major challenge, occurring after 15-30% of initial episodes. The treatment of recurrent CDI is difficult, with sparse evidence available to support any particular agent. Fecal microbiota therapy, also known as 'stool transplantation', appears to be highly effective, although availability is currently limited, and the regulatory environment is in flux. Synthetic stool products and an orally available fecal microbiota therapy product are both under investigation, which may address the problem of availability. As with most infectious diseases, an effective vaccine would be a welcome addition to our armamentarium, but none is currently available.

  19. Ragweed-induced allergic rhinoconjunctivitis: current and emerging treatment options

    Directory of Open Access Journals (Sweden)

    Ihler F

    2015-02-01

    Full Text Available Friedrich Ihler, Martin CanisDepartment of Otorhinolaryngology, University Medical Center Göttingen, Göttingen, GermanyAbstract: Ragweed (Ambrosia spp. is an annually flowering plant whose pollen bears high allergenic potential. Ragweed-induced allergic rhinoconjunctivitis has long been seen as a major immunologic condition in Northern America with high exposure and sensitization rates in the general population. The invasive occurrence of ragweed (A. artemisiifolia poses an increasing challenge to public health in Europe and Asia as well. Possible explanations for its worldwide spread are climate change and urbanization, as well as pollen transport over long distances by globalized traffic and winds. Due to the increasing disease burden worldwide, and to the lack of a current and comprehensive overview, this study aims to review the current and emerging treatment options for ragweed-induced rhinoconjunctivitis. Sound clinical evidence is present for the symptomatic treatment of ragweed-induced allergic rhinoconjunctivitis with oral third-generation H1-antihistamines and leukotriene antagonists. The topical application of glucocorticoids has also been efficient in randomized controlled clinical trials. Combined approaches employing multiple agents are common. The mainstay of causal treatment to date, especially in Northern America, is subcutaneous immunotherapy with the focus on the major allergen, Amb a 1. Beyond this, growing evidence from several geographical regions documents the benefit of sublingual immunotherapy. Future treatment options promise more specific symptomatic treatment and fewer side effects during causal therapy. Novel antihistamines for symptomatic treatment are aimed at the histamine H3-receptor. New adjuvants with toll-like receptor 4 activity or the application of the monoclonal anti-immunoglobulin E antibody, omalizumab, are supposed to enhance conventional immunotherapy. An approach targeting toll-like receptor 9 by

  20. Reforming Dutch substance abuse treatment services.

    Science.gov (United States)

    Schippers, Gerard M; Schramade, Mark; Walburg, Jan A

    2002-01-01

    The Dutch substance abuse treatment system is in the middle of a major reorganization. The goal is to improve outcomes by redesigning all major primary treatment processes and by implementing a system of regular monitoring and feedback of clinical outcome data. The new program includes implementing standardized psychosocial behavior-oriented treatment modalities and a stepped-care patient placement algorithm in a core-shell organizational model. This article outlines the new program and presents its objectives, developmental stages, and current status.

  1. A comparison of three optimization algorithms for intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Pflugfelder, D.; Wilkens, J.J.; Nill, S.; Oelfke, U.

    2008-01-01

    In intensity modulated treatment techniques, the modulation of each treatment field is obtained using an optimization algorithm. Multiple optimization algorithms have been proposed in the literature, e.g. steepest descent, conjugate gradient, quasi-Newton methods to name a few. The standard optimization algorithm in our in-house inverse planning tool KonRad is a quasi-Newton algorithm. Although this algorithm yields good results, it also has some drawbacks. Thus we implemented an improved optimization algorithm based on the limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) routine. In this paper the improved optimization algorithm is described. To compare the two algorithms, several treatment plans are optimized using both algorithms. This included photon (IMRT) as well as proton (IMPT) intensity modulated therapy treatment plans. To present the results in a larger context the widely used conjugate gradient algorithm was also included into this comparison. On average, the improved optimization algorithm was six times faster to reach the same objective function value. However, it resulted not only in an acceleration of the optimization. Due to the faster convergence, the improved optimization algorithm usually terminates the optimization process at a lower objective function value. The average of the observed improvement in the objective function value was 37%. This improvement is clearly visible in the corresponding dose-volume-histograms. The benefit of the improved optimization algorithm is particularly pronounced in proton therapy plans. The conjugate gradient algorithm ranked in between the other two algorithms with an average speedup factor of two and an average improvement of the objective function value of 30%. (orig.)

  2. Agency and Algorithms

    Directory of Open Access Journals (Sweden)

    Hanns Holger Rutz

    2016-11-01

    Full Text Available Although the concept of algorithms has been established a long time ago, their current topicality indicates a shift in the discourse. Classical definitions based on logic seem to be inadequate to describe their aesthetic capabilities. New approaches stress their involvement in material practices as well as their incompleteness. Algorithmic aesthetics can no longer be tied to the static analysis of programs, but must take into account the dynamic and experimental nature of coding practices. It is suggested that the aesthetic objects thus produced articulate something that could be called algorithmicity or the space of algorithmic agency. This is the space or the medium – following Luhmann’s form/medium distinction – where human and machine undergo mutual incursions. In the resulting coupled “extimate” writing process, human initiative and algorithmic speculation cannot be clearly divided out any longer. An observation is attempted of defining aspects of such a medium by drawing a trajectory across a number of sound pieces. The operation of exchange between form and medium I call reconfiguration and it is indicated by this trajectory. 

  3. Inclusive Flavour Tagging Algorithm

    International Nuclear Information System (INIS)

    Likhomanenko, Tatiana; Derkach, Denis; Rogozhnikov, Alex

    2016-01-01

    Identifying the flavour of neutral B mesons production is one of the most important components needed in the study of time-dependent CP violation. The harsh environment of the Large Hadron Collider makes it particularly hard to succeed in this task. We present an inclusive flavour-tagging algorithm as an upgrade of the algorithms currently used by the LHCb experiment. Specifically, a probabilistic model which efficiently combines information from reconstructed vertices and tracks using machine learning is proposed. The algorithm does not use information about underlying physics process. It reduces the dependence on the performance of lower level identification capacities and thus increases the overall performance. The proposed inclusive flavour-tagging algorithm is applicable to tag the flavour of B mesons in any proton-proton experiment. (paper)

  4. Current and future treatment options in osteoporosis.

    LENUS (Irish Health Repository)

    Brewer, Linda

    2012-02-01

    PURPOSE: The incidence of osteoporosis-related fractures will increase substantially over the coming decades as the population ages globally. This has important economic and public health implications, contributing substantially to morbidity and excess mortality in this population. METHODS: When prescribing for older patients the effectiveness profile of drugs needs to be balanced against their tolerability in individual patients. RESULTS: Currently we have good anti-fracture data to support the use of many available anti-resorptive and anabolic drugs including bisphosphonates, strontium ranelate and recombinant human parathyroid hormone. We also have evidence to demonstrate the importance of calcium and vitamin D repletion in these patients. However, in recent years our understanding of normal bone physiology and the mechanisms underlying the development of osteoporosis has significantly advanced and this has led to the development of new therapies. Novel agents, particularly denosumab, but also inhibitors of cathepsin K and anabolic agents that act on Wnt signalling, will increase the therapeutic options for clinicians in the coming years. CONCLUSION: This review discusses the evidence supporting the use of currently available treatment options for osteoporosis and potential future advances in drug therapy. Particular consideration should be given when prescribing for certain older patients who have issues with compliance or tolerance and also in those with co-morbidities or levels of frailty that may restrict the choice of therapy. Understanding the evidence for the benefit and possible harm of osteoporosis treatments is critical to appropriate management of this patient population.

  5. Programming Deep Brain Stimulation for Tremor and Dystonia: The Toronto Western Hospital Algorithms.

    Science.gov (United States)

    Picillo, Marina; Lozano, Andres M; Kou, Nancy; Munhoz, Renato Puppi; Fasano, Alfonso

    2016-01-01

    Deep brain stimulation (DBS) is an effective treatment for essential tremor (ET) and dystonia. After surgery, a number of extensive programming sessions are performed, mainly relying on neurologist's personal experience as no programming guidelines have been provided so far, with the exception of recommendations provided by groups of experts. Finally, fewer information is available for the management of DBS in ET and dystonia compared with Parkinson's disease. Our aim is to review the literature on initial and follow-up DBS programming procedures for ET and dystonia and integrate the results with our current practice at Toronto Western Hospital (TWH) to develop standardized DBS programming protocols. We conducted a literature search of PubMed from inception to July 2014 with the keywords "balance", "bradykinesia", "deep brain stimulation", "dysarthria", "dystonia", "gait disturbances", "initial programming", "loss of benefit", "micrographia", "speech", "speech difficulties" and "tremor". Seventy-six papers were considered for this review. Based on the literature review and our experience at TWH, we refined three algorithms for management of ET, including: (1) initial programming, (2) management of balance and speech issues and (3) loss of stimulation benefit. We also depicted algorithms for the management of dystonia, including: (1) initial programming and (2) management of stimulation-induced hypokinesia (shuffling gait, micrographia and speech impairment). We propose five algorithms tailored to an individualized approach to managing ET and dystonia patients with DBS. We encourage the application of these algorithms to supplement current standards of care in established as well as new DBS centers to test the clinical usefulness of these algorithms in supplementing the current standards of care. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Current challenges and emerging drug delivery strategies for the treatment of psoriasis.

    Science.gov (United States)

    Hoffman, Melissa B; Hill, Dane; Feldman, Steven R

    2016-10-01

    Psoriasis is a common skin disorder associated with physical, social, psychological and financial burden. Over the past two decades, advances in our understanding of pathogenesis and increased appreciation for the multifaceted burden of psoriasis has led to new treatment development and better patient outcomes. Yet, surveys demonstrate that many psoriasis patients are either undertreated or are dissatisfied with treatment. There are many barriers that need be overcome to optimize patient outcomes and satisfaction. This review covers the current challenges associated with each major psoriasis treatment strategy (topical, phototherapy, oral medications and biologics). It also reviews the challenges associated with the psychosocial aspects of the disease and how they affect treatment outcomes. Patient adherence, inconvenience, high costs, and drug toxicities are all discussed. Then, we review the emerging drug delivery strategies in topical, oral, and biologic therapy. By outlining current treatment challenges and emerging drug delivery strategies, we hope to highlight the deficits in psoriasis treatment and strategies for how to overcome them. Regardless of disease severity, clinicians should use a patient-centered approach. In all cases, we need to balance patients' psychosocial needs, treatment costs, convenience, and effectiveness with patients' preferences in order to optimize treatment outcomes.

  7. BACK PAIN ASSOCIATED WITH OSTEOPOROSIS — TREATMENT PATTERNS, APPROACHES TO THERAPY

    Directory of Open Access Journals (Sweden)

    N. A. Shostak

    2014-07-01

    Full Text Available The article highlights current approaches to diagnosis and treatment of back pain, associated with osteoporosis. An algorithm for management of patients with vertebral compression fracture, complicated by pain, the main approaches to drug treatment of back pain and osteoporosis are described.

  8. MREIT experiments with 200 µA injected currents: a feasibility study using two reconstruction algorithms, SMM and harmonic B(Z).

    Science.gov (United States)

    Arpinar, V E; Hamamura, M J; Degirmenci, E; Muftuler, L T

    2012-07-07

    Magnetic resonance electrical impedance tomography (MREIT) is a technique that produces images of conductivity in tissues and phantoms. In this technique, electrical currents are applied to an object and the resulting magnetic flux density is measured using magnetic resonance imaging (MRI) and the conductivity distribution is reconstructed using these MRI data. Currently, the technique is used in research environments, primarily studying phantoms and animals. In order to translate MREIT to clinical applications, strict safety standards need to be established, especially for safe current limits. However, there are currently no standards for safe current limits specific to MREIT. Until such standards are established, human MREIT applications need to conform to existing electrical safety standards in medical instrumentation, such as IEC601. This protocol limits patient auxiliary currents to 100 µA for low frequencies. However, published MREIT studies have utilized currents 10-400 times larger than this limit, bringing into question whether the clinical applications of MREIT are attainable under current standards. In this study, we investigated the feasibility of MREIT to accurately reconstruct the relative conductivity of a simple agarose phantom using 200 µA total injected current and tested the performance of two MREIT reconstruction algorithms. These reconstruction algorithms used are the iterative sensitivity matrix method (SMM) by Ider and Birgul (1998 Elektrik 6 215-25) with Tikhonov regularization and the harmonic B(Z) proposed by Oh et al (2003 Magn. Reason. Med. 50 875-8). The reconstruction techniques were tested at both 200 µA and 5 mA injected currents to investigate their noise sensitivity at low and high current conditions. It should be noted that 200 µA total injected current into a cylindrical phantom generates only 14.7 µA current in imaging slice. Similarly, 5 mA total injected current results in 367 µA in imaging slice. Total

  9. Current approaches and future directions in the treatment of leprosy

    Directory of Open Access Journals (Sweden)

    Worobec SM

    2012-08-01

    Full Text Available Sophie M WorobecDepartment of Dermatology, College of Medicine, University of Illinois at Chicago, Chicago, Illinois, USAAbstract: This review surveys current treatments and future treatment trends in leprosy from a clinical perspective. The World Health Organization provides a multidrug treatment regimen that targets the Mycobacterium leprae bacillus which causes leprosy. Several investigational drugs are available for the treatment of drug-resistant M. leprae. Future directions in leprosy treatment will focus on: the molecular signaling mechanism M. leprae uses to avoid triggering an immune response; prospective studies of the side effects experienced during multiple-drug therapy; recognition of relapse rates post-completion of designated treatments; combating multidrug resistance; vaccine development; development of new diagnostic tests; and the implications of the recent discovery of a genetically distinct leprosy-causing bacillus, Mycobacterium lepromatosis.Keywords: epidemiology, leprosy, Hansen’s disease, multidrug resistance, multidrug therapy

  10. Formulating adaptive radiation therapy (ART) treatment planning into a closed-loop control framework

    International Nuclear Information System (INIS)

    Zerda, Adam de la; Armbruster, Benjamin; Xing Lei

    2007-01-01

    While ART has been studied for years, the specific quantitative implementation details have not. In order for this new scheme of radiation therapy (RT) to reach its potential, an effective ART treatment planning strategy capable of taking into account the dose delivery history and the patient's on-treatment geometric model must be in place. This paper performs a theoretical study of dynamic closed-loop control algorithms for ART and compares their utility with data from phantom and clinical cases. We developed two classes of algorithms: those Adapting to Changing Geometry and those Adapting to Geometry and Delivered Dose. The former class takes into account organ deformations found just before treatment. The latter class optimizes the dose distribution accumulated over the entire course of treatment by adapting at each fraction, not only to the information just before treatment about organ deformations but also to the dose delivery history. We showcase two algorithms in the class of those Adapting to Geometry and Delivered Dose. A comparison of the approaches indicates that certain closed-loop ART algorithms may significantly improve the current practice. We anticipate that improvements in imaging, dose verification and reporting will further increase the importance of adaptive algorithms

  11. Misophonia: current perspectives

    Directory of Open Access Journals (Sweden)

    Cavanna AE

    2015-08-01

    Full Text Available Andrea E Cavanna,1–3 Stefano Seri3,4 1Department of Neuropsychiatry, Birmingham and Solihull Mental Health NHS Foundation Trust and University of Birmingham, Birmingham, 2Sobell Department of Motor Neuroscience and Movement Disorders, Institute of Neurology, University College London, London, 3School of Life and Health Sciences, Aston Brain Centre, Wellcome Trust Laboratory for MEG Studies, Aston University, 4Children’s Epilepsy Surgery Programme, The Birmingham Children’s Hospital NHS Foundation Trust, Birmingham, UK Abstract: Misophonia is characterized by a negative reaction to a sound with a specific pattern and meaning to a given individual. In this paper, we review the clinical features of this relatively common yet underinvestigated condition, with focus on co-occurring neurodevelopmental disorders. Currently available data on the putative pathophysiology of the condition can inform our understanding and guide the diagnostic process and treatment approach. Tinnitus retraining therapy and cognitive behavior therapy have been proposed as the most effective treatment strategies for reducing symptoms; however, current treatment algorithms should be validated in large population studies. At the present stage, competing paradigms see misophonia as a physiological state potentially inducible in any subject, an idiopathic condition (which can present with comorbid psychiatric disorders, or a symptomatic manifestation of an underlying psychiatric disorder. Agreement on the use of standardized diagnostic criteria would be an important step forward in terms of both clinical practice and scientific inquiry. Areas for future research include phenomenology, epidemiology, modulating factors, neurophysiological underpinnings, and treatment trials. Keywords: misophonia, selective sound sensitivity syndrome, hyperacusis, neurodevelopmental disorders, Tourette syndrome, obsessive-compulsive spectrum

  12. Fault Diagnosis System of Induction Motors Based on Neural Network and Genetic Algorithm Using Stator Current Signals

    Directory of Open Access Journals (Sweden)

    Tian Han

    2006-01-01

    Full Text Available This paper proposes an online fault diagnosis system for induction motors through the combination of discrete wavelet transform (DWT, feature extraction, genetic algorithm (GA, and neural network (ANN techniques. The wavelet transform improves the signal-to-noise ratio during a preprocessing. Features are extracted from motor stator current, while reducing data transfers and making online application available. GA is used to select the most significant features from the whole feature database and optimize the ANN structure parameter. Optimized ANN is trained and tested by the selected features of the measurement data of stator current. The combination of advanced techniques reduces the learning time and increases the diagnosis accuracy. The efficiency of the proposed system is demonstrated through motor faults of electrical and mechanical origins on the induction motors. The results of the test indicate that the proposed system is promising for the real-time application.

  13. Metastatic breast cancer: do current treatments improve quality of life? A prospective study

    Directory of Open Access Journals (Sweden)

    Fernanda Amado

    Full Text Available CONTEXT AND OBJECTIVE: In metastatic breast cancer cases, the currently available therapeutic approaches provide minimal improvement in survival. As such, quality of life (QOL becomes one of the main objectives of treatment. It is not known whether current treatments derived from trials improve QOL. The aim was to evaluate changes in QOL among metastatic breast cancer patients receiving treatment derived from trials. DESIGN AND SETTING: Prospective observational QOL survey in a tertiary cancer center. METHODS: To evaluate the influence of current treatments on patients' QOL, the Medical Outcomes Study Short Form-36 (SF-36 and the Beck Depression Inventory (BDI were applied on three occasions: before starting treatment and at the 6th and 12th weeks, to consecutive metastatic breast cancer patients over a one-year period. RESULTS: We found an improvement in QOL in the sample evaluated (n = 40, expressed by changes in the overall SF-36 score (p = 0.002 and the BDI (p = 0.004. Taken individually, the SF-36 components Pain, Social Functioning and Mental Health also improved significantly. Patients with worse initial performance status and secondary symptoms displayed greater improvement than those with better initial performance status and asymptomatic disease (p < 0.001. Patients who received more than one type of therapy showed larger gains than those given only one type (p = 0.038. CONCLUSIONS: In our environment, current metastatic breast cancer treatments can improve QOL, especially among symptomatic patients and those with low performance status.

  14. Decoding algorithm for vortex communications receiver

    Science.gov (United States)

    Kupferman, Judy; Arnon, Shlomi

    2018-01-01

    Vortex light beams can provide a tremendous alphabet for encoding information. We derive a symbol decoding algorithm for a direct detection matrix detector vortex beam receiver using Laguerre Gauss (LG) modes, and develop a mathematical model of symbol error rate (SER) for this receiver. We compare SER as a function of signal to noise ratio (SNR) for our algorithm and for the Pearson correlation algorithm. To our knowledge, this is the first comprehensive treatment of a decoding algorithm of a matrix detector for an LG receiver.

  15. The Retina Algorithm

    CERN Multimedia

    CERN. Geneva; PUNZI, Giovanni

    2015-01-01

    Charge particle reconstruction is one of the most demanding computational tasks found in HEP, and it becomes increasingly important to perform it in real time. We envision that HEP would greatly benefit from achieving a long-term goal of making track reconstruction happen transparently as part of the detector readout ("detector-embedded tracking"). We describe here a track-reconstruction approach based on a massively parallel pattern-recognition algorithm, inspired by studies of the processing of visual images by the brain as it happens in nature ('RETINA algorithm'). It turns out that high-quality tracking in large HEP detectors is possible with very small latencies, when this algorithm is implemented in specialized processors, based on current state-of-the-art, high-speed/high-bandwidth digital devices.

  16. Natural speech algorithm applied to baseline interview data can predict which patients will respond to psilocybin for treatment-resistant depression.

    Science.gov (United States)

    Carrillo, Facundo; Sigman, Mariano; Fernández Slezak, Diego; Ashton, Philip; Fitzgerald, Lily; Stroud, Jack; Nutt, David J; Carhart-Harris, Robin L

    2018-04-01

    Natural speech analytics has seen some improvements over recent years, and this has opened a window for objective and quantitative diagnosis in psychiatry. Here, we used a machine learning algorithm applied to natural speech to ask whether language properties measured before psilocybin for treatment-resistant can predict for which patients it will be effective and for which it will not. A baseline autobiographical memory interview was conducted and transcribed. Patients with treatment-resistant depression received 2 doses of psilocybin, 10 mg and 25 mg, 7 days apart. Psychological support was provided before, during and after all dosing sessions. Quantitative speech measures were applied to the interview data from 17 patients and 18 untreated age-matched healthy control subjects. A machine learning algorithm was used to classify between controls and patients and predict treatment response. Speech analytics and machine learning successfully differentiated depressed patients from healthy controls and identified treatment responders from non-responders with a significant level of 85% of accuracy (75% precision). Automatic natural language analysis was used to predict effective response to treatment with psilocybin, suggesting that these tools offer a highly cost-effective facility for screening individuals for treatment suitability and sensitivity. The sample size was small and replication is required to strengthen inferences on these results. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Current Treatment Options in Challenging Oral Diseases: Burning Mouth Syndrome

    OpenAIRE

    Bilgen Erdoğan; Murat Yılmaz

    2012-01-01

    Burning mouth syndrome is a chronic condition characterized by burning pain without any signs of an oral mucosal pathology, that usually affects postmenopausal women. Burning sensation is often accompanied by dysgeusia and xerostomia. The pathogenesis of the disease is unknown and an effective treatment option for most of the patients has not been defined yet. The aim of this review is to present current pharmacological and physicological treatments of burning mouth syndrome.

  18. Current Treatment Options in Challenging Oral Diseases: Burning Mouth Syndrome

    Directory of Open Access Journals (Sweden)

    Bilgen Erdoğan

    2012-12-01

    Full Text Available Burning mouth syndrome is a chronic condition characterized by burning pain without any signs of an oral mucosal pathology, that usually affects postmenopausal women. Burning sensation is often accompanied by dysgeusia and xerostomia. The pathogenesis of the disease is unknown and an effective treatment option for most of the patients has not been defined yet. The aim of this review is to present current pharmacological and physicological treatments of burning mouth syndrome.

  19. An enhanced block matching algorithm for fast elastic registration in adaptive radiotherapy

    International Nuclear Information System (INIS)

    Malsch, U; Thieke, C; Huber, P E; Bendl, R

    2006-01-01

    Image registration has many medical applications in diagnosis, therapy planning and therapy. Especially for time-adaptive radiotherapy, an efficient and accurate elastic registration of images acquired for treatment planning, and at the time of the actual treatment, is highly desirable. Therefore, we developed a fully automatic and fast block matching algorithm which identifies a set of anatomical landmarks in a 3D CT dataset and relocates them in another CT dataset by maximization of local correlation coefficients in the frequency domain. To transform the complete dataset, a smooth interpolation between the landmarks is calculated by modified thin-plate splines with local impact. The concept of the algorithm allows separate processing of image discontinuities like temporally changing air cavities in the intestinal track or rectum. The result is a fully transformed 3D planning dataset (planning CT as well as delineations of tumour and organs at risk) to a verification CT, allowing evaluation and, if necessary, changes of the treatment plan based on the current patient anatomy without time-consuming manual re-contouring. Typically the total calculation time is less than 5 min, which allows the use of the registration tool between acquiring the verification images and delivering the dose fraction for online corrections. We present verifications of the algorithm for five different patient datasets with different tumour locations (prostate, paraspinal and head-and-neck) by comparing the results with manually selected landmarks, visual assessment and consistency testing. It turns out that the mean error of the registration is better than the voxel resolution (2 x 2 x 3 mm 3 ). In conclusion, we present an algorithm for fully automatic elastic image registration that is precise and fast enough for online corrections in an adaptive fractionated radiation treatment course

  20. Development and validation of an algorithm for laser application in wound treatment

    Directory of Open Access Journals (Sweden)

    Diequison Rite da Cunha

    2017-12-01

    Full Text Available ABSTRACT Objective: To develop and validate an algorithm for laser wound therapy. Method: Methodological study and literature review. For the development of the algorithm, a review was performed in the Health Sciences databases of the past ten years. The algorithm evaluation was performed by 24 participants, nurses, physiotherapists, and physicians. For data analysis, the Cronbach’s alpha coefficient and the chi-square test for independence was used. The level of significance of the statistical test was established at 5% (p<0.05. Results: The professionals’ responses regarding the facility to read the algorithm indicated: 41.70%, great; 41.70%, good; 16.70%, regular. With regard the algorithm being sufficient for supporting decisions related to wound evaluation and wound cleaning, 87.5% said yes to both questions. Regarding the participants’ opinion that the algorithm contained enough information to support their decision regarding the choice of laser parameters, 91.7% said yes. The questionnaire presented reliability using the Cronbach’s alpha coefficient test (α = 0.962. Conclusion: The developed and validated algorithm showed reliability for evaluation, wound cleaning, and use of laser therapy in wounds.

  1. Exercise after breast cancer treatment: current perspectives

    Directory of Open Access Journals (Sweden)

    Dieli-Conwright CM

    2015-10-01

    Full Text Available Christina M Dieli-Conwright, Breanna Z Orozco Division of Biokinesiology and Physical Therapy, Women's Health and Exercise Laboratory, University of Southern California, Los Angeles, CA, USA Abstract: Over the past 2 decades, great strides have been made in the field of exercise-oncology research, particularly with breast cancer. This area of research is particularly important since there are >2.8 million breast cancer survivors who are in need of an intervention that can offset treatment-related side effects. Noticeable reductions in physical fitness (ie, cardiopulmonary fitness and muscular strength, negative changes in body composition (ie, increase in body mass, decrease in lean body mass, and increase in fat mass, increased fatigue, depression, or anxiety are some of the common side effects of cancer treatments that negatively impact overall quality of life and increase the risk for the development of comorbidities. Exercise plays a vital role in improving cardiopulmonary function, psychological events, muscular strength, and endurance in breast cancer survivors, and thus should be considered as a key factor of lifestyle intervention to reverse negative treatment-related side effects. The purpose of this review is to address current perspectives on the benefits of aerobic and resistance exercise after breast cancer treatments. This review is focused on the well-established benefits of exercise on physical and emotional well-being, bone health, lymphedema management, and the postulated benefits of exercise on risk reduction for recurrence of breast cancer. Keywords: breast cancer, exercise, physical well-being

  2. Current and Under Development Treatment Modalities of Psoriasis: A Review.

    Science.gov (United States)

    Albaghdadi, Abdullah

    2017-01-01

    Psoriasis is a chronic and complex autoimmune inflammatory skin disease that affects over 125 million people worldwide. It can exhibit at any age, in spite of the fact that children are less normally influenced than adults. It is characterized by distinct erythematous plaques shielded with conspicuous silvery scales that shows up in different areas of the skin. Knowledge of pathophysiology, especially the pathogenesis of psoriasis, has significantly progressed in the recent decade. Advancement in molecular knowledge leads to better understanding of the disease, thus influencing the development of efficient treatment modalities. However, even with the availability of various options of treatment most of the efficient treatment modalities are costly. Expenses of health care bring about major financial weight to the patients as well as to health care systems. Thus, it was important to review the available current treatment options and those which are under development, in terms of efficacy, safety and cost to assist in selecting the most appropriate treatment for psoriasis patients. Literatures were searched by using key words psoriasis, topical treatment, systemic treatment, biologics and phototherapies, on Embase, Medline, Jstor, Cochrane and Merck Index databases. Life-style choices such as smoking, alcohol consumption, obesity and stress are recognised as risk factors and triggers associated with psoriasis. Psoriasis poses psycho-social and economic burden on affected patients that sometimes leads to depression, reduced social interaction and suicidal tendencies in patients. Depending on the type, severity and extent of the disease, comorbidities, patient preference, efficacy and safety profile, numerous treatment modalities and therapeutic agents are available such as topical, systemic, biologic and phototherapeutic treatments. However, it was found that among all the current available treatments for psoriasis, biologic agents and phototherapeutic modalities are

  3. Power of automated algorithms for combining time-line follow-back and urine drug screening test results in stimulant-abuse clinical trials.

    Science.gov (United States)

    Oden, Neal L; VanVeldhuisen, Paul C; Wakim, Paul G; Trivedi, Madhukar H; Somoza, Eugene; Lewis, Daniel

    2011-09-01

    In clinical trials of treatment for stimulant abuse, researchers commonly record both Time-Line Follow-Back (TLFB) self-reports and urine drug screen (UDS) results. To compare the power of self-report, qualitative (use vs. no use) UDS assessment, and various algorithms to generate self-report-UDS composite measures to detect treatment differences via t-test in simulated clinical trial data. We performed Monte Carlo simulations patterned in part on real data to model self-report reliability, UDS errors, dropout, informatively missing UDS reports, incomplete adherence to a urine donation schedule, temporal correlation of drug use, number of days in the study period, number of patients per arm, and distribution of drug-use probabilities. Investigated algorithms include maximum likelihood and Bayesian estimates, self-report alone, UDS alone, and several simple modifications of self-report (referred to here as ELCON algorithms) which eliminate perceived contradictions between it and UDS. Among the algorithms investigated, simple ELCON algorithms gave rise to the most powerful t-tests to detect mean group differences in stimulant drug use. Further investigation is needed to determine if simple, naïve procedures such as the ELCON algorithms are optimal for comparing clinical study treatment arms. But researchers who currently require an automated algorithm in scenarios similar to those simulated for combining TLFB and UDS to test group differences in stimulant use should consider one of the ELCON algorithms. This analysis continues a line of inquiry which could determine how best to measure outpatient stimulant use in clinical trials (NIDA. NIDA Monograph-57: Self-Report Methods of Estimating Drug Abuse: Meeting Current Challenges to Validity. NTIS PB 88248083. Bethesda, MD: National Institutes of Health, 1985; NIDA. NIDA Research Monograph 73: Urine Testing for Drugs of Abuse. NTIS PB 89151971. Bethesda, MD: National Institutes of Health, 1987; NIDA. NIDA Research

  4. Artifact removal algorithms for stroke detection using a multistatic MIST beamforming algorithm.

    Science.gov (United States)

    Ricci, E; Di Domenico, S; Cianca, E; Rossi, T

    2015-01-01

    Microwave imaging (MWI) has been recently proved as a promising imaging modality for low-complexity, low-cost and fast brain imaging tools, which could play a fundamental role to efficiently manage emergencies related to stroke and hemorrhages. This paper focuses on the UWB radar imaging approach and in particular on the processing algorithms of the backscattered signals. Assuming the use of the multistatic version of the MIST (Microwave Imaging Space-Time) beamforming algorithm, developed by Hagness et al. for the early detection of breast cancer, the paper proposes and compares two artifact removal algorithms. Artifacts removal is an essential step of any UWB radar imaging system and currently considered artifact removal algorithms have been shown not to be effective in the specific scenario of brain imaging. First of all, the paper proposes modifications of a known artifact removal algorithm. These modifications are shown to be effective to achieve good localization accuracy and lower false positives. However, the main contribution is the proposal of an artifact removal algorithm based on statistical methods, which allows to achieve even better performance but with much lower computational complexity.

  5. Plasma current profile during current reversal in a tokamak

    International Nuclear Information System (INIS)

    Huang Jianguo; Yang Xuanzong; Zheng Shaobai; Feng Chunhua; Zhang Houxian; Wang Long

    1999-01-01

    Alternating current operation with one full cycle and a current level of 2.5 kA have been achieved in the CT-6B tokamak. The poloidal magnetic field in the plasma is measured with two internal magnetic probes in repeated discharges. The current distribution is reconstructed with an inversion algorithm. The inverse current first appears on the weak field side. The existence of magnetic surfaces and rotational transform provide particle confinement in the current reversal phase

  6. Pulmonary arterial hypertension: tailoring treatment to risk in the current era

    Directory of Open Access Journals (Sweden)

    Sean Gaine

    2017-12-01

    Full Text Available Recent advances in the treatment of pulmonary arterial hypertension (PAH have led to improved patient outcomes. Multiple PAH therapies are now available and optimising the use of these drugs in clinical practice is vital. In this review, we discuss the management of PAH patients in the context of current treatment guidelines and supporting clinical evidence. In clinical practice, considerable emphasis is placed on the importance of making treatment decisions guided by each patient's risk status, which should be assessed using multiple prognostic parameters. As PAH is a progressive disease, regular assessments are essential to ensure that any change in risk is detected in a timely manner and treatment is adjusted accordingly. With the availability of therapies that target three different pathogenic pathways, combination therapy is now the standard of care. For most patients, this involves dual combination therapy with agents targeting the endothelin and nitric oxide pathways. Therapies targeting the prostacyclin pathway should be added for patients receiving dual combination therapy who do not achieve a low-risk status. There is also a need for a holistic approach to treatment beyond pharmacological therapies. Implementation of all these approaches will ensure that PAH patients receive maximal benefit from currently available therapies.

  7. A review of the current treatment methods for posthaemorrhagic hydrocephalus of infants

    Directory of Open Access Journals (Sweden)

    Sparrow Owen

    2009-01-01

    Full Text Available Abstract Posthaemorrhagic hydrocephalus (PHH is a major problem for premature infants, generally requiring lifelong care. It results from small blood clots inducing scarring within CSF channels impeding CSF circulation. Transforming growth factor – beta is released into CSF and cytokines stimulate deposition of extracellular matrix proteins which potentially obstruct CSF pathways. Prolonged raised pressures and free radical damage incur poor neurodevelopmental outcomes. The most common treatment involves permanent ventricular shunting with all its risks and consequences. This is a review of the current evidence for the treatment and prevention of PHH and shunt dependency. The Cochrane Central Register of Controlled Trials (CENTRAL, The Cochrane Library and PubMed (from 1966 to August 2008 were searched. Trials using random or quasi-random patient allocation for any intervention were considered in infants less than 12 months old with PHH. Thirteen trials were identified although speculative interventions were also evaluated. The literature confirms that lumbar punctures, diuretic drugs and intraventricular fibrinolytic therapy can have significant adverse effects and fail to prevent shunt dependence, death or disability. There is no evidence that postnatal phenobarbital administration prevents intraventricular haemorrhage (IVH. Subcutaneous reservoirs and external drains have not been tested in randomized controlled trials, but can be useful as a temporising measure. Drainage, irrigation and fibrinolytic therapy as a way of removing blood to inhibit progressive deposition of matrix proteins, permanent hydrocephalus and shunt dependency, are invasive and experimental. Studies of ventriculo-subgaleal shunts show potential as a temporary method of CSF diversion, but have high infection rates. At present no clinical intervention has been shown to reduce shunt surgery in these infants. A ventricular shunt is not advisable in the early phase after

  8. Gastric Cancer: Current Status of Diagnosis and Treatment

    International Nuclear Information System (INIS)

    Takahashi, Tsunehiro; Saikawa, Yoshiro; Kitagawa, Yuko

    2013-01-01

    Gastric cancer is the second leading cause of death from malignant disease worldwide and most frequently discovered in advanced stages. Because curative surgery is regarded as the only option for cure, early detection of resectable gastric cancer is extremely important for good patient outcomes. Therefore, noninvasive diagnostic modalities such as evolutionary endoscopy and positron emission tomography are utilized as screening tools for gastric cancer. To date, early gastric cancer is being treated using minimally invasive methods such as endoscopic treatment and laparoscopic surgery, while in advanced cancer it is necessary to consider multimodality treatment including chemotherapy, radiotherapy, and surgery. Because of the results of large clinical trials, surgery with extended lymphadenectomy could not be recommended as a standard therapy for advanced gastric cancer. Recent clinical trials had shown survival benefits of adjuvant chemotherapy after curative resection compared with surgery alone. In addition, recent advances of molecular targeted agents would play an important role as one of the modalities for advanced gastric cancer. In this review, we summarize the current status of diagnostic technology and treatment for gastric cancer

  9. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    Science.gov (United States)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that

  10. Lymphatic malformations: a proposed management algorithm.

    LENUS (Irish Health Repository)

    Oosthuizen, J C

    2012-02-01

    OBJECTIVE: The aim of this study was to develop a management algorithm for cervicofacial lymphatic malformations, based on the authors\\' experience in managing these lesions as well as current literature on the subject. STUDY DESIGN AND METHODS: A retrospective medical record review of all the patients treated for lymphatic malformations at our institution during a 10-year period (1998-2008) was performed. DATA COLLECTED: age at diagnosis, location and type of lesion, radiologic investigation performed, presenting symptoms, treatment modality used, complications and results achieved. RESULTS: 14 patients were identified. Eight (57%) male and six (43%) female. There was an equal distribution between the left and right sides. The majority (71%) of cases were diagnosed within the first year of life. The majority of lesions were located in the suprahyoid region. The predominant reason for referral was an asymptomatic mass in 7 cases (50%) followed by airway compromise (36%) and dysphagia (14%). Management options employed included: observation, OK-432 injection, surgical excision and laser therapy. In 5 cases (36%) a combination of these were used. CONCLUSION: Historically surgical excision has been the management option of choice for lymphatic malformations. However due to the morbidity and high complication rate associated this is increasingly being questioned. Recent advances in sclerotherapy e.g. OK-432 injection have also shown significant promise. Based on experience in managing these lesions as well as current literature the authors of this paper have developed an algorithm for the management of cervicofacial lymphatic malformations.

  11. Evaluation of the performance of existing non-laboratory based cardiovascular risk assessment algorithms

    Science.gov (United States)

    2013-01-01

    Background The high burden and rising incidence of cardiovascular disease (CVD) in resource constrained countries necessitates implementation of robust and pragmatic primary and secondary prevention strategies. Many current CVD management guidelines recommend absolute cardiovascular (CV) risk assessment as a clinically sound guide to preventive and treatment strategies. Development of non-laboratory based cardiovascular risk assessment algorithms enable absolute risk assessment in resource constrained countries. The objective of this review is to evaluate the performance of existing non-laboratory based CV risk assessment algorithms using the benchmarks for clinically useful CV risk assessment algorithms outlined by Cooney and colleagues. Methods A literature search to identify non-laboratory based risk prediction algorithms was performed in MEDLINE, CINAHL, Ovid Premier Nursing Journals Plus, and PubMed databases. The identified algorithms were evaluated using the benchmarks for clinically useful cardiovascular risk assessment algorithms outlined by Cooney and colleagues. Results Five non-laboratory based CV risk assessment algorithms were identified. The Gaziano and Framingham algorithms met the criteria for appropriateness of statistical methods used to derive the algorithms and endpoints. The Swedish Consultation, Framingham and Gaziano algorithms demonstrated good discrimination in derivation datasets. Only the Gaziano algorithm was externally validated where it had optimal discrimination. The Gaziano and WHO algorithms had chart formats which made them simple and user friendly for clinical application. Conclusion Both the Gaziano and Framingham non-laboratory based algorithms met most of the criteria outlined by Cooney and colleagues. External validation of the algorithms in diverse samples is needed to ascertain their performance and applicability to different populations and to enhance clinicians’ confidence in them. PMID:24373202

  12. An Algorithm For Climate-Quality Atmospheric Profiling Continuity From EOS Aqua To Suomi-NPP

    Science.gov (United States)

    Moncet, J. L.

    2015-12-01

    We will present results from an algorithm that is being developed to produce climate-quality atmospheric profiling earth system data records (ESDRs) for application to hyperspectral sounding instrument data from Suomi-NPP, EOS Aqua, and other spacecraft. The current focus is on data from the S-NPP Cross-track Infrared Sounder (CrIS) and Advanced Technology Microwave Sounder (ATMS) instruments as well as the Atmospheric InfraRed Sounder (AIRS) on EOS Aqua. The algorithm development at Atmospheric and Environmental Research (AER) has common heritage with the optimal estimation (OE) algorithm operationally processing S-NPP data in the Interface Data Processing Segment (IDPS), but the ESDR algorithm has a flexible, modular software structure to support experimentation and collaboration and has several features adapted to the climate orientation of ESDRs. Data record continuity benefits from the fact that the same algorithm can be applied to different sensors, simply by providing suitable configuration and data files. The radiative transfer component uses an enhanced version of optimal spectral sampling (OSS) with updated spectroscopy, treatment of emission that is not in local thermodynamic equilibrium (non-LTE), efficiency gains with "global" optimal sampling over all channels, and support for channel selection. The algorithm is designed for adaptive treatment of clouds, with capability to apply "cloud clearing" or simultaneous cloud parameter retrieval, depending on conditions. We will present retrieval results demonstrating the impact of a new capability to perform the retrievals on sigma or hybrid vertical grid (as opposed to a fixed pressure grid), which particularly affects profile accuracy over land with variable terrain height and with sharp vertical structure near the surface. In addition, we will show impacts of alternative treatments of regularization of the inversion. While OE algorithms typically implement regularization by using background estimates from

  13. Algorithm of Particle Data Association for SLAM Based on Improved Ant Algorithm

    Directory of Open Access Journals (Sweden)

    KeKe Gen

    2015-01-01

    Full Text Available The article considers a problem of data association algorithm for simultaneous localization and mapping guidelines in determining the route of unmanned aerial vehicles (UAVs. Currently, these equipments are already widely used, but mainly controlled from the remote operator. An urgent task is to develop a control system that allows for autonomous flight. Algorithm SLAM (simultaneous localization and mapping, which allows to predict the location, speed, the ratio of flight parameters and the coordinates of landmarks and obstacles in an unknown environment, is one of the key technologies to achieve real autonomous UAV flight. The aim of this work is to study the possibility of solving this problem by using an improved ant algorithm.The data association for SLAM algorithm is meant to establish a matching set of observed landmarks and landmarks in the state vector. Ant algorithm is one of the widely used optimization algorithms with positive feedback and the ability to search in parallel, so the algorithm is suitable for solving the problem of data association for SLAM. But the traditional ant algorithm in the process of finding routes easily falls into local optimum. Adding random perturbations in the process of updating the global pheromone to avoid local optima. Setting limits pheromone on the route can increase the search space with a reasonable amount of calculations for finding the optimal route.The paper proposes an algorithm of the local data association for SLAM algorithm based on an improved ant algorithm. To increase the speed of calculation, local data association is used instead of the global data association. The first stage of the algorithm defines targets in the matching space and the observed landmarks with the possibility of association by the criterion of individual compatibility (IC. The second stage defines the matched landmarks and their coordinates using improved ant algorithm. Simulation results confirm the efficiency and

  14. Three dimensional intensity modulated brachytherapy (IMBT): Dosimetry algorithm and inverse treatment planning

    International Nuclear Information System (INIS)

    Shi Chengyu; Guo Bingqi; Cheng, Chih-Yao; Esquivel, Carlos; Eng, Tony; Papanikolaou, Niko

    2010-01-01

    Purpose: The feasibility of intensity modulated brachytherapy (IMBT) to improve dose conformity for irregularly shaped targets has been previously investigated by researchers by means of using partially shielded sources. However, partial shielding does not fully explore the potential of IMBT. The goal of this study is to introduce the concept of three dimensional (3D) intensity modulated brachytherapy and solve two fundamental issues regarding the application of 3D IMBT treatment planning: The dose calculation algorithm and the inverse treatment planning method. Methods: A 3D IMBT treatment planning system prototype was developed using the MATLAB platform. This system consists of three major components: (1) A comprehensive IMBT source calibration method with dosimetric inputs from Monte Carlo (EGSnrc) simulations; (2) a ''modified TG-43'' (mTG-43) dose calculation formalism for IMBT dosimetry; and (3) a physical constraint based inverse IMBT treatment planning platform utilizing a simulated annealing optimization algorithm. The model S700 Axxent electronic brachytherapy source developed by Xoft, Inc. (Fremont, CA), was simulated in this application. Ten intracavitary accelerated partial breast irradiation (APBI) cases were studied. For each case, an ''isotropic plan'' with only optimized source dwell time and a fully optimized IMBT plan were generated and compared to the original plan in various dosimetric aspects, such as the plan quality, planning, and delivery time. The issue of the mechanical complexity of the IMBT applicator is not addressed in this study. Results: IMBT approaches showed superior plan quality compared to the original plans and the isotropic plans to different extents in all studied cases. An extremely difficult case with a small breast and a small distance to the ribs and skin, the IMBT plan minimized the high dose volume V 200 by 16.1% and 4.8%, respectively, compared to the original and the isotropic plans. The conformity index for the

  15. Three dimensional intensity modulated brachytherapy (IMBT): Dosimetry algorithm and inverse treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Shi Chengyu; Guo Bingqi; Cheng, Chih-Yao; Esquivel, Carlos; Eng, Tony; Papanikolaou, Niko [Cancer Therapy and Research Center, University of Texas Health Science Center at San Antonio, San Antonio, Texas 78229 (United States); Department of Radiation Oncology, Oklahoma University Health Science Center, Oklahoma City, Oklahoma 73104 (United States); Cancer Therapy and Research Center, University of Texas Health Science Center at San Antonio, San Antonio, Texas 78229 (United States)

    2010-07-15

    Purpose: The feasibility of intensity modulated brachytherapy (IMBT) to improve dose conformity for irregularly shaped targets has been previously investigated by researchers by means of using partially shielded sources. However, partial shielding does not fully explore the potential of IMBT. The goal of this study is to introduce the concept of three dimensional (3D) intensity modulated brachytherapy and solve two fundamental issues regarding the application of 3D IMBT treatment planning: The dose calculation algorithm and the inverse treatment planning method. Methods: A 3D IMBT treatment planning system prototype was developed using the MATLAB platform. This system consists of three major components: (1) A comprehensive IMBT source calibration method with dosimetric inputs from Monte Carlo (EGSnrc) simulations; (2) a ''modified TG-43'' (mTG-43) dose calculation formalism for IMBT dosimetry; and (3) a physical constraint based inverse IMBT treatment planning platform utilizing a simulated annealing optimization algorithm. The model S700 Axxent electronic brachytherapy source developed by Xoft, Inc. (Fremont, CA), was simulated in this application. Ten intracavitary accelerated partial breast irradiation (APBI) cases were studied. For each case, an ''isotropic plan'' with only optimized source dwell time and a fully optimized IMBT plan were generated and compared to the original plan in various dosimetric aspects, such as the plan quality, planning, and delivery time. The issue of the mechanical complexity of the IMBT applicator is not addressed in this study. Results: IMBT approaches showed superior plan quality compared to the original plans and the isotropic plans to different extents in all studied cases. An extremely difficult case with a small breast and a small distance to the ribs and skin, the IMBT plan minimized the high dose volume V{sub 200} by 16.1% and 4.8%, respectively, compared to the original and the

  16. Three dimensional intensity modulated brachytherapy (IMBT): dosimetry algorithm and inverse treatment planning.

    Science.gov (United States)

    Shi, Chengyu; Guo, Bingqi; Cheng, Chih-Yao; Esquivel, Carlos; Eng, Tony; Papanikolaou, Niko

    2010-07-01

    The feasibility of intensity modulated brachytherapy (IMBT) to improve dose conformity for irregularly shaped targets has been previously investigated by researchers by means of using partially shielded sources. However, partial shielding does not fully explore the potential of IMBT. The goal of this study is to introduce the concept of three dimensional (3D) intensity modulated brachytherapy and solve two fundamental issues regarding the application of 3D IMBT treatment planning: The dose calculation algorithm and the inverse treatment planning method. A 3D IMBT treatment planning system prototype was developed using the MATLAB platform. This system consists of three major components: (1) A comprehensive IMBT source calibration method with dosimetric inputs from Monte Carlo (EGSnrc) simulations; (2) a "modified TG-43" (mTG-43) dose calculation formalism for IMBT dosimetry; and (3) a physical constraint based inverse IMBT treatment planning platform utilizing a simulated annealing optimization algorithm. The model S700 Axxent electronic brachytherapy source developed by Xoft, Inc. (Fremont, CA), was simulated in this application. Ten intracavitary accelerated partial breast irradiation (APBI) cases were studied. For each case, an "isotropic plan" with only optimized source dwell time and a fully optimized IMBT plan were generated and compared to the original plan in various dosimetric aspects, such as the plan quality, planning, and delivery time. The issue of the mechanical complexity of the IMBT applicator is not addressed in this study. IMBT approaches showed superior plan quality compared to the original plans and tht isotropic plans to different extents in all studied cases. An extremely difficult case with a small breast and a small distance to the ribs and skin, the IMBT plan minimized the high dose volume V200 by 16.1% and 4.8%, respectively, compared to the original and the isotropic plans. The conformity index for the target was increased by 0.13 and 0

  17. Current status in the treatment options for esophageal achalasia.

    Science.gov (United States)

    Chuah, Seng-Kee; Chiu, Chien-Hua; Tai, Wei-Chen; Lee, Jyong-Hong; Lu, Hung-I; Changchien, Chi-Sin; Tseng, Ping-Huei; Wu, Keng-Liang

    2013-09-07

    Recent advances in the treatment of achalasia include the use of high-resolution manometry to predict the outcome of patients and the introduction of peroral endoscopic myotomy (POEM). The first multicenter randomized, controlled, 2-year follow-up study conducted by the European Achalasia Trial group indicated that laparoscopic Heller myotomy (LHM) was not superior to pneumatic dilations (PD). Publications on the long-term success of laparoscopic surgery continue to emerge. In addition, laparoscopic single-site surgery is applicable to advanced laparoscopic operations such as LHM and anterior fundoplication. The optimal treatment option is an ongoing matter of debate. In this review, we provide an update of the current progress in the treatment of esophageal achalasia. Unless new conclusive data prove otherwise, LHM is considered the most durable treatment for achalasia at the expense of increased reflux-associated complications. However, PD is the first choice for non-surgical treatment and is more cost-effective. Repeated PD according to an "on-demand" strategy based on symptom recurrence can achieve long-term remission. Decision making should be based on clinical evidence that identifies a subcategory of patients who would benefit from specific treatment options. POEM has shown promise but its long-term efficacy and safety need to be assessed further.

  18. Current treatment for anorexia nervosa: efficacy, safety, and adherence

    Directory of Open Access Journals (Sweden)

    Lindsay P Bodell

    2010-10-01

    Full Text Available Lindsay P Bodell, Pamela K KeelDepartment of Psychology, Florida State University, Tallahassee, FL, USAAbstract: Anorexia nervosa (AN is a serious psychiatric illness associated with significant medical and psychiatric morbidity, psychosocial impairment, increased risk of death, and chronicity. Given the severity of the disorder, the establishment of safe and effective treatments is necessary. Several treatments have been tried in AN, but few favorable results have emerged. This paper reviews randomized controlled trials in AN, and provides a synthesis of existing data regarding the efficacy, safety, and adherence associated with pharmacologic and psychological interventions. Randomized controlled trials for the treatment of AN published in peer-reviewed journals were identified by electronic and manual searches. Overall, pharmacotherapy has limited benefits in the treatment of AN, with some promising preliminary findings associated with olanzapine, an antipsychotic agent. No single psychological intervention has demonstrated clear superiority in treating adults with AN. In adolescents with AN, the evidence base is strongest for the use of family therapy over alternative individual psychotherapies. Results highlight challenges in both treating individuals with AN and in studying the effects of those treatments, and further emphasize the importance of continued efforts to develop novel interventions. Treatment trials currently underway and areas for future research are discussed.Keywords: anorexia nervosa, treatment, pharmacotherapy, psychotherapy, randomized controlled trials

  19. Current status and prospects of HIV treatment.

    Science.gov (United States)

    Cihlar, Tomas; Fordyce, Marshall

    2016-06-01

    Current antiviral treatments can reduce HIV-associated morbidity, prolong survival, and prevent HIV transmission. Combination antiretroviral therapy (cART) containing preferably three active drugs from two or more classes is required for durable virologic suppression. Regimen selection is based on virologic efficacy, potential for adverse effects, pill burden and dosing frequency, drug-drug interaction potential, resistance test results, comorbid conditions, social status, and cost. With prolonged virologic suppression, improved clinical outcomes, and longer survival, patients will be exposed to antiretroviral agents for decades. Therefore, maximizing the safety and tolerability of cART is a high priority. Emergence of resistance and/or lack of tolerability in individual patients require availability of a range of treatment options. Development of new drugs is focused on improving safety (e.g. tenofovir alafenamide) and/or resistance profile (e.g. doravirine) within the existing drug classes, combination therapies with improved adherence (e.g. single-tablet regimens), novel mechanisms of action (e.g. attachment inhibitors, maturation inhibitors, broadly neutralizing antibodies), and treatment simplification with infrequent dosing (e.g. long-acting injectables). In parallel with cART innovations, research and development efforts focused on agents that target persistent HIV reservoirs may lead to prolonged drug-free remission and HIV cure. Copyright © 2016 Gilead Sciences, Inc. Published by Elsevier B.V. All rights reserved.

  20. Evaluating and comparing algorithms for respiratory motion prediction

    International Nuclear Information System (INIS)

    Ernst, F; Dürichen, R; Schlaefer, A; Schweikard, A

    2013-01-01

    In robotic radiosurgery, it is necessary to compensate for systematic latencies arising from target tracking and mechanical constraints. This compensation is usually achieved by means of an algorithm which computes the future target position. In most scientific works on respiratory motion prediction, only one or two algorithms are evaluated on a limited amount of very short motion traces. The purpose of this work is to gain more insight into the real world capabilities of respiratory motion prediction methods by evaluating many algorithms on an unprecedented amount of data. We have evaluated six algorithms, the normalized least mean squares (nLMS), recursive least squares (RLS), multi-step linear methods (MULIN), wavelet-based multiscale autoregression (wLMS), extended Kalman filtering, and ε-support vector regression (SVRpred) methods, on an extensive database of 304 respiratory motion traces. The traces were collected during treatment with the CyberKnife (Accuray, Inc., Sunnyvale, CA, USA) and feature an average length of 71 min. Evaluation was done using a graphical prediction toolkit, which is available to the general public, as is the data we used. The experiments show that the nLMS algorithm—which is one of the algorithms currently used in the CyberKnife—is outperformed by all other methods. This is especially true in the case of the wLMS, the SVRpred, and the MULIN algorithms, which perform much better. The nLMS algorithm produces a relative root mean square (RMS) error of 75% or less (i.e., a reduction in error of 25% or more when compared to not doing prediction) in only 38% of the test cases, whereas the MULIN and SVRpred methods reach this level in more than 77%, the wLMS algorithm in more than 84% of the test cases. Our work shows that the wLMS algorithm is the most accurate algorithm and does not require parameter tuning, making it an ideal candidate for clinical implementation. Additionally, we have seen that the structure of a patient

  1. Multiple brain metastases - current management and perspectives for treatment with electrochemotherapy

    DEFF Research Database (Denmark)

    Linnert, Mette; Iversen, Helle Klingenberg; Gehl, Julie

    2012-01-01

    BACKGROUND: Due to the advanced oncological treatments of cancer, an overall increase in cancer incidence, and better diagnostic tools, the incidence of brain metastases is on the rise. This review addresses the current treatment options for patients with multiple brain metastases, presenting...... of the chemotherapeutic drug bleomycin by 300 times. Preclinical data are promising and the first patient has been treated in an ongoing clinical trial for patients with brain metastases. Perspectives for ECT in the brain include treatment of primary and secondary brain tumors as well as soft tissue metastases elsewhere....

  2. The utility and limitations of current web-available algorithms to predict peptides recognized by CD4 T cells in response to pathogen infection #

    Science.gov (United States)

    Chaves, Francisco A.; Lee, Alvin H.; Nayak, Jennifer; Richards, Katherine A.; Sant, Andrea J.

    2012-01-01

    The ability to track CD4 T cells elicited in response to pathogen infection or vaccination is critical because of the role these cells play in protective immunity. Coupled with advances in genome sequencing of pathogenic organisms, there is considerable appeal for implementation of computer-based algorithms to predict peptides that bind to the class II molecules, forming the complex recognized by CD4 T cells. Despite recent progress in this area, there is a paucity of data regarding their success in identifying actual pathogen-derived epitopes. In this study, we sought to rigorously evaluate the performance of multiple web-available algorithms by comparing their predictions and our results using purely empirical methods for epitope discovery in influenza that utilized overlapping peptides and cytokine Elispots, for three independent class II molecules. We analyzed the data in different ways, trying to anticipate how an investigator might use these computational tools for epitope discovery. We come to the conclusion that currently available algorithms can indeed facilitate epitope discovery, but all shared a high degree of false positive and false negative predictions. Therefore, efficiencies were low. We also found dramatic disparities among algorithms and between predicted IC50 values and true dissociation rates of peptide:MHC class II complexes. We suggest that improved success of predictive algorithms will depend less on changes in computational methods or increased data sets and more on changes in parameters used to “train” the algorithms that factor in elements of T cell repertoire and peptide acquisition by class II molecules. PMID:22467652

  3. Current pharmacological agents for the treatment of premature ejaculation

    Directory of Open Access Journals (Sweden)

    Onur Dede

    2014-06-01

    Full Text Available This study was aimed to review and assess the update studies regarding medical treatment for premature ejaculation (PE. It is the most common sexual problem affecting men. It can affect men at all ages and has a serious impact on the quality of life for men and their partners. A wide variety of therapeutic modalities have been tried for treatment of premature ejaculation. Psychological therapies may be helpful for patients with complaint PE. Several topical therapies have been used including lidocaine cream, lidocaine-prilocaine cream. There has been recent interest in the selective serotonin reuptake inhibitors (SSRI for the treatment of PE, due to the fact that one of their common side effects is delayed ejaculation. Currently used SSRIs have several non-sexual side effects and long half lives, therefore there has been interest in developing a short acting, and efficacious SSRI that can be used on-demand for PE. Dapoxetine has been recently evaluated for the treatment of PE by several groups, and results so far appear promising.

  4. Planar graphs theory and algorithms

    CERN Document Server

    Nishizeki, T

    1988-01-01

    Collected in this volume are most of the important theorems and algorithms currently known for planar graphs, together with constructive proofs for the theorems. Many of the algorithms are written in Pidgin PASCAL, and are the best-known ones; the complexities are linear or 0(nlogn). The first two chapters provide the foundations of graph theoretic notions and algorithmic techniques. The remaining chapters discuss the topics of planarity testing, embedding, drawing, vertex- or edge-coloring, maximum independence set, subgraph listing, planar separator theorem, Hamiltonian cycles, and single- or multicommodity flows. Suitable for a course on algorithms, graph theory, or planar graphs, the volume will also be useful for computer scientists and graph theorists at the research level. An extensive reference section is included.

  5. A Novel Particle Swarm Optimization Algorithm for Global Optimization.

    Science.gov (United States)

    Wang, Chun-Feng; Liu, Kui

    2016-01-01

    Particle Swarm Optimization (PSO) is a recently developed optimization method, which has attracted interest of researchers in various areas due to its simplicity and effectiveness, and many variants have been proposed. In this paper, a novel Particle Swarm Optimization algorithm is presented, in which the information of the best neighbor of each particle and the best particle of the entire population in the current iteration is considered. Meanwhile, to avoid premature, an abandoned mechanism is used. Furthermore, for improving the global convergence speed of our algorithm, a chaotic search is adopted in the best solution of the current iteration. To verify the performance of our algorithm, standard test functions have been employed. The experimental results show that the algorithm is much more robust and efficient than some existing Particle Swarm Optimization algorithms.

  6. An Ultrasonic Multi-Beam Concentration Meter with a Neuro-Fuzzy Algorithm for Water Treatment Plants.

    Science.gov (United States)

    Lee, Ho-Hyun; Jang, Sang-Bok; Shin, Gang-Wook; Hong, Sung-Taek; Lee, Dae-Jong; Chun, Myung Geun

    2015-10-23

    Ultrasonic concentration meters have widely been used at water purification, sewage treatment and waste water treatment plants to sort and transfer high concentration sludges and to control the amount of chemical dosage. When an unusual substance is contained in the sludge, however, the attenuation of ultrasonic waves could be increased or not be transmitted to the receiver. In this case, the value measured by a concentration meter is higher than the actual density value or vibration. As well, it is difficult to automate the residuals treatment process according to the various problems such as sludge attachment or sensor failure. An ultrasonic multi-beam concentration sensor was considered to solve these problems, but an abnormal concentration value of a specific ultrasonic beam degrades the accuracy of the entire measurement in case of using a conventional arithmetic mean for all measurement values, so this paper proposes a method to improve the accuracy of the sludge concentration determination by choosing reliable sensor values and applying a neuro-fuzzy learning algorithm. The newly developed meter is proven to render useful results from a variety of experiments on a real water treatment plant.

  7. An AK-LDMeans algorithm based on image clustering

    Science.gov (United States)

    Chen, Huimin; Li, Xingwei; Zhang, Yongbin; Chen, Nan

    2018-03-01

    Clustering is an effective analytical technique for handling unmarked data for value mining. Its ultimate goal is to mark unclassified data quickly and correctly. We use the roadmap for the current image processing as the experimental background. In this paper, we propose an AK-LDMeans algorithm to automatically lock the K value by designing the Kcost fold line, and then use the long-distance high-density method to select the clustering centers to further replace the traditional initial clustering center selection method, which further improves the efficiency and accuracy of the traditional K-Means Algorithm. And the experimental results are compared with the current clustering algorithm and the results are obtained. The algorithm can provide effective reference value in the fields of image processing, machine vision and data mining.

  8. Monte Carlo evaluation of a photon pencil kernel algorithm applied to fast neutron therapy treatment planning

    Science.gov (United States)

    Söderberg, Jonas; Alm Carlsson, Gudrun; Ahnesjö, Anders

    2003-10-01

    When dedicated software is lacking, treatment planning for fast neutron therapy is sometimes performed using dose calculation algorithms designed for photon beam therapy. In this work Monte Carlo derived neutron pencil kernels in water were parametrized using the photon dose algorithm implemented in the Nucletron TMS (treatment management system) treatment planning system. A rectangular fast-neutron fluence spectrum with energies 0-40 MeV (resembling a polyethylene filtered p(41)+ Be spectrum) was used. Central axis depth doses and lateral dose distributions were calculated and compared with the corresponding dose distributions from Monte Carlo calculations for homogeneous water and heterogeneous slab phantoms. All absorbed doses were normalized to the reference dose at 10 cm depth for a field of radius 5.6 cm in a 30 × 40 × 20 cm3 water test phantom. Agreement to within 7% was found in both the lateral and the depth dose distributions. The deviations could be explained as due to differences in size between the test phantom and that used in deriving the pencil kernel (radius 200 cm, thickness 50 cm). In the heterogeneous phantom, the TMS, with a directly applied neutron pencil kernel, and Monte Carlo calculated absorbed doses agree approximately for muscle but show large deviations for media such as adipose or bone. For the latter media, agreement was substantially improved by correcting the absorbed doses calculated in TMS with the neutron kerma factor ratio and the stopping power ratio between tissue and water. The multipurpose Monte Carlo code FLUKA was used both in calculating the pencil kernel and in direct calculations of absorbed dose in the phantom.

  9. Proportionate-type normalized last mean square algorithms

    CERN Document Server

    Wagner, Kevin

    2013-01-01

    The topic of this book is proportionate-type normalized least mean squares (PtNLMS) adaptive filtering algorithms, which attempt to estimate an unknown impulse response by adaptively giving gains proportionate to an estimate of the impulse response and the current measured error. These algorithms offer low computational complexity and fast convergence times for sparse impulse responses in network and acoustic echo cancellation applications. New PtNLMS algorithms are developed by choosing gains that optimize user-defined criteria, such as mean square error, at all times. PtNLMS algorithms ar

  10. Substance Use in Rural Central Appalachia: Current Status and Treatment Considerations.

    Science.gov (United States)

    Moody, Lara; Satterwhite, Emily; Bickel, Warren K

    2017-04-01

    The burden of substance use and especially the unmatched rates of overdoses in rural Central Appalachia highlight the need for innovative approaches to curb the initiation to drug misuse and to address current substance use disorders. Effective substance use interventions involve a thorough understanding of the region. In Central Appalachia, many of the barriers to treatment are shared with other rural and impoverished areas, including a lack of access to health care and lack of health care providers with specialized training. Parts of Appalachia also present their own considerations, including the challenges of fostering trust and encouraging treatment-seeking in communities with dense, long-term, place-based social and family networks. Current policies and interventions for substance use have been largely inadequate in the region, as evidenced by continued increases in substance use and substance-related deaths, especially related to nonmedical prescription drug use and increasing heroin use. The authors discuss ways in which rural life, poverty, identity, and values in Appalachia have influenced substance use and treatment and propose strategies and interventions to improve outcomes.

  11. Problem solving with genetic algorithms and Splicer

    Science.gov (United States)

    Bayer, Steven E.; Wang, Lui

    1991-01-01

    Genetic algorithms are highly parallel, adaptive search procedures (i.e., problem-solving methods) loosely based on the processes of population genetics and Darwinian survival of the fittest. Genetic algorithms have proven useful in domains where other optimization techniques perform poorly. The main purpose of the paper is to discuss a NASA-sponsored software development project to develop a general-purpose tool for using genetic algorithms. The tool, called Splicer, can be used to solve a wide variety of optimization problems and is currently available from NASA and COSMIC. This discussion is preceded by an introduction to basic genetic algorithm concepts and a discussion of genetic algorithm applications.

  12. Current treatment of dyslipidaemia: PCSK9 inhibitors and statin intolerance.

    Science.gov (United States)

    Koskinas, Konstantinos; Wilhelm, Matthias; Windecker, Stephan

    2016-01-01

    Statins are the cornerstone of the management of dyslipidaemias and prevention of cardiovascular disease. Although statins are, overall, safe and well tolerated, adverse events can occur and constitute an important barrier to maintaining long-term adherence to statin treatment. In patients who cannot tolerate statins, alternative treatments include switch to another statin, intermittent-dosage regimens and non-statin lipid-lowering medications. Nonetheless, a high proportion of statin-intolerant patients are unable to achieve recommended low-density lipoprotein (LDL) cholesterol goals, thereby resulting in substantial residual cardiovascular risk. Proprotein convertase subtilisin/kexin type 9 (PCSK9) is a protease implicated in LDL receptor degradation and plays a central role in cholesterol metabolism. In recent studies, PCSK9 inhibition by means of monoclonal antibodies achieved LDL cholesterol reductions of 50% to 70% across various patient populations and background lipid-lowering therapies, while maintaining a favourable safety profile. The efficacy and safety of the monoclonal antibodies alirocumab and evolocumab were confirmed in statin-intolerant patients, indicating that PCSK9 inhibitors represent an attractive treatment option in this challenging clinical setting. PCSK9 inhibitors recently received regulatory approval for clinical use and may be considered in properly selected patients according to current consensus documents, including patients with statin intolerance. In this review we summarise current evidence regarding diagnostic evaluation of statin-related adverse events, particularly statin-associated muscle symptoms, and we discuss current recommendations on the management of statin-intolerant patients. In view of emerging evidence of the efficacy and safety of PCSK9 inhibitors, we further discuss the role of monoclonal PCSK9 antibodies in the management of statin-intolerant hypercholesterolaemic patients.

  13. SU-F-J-148: A Collapsed Cone Algorithm Can Be Used for Quality Assurance for Monaco Treatment Plans for the MR-Linac

    Energy Technology Data Exchange (ETDEWEB)

    Hackett, S; Asselen, B van; Wolthaus, J; Kotte, A; Bol, G; Lagendijk, J; Raaymakers, B [University Medical Center, Utrecht (Netherlands); Feist, G [Elekta Instrument AB, Stockholm (Sweden); Pencea, S [Elekta Inc., Atlanta, GA (United States); Akhiat, H [Elekta BV, Best (Netherlands)

    2016-06-15

    Purpose: Treatment plans for the MR-linac, calculated in Monaco v5.19, include direct simulation of the effects of the 1.5T B{sub 0}-field. We tested the feasibility of using a collapsed-cone (CC) algorithm in Oncentra, which does not account for effects of the B{sub 0}-field, as a fast online, independent 3D check of dose calculations. Methods: Treatment plans for six patients were generated in Monaco with a 6 MV FFF beam and the B{sub 0}-field. All plans were recalculated with a CC model of the same beam. Plans for the same patients were also generated in Monaco without the B{sub 0}-field. The mean dose (Dmean) and doses to 10% (D10%) and 90% (D90%) of the volume were determined, as percentages of the prescribed dose, for target volumes and OARs in each calculated dose distribution. Student’s t-tests between paired parameters from Monaco plans and corresponding CC calculations were performed. Results: Figure 1 shows an example of the difference between dose distributions calculated in Monaco, with the B{sub 0}-field, and the CC algorithm. Figure 2 shows distributions of (absolute) difference between parameters for Monaco plans, with the B{sub 0}-field, and CC calculations. The Dmean and D90% values for the CTVs and PTVs were significantly different, but differences in dose distributions arose predominantly at the edges of the target volumes. Inclusion of the B{sub 0}-field had little effect on agreement of the Dmean values, as illustrated by Figure 3, nor on agreement of the D10% and D90% values. Conclusion: Dose distributions recalculated with a CC algorithm show good agreement with those calculated with Monaco, for plans both with and without the B{sub 0}-field, indicating that the CC algorithm could be used to check online treatment planning for the MRlinac. Agreement for a wider range of treatment sites, and the feasibility of using the γ-test as a simple pass/fail criterion, will be investigated.

  14. Current state of methodological and decisions for radiation treatment of blood, its components and products

    Directory of Open Access Journals (Sweden)

    Gordeev A.V.

    2014-12-01

    Full Text Available This article presents currently used blood transfusion media — components and blood products, therapeutic effects, reactions and complications of blood transfusion, use of radiation treatment for blood transfusion fluids. There had been discussed in detail the practice of radiation processing of blood components and for the prevention of reaction "graft versus host" and studies of plasma radiation treatment for its infectious safety. There was presented the current state of techniques and technical solutions of radiation treatment of transfusion-transmissible environments. There were also considered an alternative to radiation treatment of blood.

  15. Current knowledge and trends in age-related macular degeneration: today's and future treatments.

    Science.gov (United States)

    Velez-Montoya, Raul; Oliver, Scott C N; Olson, Jeffrey L; Fine, Stuart L; Mandava, Naresh; Quiroz-Mercado, Hugo

    2013-09-01

    To address the most dynamic and current issues concerning today's treatment options and promising research efforts regarding treatment for age-related macular degeneration. This review is aimed to serve as a practical reference for more in-depth reviews on the subject. An online review of the database PubMed and Ovid were performed, searching for the key words age-related macular degeneration, AMD, VEGF, treatment, PDT, steroids, bevacizumab, ranibizumab, VEGF-trap, radiation, combined therapy, as well as their compound phrases. The search was limited to articles published since 1985. All returned articles were carefully screened, and their references were manually reviewed for additional relevant data. The web page www.clinicaltrials.gov was also accessed in search of relevant research trials. A total of 363 articles were reviewed, including 64 additional articles extracted from the references. At the end, only 160 references were included in this review. Treatment for age-related macular degeneration is a very dynamic research field. While current treatments are mainly aimed at blocking vascular endothelial growth factor, future treatments seek to prevent vision loss because of scarring. Promising efforts have been made to address the dry form of the disease, which has lacked effective treatment.

  16. Transcranial direct-current stimulation as treatment in epilepsy.

    Science.gov (United States)

    Gschwind, Markus; Seeck, Margitta

    2016-12-01

    Neuromodulation (NM) is a complementary therapy for patients with drug-resistant epilepsy. Vagal nerve stimulation and deep brain stimulation of the anterior thalamus are established techniques and have shown their efficacy in lowering seizure frequency, but they are invasive and rarely render patients seizure-free. Non-invasive NM techniques are therefore increasingly investigated in a clinical context. Areas covered: Current knowledge about transcranial direct-current stimulation (tDCS) and other non-invasive NM in patients with epilepsy, based on the available animal and clinical studies from PubMed search. Expert commentary: tDCS modulates neuronal membrane potentials, and consequently alters cortical excitability. Cathodal stimulation leads to cortical inhibition, which is of particular importance in epilepsy treatment. The antiepileptic efficacy is promising but still lacks systematic studies. The beneficial effect, seen in ~20%, outlasts the duration of stimulation, indicating neuronal plasticity and is therefore of great interest to obtain long-term effects.

  17. [Our current approach in the treatment of sigmoid colon volvulus].

    Science.gov (United States)

    Taviloğlu, Korhan; Aydin, Erol; Ertekin, Cemalettin; Güloğlu, Recep; Kurtoğlu, Mehmet

    2002-04-01

    Our aim was to emphasize the role of endoscopic detorsion in the treatment of sigmoid colon volvulus, which we currently apply in the majority of our cases. The data of 37 patients were analyzed in a retrospective manner, during a 86-month period, between May 1994 and July 2001. The patients were classified into three groups. The first group consisted of 9 patients with resection and anastomosis, the second group consisted of 20 patients with Hartmann's procedure, and the third group consisted of 8 patients with endoscopic detorsion. Complications were encountered in 7 patients (19%), and 3 patients (8%) died following treatment. We favor colonic resection following endoscopic treatment. Resection should be preferred, if endoscopic detorsion is not successful or in the presence of a complication.

  18. Current and emerging treatment options for Peyronie's disease

    Directory of Open Access Journals (Sweden)

    Gokce A

    2013-01-01

    Full Text Available Ahmet Gokce, Julie C Wang, Mary K Powers, Wayne JG HellstromDepartment of Urology, Tulane University – School of Medicine, New Orleans, LA, USAAbstract: Peyronie's disease (PD is a condition of the penis, characterized by the presence of localized fibrotic plaque in the tunica albuginea. PD is not an uncommon disorder, with recent epidemiologic studies documenting a prevalence of 3–9% of adult men affected. The actual prevalence of PD may be even higher. It is often associated with penile pain, anatomical deformities in the erect penis, and difficulty with intromission. As the definitive pathophysiology of PD has not been completely elucidated, further basic research is required to make progress in the understanding of this enigmatic condition. Similarly, research on effective therapies is limited. Currently, nonsurgical treatments are used for those men who are in the acute stage of PD, whereas surgical options are reserved for men with established PD who cannot successfully penetrate. Intralesional treatments are growing in clinical popularity as a minimally invasive approach in the initial treatment of PD. A surgical approach should be considered when men with PD do not respond to conservative, medical, or minimally invasive therapies for approximately 1 year and cannot have satisfactory sexual intercourse. As scientific breakthroughs in the understanding of the mechanisms of this disease process evolve, novel treatments for the many men suffering with PD are anticipated.Keywords: oral therapy, intralesional treatment, topical therapy, extracorporeal shockwave therapy, traction devices, plication, incision and grafting, penile prosthesis.

  19. In touch with psoriasis: topical treatments and current guidelines.

    LENUS (Irish Health Repository)

    Murphy, G

    2012-02-01

    This article describes topical therapies and treatment guidelines for psoriasis and is based on a presentation given by the authors at a satellite symposium held during the 19th Congress of the European Academy of Dermatology and Venereology, 6-10 October, 2010, in Gothenburg, Sweden. The highly variable nature of psoriasis and its individual presentation in patients can make it difficult to choose the most appropriate treatment. There are many treatment options, from topical treatment with emollients for very mild psoriasis, to systemic therapy with fumaric acid esters, methotrexate or biologics for severe disease. For the treatment of mild-to-moderate psoriasis, topical therapy is generally the most appropriate and a variety of options, both historical and recent, are available. Newer therapies offer greater convenience and fewer side-effects. Of the more recently available therapies, vitamin D analogues and topical corticosteroids are the two with the greatest proven efficacy in randomized clinical trials. A recent Cochrane review showed the highest efficacy overall with the fixed combination vitamin D analogue (calcipotriol) and corticosteroid (betamethasone dipropionate). Indeed, clinical trials have shown that two-compound calcipotriol\\/betamethasone dipropionate ointment has higher efficacy than calcipotriol or betamethasone dipropionate alone. With regard to safety, two-compound calcipotriol\\/betamethasone dipropionate was shown to be suitable for intermittent long-term treatment of mild-to-moderate psoriasis. The findings of the Cochrane review are reflected in the current treatment guidelines from the USA and Germany regarding the treatment of mild-to-moderate psoriasis. In both these guidelines, which will be discussed in this article, the recommended treatments for this patient group are vitamin D analogues and corticosteroids, particularly when used in combination.

  20. In touch with psoriasis: topical treatments and current guidelines.

    LENUS (Irish Health Repository)

    Murphy, G

    2011-06-01

    This article describes topical therapies and treatment guidelines for psoriasis and is based on a presentation given by the authors at a satellite symposium held during the 19th Congress of the European Academy of Dermatology and Venereology, 6-10 October, 2010, in Gothenburg, Sweden. The highly variable nature of psoriasis and its individual presentation in patients can make it difficult to choose the most appropriate treatment. There are many treatment options, from topical treatment with emollients for very mild psoriasis, to systemic therapy with fumaric acid esters, methotrexate or biologics for severe disease. For the treatment of mild-to-moderate psoriasis, topical therapy is generally the most appropriate and a variety of options, both historical and recent, are available. Newer therapies offer greater convenience and fewer side-effects. Of the more recently available therapies, vitamin D analogues and topical corticosteroids are the two with the greatest proven efficacy in randomized clinical trials. A recent Cochrane review showed the highest efficacy overall with the fixed combination vitamin D analogue (calcipotriol) and corticosteroid (betamethasone dipropionate). Indeed, clinical trials have shown that two-compound calcipotriol\\/betamethasone dipropionate ointment has higher efficacy than calcipotriol or betamethasone dipropionate alone. With regard to safety, two-compound calcipotriol\\/betamethasone dipropionate was shown to be suitable for intermittent long-term treatment of mild-to-moderate psoriasis. The findings of the Cochrane review are reflected in the current treatment guidelines from the USA and Germany regarding the treatment of mild-to-moderate psoriasis. In both these guidelines, which will be discussed in this article, the recommended treatments for this patient group are vitamin D analogues and corticosteroids, particularly when used in combination.

  1. Identification and real time control of current profile in Tore-supra: algorithms and simulation; Identification et controle en temps reel du profil de courant dans Tore Supra: algorithmes et simulations

    Energy Technology Data Exchange (ETDEWEB)

    Houy, P

    1999-10-15

    The aim of this work is to propose a real-time control of the current profile in order to achieve reproducible operating modes with improved energetic confinement in tokamaks. The determination of the profile is based on measurements given by interferometry and polarimetry diagnostics. Different ways to evaluate and improve the accuracy of these measurements are exposed. The position and the shape of a plasma are controlled by the poloidal system that forces them to cope with standard values. Gas or neutral ions or ice pellet or extra power injection are technical means used to control other plasma parameters. These controls are performed by servo-controlled loops. The poloidal system of Tore-supra is presented. The main obstacle to a reliable determination of the current profile is the fact that slightly different Faraday angles lead to very different profiles. The direct identification method that is exposed in this work, gives the profile that minimizes the square of the margin between measured and computed values. The different algorithms proposed to control current profiles on Tore-supra have been validated by using a plasma simulation. The code Cronos that solves the resistive diffusion equation of current has been used. (A.C.)

  2. Relative Pose Estimation Algorithm with Gyroscope Sensor

    Directory of Open Access Journals (Sweden)

    Shanshan Wei

    2016-01-01

    Full Text Available This paper proposes a novel vision and inertial fusion algorithm S2fM (Simplified Structure from Motion for camera relative pose estimation. Different from current existing algorithms, our algorithm estimates rotation parameter and translation parameter separately. S2fM employs gyroscopes to estimate camera rotation parameter, which is later fused with the image data to estimate camera translation parameter. Our contributions are in two aspects. (1 Under the circumstance that no inertial sensor can estimate accurately enough translation parameter, we propose a translation estimation algorithm by fusing gyroscope sensor and image data. (2 Our S2fM algorithm is efficient and suitable for smart devices. Experimental results validate efficiency of the proposed S2fM algorithm.

  3. Current and emerging treatment options for hairy cell leukemia

    Directory of Open Access Journals (Sweden)

    López-Rubio M

    2015-08-01

    Full Text Available Montserrat López-Rubio,1 Jose Antonio Garcia-Marco2 1Department of Hematology, Hospital Universitario Príncipe de Asturias, Alcalá de Henares, 2Department of Hematology, Hospital Universitario Puerta de Hierro Majadahonda, Majadahonda, Madrid, Spain Abstract: Hairy cell leukemia (HCL is a lymphoproliferative B-cell disorder characterized by pancytopenia, splenomegaly, and characteristic cytoplasmic hairy projections. Precise diagnosis is essential in order to differentiate classic forms from HCL variants, such as the HCL-variant and VH4-34 molecular variant, which are more resistant to available treatments. The current standard of care is treatment with purine analogs (PAs, such as cladribine or pentostatin, which provide a high rate of long-lasting clinical remissions. Nevertheless, ~30%–40% of the patients relapse, and moreover, some of these are difficult-to-treat refractory cases. The use of the monoclonal antibody rituximab in combination with PA appears to produce even higher responses, and it is often employed to minimize or eliminate residual disease. Currently, research in the field of HCL is focused on identifying novel therapeutic targets and potential agents that are safe and can universally cure the disease. The discovery of the BRAF mutation and progress in understanding the biology of the disease has enabled the scientific community to explore new therapeutic targets. Ongoing clinical trials are assessing various treatment strategies such as the combination of PA and anti-CD20 monoclonal antibodies, recombinant immunotoxins targeting CD22, BRAF inhibitors, and B-cell receptor signal inhibitors. Keywords: hairy cell leukemia, purine analogs, rituximab, immunotoxins, vemurafenib, ibrutinib

  4. Municipal wastewater treatment in Mexico: current status and opportunities for employing ecological treatment systems.

    Science.gov (United States)

    Zurita, Florentina; Roy, Eric D; White, John R

    2012-06-01

    The aim of this paper is to evaluate the current status of municipal wastewater (MWW) treatment in Mexico, as well as to assess opportunities for using ecological treatment systems, such as constructed wetlands. In 2008, Mexico had 2101 MWW treatment plants that treated only 84 m3/s of wastewater (208 m3/s ofMWW were collected in sewer systems). Unfortunately, most treatment plants operate below capacity owing to a lack of maintenance and paucity of properly trained personnel. The main types of treatment systems applied in Mexico are activated sludge and waste stabilization ponds, which treat 44.3% and 18% of the MWW collected, respectively. As in many other developing nations around the world, there is a great need in Mexico for low-cost, low-maintenance wastewater treatment systems that are both economically and environmentally sustainable. In 2005, 24.3 million Mexicans lived in villages of less than 2500 inhabitants and 14.1 million lived in towns with 2500-15,000 inhabitants. An opportunity exists to extend the use of ecological treatment systems to these low population density areas and considerably increase the percentage of MWW that is treated in Mexico. Small-scale and medium-size constructed wetlands have been built successfully in some states, primarily during the past five years. Several barriers need to be overcome to increase the adoption and utilization of ecological wastewater technology in Mexico, including: a lack of knowledge about this technology, scarce technical information in Spanish, and the government's concentration on constructing MWW treatment plants solely in urban areas.

  5. Our Approach to Toxic Epidermal Necrolysis and Review of Current Treatment Alternatives

    Directory of Open Access Journals (Sweden)

    Fatih Uygur

    2008-09-01

    Full Text Available Toxic epidermal necrolysis (TEN is a clinical entity which has a 30 to 40 % mortality rate, with necrolysis affecting the entire epidermis. Antibiotics, nonsteroidal anti-inflammatory drugs and anticonvulsants are offender drugs in TEN etiology. A standard treatment protocol with proven efficacy is still lacking. In this study, current treatment practice and our treatment strategy for TEN is discussed and eight patients treated in our clinic between the years 2001 and 2008 are reviewed.

  6. Study on Computerized Treatment Plan of Field-in-Field Intensity Modulated Radiation Therapy and Conventional Radiation Therapy according to PBC Algorithm and AAA on Breast Cancer Tangential Beam

    International Nuclear Information System (INIS)

    Yeom, Mi Suk; Bae, Seong Soo; Kim, Dae Sup; Back, Geum Mun

    2012-01-01

    Anisotropic Analytical Algorithm (AAA) provides more accurate dose calculation regarding impact on scatter and tissue inhomogeneity in comparison to Pencil Beam Convolution (PBC) algorithm. This study tries to analyze the difference of dose distribution according to PBC algorithm and dose calculation algorithm of AAA on breast cancer tangential plan. Computerized medical care plan using Eclipse treatment planning system (version 8.9, VARIAN, USA) has been established for the 10 breast cancer patients using 6 MV energy of Linac (CL-6EX, VARIAN, USA). After treatment plan of Conventional Radiation Therapy plan (Conventional plan) and Field-in-Field Intensity Modulated Radiation Therapy plan (FiF plan) using PBC algorithm has been established, MU has been fixed, implemented dose calculation after changing it to AAA, and compared and analyzed treatment plan using Dose Volume Histogram (DVH). Firstly, as a result of evaluating PBC algorithm of Conventional plan and the difference according to AAA, the average difference of CI value on target volume has been highly estimated by 0.295 on PBC algorithm and as a result of evaluating dose of lung, V 47 Gy and has been highly evaluated by 5.83% and 4.04% each, Mean dose, V 20 , V 5 , V 3 Gy has been highly evaluated 0.6%, 0.29%, 6.35%, 10.23% each on AAA. Secondly, in case of FiF plan, the average difference of CI value on target volume has been highly evaluated on PBC algorithm by 0.165, and dose on ipsilateral lung, V 47 , V 45 Gy, Mean dose has been highly evaluated 6.17%, 3.80%, 0.15% each on PBC algorithm, V 20 , V 5 , V 3 Gy has been highly evaluated 0.14%, 4.07%, 4.35% each on AAA. When calculating with AAA on breast cancer tangential plan, compared to PBC algorithm, Conformity on target volume of Conventional plan, FiF plan has been less evaluated by 0.295, 0.165 each. For the reason that dose of high dose region of ipsilateral lung has been showed little amount, and dose of low dose region has been showed much amount

  7. Combined Dust Detection Algorithm by Using MODIS Infrared Channels over East Asia

    Science.gov (United States)

    Park, Sang Seo; Kim, Jhoon; Lee, Jaehwa; Lee, Sukjo; Kim, Jeong Soo; Chang, Lim Seok; Ou, Steve

    2014-01-01

    A new dust detection algorithm is developed by combining the results of multiple dust detectionmethods using IR channels onboard the MODerate resolution Imaging Spectroradiometer (MODIS). Brightness Temperature Difference (BTD) between two wavelength channels has been used widely in previous dust detection methods. However, BTDmethods have limitations in identifying the offset values of the BTDto discriminate clear-sky areas. The current algorithm overcomes the disadvantages of previous dust detection methods by considering the Brightness Temperature Ratio (BTR) values of the dual wavelength channels with 30-day composite, the optical properties of the dust particles, the variability of surface properties, and the cloud contamination. Therefore, the current algorithm shows improvements in detecting the dust loaded region over land during daytime. Finally, the confidence index of the current dust algorithm is shown in 10 × 10 pixels of the MODIS observations. From January to June, 2006, the results of the current algorithm are within 64 to 81% of those found using the fine mode fraction (FMF) and aerosol index (AI) from the MODIS and Ozone Monitoring Instrument (OMI). The agreement between the results of the current algorithm and the OMI AI over the non-polluted land also ranges from 60 to 67% to avoid errors due to the anthropogenic aerosol. In addition, the developed algorithm shows statistically significant results at four AErosol RObotic NETwork (AERONET) sites in East Asia.

  8. Current Approaches in the Treatment of Relapsed and Refractory Acute Myeloid Leukemia

    Science.gov (United States)

    Ramos, Nestor R.; Mo, Clifton C.; Karp, Judith E.; Hourigan, Christopher S.

    2015-01-01

    The limited sensitivity of the historical treatment response criteria for acute myeloid leukemia (AML) has resulted in a different paradigm for treatment compared with most other cancers presenting with widely disseminated disease. Initial cytotoxic induction chemotherapy is often able to reduce tumor burden to a level sufficient to meet the current criteria for “complete” remission. Nevertheless, most AML patients ultimately die from their disease, most commonly as clinically evident relapsed AML. Despite a variety of available salvage therapy options, prognosis in patients with relapsed or refractory AML is generally poor. In this review, we outline the commonly utilized salvage cytotoxic therapy interventions and then highlight novel investigational efforts currently in clinical trials using both pathway-targeted agents and immunotherapy based approaches. We conclude that there is no current standard of care for adult relapsed or refractory AML other than offering referral to an appropriate clinical trial. PMID:25932335

  9. Implementation of an Algorithm for Prosthetic Joint Infection: Deviations and Problems.

    Science.gov (United States)

    Mühlhofer, Heinrich M L; Kanz, Karl-Georg; Pohlig, Florian; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; von Eisenhart-Rothe, Ruediger; Schauwecker, Johannes

    The outcome of revision surgery in arthroplasty is based on a precise diagnosis. In addition, the treatment varies based on whether the prosthetic failure is caused by aseptic or septic loosening. Algorithms can help to identify periprosthetic joint infections (PJI) and standardize diagnostic steps, however, algorithms tend to oversimplify the treatment of complex cases. We conducted a process analysis during the implementation of a PJI algorithm to determine problems and deviations associated with the implementation of this algorithm. Fifty patients who were treated after implementing a standardized algorithm were monitored retrospectively. Their treatment plans and diagnostic cascades were analyzed for deviations from the implemented algorithm. Each diagnostic procedure was recorded, compared with the algorithm, and evaluated statistically. We detected 52 deviations while treating 50 patients. In 25 cases, no discrepancy was observed. Synovial fluid aspiration was not performed in 31.8% of patients (95% confidence interval [CI], 18.1%-45.6%), while white blood cell counts (WBCs) and neutrophil differentiation were assessed in 54.5% of patients (95% CI, 39.8%-69.3%). We also observed that the prolonged incubation of cultures was not requested in 13.6% of patients (95% CI, 3.5%-23.8%). In seven of 13 cases (63.6%; 95% CI, 35.2%-92.1%), arthroscopic biopsy was performed; 6 arthroscopies were performed in discordance with the algorithm (12%; 95% CI, 3%-21%). Self-critical analysis of diagnostic processes and monitoring of deviations using algorithms are important and could increase the quality of treatment by revealing recurring faults.

  10. Parastomal hernia - current knowledge and treatment.

    Science.gov (United States)

    Styliński, Roman; Alzubedi, Adam; Rudzki, Sławomir

    2018-03-01

    Intestinal stoma creation is one of the most common surgical procedures. The most common long-term complication following stoma creation is parastomal hernia, which according to some authors is practically unavoidable. Statistical differences of its occurrence are mainly due to patient observation time and evaluation criteria. Consequently, primary prevention methods such as placement of prosthetic mesh and newly developed minimally invasive methods of stoma creation are used. It seems that in the light of evidence-based medicine, the best way to treat parastomal hernia is the one that the surgeon undertaking therapy is the most experienced in and is suited to the individuality of each patient, his condition and comorbidities. As a general rule, reinforcing the abdominal wall with a prosthetic mesh is the treatment of choice, with a low rate of complications and relapses over a long period of time. The current trend is to use lightweight, large pore meshes.

  11. Practical aspects of treatment with target specific anticoagulants: initiation, payment and current market, transitions, and venous thromboembolism treatment.

    Science.gov (United States)

    Mahan, Charles E

    2015-04-01

    Target specific anticoagulants (TSOACs) have recently been introduced to the US market for multiple indications including venous thromboembolism (VTE) prevention in total hip and knee replacement surgeries, VTE treatment and reduction in the risk of stroke in patients with non-valvular atrial fibrillation (NVAF). Currently, three TSOACs are available including rivaroxaban, apixaban, and dabigatran with edoxaban currently under Food and Drug Administration review for VTE treatment and stroke prevention in NVAF. The introduction of these agents has created a paradigm shift in anticoagulation by considerably simplifying treatment and anticoagulant initiation for patients by giving clinicians the opportunity to use a rapid onset, rapid offset, oral agent. The availability of these rapid onset TSOACs is allowing for outpatient treatment of low risk pulmonary embolism and deep vein thrombosis which can greatly reduce healthcare costs by avoiding inpatient hospitalizations and treatment for the disease. Additionally with this practice, the complications of an inpatient hospitalization may also be avoided such as nosocomial infections. Single-agent approaches with TSOACs represent a paradigm shift in the treatment of VTE versus the complicated overlap of a parenteral agent with warfarin. Transitions between anticoagulants, including TSOACs, are a high-risk period for the patient, and clinicians must carefully consider patient characteristics such as renal function as well as the agents that are being transitioned. TSOAC use appears to be growing slowly with improved payment coverage throughout the US.

  12. An algorithm for learning real-time automata

    NARCIS (Netherlands)

    Verwer, S.E.; De Weerdt, M.M.; Witteveen, C.

    2007-01-01

    We describe an algorithm for learning simple timed automata, known as real-time automata. The transitions of real-time automata can have a temporal constraint on the time of occurrence of the current symbol relative to the previous symbol. The learning algorithm is similar to the redblue fringe

  13. Determination of effective treatment duration of interferential current therapy using electromyography

    OpenAIRE

    Youn, Jong-In; Lee, Ho Sub; Lee, Sangkwan

    2016-01-01

    [Purpose] This study used electromyography to measure the effective treatment duration of interferential current therapy for muscle fatigue. [Subjects and Methods] Fifteen healthy adult men volunteered to participate in the study (age: 24.2 ? 1.3?years; weight: 67.6 ? 4.92?kg; height: 176.4 ? 4.92?cm). All subjects performed 5?min of isometric back extension exercise to produce muscle fatigue, and were then treated with interferential current therapy for 15?min, with electromyography monitori...

  14. Current obesity drug treatment

    Directory of Open Access Journals (Sweden)

    Marcio C. Mancini

    2006-03-01

    Full Text Available Pharmacological treatment of obesity is an area of sudden changes,development of new drugs and treatment propositions. This articlepresents information on physiological agents that are currentlybeing used as well as drugs that were widely used but are nomore available.

  15. An Algorithm for Neuropathic Pain Management in Older People.

    Science.gov (United States)

    Pickering, Gisèle; Marcoux, Margaux; Chapiro, Sylvie; David, Laurence; Rat, Patrice; Michel, Micheline; Bertrand, Isabelle; Voute, Marion; Wary, Bernard

    2016-08-01

    Neuropathic pain frequently affects older people, who generally also have several comorbidities. Elderly patients are often poly-medicated, which increases the risk of drug-drug interactions. These patients, especially those with cognitive problems, may also have restricted communication skills, making pain evaluation difficult and pain treatment challenging. Clinicians and other healthcare providers need a decisional algorithm to optimize the recognition and management of neuropathic pain. We present a decisional algorithm developed by a multidisciplinary group of experts, which focuses on pain assessment and therapeutic options for the management of neuropathic pain, particularly in the elderly. The algorithm involves four main steps: (1) detection, (2) evaluation, (3) treatment, and (4) re-evaluation. The detection of neuropathic pain is an essential step in ensuring successful management. The extent of the impact of the neuropathic pain is then assessed, generally with self-report scales, except in patients with communication difficulties who can be assessed using behavioral scales. The management of neuropathic pain frequently requires combination treatments, and recommended treatments should be prescribed with caution in these elderly patients, taking into consideration their comorbidities and potential drug-drug interactions and adverse events. This algorithm can be used in the management of neuropathic pain in the elderly to ensure timely and adequate treatment by a multidisciplinary team.

  16. A fast optimization algorithm for multicriteria intensity modulated proton therapy planning

    International Nuclear Information System (INIS)

    Chen Wei; Craft, David; Madden, Thomas M.; Zhang, Kewu; Kooy, Hanne M.; Herman, Gabor T.

    2010-01-01

    Purpose: To describe a fast projection algorithm for optimizing intensity modulated proton therapy (IMPT) plans and to describe and demonstrate the use of this algorithm in multicriteria IMPT planning. Methods: The authors develop a projection-based solver for a class of convex optimization problems and apply it to IMPT treatment planning. The speed of the solver permits its use in multicriteria optimization, where several optimizations are performed which span the space of possible treatment plans. The authors describe a plan database generation procedure which is customized to the requirements of the solver. The optimality precision of the solver can be specified by the user. Results: The authors apply the algorithm to three clinical cases: A pancreas case, an esophagus case, and a tumor along the rib cage case. Detailed analysis of the pancreas case shows that the algorithm is orders of magnitude faster than industry-standard general purpose algorithms (MOSEK's interior point optimizer, primal simplex optimizer, and dual simplex optimizer). Additionally, the projection solver has almost no memory overhead. Conclusions: The speed and guaranteed accuracy of the algorithm make it suitable for use in multicriteria treatment planning, which requires the computation of several diverse treatment plans. Additionally, given the low memory overhead of the algorithm, the method can be extended to include multiple geometric instances and proton range possibilities, for robust optimization.

  17. A fast optimization algorithm for multicriteria intensity modulated proton therapy planning.

    Science.gov (United States)

    Chen, Wei; Craft, David; Madden, Thomas M; Zhang, Kewu; Kooy, Hanne M; Herman, Gabor T

    2010-09-01

    To describe a fast projection algorithm for optimizing intensity modulated proton therapy (IMPT) plans and to describe and demonstrate the use of this algorithm in multicriteria IMPT planning. The authors develop a projection-based solver for a class of convex optimization problems and apply it to IMPT treatment planning. The speed of the solver permits its use in multicriteria optimization, where several optimizations are performed which span the space of possible treatment plans. The authors describe a plan database generation procedure which is customized to the requirements of the solver. The optimality precision of the solver can be specified by the user. The authors apply the algorithm to three clinical cases: A pancreas case, an esophagus case, and a tumor along the rib cage case. Detailed analysis of the pancreas case shows that the algorithm is orders of magnitude faster than industry-standard general purpose algorithms (MOSEK'S interior point optimizer, primal simplex optimizer, and dual simplex optimizer). Additionally, the projection solver has almost no memory overhead. The speed and guaranteed accuracy of the algorithm make it suitable for use in multicriteria treatment planning, which requires the computation of several diverse treatment plans. Additionally, given the low memory overhead of the algorithm, the method can be extended to include multiple geometric instances and proton range possibilities, for robust optimization.

  18. Childhood Enuresis: Current Diagnostic Formulations, Salient Findings, and Effective Treatment Modalities.

    Science.gov (United States)

    Thurber, Steven

    2017-06-01

    Enuresis constitutes a frequently encountered problem area for children that may adversely affect social and emotional adjustment. This type of incontinence has been of concern to the human family for centuries. A brief history of enuresis is presented followed by current conceptualizations, diagnostic criteria, prevalence rates and psychiatric comorbidities. Historic notions of causation together with ineffective, sometimes barbaric treatments are then discussed, ending with a presentation of evidence-based treatment modalities, with the urine alarm being an essential element of effective treatment. An intervention termed dry bed training combines the urine alarm with a series of procedures designed in part to reduce relapse potential and should be a primary consideration for implementation by treatment professionals. Finally, a brief case study is presented illustrating special etiological and treatment considerations with juvenile psychiatric patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. European audit of current practice in diagnosis and treatment of childhood growth hormone deficiency

    DEFF Research Database (Denmark)

    Juul, Anders; Bernasconi, Sergio; Clayton, Peter E

    2002-01-01

    The present survey among members of the ESPE on current practice in diagnosis and treatment of growth hormone (GH) deficiency (GHD) is of great clinical relevance and importance in the light of the recently published guidelines for diagnosis and treatment of GHD by the Growth Hormone Research...... Society. We have found much conformity but also numerous discrepancies between the recommendations of the Growth Hormone Research Society and the current practice in Europe....

  20. A virtual-accelerator-based verification of a Monte Carlo dose calculation algorithm for electron beam treatment planning in homogeneous phantoms

    International Nuclear Information System (INIS)

    Wieslander, Elinore; Knoeoes, Tommy

    2006-01-01

    By introducing Monte Carlo (MC) techniques to the verification procedure of dose calculation algorithms in treatment planning systems (TPSs), problems associated with conventional measurements can be avoided and properties that are considered unmeasurable can be studied. The aim of the study is to implement a virtual accelerator, based on MC simulations, to evaluate the performance of a dose calculation algorithm for electron beams in a commercial TPS. The TPS algorithm is MC based and the virtual accelerator is used to study the accuracy of the algorithm in water phantoms. The basic test of the implementation of the virtual accelerator is successful for 6 and 12 MeV (γ < 1.0, 0.02 Gy/2 mm). For 18 MeV, there are problems in the profile data for some of the applicators, where the TPS underestimates the dose. For fields equipped with patient-specific inserts, the agreement is generally good. The exception is 6 MeV where there are slightly larger deviations. The concept of the virtual accelerator is shown to be feasible and has the potential to be a powerful tool for vendors and users

  1. Monte Carlo evaluation of the AAA treatment planning algorithm in a heterogeneous multilayer phantom and IMRT clinical treatments for an Elekta SL25 linear accelerator

    International Nuclear Information System (INIS)

    Sterpin, E.; Tomsej, M.; Smedt, B. de; Reynaert, N.; Vynckier, S.

    2007-01-01

    The Anisotropic Analytical Algorithm (AAA) is a new pencil beam convolution/superposition algorithm proposed by Varian for photon dose calculations. The configuration of AAA depends on linear accelerator design and specifications. The purpose of this study was to investigate the accuracy of AAA for an Elekta SL25 linear accelerator for small fields and intensity modulated radiation therapy (IMRT) treatments in inhomogeneous media. The accuracy of AAA was evaluated in two studies. First, AAA was compared both with Monte Carlo (MC) and the measurements in an inhomogeneous phantom simulating lung equivalent tissues and bone ribs. The algorithm was tested under lateral electronic disequilibrium conditions, using small fields (2x2 cm 2 ). Good agreement was generally achieved for depth dose and profiles, with deviations generally below 3% in lung inhomogeneities and below 5% at interfaces. However, the effects of attenuation and scattering close to the bone ribs were not fully taken into account by AAA, and small inhomogeneities may lead to planning errors. Second, AAA and MC were compared for IMRT plans in clinical conditions, i.e., dose calculations in a computed tomography scan of a patient. One ethmoid tumor, one orophaxynx and two lung tumors are presented in this paper. Small differences were found between the dose volume histograms. For instance, a 1.7% difference for the mean planning target volume dose was obtained for the ethmoid case. Since better agreement was achieved for the same plans but in homogeneous conditions, these differences must be attributed to the handling of inhomogeneities by AAA. Therefore, inherent assumptions of the algorithm, principally the assumption of independent depth and lateral directions in the scaling of the kernels, were slightly influencing AAA's validity in inhomogeneities. However, AAA showed a good accuracy overall and a great ability to handle small fields in inhomogeneous media compared to other pencil beam convolution

  2. Current and Emerging Directions in the Treatment of Eating Disorders

    Directory of Open Access Journals (Sweden)

    Tiffany A. Brown

    2012-01-01

    Full Text Available Eating disorders are a significant source of psychiatric morbidity in young women and demonstrate high comorbidity with mood, anxiety, and substance use disorders. Thus, clinicians may encounter eating disorders in the context of treating other conditions. This review summarizes the efficacy of current and emerging treatments for anorexia nervosa (AN, bulimia nervosa (BN, and binge eating disorder (BED. Treatment trials were identified using electronic and manual searches and by reviewing abstracts from conference proceedings. Family based therapy has demonstrated superiority for adolescents with AN but no treatment has established superiority for adults. For BN, both 60 mg fluoxetine and cognitive behavioral therapy (CBT have well-established efficacy. For BED, selective serotonin reuptake inhibitors, CBT, and interpersonal psychotherapy have demonstrated efficacy. Emerging directions for AN include investigation of the antipsychotic olanzapine and several novel psychosocial treatments. Future directions for BN and BED include increasing CBT disseminability, targeting affect regulation, and individualized stepped-care approaches.

  3. Effects of visualization on algorithm comprehension

    Science.gov (United States)

    Mulvey, Matthew

    Computer science students are expected to learn and apply a variety of core algorithms which are an essential part of the field. Any one of these algorithms by itself is not necessarily extremely complex, but remembering the large variety of algorithms and the differences between them is challenging. To address this challenge, we present a novel algorithm visualization tool designed to enhance students understanding of Dijkstra's algorithm by allowing them to discover the rules of the algorithm for themselves. It is hoped that a deeper understanding of the algorithm will help students correctly select, adapt and apply the appropriate algorithm when presented with a problem to solve, and that what is learned here will be applicable to the design of other visualization tools designed to teach different algorithms. Our visualization tool is currently in the prototype stage, and this thesis will discuss the pedagogical approach that informs its design, as well as the results of some initial usability testing. Finally, to clarify the direction for further development of the tool, four different variations of the prototype were implemented, and the instructional effectiveness of each was assessed by having a small sample participants use the different versions of the prototype and then take a quiz to assess their comprehension of the algorithm.

  4. Clinical algorithms to aid osteoarthritis guideline dissemination.

    Science.gov (United States)

    Meneses, S R F; Goode, A P; Nelson, A E; Lin, J; Jordan, J M; Allen, K D; Bennell, K L; Lohmander, L S; Fernandes, L; Hochberg, M C; Underwood, M; Conaghan, P G; Liu, S; McAlindon, T E; Golightly, Y M; Hunter, D J

    2016-09-01

    Numerous scientific organisations have developed evidence-based recommendations aiming to optimise the management of osteoarthritis (OA). Uptake, however, has been suboptimal. The purpose of this exercise was to harmonize the recent recommendations and develop a user-friendly treatment algorithm to facilitate translation of evidence into practice. We updated a previous systematic review on clinical practice guidelines (CPGs) for OA management. The guidelines were assessed using the Appraisal of Guidelines for Research and Evaluation for quality and the standards for developing trustworthy CPGs as established by the National Academy of Medicine (NAM). Four case scenarios and algorithms were developed by consensus of a multidisciplinary panel. Sixteen guidelines were included in the systematic review. Most recommendations were directed toward physicians and allied health professionals, and most had multi-disciplinary input. Analysis for trustworthiness suggests that many guidelines still present a lack of transparency. A treatment algorithm was developed for each case scenario advised by recommendations from guidelines and based on panel consensus. Strategies to facilitate the implementation of guidelines in clinical practice are necessary. The algorithms proposed are examples of how to apply recommendations in the clinical context, helping the clinician to visualise the patient flow and timing of different treatment modalities. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  5. [Comparison of dose calculation algorithms in stereotactic radiation therapy in lung].

    Science.gov (United States)

    Tomiyama, Yuki; Araki, Fujio; Kanetake, Nagisa; Shimohigashi, Yoshinobu; Tominaga, Hirofumi; Sakata, Jyunichi; Oono, Takeshi; Kouno, Tomohiro; Hioki, Kazunari

    2013-06-01

    Dose calculation algorithms in radiation treatment planning systems (RTPSs) play a crucial role in stereotactic body radiation therapy (SBRT) in the lung with heterogeneous media. This study investigated the performance and accuracy of dose calculation for three algorithms: analytical anisotropic algorithm (AAA), pencil beam convolution (PBC) and Acuros XB (AXB) in Eclipse (Varian Medical Systems), by comparison against the Voxel Monte Carlo algorithm (VMC) in iPlan (BrainLab). The dose calculations were performed with clinical lung treatments under identical planning conditions, and the dose distributions and the dose volume histogram (DVH) were compared among algorithms. AAA underestimated the dose in the planning target volume (PTV) compared to VMC and AXB in most clinical plans. In contrast, PBC overestimated the PTV dose. AXB tended to slightly overestimate the PTV dose compared to VMC but the discrepancy was within 3%. The discrepancy in the PTV dose between VMC and AXB appears to be due to differences in physical material assignments, material voxelization methods, and an energy cut-off for electron interactions. The dose distributions in lung treatments varied significantly according to the calculation accuracy of the algorithms. VMC and AXB are better algorithms than AAA for SBRT.

  6. Chiari malformation Type I surgery in pediatric patients. Part 1: validation of an ICD-9-CM code search algorithm.

    Science.gov (United States)

    Ladner, Travis R; Greenberg, Jacob K; Guerrero, Nicole; Olsen, Margaret A; Shannon, Chevis N; Yarbrough, Chester K; Piccirillo, Jay F; Anderson, Richard C E; Feldstein, Neil A; Wellons, John C; Smyth, Matthew D; Park, Tae Sung; Limbrick, David D

    2016-05-01

    OBJECTIVE Administrative billing data may facilitate large-scale assessments of treatment outcomes for pediatric Chiari malformation Type I (CM-I). Validated International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) code algorithms for identifying CM-I surgery are critical prerequisites for such studies but are currently only available for adults. The objective of this study was to validate two ICD-9-CM code algorithms using hospital billing data to identify pediatric patients undergoing CM-I decompression surgery. METHODS The authors retrospectively analyzed the validity of two ICD-9-CM code algorithms for identifying pediatric CM-I decompression surgery performed at 3 academic medical centers between 2001 and 2013. Algorithm 1 included any discharge diagnosis code of 348.4 (CM-I), as well as a procedure code of 01.24 (cranial decompression) or 03.09 (spinal decompression or laminectomy). Algorithm 2 restricted this group to the subset of patients with a primary discharge diagnosis of 348.4. The positive predictive value (PPV) and sensitivity of each algorithm were calculated. RESULTS Among 625 first-time admissions identified by Algorithm 1, the overall PPV for CM-I decompression was 92%. Among the 581 admissions identified by Algorithm 2, the PPV was 97%. The PPV for Algorithm 1 was lower in one center (84%) compared with the other centers (93%-94%), whereas the PPV of Algorithm 2 remained high (96%-98%) across all subgroups. The sensitivity of Algorithms 1 (91%) and 2 (89%) was very good and remained so across subgroups (82%-97%). CONCLUSIONS An ICD-9-CM algorithm requiring a primary diagnosis of CM-I has excellent PPV and very good sensitivity for identifying CM-I decompression surgery in pediatric patients. These results establish a basis for utilizing administrative billing data to assess pediatric CM-I treatment outcomes.

  7. Nonlinear PI Control with Adaptive Interaction Algorithm for Multivariable Wastewater Treatment Process

    Directory of Open Access Journals (Sweden)

    S. I. Samsudin

    2014-01-01

    Full Text Available The wastewater treatment plant (WWTP is highly known with the nonlinearity of the control parameters, thus it is difficult to be controlled. In this paper, the enhancement of nonlinear PI controller (ENon-PI to compensate the nonlinearity of the activated sludge WWTP is proposed. The ENon-PI controller is designed by cascading a sector-bounded nonlinear gain to linear PI controller. The rate variation of the nonlinear gain kn is automatically updated based on adaptive interaction algorithm. Initiative to simplify the ENon-PI control structure by adapting kn has been proved by significant improvement under various dynamic influents. More than 30% of integral square error and 14% of integral absolute error are reduced compared to benchmark PI for DO control and nitrate in nitrogen removal control. Better average effluent qualities, less number of effluent violations, and lower aeration energy consumption resulted.

  8. [Current treatment concepts for olecranon and prepatellar bursitis in Austria].

    Science.gov (United States)

    Baumbach, S F; Michel, M; Wyen, H; Buschmann, C T; Kdolsky, R; Kanz, K-G

    2013-04-01

    The limited evidence available on the diagnosis and treatment of olecranon and prepatellar bursitis indicates nationally varying treatment approaches. Therefore the aim of this study was to survey the current treatment concepts of olecranon and prepatellar bursitis in Austria. An online questionnaire comprising of demographic data, questions regarding diagnostics and differentiation between septic bursitis (SB) and non-septic bursitis (NSB) as well as two case reports for therapy appraisal were sent to members of the Austrian Society of Orthopaedics and Orthopaedic Surgery (ÖGO) and the Austrian Society of Traumatology (ÖGU). The overall response rates were 46 % (ÖGU)/12 % (ÖGO). Differentiation between SB and NSB was predominantly based on medical history/clinical presentation (ÖGU: 100 %/ÖGO: 84 %) and blood sampling (ÖGU: 82 %/ÖGO: 77 %). 64/36 % of surveyed members of ÖGO/OGU performed a bursal aspiration. 95/55 % of Austrian ÖGU opinion leaders favoured a surgical treatment approach in cases of SB/NSB. Conversely, ÖGO members rather favoured a conservative treatment approach (28/27 %). Significant differences were found between ÖGO and ÖGU, with the latter favouring a surgical treatment approach in cases of SB and NSB. However, the international literature argues for a conservative treatment approach. Further high quality research is needed to establish an evidence-based treatment approach. Georg Thieme Verlag KG Stuttgart · New York.

  9. Algorithms for contrast enhancement of electronic portal images

    International Nuclear Information System (INIS)

    Díez, S.; Sánchez, S.

    2015-01-01

    An implementation of two new automatized image processing algorithms for contrast enhancement of portal images is presented as suitable tools which facilitate the setup verification and visualization of patients during radiotherapy treatments. In the first algorithm, called Automatic Segmentation and Histogram Stretching (ASHS), the portal image is automatically segmented in two sub-images delimited by the conformed treatment beam: one image consisting of the imaged patient obtained directly from the radiation treatment field, and the second one is composed of the imaged patient outside it. By segmenting the original image, a histogram stretching can be independently performed and improved in both regions. The second algorithm involves a two-step process. In the first step, a Normalization to Local Mean (NLM), an inverse restoration filter is applied by dividing pixel by pixel a portal image by its blurred version. In the second step, named Lineally Combined Local Histogram Equalization (LCLHE), the contrast of the original image is strongly improved by a Local Contrast Enhancement (LCE) algorithm, revealing the anatomical structures of patients. The output image is lineally combined with a portal image of the patient. Finally the output images of the previous algorithms (NLM and LCLHE) are lineally combined, once again, in order to obtain a contrast enhanced image. These two algorithms have been tested on several portal images with great results. - Highlights: • Two Algorithms are implemented to improve the contrast of Electronic Portal Images. • The multi-leaf and conformed beam are automatically segmented into Portal Images. • Hidden anatomical and bony structures in portal images are revealed. • The task related to the patient setup verification is facilitated by the contrast enhancement then achieved.

  10. Current Treatment of Toxoplasma Retinochoroiditis: An Evidence-Based Review

    Directory of Open Access Journals (Sweden)

    Meredith Harrell

    2014-01-01

    Full Text Available Objective. To perform an evidence-based review of treatments for Toxoplasma retinochoroiditis (TRC. Methods. A systematic literature search was performed using the PubMed database and the key phrase “ocular toxoplasmosis treatment” and the filter for “controlled clinical trial” and “randomized clinical trial” as well as OVID medline (1946 to May week 2 2014 using the keyword ‘‘ocular toxoplasmosis’’. The included studies were used to evaluate the various treatment modalities of TRC. Results. The electronic search yielded a total of 974 publications of which 44 reported on the treatment of ocular toxoplasmosis. There were 9 randomized controlled studies and an additional 3 comparative studies on the treatment of acute TRC with systemic or intravitreous antibiotics or on reducing the recurrences of TRC. Endpoints of studies included visual acuity improvement, inflammatory response, lesion size changes, recurrences of lesions, and adverse effects of medications. Conclusions. There was conflicting evidence as to the effectiveness of systemic antibiotics for TRC. There is no evidence to support that one antibiotic regimen is superior to another so choice needs to be informed by the safety profile. Intravitreous clindamycin with dexamethasone seems to be as effective as systemic treatments. There is currently level I evidence that intermittent trimethoprim-sulfamethoxazole prevents recurrence of the disease.

  11. Phase II drugs currently being investigated for the treatment of hypogonadism.

    Science.gov (United States)

    Udedibia, Emeka; Kaminetsky, Jed

    2014-12-01

    Hypogonadism is the most common endocrine disorder, which affects men of all age groups. Recent shifts in public awareness, increased screening and recognition of symptoms and updated diagnostic criteria have led to an increase in men diagnosed as hypogonadal, including middle-aged and older men who previously would have been considered eugonadal. The increase in testosterone replacement therapy (TRT) has paralleled an increase in advancements of treatment options. Although current therapies are highly efficacious for many men, there remains a need for newer therapies that are more cost-effective, preserve ease of use and administration, mitigate undesirable effects and closely mimic physiological levels of testosterone. In this review, the authors discuss current TRTs and therapies in development for the treatment of hypogonadism. The focus is on therapies under Phase II investigation or those who have recently completed Phase II study. With several new therapies in development, the authors expect advancements in achieving treatment benchmarks that meet the needs of the individual symptomatic hypogonadal male. Increased public awareness of hypogonadism and TRT has led to a welcomed expansion in the choice of TRT options. These include new delivery systems, formulations, routes of administration and non-testosterone modalities.

  12. Current status of electron beam treatment of flue gas in China

    International Nuclear Information System (INIS)

    Wang Zhiguang

    2006-01-01

    Fossil resource especially coal will remain the main energy resource in China over the next 3 ∼4 decades. Pollution of flue gas from fossil power station is one problem being desiderated to solve since 1990's. Electron beam treatment of flue gas as an advanced technique has been developed and used by some institutes and industries in China. The current status of flue gas treatment using electron beam and the development of electron accelerator in China are reviewed. (author)

  13. Recommendations from the Spanish Oncology Genitourinary Group for the treatment of metastatic renal cancer.

    Science.gov (United States)

    Bellmunt, Joaquim; Calvo, Emiliano; Castellano, Daniel; Climent, Miguel Angel; Esteban, Emilio; García del Muro, Xavier; González-Larriba, José Luis; Maroto, Pablo; Trigo, José Manuel

    2009-03-01

    For almost the last two decades, interleukin-2 and interferon-alpha have been the only systemic treatment options available for metastatic renal cell carcinoma. However, in recent years, five new targeted therapies namely sunitinib, sorafenib, temsirolimus, everolimus and bevacizumab have demonstrated clinical activity in these patients. With the availability of new targeted agents that are active in this disease, there is a need to continuously update the treatment algorithm of the disease. Due to the important advances obtained, the Spanish Oncology Genitourinary Group (SOGUG) has considered it would be useful to review the current status of the disease, including the genetic and molecular biology factors involved, the current predicting models for development of metastases as well as the role of surgery, radiotherapy and systemic therapies in the early- or late management of the disease. Based on this previous work, a treatment algorithm was developed.

  14. Fluorine-plasma surface treatment for gate forward leakage current reduction in AlGaN/GaN HEMTs

    International Nuclear Information System (INIS)

    Chen Wanjun; Zhang Jing; Zhang Bo; Chen, Kevin Jing

    2013-01-01

    The gate forward leakage current in AlGaN/GaN high electron mobility transistors (HEMTs) is investigated. It is shown that the current which originated from the forward biased Schottky-gate contributed to the gate forward leakage current. Therefore, a fluorine-plasma surface treatment is presented to induce the negative ions into the AlGaN layer which results in a higher metal—semiconductor barrier. Consequently, the gate forward leakage current shrinks. Experimental results confirm that the gate forward leakage current is decreased by one order magnitude lower than that of HEMT device without plasma treatment. In addition, the DC characteristics of the HEMT device with plasma treatment have been studied. (semiconductor devices)

  15. Head and neck paragangliomas: A two-decade institutional experience and algorithm for management.

    Science.gov (United States)

    Smith, Joshua D; Harvey, Rachel N; Darr, Owen A; Prince, Mark E; Bradford, Carol R; Wolf, Gregory T; Else, Tobias; Basura, Gregory J

    2017-12-01

    Paragangliomas of the head and neck and cranial base are typically benign, slow-growing tumors arising within the jugular foramen, middle ear, carotid bifurcation, or vagus nerve proper. The objective of this study was to provide a comprehensive characterization of our institutional experience with clinical management of these tumors and posit an algorithm for diagnostic evaluation and treatment. This was a retrospective cohort study of patients undergoing treatment for paragangliomas of the head and neck and cranial base at our institution from 2000-2017. Data on tumor location, catecholamine levels, and specific imaging modalities employed in diagnostic work-up, pre-treatment cranial nerve palsy, treatment modality, utilization of preoperative angiographic embolization, complications of treatment, tumor control and recurrence, and hereditary status (ie, succinate dehydrogenase mutations) were collected and summarized. The mean (SD) age of our cohort was 51.8 (±16.1) years with 123 (63.4%) female patients and 71 (36.6%) male patients. Catecholamine-secreting lesions were found in nine (4.6%) patients. Fifty-one patients underwent genetic testing, with mutations identified in 43 (20 SDHD , 13 SDHB, 7 SDHD , 1 SDHA, SDHAF2, and NF1 ). Observation with serial imaging, surgical extirpation, radiation, and stereotactic radiosurgery were variably employed as treatment approaches across anatomic subsites. An algorithmic approach to clinical management of these tumors, derived from our longitudinal institutional experience and current empiric evidence, may assist otolaryngologists, radiation oncologists, and geneticists in the care of these complex neoplasms. 4.

  16. Design of an optimization algorithm for clinical use

    International Nuclear Information System (INIS)

    Gustafsson, Anders

    1995-01-01

    Radiation therapy optimization has received much attention in the past few years. In combination with biological objective functions, the different optimization schemes has shown a potential to considerably increase the treatment outcome. With improved radiobiological models and increased computer capacity, radiation therapy optimization has now reached a stage where implementation in a clinical treatment planning system is realistic. A radiation therapy optimization method has been investigated with respect to its feasibility as a tool in a clinical 3D treatment planning system. The optimization algorithm is a constrained iterative gradient method. Photon dose calculation is performed using the clinically validated pencil-beam based algorithm of the clinical treatment planning system. Dose calculation within the optimization scheme is very time consuming and measures are required to decrease the calculation time. Different methods for more effective dose calculation within the optimization scheme have been investigated. The optimization results for adaptive sampling of calculation points, and secondary effect approximations in the dose calculation algorithm are compared with the optimization result for accurate dose calculation in all voxels of interest

  17. How many neurons can we see with current spike sorting algorithms?

    Science.gov (United States)

    Pedreira, Carlos; Martinez, Juan; Ison, Matias J; Quian Quiroga, Rodrigo

    2012-10-15

    Recent studies highlighted the disagreement between the typical number of neurons observed with extracellular recordings and the ones to be expected based on anatomical and physiological considerations. This disagreement has been mainly attributed to the presence of sparsely firing neurons. However, it is also possible that this is due to limitations of the spike sorting algorithms used to process the data. To address this issue, we used realistic simulations of extracellular recordings and found a relatively poor spike sorting performance for simulations containing a large number of neurons. In fact, the number of correctly identified neurons for single-channel recordings showed an asymptotic behavior saturating at about 8-10 units, when up to 20 units were present in the data. This performance was significantly poorer for neurons with low firing rates, as these units were twice more likely to be missed than the ones with high firing rates in simulations containing many neurons. These results uncover one of the main reasons for the relatively low number of neurons found in extracellular recording and also stress the importance of further developments of spike sorting algorithms. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. The Current State of Empirical Support for the Pharmacological Treatment of Selective Mutism

    Science.gov (United States)

    Carlson, John S.; Mitchell, Angela D.; Segool, Natasha

    2008-01-01

    This article reviews the current state of evidence for the psychopharmacological treatment of children diagnosed with selective mutism within the context of its link to social anxiety disorder. An increased focus on potential medication treatment for this disorder has resulted from significant monetary and resource limitations in typical practice,…

  19. Algorithm integration using ADL (Algorithm Development Library) for improving CrIMSS EDR science product quality

    Science.gov (United States)

    Das, B.; Wilson, M.; Divakarla, M. G.; Chen, W.; Barnet, C.; Wolf, W.

    2013-05-01

    Algorithm Development Library (ADL) is a framework that mimics the operational system IDPS (Interface Data Processing Segment) that is currently being used to process data from instruments aboard Suomi National Polar-orbiting Partnership (S-NPP) satellite. The satellite was launched successfully in October 2011. The Cross-track Infrared and Microwave Sounder Suite (CrIMSS) consists of the Advanced Technology Microwave Sounder (ATMS) and Cross-track Infrared Sounder (CrIS) instruments that are on-board of S-NPP. These instruments will also be on-board of JPSS (Joint Polar Satellite System) that will be launched in early 2017. The primary products of the CrIMSS Environmental Data Record (EDR) include global atmospheric vertical temperature, moisture, and pressure profiles (AVTP, AVMP and AVPP) and Ozone IP (Intermediate Product from CrIS radiances). Several algorithm updates have recently been proposed by CrIMSS scientists that include fixes to the handling of forward modeling errors, a more conservative identification of clear scenes, indexing corrections for daytime products, and relaxed constraints between surface temperature and air temperature for daytime land scenes. We have integrated these improvements into the ADL framework. This work compares the results from ADL emulation of future IDPS system incorporating all the suggested algorithm updates with the current official processing results by qualitative and quantitative evaluations. The results prove these algorithm updates improve science product quality.

  20. Addressing Prediabetes in Childhood Obesity Treatment Programs: Support from Research and Current Practice

    Science.gov (United States)

    Grow, H. Mollie; Fernandez, Cristina; Lukasiewicz, Gloria J.; Rhodes, Erinn T.; Shaffer, Laura A.; Sweeney, Brooke; Woolford, Susan J.; Estrada, Elizabeth

    2014-01-01

    Abstract Background: Type 2 diabetes mellitus (T2DM) and prediabetes have increased in prevalence among overweight and obese children, with significant implications for long-term health. There is little published evidence on the best approaches to care of prediabetes among overweight youth or the current practices used across pediatric weight management programs. Methods: This article reviews the literature and summarizes current practices for screening, diagnosis, and treatment of prediabetes at childhood obesity treatment centers. Findings regarding current practice were based on responses to an online survey from 28 pediatric weight management programs at 25 children's hospitals in 2012. Based on the literature reviewed, and empiric data, consensus support statements on prediabetes care and T2DM prevention were developed among representatives of these 25 children's hospitals' obesity clinics. Results: The evidence reviewed demonstrates that current T2DM and prediabetes diagnostic parameters are derived from adult-based studies with little understanding of clinical outcomes among youth. Very limited evidence exists on preventing progression of prediabetes. Some evidence suggests that a significant proportion of obese youth with prediabetes will revert to normoglycemia without pharmacological management. Evidence supports lifestyle modification for children with prediabetes, but further study of specific lifestyle changes and pharmacological treatments is needed. Conclusion: Evidence to guide management of prediabetes in children is limited. Current practice patterns of pediatric weight management programs show areas of variability in practice, reflecting the limited evidence base. More research is needed to guide clinical care for overweight youth with prediabetes. PMID:25055134

  1. The Texas medication algorithm project: clinical results for schizophrenia.

    Science.gov (United States)

    Miller, Alexander L; Crismon, M Lynn; Rush, A John; Chiles, John; Kashner, T Michael; Toprac, Marcia; Carmody, Thomas; Biggs, Melanie; Shores-Wilson, Kathy; Chiles, Judith; Witte, Brad; Bow-Thomas, Christine; Velligan, Dawn I; Trivedi, Madhukar; Suppes, Trisha; Shon, Steven

    2004-01-01

    In the Texas Medication Algorithm Project (TMAP), patients were given algorithm-guided treatment (ALGO) or treatment as usual (TAU). The ALGO intervention included a clinical coordinator to assist the physicians and administer a patient and family education program. The primary comparison in the schizophrenia module of TMAP was between patients seen in clinics in which ALGO was used (n = 165) and patients seen in clinics in which no algorithms were used (n = 144). A third group of patients, seen in clinics using an algorithm for bipolar or major depressive disorder but not for schizophrenia, was also studied (n = 156). The ALGO group had modestly greater improvement in symptoms (Brief Psychiatric Rating Scale) during the first quarter of treatment. The TAU group caught up by the end of 12 months. Cognitive functions were more improved in ALGO than in TAU at 3 months, and this difference was greater at 9 months (the final cognitive assessment). In secondary comparisons of ALGO with the second TAU group, the greater improvement in cognitive functioning was again noted, but the initial symptom difference was not significant.

  2. Dynamic Synchronous Capture Algorithm for an Electromagnetic Flowmeter.

    Science.gov (United States)

    Fanjiang, Yong-Yi; Lu, Shih-Wei

    2017-04-10

    This paper proposes a dynamic synchronous capture (DSC) algorithm to calculate the flow rate for an electromagnetic flowmeter. The characteristics of the DSC algorithm can accurately calculate the flow rate signal and efficiently convert an analog signal to upgrade the execution performance of a microcontroller unit (MCU). Furthermore, it can reduce interference from abnormal noise. It is extremely steady and independent of fluctuations in the flow measurement. Moreover, it can calculate the current flow rate signal immediately (m/s). The DSC algorithm can be applied to the current general MCU firmware platform without using DSP (Digital Signal Processing) or a high-speed and high-end MCU platform, and signal amplification by hardware reduces the demand for ADC accuracy, which reduces the cost.

  3. Current neurotrauma treatment practice in secondary medical service centers

    International Nuclear Information System (INIS)

    Suehiro, Eiichi; Yoshino, Hiroko; Koizumi, Hiroyasu; Yoneda, Hiroshi; Suzuki, Michiyasu

    2011-01-01

    Despite neurotrauma treatment practices comprising a significant amount of neurosurgical work for secondary medical service centers, little attention has been placed on neurotrauma cases and evaluation of current neurotrauma treatment practices is limited. Therefore we investigated current neurotrauma practices in our hospital located in a Japanese suburban city. We analyzed 439 patients with traumatic brain injury (TBI) admitted to our hospital between April 2004 and October 2010. Patients were divided into three groups based on the Glasgow Coma Scale (GCS) score on admission: mild TBI (GCS 14-15) in 252 patients (57.4%), moderate TBI (GCS 9-13) in 116 patients (26.4%), and severe TBI (GCS 3-8) in 71 patients (16.2%). Age, gender, alcohol consumption, cause of injury, cranial CT findings, neurosurgical procedure, length of hospital stay, and clinical outcome were analyzed. The average age of the patients was 59.2 years old. Male patients comprised 65%. Alcohol consumption was reported in 81 cases (18.5%), most of them with moderate TBI. Fall (208 cases, 47.4%) was the most frequent cause of injury, followed by traffic accident (115 cases, 26.2%) and high fall (73 cases, 16.6%). Acute subdural hematoma (174 cases, 39.6%) was most frequently seen in cranial CT findings on admission, which significantly increased with severity. A neurosurgical procedure was performed for 70 cases (15.9%), of which 15 (6.0%) were mild TBI and 18 (15.5%) were moderate TBI. The average hospital stay was 20.8 days, which significantly increased with severity. The overall rate of favorable outcome was 82.7%, and mortality was 8.2%; outcome deteriorated with severity. Some mild and moderate TBI cases had deteriorated and required surgery or resulted in death. These findings suggest that cautious treatment is necessary even in mild to moderate TBI cases which are often encountered in secondary medical service centers. (author)

  4. Current status of quality assurance of treatment planning systems

    International Nuclear Information System (INIS)

    Mijnheer, B.J.

    1997-01-01

    A review is given of the current status of quality assurance of treatment planning systems. At this moment only one comprehensive report is available. In order to review national activities a questionnaire has been distributed amongst national societies of medical physicists. From the 23 responding countries, 8 indicated that only limited efforts are underway, 8 answered that a working group is evaluating their specific national requirements while in 5 countries a document is drafted. The highlights of these reports have been summarized. (author)

  5. Optimized Bayesian dynamic advising theory and algorithms

    CERN Document Server

    Karny, Miroslav

    2006-01-01

    Written by one of the world's leading groups in the area of Bayesian identification, control, and decision making, this book provides the theoretical and algorithmic basis of optimized probabilistic advising. Starting from abstract ideas and formulations, and culminating in detailed algorithms, the book comprises a unified treatment of an important problem of the design of advisory systems supporting supervisors of complex processes. It introduces the theoretical and algorithmic basis of developed advising, relying on novel and powerful combination black-box modelling by dynamic mixture models

  6. Ivabradine: Current and Future Treatment of Heart Failure.

    Science.gov (United States)

    Thorup, Lene; Simonsen, Ulf; Grimm, Daniela; Hedegaard, Elise R

    2017-08-01

    In heart failure (HF), the heart cannot pump blood efficiently and is therefore unable to meet the body's demands of oxygen, and/or there is increased end-diastolic pressure. Current treatments for HF with reduced ejection fraction (HFrEF) include angiotensin-converting enzyme (ACE) inhibitors, angiotension receptor type 1 (AT 1 ) antagonists, β-adrenoceptor antagonists, aldosterone receptor antagonists, diuretics, digoxin and a combination drug with AT 1 receptor antagonist and neprilysin inhibitor. In HF, the risk of readmission for hospital and mortality is markedly higher with a heart rate (HR) above 70 bpm. Here, we review the evidence regarding the use of ivabradine for lowering HR in HF. Ivabradine is a blocker of an I funny current (I(f)) channel and causes rate-dependent inhibition of the pacemaker activity in the sinoatrial node. In clinical trials of HFrEF, treatment with ivabradine seems to improve clinical outcome, for example improved ejection fraction (EF) and less readmission for hospital, but the effect appears most pronounced in patients with HRs above 70 bpm, while the effect on cardiovascular death appears less consistent. The adverse effects of ivabradine include bradycardia, atrial fibrillation and visual disturbances, but ivabradine avoids the negative inotrope effects observed with β-adrenoceptor antagonists. In conclusion, in patients with stable HFrEF with EF<35% and HR above 70 bpm, ivabradine improves the outcome and might be a first choice of therapy, if beta-adrenoceptor antagonists are not tolerated. Further studies must show whether that can be extended to HF patients with preserved EF. © 2017 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  7. Current Treatments of Bruxism

    OpenAIRE

    Guaita, Marc; H?gl, Birgit

    2016-01-01

    Opinion statement Despite numerous case reports, the evidence for treatment of bruxism is still low. Different treatment modalities (behavioral techniques, intraoral devices, medications, and contingent electrical stimulation) have been applied. A clinical evaluation is needed to differentiate between awake bruxism and sleep bruxism and rule out any medical disorder or medication that could be behind its appearance (secondary bruxism). A polysomnography is required only in a few cases of slee...

  8. Cloud detection algorithm comparison and validation for operational Landsat data products

    Science.gov (United States)

    Foga, Steven Curtis; Scaramuzza, Pat; Guo, Song; Zhu, Zhe; Dilley, Ronald; Beckmann, Tim; Schmidt, Gail L.; Dwyer, John L.; Hughes, MJ; Laue, Brady

    2017-01-01

    Clouds are a pervasive and unavoidable issue in satellite-borne optical imagery. Accurate, well-documented, and automated cloud detection algorithms are necessary to effectively leverage large collections of remotely sensed data. The Landsat project is uniquely suited for comparative validation of cloud assessment algorithms because the modular architecture of the Landsat ground system allows for quick evaluation of new code, and because Landsat has the most comprehensive manual truth masks of any current satellite data archive. Currently, the Landsat Level-1 Product Generation System (LPGS) uses separate algorithms for determining clouds, cirrus clouds, and snow and/or ice probability on a per-pixel basis. With more bands onboard the Landsat 8 Operational Land Imager (OLI)/Thermal Infrared Sensor (TIRS) satellite, and a greater number of cloud masking algorithms, the U.S. Geological Survey (USGS) is replacing the current cloud masking workflow with a more robust algorithm that is capable of working across multiple Landsat sensors with minimal modification. Because of the inherent error from stray light and intermittent data availability of TIRS, these algorithms need to operate both with and without thermal data. In this study, we created a workflow to evaluate cloud and cloud shadow masking algorithms using cloud validation masks manually derived from both Landsat 7 Enhanced Thematic Mapper Plus (ETM +) and Landsat 8 OLI/TIRS data. We created a new validation dataset consisting of 96 Landsat 8 scenes, representing different biomes and proportions of cloud cover. We evaluated algorithm performance by overall accuracy, omission error, and commission error for both cloud and cloud shadow. We found that CFMask, C code based on the Function of Mask (Fmask) algorithm, and its confidence bands have the best overall accuracy among the many algorithms tested using our validation data. The Artificial Thermal-Automated Cloud Cover Algorithm (AT-ACCA) is the most accurate

  9. Current treatment of Graves' disease

    International Nuclear Information System (INIS)

    Harada, T.; Shimaoka, K.; Mimura, T.; Ito, K.

    1987-01-01

    In this review we have described the rationale for the appropriate treatment of patients with Graves' disease. Because the etiology of this disorder remains obscure, its management remains controversial. Since antithyroid drugs and radioiodine became readily available in the early 1950s, they have been widely used for the treatment of thyrotoxicosis, and the number of cases treated surgically has markedly decreased. However, almost four decades of experience have disclosed an unexpectedly high incidence of delayed hypothyroidism after radioiodine treatment and a low remission rate after antithyroid therapy. As a result, surgery is again being advocated as the treatment of choice. The three modalities of treatment have different advantages and disadvantages, and selection of treatment is of importance. In principle, we believe that for most patients a subtotal thyroidectomy should be performed after the patient has been rendered euthyroid by antithyroid drugs. We attempt to leave a thyroid remnant of 6 to 8 gm.36 references

  10. Individualized model predicts brain current flow during transcranial direct-current stimulation treatment in responsive stroke patient.

    Science.gov (United States)

    Datta, Abhishek; Baker, Julie M; Bikson, Marom; Fridriksson, Julius

    2011-07-01

    Although numerous published reports have demonstrated the beneficial effects of transcranial direct-current stimulation (tDCS) on task performance, fundamental questions remain regarding the optimal electrode configuration on the scalp. Moreover, it is expected that lesioned brain tissue will influence current flow and should therefore be considered (and perhaps leveraged) in the design of individualized tDCS therapies for stroke. The current report demonstrates how different electrode configurations influence the flow of electrical current through brain tissue in a patient who responded positively to a tDCS treatment targeting aphasia. The patient, a 60-year-old man, sustained a left hemisphere ischemic stroke (lesion size = 87.42 mL) 64 months before his participation. In this study, we present results from the first high-resolution (1 mm(3)) model of tDCS in a brain with considerable stroke-related damage; the model was individualized for the patient who received anodal tDCS to his left frontal cortex with the reference cathode electrode placed on his right shoulder. We modeled the resulting brain current flow and also considered three additional reference electrode positions: right mastoid, right orbitofrontal cortex, and a "mirror" configuration with the anode over the undamaged right cortex. Our results demonstrate the profound effect of lesioned tissue on resulting current flow and the ability to modulate current pattern through the brain, including perilesional regions, through electrode montage design. The complexity of brain current flow modulation by detailed normal and pathologic anatomy suggest: (1) That computational models are critical for the rational interpretation and design of individualized tDCS stroke-therapy; and (2) These models must accurately reproduce head anatomy as shown here. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Tourette Syndrome and comorbid ADHD: current pharmacological treatment options.

    Science.gov (United States)

    Rizzo, Renata; Gulisano, Mariangela; Calì, Paola V; Curatolo, Paolo

    2013-09-01

    Attention Deficit Hyperactivity Disorder (ADHD) is the most common co-morbid condition encountered in people with tics and Tourette Syndrome (TS). The co-occurrence of TS and ADHD is associated with a higher psychopathological, social and academic impairment and the management may represent a challenge for the clinicians. To review recent advances in management of patients with tic, Tourette Syndrome and comorbid Attention Deficit Hyperactivity Disorder. We searched peer reviewed and original medical publications (PUBMED 1990-2012) and included randomized, double-blind, controlled trials related to pharmacological treatment for tic and TS used in children and adolescents with comorbid ADHD. "Tourette Syndrome" or "Tic" and "ADHD", were cross referenced with the words "pharmacological treatment", "α-agonist", "psychostimulants", "selective norepinephrine reuptake inhibitor", "antipsychotics". Three classes of drugs are currently used in the treatment of TS and comorbid ADHD: α-agonists (clonidine and guanfacine), stimulants (amphetamine enantiomers, methylphenidate enantiomers or slow release preparation), and selective norepinephrine reuptake inhibitor (atomoxetine). It has been recently suggested that in a few selected cases partial dopamine agonists (aripiprazole) could be useful. Level A of evidence supported the use of noradrenergic agents (clonidine). Reuptake inhibitors (atomoxetine) and stimulants (methylphenidate) could be, also used for the treatment of TS and comorbid ADHD. Taking into account the risk-benefit profile, clonidine could be used as the first line treatment. However only few studies meet rigorous quality criteria in terms of study design and methodology; most trials have low statistical power due to small sample size or short duration. Treatment should be "symptom targeted" and personalized for each patient. Copyright © 2013 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.

  12. Optimization of pulsed current GTAW process parameters for sintered hot forged AISI 4135 P/M steel welds by simulated annealing and genetic algorithm

    International Nuclear Information System (INIS)

    Joseph, Joby; Muthukumaran, S.

    2016-01-01

    Abundant improvements have occurred in materials handling, especially in metal joining. Pulsed current gas tungsten arc welding (PCGTAW) is one of the consequential fusion techniques. In this work, PCGTAW of AISI 4135 steel engendered through powder metallurgy (P/M) has been executed, and the process parameters have been highlighted applying Taguchi's L9 orthogonal array. The results show that the peak current (Ip), gas flow rate (GFR), welding speed (WS) and base current (Ib) are the critical constraints in strong determinant of the Tensile strength (TS) as well as percentage of elongation (% Elong) of the joint. The practical impact of applying Genetic algorithm (GA) and Simulated annealing (SA) to PCGTAW process has been authenticated by means of calculating the deviation between predicted and experimental welding process parameters

  13. Optimization of pulsed current GTAW process parameters for sintered hot forged AISI 4135 P/M steel welds by simulated annealing and genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, Joby; Muthukumaran, S. [National Institute of Technology, Tamil Nadu (India)

    2016-01-15

    Abundant improvements have occurred in materials handling, especially in metal joining. Pulsed current gas tungsten arc welding (PCGTAW) is one of the consequential fusion techniques. In this work, PCGTAW of AISI 4135 steel engendered through powder metallurgy (P/M) has been executed, and the process parameters have been highlighted applying Taguchi's L9 orthogonal array. The results show that the peak current (Ip), gas flow rate (GFR), welding speed (WS) and base current (Ib) are the critical constraints in strong determinant of the Tensile strength (TS) as well as percentage of elongation (% Elong) of the joint. The practical impact of applying Genetic algorithm (GA) and Simulated annealing (SA) to PCGTAW process has been authenticated by means of calculating the deviation between predicted and experimental welding process parameters.

  14. Current Icing Product

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Current Icing Product (CIP) is an automatically-generated index suitable for depicting areas of potentially hazardous airframe icing. The CIP algorithm combines...

  15. Evaluation of a metal artifact reduction algorithm applied to post-interventional flat detector CT in comparison to pre-treatment CT in patients with acute subarachnoid haemorrhage

    International Nuclear Information System (INIS)

    Mennecke, Angelika; Svergun, Stanislav; Doerfler, Arnd; Struffert, Tobias; Scholz, Bernhard; Royalty, Kevin

    2017-01-01

    Metal artefacts can impair accurate diagnosis of haemorrhage using flat detector CT (FD-CT), especially after aneurysm coiling. Within this work we evaluate a prototype metal artefact reduction algorithm by comparison of the artefact-reduced and the non-artefact-reduced FD-CT images to pre-treatment FD-CT and multi-slice CT images. Twenty-five patients with acute aneurysmal subarachnoid haemorrhage (SAH) were selected retrospectively. FD-CT and multi-slice CT before endovascular treatment as well as FD-CT data sets after treatment were available for all patients. The algorithm was applied to post-treatment FD-CT. The effect of the algorithm was evaluated utilizing the pre-post concordance of a modified Fisher score, a subjective image quality assessment, the range of the Hounsfield units within three ROIs, and the pre-post slice-wise Pearson correlation. The pre-post concordance of the modified Fisher score, the subjective image quality, and the pre-post correlation of the ranges of the Hounsfield units were significantly higher for artefact-reduced than for non-artefact-reduced images. Within the metal-affected slices, the pre-post slice-wise Pearson correlation coefficient was higher for artefact-reduced than for non-artefact-reduced images. The overall diagnostic quality of the artefact-reduced images was improved and reached the level of the pre-interventional FD-CT images. The metal-unaffected parts of the image were not modified. (orig.)

  16. Evaluation of a metal artifact reduction algorithm applied to post-interventional flat detector CT in comparison to pre-treatment CT in patients with acute subarachnoid haemorrhage

    Energy Technology Data Exchange (ETDEWEB)

    Mennecke, Angelika; Svergun, Stanislav; Doerfler, Arnd; Struffert, Tobias [University of Erlangen-Nuremberg, Department of Neuroradiology, Erlangen (Germany); Scholz, Bernhard [Siemens Healthcare GmbH, Forchheim (Germany); Royalty, Kevin [Siemens Medical Solutions, USA, Inc., Hoffman Estates, IL (United States)

    2017-01-15

    Metal artefacts can impair accurate diagnosis of haemorrhage using flat detector CT (FD-CT), especially after aneurysm coiling. Within this work we evaluate a prototype metal artefact reduction algorithm by comparison of the artefact-reduced and the non-artefact-reduced FD-CT images to pre-treatment FD-CT and multi-slice CT images. Twenty-five patients with acute aneurysmal subarachnoid haemorrhage (SAH) were selected retrospectively. FD-CT and multi-slice CT before endovascular treatment as well as FD-CT data sets after treatment were available for all patients. The algorithm was applied to post-treatment FD-CT. The effect of the algorithm was evaluated utilizing the pre-post concordance of a modified Fisher score, a subjective image quality assessment, the range of the Hounsfield units within three ROIs, and the pre-post slice-wise Pearson correlation. The pre-post concordance of the modified Fisher score, the subjective image quality, and the pre-post correlation of the ranges of the Hounsfield units were significantly higher for artefact-reduced than for non-artefact-reduced images. Within the metal-affected slices, the pre-post slice-wise Pearson correlation coefficient was higher for artefact-reduced than for non-artefact-reduced images. The overall diagnostic quality of the artefact-reduced images was improved and reached the level of the pre-interventional FD-CT images. The metal-unaffected parts of the image were not modified. (orig.)

  17. The promise of ketamine for treatment-resistant depression: current evidence and future directions

    Science.gov (United States)

    DeWilde, Kaitlin E.; Levitch, Cara F.; Murrough, James W.; Mathew, Sanjay J.; Iosifescu, Dan V.

    2014-01-01

    Major depressive disorder (MDD) is one of the most disabling diseases worldwide and is a significant public health threat. Current treatments for MDD primarily consist of monoamine-targeting agents and have limited efficacy. However, the glutamate neurotransmitter system has recently come into focus as a promising alternative for novel antidepressant treatments. We review the current data on the glutamate NMDA receptor antagonist ketamine, which has been shown in clinical trials to act as a rapid antidepressant in MDD. We also examine ketamine efficacy on dimensions of psychopathology, including anhedonia, cognition, and suicidality, consistent with the NIMH Research Domain Criteria (RDoC) initiative. Other aspects of ketamine reviewed in this paper include safety and efficacy, different administration methods, and the risks of misuse of ketamine outside of medical settings. Finally, we conclude with a discussion of other glutamatergic agents other than ketamine currently being tested as novel antidepressants. PMID:25649308

  18. Algorithmic parameterization of mixed treatment comparisons

    NARCIS (Netherlands)

    G. van Valkenhoef (Gert); T. Tervonen (Tommi); B. de Brock (Bert)

    2012-01-01

    textabstractMixed Treatment Comparisons (MTCs) enable the simultaneous meta-analysis (data pooling) of networks of clinical trials comparing ≥2 alternative treatments. Inconsistency models are critical in MTC to assess the overall consistency between evidence sources. Only in the absence of

  19. Evolutionary algorithms applied to Landau-gauge fixing

    International Nuclear Information System (INIS)

    Markham, J.F.

    1998-01-01

    Current algorithms used to put a lattice gauge configuration into Landau gauge either suffer from the problem of critical slowing-down or involve an additions computational expense to overcome it. Evolutionary Algorithms (EAs), which have been widely applied to other global optimisation problems, may be of use in gauge fixing. Also, being global, they should not suffer from critical slowing-down as do local gradient based algorithms. We apply EA'S and also a Steepest Descent (SD) based method to the problem of Landau Gauge Fixing and compare their performance. (authors)

  20. Current status of brachytherapy in cancer treatment – short overview

    Directory of Open Access Journals (Sweden)

    Janusz Skowronek

    2017-12-01

    Full Text Available Cancer incidence and mortality depend on a number of factors, including age, socio-economic status and geographical location, and its prevalence is growing around the world. Most of cancer treatments include external beam radiotherapy or brachytherapy. Brachytherapy, a type of radiotherapy with energy from radionuclides inserted directly into the tumor, is increasingly used in cancer treatment. For cervical and skin cancers, it has become a standard therapy for more than 100 years as well as an important part of the treatment guidelines for other malignancies, including head and neck, skin, breast, and prostate cancers. Compared to external beam radiotherapy, brachytherapy has the potential to deliver an ablative radiation dose over a short period of time directly to the altered tissue area with the advantage of a rapid fall-off in dose, and consequently, sparing of adjacent organs. As a result, the patient is able to complete the treatment earlier, and the risks of occurrence of another cancer are lower than in conventional radiotherapy treatment. Brachytherapy has increased its use as a radical or palliative treatment, and become more advanced with the spread of pulsed-dose-rate and high-dose-rate afterloading machines; the use of new 3D/4D planning systems has additionally improved the quality of the treatment. The aim of the present study was to present short summaries of current studies on brachytherapy for the most frequently diagnosed tumors. Data presented in this manuscript should help especially young physicians or physicists to explore and introduce brachytherapy in cancer treatments.

  1. Pancreatic Cancer Diagnostics and TreatmentCurrent State

    Directory of Open Access Journals (Sweden)

    Zdeněk Krška

    2015-01-01

    Full Text Available Pancreatic ductal adenocarcinoma (PDAC represents permanent and ever rising issue worldwide. Five-year survival does not exceed 3 to 6%, i.e. the worst result among solid tumours. The article evaluates the current state of PDAC diagnostics and treatment specifying also development and trends. Percentage of non-resectable tumours due to locally advanced or metastatic condition varies 60–80%, mostly over 80%. Survival with non-resectable PDAC is 4 to 8 months (median 3.5. In contrast R0 resection shows the survival 18–27 months. Laboratory and imaging screening methods are not indicated on large scale. Risk factors are smoking, alcohol abuse, chronic pancreatitis, diabetes mellitus. Genetic background in most PDAC has not been detected yet. Some genes connected with high risk of PDAC (e.g. BRCA2, PALB2 have been identified as significant and highly penetrative, but link between PDAC and these genes can be seen only in 10–20%. This article surveys perspective oncogenes, tumour suppressor genes, microRNA. Albeit CT is still favoured over other imaging methods, involvement of NMR rises. Surgery prefers the “vessel first” approach, which proves to be justified especially in R0 resection. According to EBM immunotherapy same as radiotherapy are not significant in PDAC treatment. Chemotherapy shows limited importance in conversion treatment of locally advanced or borderline tumours or in case of metastatic spread. Unified procedures cannot be defined due to inhomogenous arrays. Surgical resection is the only chance for curative treatment of PDAC and depends mainly on timely indication for surgery and quality of multidisciplinary team in a high-volume centre.

  2. Current Situation of Treatment for Anaphylaxis in a Japanese Pediatric Emergency Center.

    Science.gov (United States)

    Ninchoji, Takeshi; Iwatani, Sota; Nishiyama, Masahiro; Kamiyoshi, Naohiro; Taniguchi-Ikeda, Mariko; Morisada, Naoya; Ishibashi, Kazuto; Iijima, Kazumoto; Ishida, Akihito; Morioka, Ichiro

    2018-04-01

    Anaphylaxis is a systemic allergic reaction that sometimes requires prompt treatment with intramuscular adrenaline. The aim of the study was to investigate the current situation regarding anaphylaxis treatment in a representative pediatric primary emergency facility in Japan. We retrospectively examined the medical records dating from April 2011 through March 2014 from Kobe Children's Primary Emergency Medical Center, where general pediatricians work on a part-time basis. Clinical characteristics and current treatments for patients with anaphylaxis who presented to the facility were investigated. Furthermore, we compared the clinical characteristics between anaphylaxis patients given intramuscular adrenaline and those not given it. During the study period, 217 patients were diagnosed with anaphylaxis. The median Sampson grade at the time of visit was 2, and 90 patients (41%) were grade 4 or higher. No patients received self-intramuscular injected adrenaline before arrival at our emergency medical center because none of the patients had been prescribed it. Further treatment during the visit was provided to 128 patients (59%), with only 17 (8%) receiving intramuscular adrenaline. Patients given intramuscular adrenaline had significantly lower peripheral saturation of oxygen at the visit (P = 0.025) and more frequent transfer to a referral hospital (P < 0.001) than those not given intramuscular adrenaline. Education for Japanese pediatric practitioners and patients is warranted, because no patients used self-intramuscular injected adrenaline as a prehospital treatment for anaphylaxis, and only severely affected patients who needed oxygen therapy or hospitalization received intramuscular adrenaline in a pediatric primary emergency setting.

  3. Algorithmic parameterization of mixed treatment comparisons

    NARCIS (Netherlands)

    van Valkenhoef, Gert; Tervonen, Tommi; de Brock, Bert; Hillege, Hans

    Mixed Treatment Comparisons (MTCs) enable the simultaneous meta-analysis (data pooling) of networks of clinical trials comparing a parts per thousand yen2 alternative treatments. Inconsistency models are critical in MTC to assess the overall consistency between evidence sources. Only in the absence

  4. Two-dimensional pencil beam scaling: an improved proton dose algorithm for heterogeneous media

    International Nuclear Information System (INIS)

    Szymanowski, Hanitra; Oelfke, Uwe

    2002-01-01

    New dose delivery techniques with proton beams, such as beam spot scanning or raster scanning, require fast and accurate dose algorithms which can be applied for treatment plan optimization in clinically acceptable timescales. The clinically required accuracy is particularly difficult to achieve for the irradiation of complex, heterogeneous regions of the patient's anatomy. Currently applied fast pencil beam dose calculations based on the standard inhomogeneity correction of pathlength scaling often cannot provide the accuracy required for clinically acceptable dose distributions. This could be achieved with sophisticated Monte Carlo simulations which are still unacceptably time consuming for use as dose engines in optimization calculations. We therefore present a new algorithm for proton dose calculations which aims to resolve the inherent problem between calculation speed and required clinical accuracy. First, a detailed derivation of the new concept, which is based on an additional scaling of the lateral proton fluence is provided. Then, the newly devised two-dimensional (2D) scaling method is tested for various geometries of different phantom materials. These include standard biological tissues such as bone, muscle and fat as well as air. A detailed comparison of the new 2D pencil beam scaling with the current standard pencil beam approach and Monte Carlo simulations, performed with GEANT, is presented. It was found that the new concept proposed allows calculation of absorbed dose with an accuracy almost equal to that achievable with Monte Carlo simulations while requiring only modestly increased calculation times in comparison to the standard pencil beam approach. It is believed that this new proton dose algorithm has the potential to significantly improve the treatment planning outcome for many clinical cases encountered in highly conformal proton therapy. (author)

  5. Current state of methodological and decisions for radiation treatment of blood, its components and products

    OpenAIRE

    Gordeev A.V.; Naumova L.A.; Kharitonov S.V.

    2014-01-01

    This article presents currently used blood transfusion media — components and blood products, therapeutic effects, reactions and complications of blood transfusion, use of radiation treatment for blood transfusion fluids. There had been discussed in detail the practice of radiation processing of blood components and for the prevention of reaction "graft versus host" and studies of plasma radiation treatment for its infectious safety. There was presented the current state of techniques and tec...

  6. A Novel Hybrid Firefly Algorithm for Global Optimization.

    Directory of Open Access Journals (Sweden)

    Lina Zhang

    Full Text Available Global optimization is challenging to solve due to its nonlinearity and multimodality. Traditional algorithms such as the gradient-based methods often struggle to deal with such problems and one of the current trends is to use metaheuristic algorithms. In this paper, a novel hybrid population-based global optimization algorithm, called hybrid firefly algorithm (HFA, is proposed by combining the advantages of both the firefly algorithm (FA and differential evolution (DE. FA and DE are executed in parallel to promote information sharing among the population and thus enhance searching efficiency. In order to evaluate the performance and efficiency of the proposed algorithm, a diverse set of selected benchmark functions are employed and these functions fall into two groups: unimodal and multimodal. The experimental results show better performance of the proposed algorithm compared to the original version of the firefly algorithm (FA, differential evolution (DE and particle swarm optimization (PSO in the sense of avoiding local minima and increasing the convergence rate.

  7. Novel prediction- and subblock-based algorithm for fractal image compression

    International Nuclear Information System (INIS)

    Chung, K.-L.; Hsu, C.-H.

    2006-01-01

    Fractal encoding is the most consuming part in fractal image compression. In this paper, a novel two-phase prediction- and subblock-based fractal encoding algorithm is presented. Initially the original gray image is partitioned into a set of variable-size blocks according to the S-tree- and interpolation-based decomposition principle. In the first phase, each current block of variable-size range block tries to find the best matched domain block based on the proposed prediction-based search strategy which utilizes the relevant neighboring variable-size domain blocks. The first phase leads to a significant computation-saving effect. If the domain block found within the predicted search space is unacceptable, in the second phase, a subblock strategy is employed to partition the current variable-size range block into smaller blocks to improve the image quality. Experimental results show that our proposed prediction- and subblock-based fractal encoding algorithm outperforms the conventional full search algorithm and the recently published spatial-correlation-based algorithm by Truong et al. in terms of encoding time and image quality. In addition, the performance comparison among our proposed algorithm and the other two algorithms, the no search-based algorithm and the quadtree-based algorithm, are also investigated

  8. Virtual patient 3D dose reconstruction using in air EPID measurements and a back-projection algorithm for IMRT and VMAT treatments.

    Science.gov (United States)

    Olaciregui-Ruiz, Igor; Rozendaal, Roel; van Oers, René F M; Mijnheer, Ben; Mans, Anton

    2017-05-01

    At our institute, a transit back-projection algorithm is used clinically to reconstruct in vivo patient and in phantom 3D dose distributions using EPID measurements behind a patient or a polystyrene slab phantom, respectively. In this study, an extension to this algorithm is presented whereby in air EPID measurements are used in combination with CT data to reconstruct 'virtual' 3D dose distributions. By combining virtual and in vivo patient verification data for the same treatment, patient-related errors can be separated from machine, planning and model errors. The virtual back-projection algorithm is described and verified against the transit algorithm with measurements made behind a slab phantom, against dose measurements made with an ionization chamber and with the OCTAVIUS 4D system, as well as against TPS patient data. Virtual and in vivo patient dose verification results are also compared. Virtual dose reconstructions agree within 1% with ionization chamber measurements. The average γ-pass rate values (3% global dose/3mm) in the 3D dose comparison with the OCTAVIUS 4D system and the TPS patient data are 98.5±1.9%(1SD) and 97.1±2.9%(1SD), respectively. For virtual patient dose reconstructions, the differences with the TPS in median dose to the PTV remain within 4%. Virtual patient dose reconstruction makes pre-treatment verification based on deviations of DVH parameters feasible and eliminates the need for phantom positioning and re-planning. Virtual patient dose reconstructions have additional value in the inspection of in vivo deviations, particularly in situations where CBCT data is not available (or not conclusive). Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. Coastal Zone Color Scanner atmospheric correction algorithm - Multiple scattering effects

    Science.gov (United States)

    Gordon, Howard R.; Castano, Diego J.

    1987-01-01

    Errors due to multiple scattering which are expected to be encountered in application of the current Coastal Zone Color Scanner (CZCS) atmospheric correction algorithm are analyzed. The analysis is based on radiative transfer computations in model atmospheres, in which the aerosols and molecules are distributed vertically in an exponential manner, with most of the aerosol scattering located below the molecular scattering. A unique feature of the analysis is that it is carried out in scan coordinates rather than typical earth-sun coordinates, making it possible to determine the errors along typical CZCS scan lines. Information provided by the analysis makes it possible to judge the efficacy of the current algorithm with the current sensor and to estimate the impact of the algorithm-induced errors on a variety of applications.

  10. Emergency Department Management of Suspected Calf-Vein Deep Venous Thrombosis: A Diagnostic Algorithm

    Directory of Open Access Journals (Sweden)

    Levi Kitchen

    2016-06-01

    Full Text Available Introduction: Unilateral leg swelling with suspicion of deep venous thrombosis (DVT is a common emergency department (ED presentation. Proximal DVT (thrombus in the popliteal or femoral veins can usually be diagnosed and treated at the initial ED encounter. When proximal DVT has been ruled out, isolated calf-vein deep venous thrombosis (IC-DVT often remains a consideration. The current standard for the diagnosis of IC-DVT is whole-leg vascular duplex ultrasonography (WLUS, a test that is unavailable in many hospitals outside normal business hours. When WLUS is not available from the ED, recommendations for managing suspected IC-DVT vary. The objectives of the study is to use current evidence and recommendations to (1 propose a diagnostic algorithm for IC-DVT when definitive testing (WLUS is unavailable; and (2 summarize the controversy surrounding IC-DVT treatment. Discussion: The Figure combines D-dimer testing with serial CUS or a single deferred FLUS for the diagnosis of IC-DVT. Such an algorithm has the potential to safely direct the management of suspected IC-DVT when definitive testing is unavailable. Whether or not to treat diagnosed IC-DVT remains widely debated and awaiting further evidence. Conclusion: When IC-DVT is not ruled out in the ED, the suggested algorithm, although not prospectively validated by a controlled study, offers an approach to diagnosis that is consistent with current data and recommendations. When IC-DVT is diagnosed, current references suggest that a decision between anticoagulation and continued follow-up outpatient testing can be based on shared decision-making. The risks of proximal progression and life-threatening embolization should be balanced against the generally more benign natural history of such thrombi, and an individual patient’s risk factors for both thrombus propagation and complications of anticoagulation. [West J Emerg Med. 2016;17(4384-390.

  11. Emergency Department Management of Suspected Calf-Vein Deep Venous Thrombosis: A Diagnostic Algorithm.

    Science.gov (United States)

    Kitchen, Levi; Lawrence, Matthew; Speicher, Matthew; Frumkin, Kenneth

    2016-07-01

    Unilateral leg swelling with suspicion of deep venous thrombosis (DVT) is a common emergency department (ED) presentation. Proximal DVT (thrombus in the popliteal or femoral veins) can usually be diagnosed and treated at the initial ED encounter. When proximal DVT has been ruled out, isolated calf-vein deep venous thrombosis (IC-DVT) often remains a consideration. The current standard for the diagnosis of IC-DVT is whole-leg vascular duplex ultrasonography (WLUS), a test that is unavailable in many hospitals outside normal business hours. When WLUS is not available from the ED, recommendations for managing suspected IC-DVT vary. The objectives of the study is to use current evidence and recommendations to (1) propose a diagnostic algorithm for IC-DVT when definitive testing (WLUS) is unavailable; and (2) summarize the controversy surrounding IC-DVT treatment. The Figure combines D-dimer testing with serial CUS or a single deferred FLUS for the diagnosis of IC-DVT. Such an algorithm has the potential to safely direct the management of suspected IC-DVT when definitive testing is unavailable. Whether or not to treat diagnosed IC-DVT remains widely debated and awaiting further evidence. When IC-DVT is not ruled out in the ED, the suggested algorithm, although not prospectively validated by a controlled study, offers an approach to diagnosis that is consistent with current data and recommendations. When IC-DVT is diagnosed, current references suggest that a decision between anticoagulation and continued follow-up outpatient testing can be based on shared decision-making. The risks of proximal progression and life-threatening embolization should be balanced against the generally more benign natural history of such thrombi, and an individual patient's risk factors for both thrombus propagation and complications of anticoagulation.

  12. An improved VSS NLMS algorithm for active noise cancellation

    Science.gov (United States)

    Sun, Yunzhuo; Wang, Mingjiang; Han, Yufei; Zhang, Congyan

    2017-08-01

    In this paper, an improved variable step size NLMS algorithm is proposed. NLMS has fast convergence rate and low steady state error compared to other traditional adaptive filtering algorithm. But there is a contradiction between the convergence speed and steady state error that affect the performance of the NLMS algorithm. Now, we propose a new variable step size NLMS algorithm. It dynamically changes the step size according to current error and iteration times. The proposed algorithm has simple formulation and easily setting parameters, and effectively solves the contradiction in NLMS. The simulation results show that the proposed algorithm has a good tracking ability, fast convergence rate and low steady state error simultaneously.

  13. Multiobjective Genetic Algorithm applied to dengue control.

    Science.gov (United States)

    Florentino, Helenice O; Cantane, Daniela R; Santos, Fernando L P; Bannwart, Bettina F

    2014-12-01

    Dengue fever is an infectious disease caused by a virus of the Flaviridae family and transmitted to the person by a mosquito of the genus Aedes aegypti. This disease has been a global public health problem because a single mosquito can infect up to 300 people and between 50 and 100 million people are infected annually on all continents. Thus, dengue fever is currently a subject of research, whether in the search for vaccines and treatments for the disease or efficient and economical forms of mosquito control. The current study aims to study techniques of multiobjective optimization to assist in solving problems involving the control of the mosquito that transmits dengue fever. The population dynamics of the mosquito is studied in order to understand the epidemic phenomenon and suggest strategies of multiobjective programming for mosquito control. A Multiobjective Genetic Algorithm (MGA_DENGUE) is proposed to solve the optimization model treated here and we discuss the computational results obtained from the application of this technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Astigmatism treatment during phacoemulsification: a review of current surgical strategies and their rationale

    Directory of Open Access Journals (Sweden)

    Giuliano de Oliveira Freitas

    2013-12-01

    Full Text Available Preexisting corneal astigmatism, present at the time of cataract surgery, is reviewed in detail throughout this article on its most important aspects such as occurrence rates, clinical relevance and current treatment options. Special emphasis is given to the latter aspect. Each method's rationale, advantage and limitation ishigh lightened. Comparisons between treatment options, whenever possible, are also provided.

  15. Advanced life support for cardiac arrest beyond the algorithm

    DEFF Research Database (Denmark)

    Rudolph, Søren Steemann; Isbye, Dan Lou; Pfeiffer, Peter

    2018-01-01

    In an advanced emergency medical service all parts of the advanced life support (ALS) algorithm can be provided. This evidence-based algorithm outlines resuscitative efforts for the first 10-15 minutes after cardiac arrest, whereafter the algorithm repeats itself. Restoration of spontaneous...... circulation fails in most cases, but in some circumstances the patient may benefit from additional interventional approaches, in which case transport to hospital with ongoing cardiopulmonary resuscitation is indicated. This paper has summarized treatments outside the ALS algorithm, which may be beneficial...

  16. The Challenge in Diagnosis and Current Treatment of Chronic Thromboembolic Pulmonary Hypertension

    Directory of Open Access Journals (Sweden)

    Anggoro Budi Hartopo

    2017-04-01

    Full Text Available Chronic thromboembolic pulmonary hypertension (CTEPH is currently underdiagnosis and consequently undertreatment in the clinical practice. A deficient in diagnostic modality and treatment availability especially in developing countries makes the CTEPH diagnosis unlikely to confirm. However, high index of clinical suspicion of CTEPH will lead to proper diagnosis and correct treatment  with significant reduction in morbidity and mortality. Left untreated, the mean survival time is 6.8 years and the three year mortality rate may be as high as 90 %. The pathophysiology, diagnosis and treatment of CTEPH are necessary to be shared among internists and primary care physicians, in order to improve the overall outcome of the patients.

  17. A Guiding Evolutionary Algorithm with Greedy Strategy for Global Optimization Problems

    Directory of Open Access Journals (Sweden)

    Leilei Cao

    2016-01-01

    Full Text Available A Guiding Evolutionary Algorithm (GEA with greedy strategy for global optimization problems is proposed. Inspired by Particle Swarm Optimization, the Genetic Algorithm, and the Bat Algorithm, the GEA was designed to retain some advantages of each method while avoiding some disadvantages. In contrast to the usual Genetic Algorithm, each individual in GEA is crossed with the current global best one instead of a randomly selected individual. The current best individual served as a guide to attract offspring to its region of genotype space. Mutation was added to offspring according to a dynamic mutation probability. To increase the capability of exploitation, a local search mechanism was applied to new individuals according to a dynamic probability of local search. Experimental results show that GEA outperformed the other three typical global optimization algorithms with which it was compared.

  18. The Chandra Source Catalog: Algorithms

    Science.gov (United States)

    McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.

  19. Scalable algorithms for contact problems

    CERN Document Server

    Dostál, Zdeněk; Sadowská, Marie; Vondrák, Vít

    2016-01-01

    This book presents a comprehensive and self-contained treatment of the authors’ newly developed scalable algorithms for the solutions of multibody contact problems of linear elasticity. The brand new feature of these algorithms is theoretically supported numerical scalability and parallel scalability demonstrated on problems discretized by billions of degrees of freedom. The theory supports solving multibody frictionless contact problems, contact problems with possibly orthotropic Tresca’s friction, and transient contact problems. It covers BEM discretization, jumping coefficients, floating bodies, mortar non-penetration conditions, etc. The exposition is divided into four parts, the first of which reviews appropriate facets of linear algebra, optimization, and analysis. The most important algorithms and optimality results are presented in the third part of the volume. The presentation is complete, including continuous formulation, discretization, decomposition, optimality results, and numerical experimen...

  20. Medical treatment of radiation injuries-Current US status

    Energy Technology Data Exchange (ETDEWEB)

    Jarrett, D.G. [OSA - CBD and CDP, 3050 Defense Pentagon, Room 3C257, Washington, DC 20301-3050 (United States)], E-mail: david.jarrett@us.army.mil; Sedlak, R.G.; Dickerson, W.E. [Uniformed Services University, Armed Forces Radiobiology Research Institute, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States); Reeves, G.I. [Northrop Grumman IT, 8211 Terminal Road, Lorton, VA 22079-1421 (United States)

    2007-07-15

    A nuclear incident or major release of radioactive materials likely would result in vast numbers of patients, many of whom would require novel therapy. Fortunately, the numbers of radiation victims in the United States (USA) have been limited to date. If a mass-casualty situation occurs, there will be a need to perform rapid, accurate dose estimates and to provide appropriate medications and other treatment to ameliorate radiation injury. The medical management of radiation injury is complex. Radiation injury may include acute radiation sickness (ARS) from external and/or internal radiation exposure, internal organ damage from incorporated radioactive isotopes, and cutaneous injury. Human and animal data have shown that optimal medical care may nearly double the survivable dose of ionizing radiation. Current treatment strategies for radiation injuries are discussed with concentration on the medical management of the hematopoietic syndrome. In addition, priority areas for continuing and future research into both acute deterministic injuries and also long-term stochastic sequelae of radiation exposure have been identified. There are several near-term novel therapies that appear to offer excellent prognosis for radiation casualties, and these are also described.

  1. Osteoarthritis: detection, pathophysiology, and current/future treatment strategies.

    Science.gov (United States)

    Sovani, Sujata; Grogan, Shawn P

    2013-01-01

    Osteoarthritis (OA) is a disease of the joint, and age is the major risk factor for its development. Clinical manifestation of OA includes joint pain, stiffness, and loss of mobility. Currently, no pharmacological treatments are available to treat this specific joint disease; only symptom-modifying drugs are available. Improvement in imaging technology, identification of biomarkers, and increased understanding of the molecular basis of OA will aid in detecting the early stages of disease. Yet the development of interventional strategies remains elusive and will be critical for effective prevention of OA-associated joint destruction. The potential of cell-based therapies may be applicable in improving joint function in mild to more advanced cases of OA. Ongoing studies to understand the basis of this disease will eventually lead to prevention and treatment strategies and will also be a key in reducing the social and economic burden of this disease. Nurses are advised to provide an integrative approach of disease assessment and management in OA patients' care with a focus on education and implementation. Knowledge and understanding of OA and how this affects the individual patient form the basis for such an integrative approach to all-round patient care and disease management.

  2. Comparison of two heterogeneity correction algorithms in pituitary gland treatments with intensity-modulated radiation therapy; Comparacao de dois algoritmos de correcao de heterogeneidade em tratamentos de tumores de hipofise com radioterapia de intensidade modulada

    Energy Technology Data Exchange (ETDEWEB)

    Albino, Lucas D.; Santos, Gabriela R.; Ribeiro, Victor A.B.; Rodrigues, Laura N., E-mail: lucasdelbem1@gmail.com [Universidade de Sao Paulo (USP), Sao Paulo, SP (Brazil). Faculdade de Medicina. Instituto de Radiologia; Weltman, Eduardo; Braga, Henrique F. [Instituto do Cancer do Estado de Sao Paulo, Sao Paulo, SP (Brazil). Servico de Radioterapia

    2013-12-15

    The dose accuracy calculated by a treatment planning system is directly related to the chosen algorithm. Nowadays, several calculation doses algorithms are commercially available and they differ in calculation time and accuracy, especially when individual tissue densities are taken into account. The aim of this study was to compare two different calculation algorithms from iPlan®, BrainLAB, in the treatment of pituitary gland tumor with intensity-modulated radiation therapy (IMRT). These tumors are located in a region with variable electronic density tissues. The deviations from the plan with no heterogeneity correction were evaluated. To initial validation of the data inserted into the planning system, an IMRT plan was simulated in a anthropomorphic phantom and the dose distribution was measured with a radiochromic film. The gamma analysis was performed in the film, comparing it with dose distributions calculated with X-ray Voxel Monte Carlo (XVMC) algorithm and pencil beam convolution (PBC). Next, 33 patients plans, initially calculated by PBC algorithm, were recalculated with XVMC algorithm. The treatment volumes and organs-at-risk dose-volume histograms were compared. No relevant differences were found in dose-volume histograms between XVMC and PBC. However, differences were obtained when comparing each plan with the plan without heterogeneity correction. (author)

  3. Multiobjective optimization with a modified simulated annealing algorithm for external beam radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Aubry, Jean-Francois; Beaulieu, Frederic; Sevigny, Caroline; Beaulieu, Luc; Tremblay, Daniel

    2006-01-01

    Inverse planning in external beam radiotherapy often requires a scalar objective function that incorporates importance factors to mimic the planner's preferences between conflicting objectives. Defining those importance factors is not straightforward, and frequently leads to an iterative process in which the importance factors become variables of the optimization problem. In order to avoid this drawback of inverse planning, optimization using algorithms more suited to multiobjective optimization, such as evolutionary algorithms, has been suggested. However, much inverse planning software, including one based on simulated annealing developed at our institution, does not include multiobjective-oriented algorithms. This work investigates the performance of a modified simulated annealing algorithm used to drive aperture-based intensity-modulated radiotherapy inverse planning software in a multiobjective optimization framework. For a few test cases involving gastric cancer patients, the use of this new algorithm leads to an increase in optimization speed of a little more than a factor of 2 over a conventional simulated annealing algorithm, while giving a close approximation of the solutions produced by a standard simulated annealing. A simple graphical user interface designed to facilitate the decision-making process that follows an optimization is also presented

  4. Evaluating a patient's request for life-prolonging treatment: an ethical framework.

    Science.gov (United States)

    Winkler, Eva C; Hiddemann, Wolfgang; Marckmann, Georg

    2012-11-01

    Contrary to the widespread concern about over-treatment at the end of life, today, patient preferences for palliative care at the end of life are frequently respected. However, ethically challenging situations in the current healthcare climate are, instead, situations in which a competent patient requests active treatment with the goal of life-prolongation while the physician suggests best supportive care only. The argument of futility has often been used to justify unilateral decisions made by physicians to withhold or withdraw life-sustaining treatment. However, we argue that neither the concept of futility nor that of patient autonomy alone is apt for resolving situations in which physicians are confronted with patients' requests for active treatment. Instead, we integrate the relevant arguments that have been put forward in the academic discussion about 'futile' treatment into an ethical algorithm with five guiding questions: (1) Is there a chance that medical intervention will be effective in achieving the patient's treatment goal? (2) How does the physician evaluate the expected benefit and the potential harm of the treatment? (3) Does the patient understand his or her medical situation? (4) Does the patient prefer receiving treatment after evaluating the benefit-harm ratio and the costs? (5) Does the treatment require many resources? This algorithm shall facilitate approaching patients' requests for treatments deemed futile by the physician in a systematic way, and responding to these requests in an ethically appropriate manner. It thereby adds substantive considerations to the current procedural approaches of conflict resolution in order to improve decision making among physicians, patients and families.

  5. Evaluation of train-speed control algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Slavik, M.M. [BKS Advantech (Pty.) Ltd., Pretoria (South Africa)

    2000-07-01

    A relatively simple and fast simulator has been developed and used for the preliminary testing of train cruise-control algorithms. The simulation is done in software on a PC. The simulator is used to gauge the consequences and feasibility of a cruise-control strategy prior to more elaborate testing and evaluation. The tool was used to design and pre-test a train-cruise control algorithm called NSS, which does not require knowledge of exact train mass, vertical alignment, or actual braking force. Only continuous measurements on the speed of the train and electrical current are required. With this modest input, the NSS algorithm effected speed changes smoothly and efficiently for a wide range of operating conditions. (orig.)

  6. BALL - biochemical algorithms library 1.3

    Directory of Open Access Journals (Sweden)

    Stöckel Daniel

    2010-10-01

    Full Text Available Abstract Background The Biochemical Algorithms Library (BALL is a comprehensive rapid application development framework for structural bioinformatics. It provides an extensive C++ class library of data structures and algorithms for molecular modeling and structural bioinformatics. Using BALL as a programming toolbox does not only allow to greatly reduce application development times but also helps in ensuring stability and correctness by avoiding the error-prone reimplementation of complex algorithms and replacing them with calls into the library that has been well-tested by a large number of developers. In the ten years since its original publication, BALL has seen a substantial increase in functionality and numerous other improvements. Results Here, we discuss BALL's current functionality and highlight the key additions and improvements: support for additional file formats, molecular edit-functionality, new molecular mechanics force fields, novel energy minimization techniques, docking algorithms, and support for cheminformatics. Conclusions BALL is available for all major operating systems, including Linux, Windows, and MacOS X. It is available free of charge under the Lesser GNU Public License (LPGL. Parts of the code are distributed under the GNU Public License (GPL. BALL is available as source code and binary packages from the project web site at http://www.ball-project.org. Recently, it has been accepted into the debian project; integration into further distributions is currently pursued.

  7. The (1+λ) evolutionary algorithm with self-adjusting mutation rate

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Witt, Carsten; Gießen, Christian

    2017-01-01

    We propose a new way to self-adjust the mutation rate in population-based evolutionary algorithms. Roughly speaking, it consists of creating half the offspring with a mutation rate that is twice the current mutation rate and the other half with half the current rate. The mutation rate is then upd......We propose a new way to self-adjust the mutation rate in population-based evolutionary algorithms. Roughly speaking, it consists of creating half the offspring with a mutation rate that is twice the current mutation rate and the other half with half the current rate. The mutation rate...... is then updated to the rate used in that subpopulation which contains the best offspring. We analyze how the (1 + A) evolutionary algorithm with this self-adjusting mutation rate optimizes the OneMax test function. We prove that this dynamic version of the (1 + A) EA finds the optimum in an expected optimization...... time (number of fitness evaluations) of O(nA/log A + n log n). This time is asymptotically smaller than the optimization time of the classic (1 + A) EA. Previous work shows that this performance is best-possible among all A-parallel mutation-based unbiased black-box algorithms. This result shows...

  8. A Turn-Projected State-Based Conflict Resolution Algorithm

    Science.gov (United States)

    Butler, Ricky W.; Lewis, Timothy A.

    2013-01-01

    State-based conflict detection and resolution (CD&R) algorithms detect conflicts and resolve them on the basis on current state information without the use of additional intent information from aircraft flight plans. Therefore, the prediction of the trajectory of aircraft is based solely upon the position and velocity vectors of the traffic aircraft. Most CD&R algorithms project the traffic state using only the current state vectors. However, the past state vectors can be used to make a better prediction of the future trajectory of the traffic aircraft. This paper explores the idea of using past state vectors to detect traffic turns and resolve conflicts caused by these turns using a non-linear projection of the traffic state. A new algorithm based on this idea is presented and validated using a fast-time simulator developed for this study.

  9. Sensitivity of NTCP parameter values against a change of dose calculation algorithm

    International Nuclear Information System (INIS)

    Brink, Carsten; Berg, Martin; Nielsen, Morten

    2007-01-01

    Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis with those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models

  10. Effective treatment options for musculoskeletal pain in primary care: A systematic overview of current evidence

    Science.gov (United States)

    Hill, Jonathan C.; Foster, Nadine E.; Protheroe, Joanne

    2017-01-01

    Background & aims Musculoskeletal pain, the most common cause of disability globally, is most frequently managed in primary care. People with musculoskeletal pain in different body regions share similar characteristics, prognosis, and may respond to similar treatments. This overview aims to summarise current best evidence on currently available treatment options for the five most common musculoskeletal pain presentations (back, neck, shoulder, knee and multi-site pain) in primary care. Methods A systematic search was conducted. Initial searches identified clinical guidelines, clinical pathways and systematic reviews. Additional searches found recently published trials and those addressing gaps in the evidence base. Data on study populations, interventions, and outcomes of intervention on pain and function were extracted. Quality of systematic reviews was assessed using AMSTAR, and strength of evidence rated using a modified GRADE approach. Results Moderate to strong evidence suggests that exercise therapy and psychosocial interventions are effective for relieving pain and improving function for musculoskeletal pain. NSAIDs and opioids reduce pain in the short-term, but the effect size is modest and the potential for adverse effects need careful consideration. Corticosteroid injections were found to be beneficial for short-term pain relief among patients with knee and shoulder pain. However, current evidence remains equivocal on optimal dose, intensity and frequency, or mode of application for most treatment options. Conclusion This review presents a comprehensive summary and critical assessment of current evidence for the treatment of pain presentations in primary care. The evidence synthesis of interventions for common musculoskeletal pain presentations shows moderate-strong evidence for exercise therapy and psychosocial interventions, with short-term benefits only from pharmacological treatments. Future research into optimal dose and application of the most

  11. Current treatment of retinoblastoma

    International Nuclear Information System (INIS)

    Shields, J.A.

    1985-01-01

    Retinoblastoma is a highly malignant intraocular tumor of childhood which requires prompt treatment once the diagnosis has been established. The traditionally accepted methods include enucleation, external irradiation, scleral plaque irradiation, photocoagulation, cryotherapy, chemotherapy. This article will provide an update on the modern methods of treatment which are available for retinoblastoma. It is based largely on personal experience with approximately 200 new patients with retinoblastoma who were evaluated and treated between 1974 and 1984 in the Oncology Service of Wills Eye Hospital with an overall survival of 97%. This article will be an overall review which will not go into statistical detail. (Auth.)

  12. Gems of combinatorial optimization and graph algorithms

    CERN Document Server

    Skutella, Martin; Stiller, Sebastian; Wagner, Dorothea

    2015-01-01

    Are you looking for new lectures for your course on algorithms, combinatorial optimization, or algorithmic game theory?  Maybe you need a convenient source of relevant, current topics for a graduate student or advanced undergraduate student seminar?  Or perhaps you just want an enjoyable look at some beautiful mathematical and algorithmic results, ideas, proofs, concepts, and techniques in discrete mathematics and theoretical computer science?   Gems of Combinatorial Optimization and Graph Algorithms is a handpicked collection of up-to-date articles, carefully prepared by a select group of international experts, who have contributed some of their most mathematically or algorithmically elegant ideas.  Topics include longest tours and Steiner trees in geometric spaces, cartograms, resource buying games, congestion games, selfish routing, revenue equivalence and shortest paths, scheduling, linear structures in graphs, contraction hierarchies, budgeted matching problems, and motifs in networks.   This ...

  13. Dose Calculation Accuracy of the Monte Carlo Algorithm for CyberKnife Compared with Other Commercially Available Dose Calculation Algorithms

    International Nuclear Information System (INIS)

    Sharma, Subhash; Ott, Joseph; Williams, Jamone; Dickow, Danny

    2011-01-01

    Monte Carlo dose calculation algorithms have the potential for greater accuracy than traditional model-based algorithms. This enhanced accuracy is particularly evident in regions of lateral scatter disequilibrium, which can develop during treatments incorporating small field sizes and low-density tissue. A heterogeneous slab phantom was used to evaluate the accuracy of several commercially available dose calculation algorithms, including Monte Carlo dose calculation for CyberKnife, Analytical Anisotropic Algorithm and Pencil Beam convolution for the Eclipse planning system, and convolution-superposition for the Xio planning system. The phantom accommodated slabs of varying density; comparisons between planned and measured dose distributions were accomplished with radiochromic film. The Monte Carlo algorithm provided the most accurate comparison between planned and measured dose distributions. In each phantom irradiation, the Monte Carlo predictions resulted in gamma analysis comparisons >97%, using acceptance criteria of 3% dose and 3-mm distance to agreement. In general, the gamma analysis comparisons for the other algorithms were <95%. The Monte Carlo dose calculation algorithm for CyberKnife provides more accurate dose distribution calculations in regions of lateral electron disequilibrium than commercially available model-based algorithms. This is primarily because of the ability of Monte Carlo algorithms to implicitly account for tissue heterogeneities, density scaling functions; and/or effective depth correction factors are not required.

  14. Managing Behçet’s disease: An update on current and emerging treatment options

    Directory of Open Access Journals (Sweden)

    P LA van Daele

    2009-05-01

    Full Text Available P LA van Daele, J H Kappen, P M van Hagen, J AM van LaarDepartment of Internal Medicine, Department of Immunology, Erasmus MC, ‘s Gravendijkwal 230, 3015 Ce Rotterdam, The NetherlandsAbstract: Behçet’s disease is an autoinflammatory vasculitis of unknown origin characterized by recurrent oral and genital ulcers, uveitis, arthritis and skin lesions. Additionally, involvement of the gastrointestinal tract, central nervous system and large vessels may occur. The disease is prevalent in countries along the ancient Silk Road from Eastern Asia to the Mediterranean Basin. Many treatment modalities are currently available. The choice of treatment depends on organ involvement and severity of disease. Topical treatment with corticosteroids is often sufficient for mucocutaneous involvement, however for more severe disease with vasculitis or neurological involvement a more aggressive approach is warranted. Newer drugs (biologicals influencing cytokines and thereby T-cell function are promising with an acceptable side effect profile. Unfortunately, reimbursement of the costs of biologicals for rare disease is still a problem in various countries. In this report we discuss the current treatment modalities for Behçet’s disease.Keywords: Behçet’s disease, biologicals, treatment

  15. Algebraic dynamics algorithm: Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm

    Institute of Scientific and Technical Information of China (English)

    WANG ShunJin; ZHANG Hua

    2007-01-01

    Based on the exact analytical solution of ordinary differential equations,a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm.A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models.The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision,and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  16. Algebraic dynamics algorithm:Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the exact analytical solution of ordinary differential equations, a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm. A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models. The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision, and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  17. Transformation Algorithm of Dielectric Response in Time-Frequency Domain

    Directory of Open Access Journals (Sweden)

    Ji Liu

    2014-01-01

    Full Text Available A transformation algorithm of dielectric response from time domain to frequency domain is presented. In order to shorten measuring time of low or ultralow frequency dielectric response characteristics, the transformation algorithm is used in this paper to transform the time domain relaxation current to frequency domain current for calculating the low frequency dielectric dissipation factor. In addition, it is shown from comparing the calculation results with actual test data that there is a coincidence for both results over a wide range of low frequencies. Meanwhile, the time domain test data of depolarization currents in dry and moist pressboards are converted into frequency domain results on the basis of the transformation. The frequency domain curves of complex capacitance and dielectric dissipation factor at the low frequency range are obtained. Test results of polarization and depolarization current (PDC in pressboards are also given at the different voltage and polarization time. It is demonstrated from the experimental results that polarization and depolarization current are affected significantly by moisture contents of the test pressboards, and the transformation algorithm is effective in ultralow frequency of 10−3 Hz. Data analysis and interpretation of the test results conclude that analysis of time-frequency domain dielectric response can be used for assessing insulation system in power transformer.

  18. Current and emerging treatment options in the management of lupus

    Science.gov (United States)

    Jordan, Natasha; D’Cruz, David

    2016-01-01

    Systemic lupus erythematosus (SLE) is a complex autoimmune disease with variable clinical manifestations. While the clearest guidelines for the treatment of SLE exist in the context of lupus nephritis, patients with other lupus manifestations such as neuropsychiatric, hematologic, musculoskeletal, and severe cutaneous lupus frequently require immunosuppression and/or biologic therapy. Conventional immunosuppressive agents such as mycophenolate mofetil, azathioprine, and cyclophosphamide are widely used in the management of SLE with current more rationalized treatment regimens optimizing the use of these agents while minimizing potential toxicity. The advent of biologic therapies has advanced the treatment of SLE particularly in patients with refractory disease. The CD20 monoclonal antibody rituximab and the anti-BLyS agent belimumab are now widely in use in clinical practice. Several other biologic agents are in ongoing clinical trials. While immunosuppressive and biologic agents are the foundation of inflammatory disease control in SLE, the importance of managing comorbidities such as cardiovascular risk factors, bone health, and minimizing susceptibility to infection should not be neglected. PMID:27529058

  19. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  20. Two-Step Proximal Gradient Algorithm for Low-Rank Matrix Completion

    Directory of Open Access Journals (Sweden)

    Qiuyu Wang

    2016-06-01

    Full Text Available In this paper, we  propose a two-step proximal gradient algorithm to solve nuclear norm regularized least squares for the purpose of recovering low-rank data matrix from sampling of its entries. Each iteration generated by the proposed algorithm is a combination of the latest three points, namely, the previous point, the current iterate, and its proximal gradient point. This algorithm preserves the computational simplicity of classical proximal gradient algorithm where a singular value decomposition in proximal operator is involved. Global convergence is followed directly in the literature. Numerical results are reported to show the efficiency of the algorithm.

  1. Current Status of Interventional Radiology Treatment of Infrapopliteal Arterial Disease

    Energy Technology Data Exchange (ETDEWEB)

    Rand, T., E-mail: thomas.rand@wienkav.at [General Hospital Hietzing, Department of Radiology (Austria); Uberoi, R. [John Radcliffe Hospital, Department of Radiology (United Kingdom)

    2013-06-15

    Treatment of infrapopliteal arteries has developed to a standard technique during the past two decades. With the introduction of innovative devices, a variety of techniques has been created and is still under investigation. Treatment options range from plain balloon angioplasty (POBA), all sorts of stent applications, such as bare metal, balloon expanding, self-expanding, coated and drug-eluting stents, and bio-absorbable stents, to latest developments, such as drug-eluting balloons. Regarding the scientific background, several prospective, randomized studies with relevant numbers of patients have been (or will be) published that are Level I evidence. In contrast to older studies, which primarily were based mostly on numeric parameters, such as diameters or residual stenoses, more recent study concepts focus increasingly on clinical features, such as amputation rate improvement or changes of clinical stages and quality of life standards. Although it is still not decided, which of the individual techniques might be the best one, we can definitely conclude that whatever treatment of infrapopliteal arteries will be used it is of substantial benefit for the patient. Therefore, the goal of this review is to give an overview about the current developments and techniques for the treatment of infrapopliteal arteries, to present clinical and technical results, to weigh individual techniques, and to discuss the recent developments.

  2. Chemotherapy-Induced Constipation and Diarrhea: Pathophysiology, Current and Emerging Treatments

    Science.gov (United States)

    McQuade, Rachel M.; Stojanovska, Vanesa; Abalo, Raquel; Bornstein, Joel C.; Nurgali, Kulmira

    2016-01-01

    Gastrointestinal (GI) side-effects of chemotherapy are a debilitating and often overlooked clinical hurdle in cancer management. Chemotherapy-induced constipation (CIC) and Diarrhea (CID) present a constant challenge in the efficient and tolerable treatment of cancer and are amongst the primary contributors to dose reductions, delays and cessation of treatment. Although prevalence of CIC is hard to estimate, it is believed to affect approximately 16% of cancer patients, whilst incidence of CID has been estimated to be as high as 80%. Despite this, the underlying mechanisms of both CID and CIC remain unclear, but are believed to result from a combination of intersecting mechanisms including inflammation, secretory dysfunctions, GI dysmotility and alterations in GI innervation. Current treatments for CIC and CID aim to reduce the severity of symptoms rather than combating the pathophysiological mechanisms of dysfunction, and often result in worsening of already chronic GI symptoms or trigger the onset of a plethora of other side-effects including respiratory depression, uneven heartbeat, seizures, and neurotoxicity. Emerging treatments including those targeting the enteric nervous system present promising avenues to alleviate CID and CIC. Identification of potential targets for novel therapies to alleviate chemotherapy-induced toxicity is essential to improve clinical outcomes and quality of life amongst cancer sufferers. PMID:27857691

  3. Current Status of Interventional Radiology Treatment of Infrapopliteal Arterial Disease

    International Nuclear Information System (INIS)

    Rand, T.; Uberoi, R.

    2013-01-01

    Treatment of infrapopliteal arteries has developed to a standard technique during the past two decades. With the introduction of innovative devices, a variety of techniques has been created and is still under investigation. Treatment options range from plain balloon angioplasty (POBA), all sorts of stent applications, such as bare metal, balloon expanding, self-expanding, coated and drug-eluting stents, and bio-absorbable stents, to latest developments, such as drug-eluting balloons. Regarding the scientific background, several prospective, randomized studies with relevant numbers of patients have been (or will be) published that are Level I evidence. In contrast to older studies, which primarily were based mostly on numeric parameters, such as diameters or residual stenoses, more recent study concepts focus increasingly on clinical features, such as amputation rate improvement or changes of clinical stages and quality of life standards. Although it is still not decided, which of the individual techniques might be the best one, we can definitely conclude that whatever treatment of infrapopliteal arteries will be used it is of substantial benefit for the patient. Therefore, the goal of this review is to give an overview about the current developments and techniques for the treatment of infrapopliteal arteries, to present clinical and technical results, to weigh individual techniques, and to discuss the recent developments.

  4. A Current Review of the Diagnostic and Treatment Strategies of Hepatic Encephalopathy

    Directory of Open Access Journals (Sweden)

    Z. Poh

    2012-01-01

    Full Text Available Hepatic encephalopathy (HE is a serious and potentially fatal complication in patients with cirrhotic liver disease. It is a spectrum ranging from minimal hepatic encephalopathy (MHE without recognizable clinical symptoms or signs, to overt HE with risk of cerebral edema and death. HE results in diminished quality of life and survival. The broad range of neuropsychiatric manifestations reflects the range of pathophysiological mechanisms and impairment in neurotransmission that are purported to cause HE including hyperammonemia, astrocyte swelling, intra-astrocytic glutamine, upregulation of 18-kDa translocator protein (TSPO (formerly known as peripheral benzodiazepine receptor or PBTR, and manganese. There is a myriad of diagnostic tools including simple bedside clinical assessment, and more complex neuropsychological batteries and neurophysiological tests available today. Current treatment strategies are directed at reducing ammonia, with newer agents showing some early promise. This paper describes the pathophysiology of the disease and summarises current diagnostic and treatment therapies available.

  5. Cough in Children: Current Approaches to the Treatment

    Directory of Open Access Journals (Sweden)

    O.O. Rechkina

    2016-03-01

    Full Text Available Introduction. Cough is one of the most common symptoms in the practice of doctors of various specialties, including pediatricians. Cough treatment should be started with the identification of its cause and correct diagnosis. Most often, cough in children is due to the increased viscosity of bronchial secretions, i.e. violation of sputum transport in the bronchial tree, and insufficient activity of ciliated epithelium. The main objective of the treatment of productive cough is dilution of sputum, bronchial secretion and excretion, thus necessitating the administration of mucolytics. Currently, one of the most famous mucolytics is acetylcysteine, cysteine amino acid product, such as ACC®. However, today the question of ACC® (acetylcysteine application in infants and young children is still debatable. This article presents a study whose objective was to evaluate the therapeutic efficacy and tolerability of ACC® (20 mg/ml solution in the treatment of bronchopulmonary diseases in children aged 2 to 6 years. Materials and methods. The study involved 60 children with acute tracheitis, simple bronchitis, acute obstructive bronchitis, recurrent bronchitis in the acute phase, community-acquired pneumonia, asthma exacerbation, cystic fibrosis. Patients of the main groups (n = 40 received ACC® (20 mg/ml solution at the age-specific dosage 3 times a day in combined treatment. Therapy of patients in the control group (n = 20 was conducted without ACC®. Results. During follow-up, patients who received ACC® had significant positive changes in the nature of cough, sputum viscosity and its amount as opposed to a comparison group of patients. Complete disappearance of cough was achieved on day 5–8 from the beginning of treatment, while in the control group patients, this time was longer. The evaluation of the effectiveness of the study drug showed that very good efficacy was achieved in 75 % of patients and good — in 20 %, and among patients who

  6. Algorithm of Functional Musculoskeletal Disorders Diagnostics

    OpenAIRE

    Alexandra P. Eroshenko

    2012-01-01

    The article scientifically justifies the algorithm of complex diagnostics of functional musculoskeletal disorders during resort treatment, aimed at the optimal application of modern methods of physical rehabilitation (correction programs formation), based on diagnostic methodologies findings

  7. Synthesis of logic circuits with evolutionary algorithms

    Energy Technology Data Exchange (ETDEWEB)

    JONES,JAKE S.; DAVIDSON,GEORGE S.

    2000-01-26

    In the last decade there has been interest and research in the area of designing circuits with genetic algorithms, evolutionary algorithms, and genetic programming. However, the ability to design circuits of the size and complexity required by modern engineering design problems, simply by specifying required outputs for given inputs has as yet eluded researchers. This paper describes current research in the area of designing logic circuits using an evolutionary algorithm. The goal of the research is to improve the effectiveness of this method and make it a practical aid for design engineers. A novel method of implementing the algorithm is introduced, and results are presented for various multiprocessing systems. In addition to evolving standard arithmetic circuits, work in the area of evolving circuits that perform digital signal processing tasks is described.

  8. Seizure detection algorithms based on EMG signals

    DEFF Research Database (Denmark)

    Conradsen, Isa

    Background: the currently used non-invasive seizure detection methods are not reliable. Muscle fibers are directly connected to the nerves, whereby electric signals are generated during activity. Therefore, an alarm system on electromyography (EMG) signals is a theoretical possibility. Objective...... on the amplitude of the signal. The other algorithm was based on information of the signal in the frequency domain, and it focused on synchronisation of the electrical activity in a single muscle during the seizure. Results: The amplitude-based algorithm reliably detected seizures in 2 of the patients, while...... the frequency-based algorithm was efficient for detecting the seizures in the third patient. Conclusion: Our results suggest that EMG signals could be used to develop an automatic seizuredetection system. However, different patients might require different types of algorithms /approaches....

  9. Adaptive discrete-ordinates algorithms and strategies

    International Nuclear Information System (INIS)

    Stone, J.C.; Adams, M.L.

    2005-01-01

    We present our latest algorithms and strategies for adaptively refined discrete-ordinates quadrature sets. In our basic strategy, which we apply here in two-dimensional Cartesian geometry, the spatial domain is divided into regions. Each region has its own quadrature set, which is adapted to the region's angular flux. Our algorithms add a 'test' direction to the quadrature set if the angular flux calculated at that direction differs by more than a user-specified tolerance from the angular flux interpolated from other directions. Different algorithms have different prescriptions for the method of interpolation and/or choice of test directions and/or prescriptions for quadrature weights. We discuss three different algorithms of different interpolation orders. We demonstrate through numerical results that each algorithm is capable of generating solutions with negligible angular discretization error. This includes elimination of ray effects. We demonstrate that all of our algorithms achieve a given level of error with far fewer unknowns than does a standard quadrature set applied to an entire problem. To address a potential issue with other algorithms, we present one algorithm that retains exact integration of high-order spherical-harmonics functions, no matter how much local refinement takes place. To address another potential issue, we demonstrate that all of our methods conserve partial currents across interfaces where quadrature sets change. We conclude that our approach is extremely promising for solving the long-standing problem of angular discretization error in multidimensional transport problems. (authors)

  10. A compilation of jet finding algorithms

    International Nuclear Information System (INIS)

    Flaugher, B.; Meier, K.

    1990-12-01

    Technical descriptions of jet finding algorithms currently in use in p bar p collider experiments (CDF, UA1, UA2), e + e - experiments and Monte-Carlo event generators (LUND programs, ISAJET) have been collected. 20 refs

  11. The Applications of Genetic Algorithms in Medicine

    Directory of Open Access Journals (Sweden)

    Ali Ghaheri

    2015-11-01

    Full Text Available A great wealth of information is hidden amid medical research data that in some cases cannot be easily analyzed, if at all, using classical statistical methods. Inspired by nature, metaheuristic algorithms have been developed to offer optimal or near-optimal solutions to complex data analysis and decision-making tasks in a reasonable time. Due to their powerful features, metaheuristic algorithms have frequently been used in other fields of sciences. In medicine, however, the use of these algorithms are not known by physicians who may well benefit by applying them to solve complex medical problems. Therefore, in this paper, we introduce the genetic algorithm and its applications in medicine. The use of the genetic algorithm has promising implications in various medical specialties including radiology, radiotherapy, oncology, pediatrics, cardiology, endocrinology, surgery, obstetrics and gynecology, pulmonology, infectious diseases, orthopedics, rehabilitation medicine, neurology, pharmacotherapy, and health care management. This review introduces the applications of the genetic algorithm in disease screening, diagnosis, treatment planning, pharmacovigilance, prognosis, and health care management, and enables physicians to envision possible applications of this metaheuristic method in their medical career.

  12. The Applications of Genetic Algorithms in Medicine.

    Science.gov (United States)

    Ghaheri, Ali; Shoar, Saeed; Naderan, Mohammad; Hoseini, Sayed Shahabuddin

    2015-11-01

    A great wealth of information is hidden amid medical research data that in some cases cannot be easily analyzed, if at all, using classical statistical methods. Inspired by nature, metaheuristic algorithms have been developed to offer optimal or near-optimal solutions to complex data analysis and decision-making tasks in a reasonable time. Due to their powerful features, metaheuristic algorithms have frequently been used in other fields of sciences. In medicine, however, the use of these algorithms are not known by physicians who may well benefit by applying them to solve complex medical problems. Therefore, in this paper, we introduce the genetic algorithm and its applications in medicine. The use of the genetic algorithm has promising implications in various medical specialties including radiology, radiotherapy, oncology, pediatrics, cardiology, endocrinology, surgery, obstetrics and gynecology, pulmonology, infectious diseases, orthopedics, rehabilitation medicine, neurology, pharmacotherapy, and health care management. This review introduces the applications of the genetic algorithm in disease screening, diagnosis, treatment planning, pharmacovigilance, prognosis, and health care management, and enables physicians to envision possible applications of this metaheuristic method in their medical career.].

  13. Current suicidal ideation in treatment-seeking individuals in the United Kingdom with gambling problems.

    Science.gov (United States)

    Ronzitti, Silvia; Soldini, Emiliano; Smith, Neil; Potenza, Marc N; Clerici, Massimo; Bowden-Jones, Henrietta

    2017-11-01

    Studies show higher lifetime prevalence of suicidality in individuals with pathological gambling. However, less is known about the relationship between pathological gambling and current suicidal ideation. We investigated socio-demographic, clinical and gambling-related variables associated with suicidality in treatment-seeking individuals. Bivariate analyses and logistic regression models were generated on data from 903 individuals to identify measures associated with aspects of suicidality. Forty-six percent of patients reported current suicidal ideation. People with current suicidal thoughts were more likely to report greater problem-gambling severity (psuicidality. Logistic regression models suggested that past suicidal ideation (psuicidality. Our findings suggest that the severity of anxiety disorder, along with a lifetime history of suicidal ideation, may help to identify treatment-seeking individuals with pathological gambling with a higher risk of suicidality, highlighting the importance of assessing suicidal ideation in clinical settings. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Interior point algorithms theory and analysis

    CERN Document Server

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  15. Ant Colony Clustering Algorithm and Improved Markov Random Fusion Algorithm in Image Segmentation of Brain Images

    Directory of Open Access Journals (Sweden)

    Guohua Zou

    2016-12-01

    Full Text Available New medical imaging technology, such as Computed Tomography and Magnetic Resonance Imaging (MRI, has been widely used in all aspects of medical diagnosis. The purpose of these imaging techniques is to obtain various qualitative and quantitative data of the patient comprehensively and accurately, and provide correct digital information for diagnosis, treatment planning and evaluation after surgery. MR has a good imaging diagnostic advantage for brain diseases. However, as the requirements of the brain image definition and quantitative analysis are always increasing, it is necessary to have better segmentation of MR brain images. The FCM (Fuzzy C-means algorithm is widely applied in image segmentation, but it has some shortcomings, such as long computation time and poor anti-noise capability. In this paper, firstly, the Ant Colony algorithm is used to determine the cluster centers and the number of FCM algorithm so as to improve its running speed. Then an improved Markov random field model is used to improve the algorithm, so that its antinoise ability can be improved. Experimental results show that the algorithm put forward in this paper has obvious advantages in image segmentation speed and segmentation effect.

  16. SU-E-T-516: Dosimetric Validation of AcurosXB Algorithm in Comparison with AAA & CCC Algorithms for VMAT Technique.

    Science.gov (United States)

    Kathirvel, M; Subramanian, V Sai; Arun, G; Thirumalaiswamy, S; Ramalingam, K; Kumar, S Ashok; Jagadeesh, K

    2012-06-01

    To dosimetrically validate AcurosXB algorithm for Volumetric Modulated Arc Therapy (VMAT) in comparison with standard clinical Anisotropic Analytic Algorithm(AAA) and Collapsed Cone Convolution(CCC) dose calculation algorithms. AcurosXB dose calculation algorithm is available with Varian Eclipse treatment planning system (V10). It uses grid-based Boltzmann equation solver to predict dose precisely in lesser time. This study was made to realize algorithms ability to predict dose accurately as its delivery for which five clinical cases each of Brain, Head&Neck, Thoracic, Pelvic and SBRT were taken. Verification plans were created on multicube phantom with iMatrixx-2D detector array and then dose prediction was done with AcurosXB, AAA & CCC (COMPASS System) algorithm and the same were delivered onto CLINAC-iX treatment machine. Delivered dose was captured in iMatrixx plane for all 25 plans. Measured dose was taken as reference to quantify the agreement between AcurosXB calculation algorithm against previously validated AAA and CCC algorithm. Gamma evaluation was performed with clinical criteria distance-to-agreement 3&2mm and dose difference 3&2% in omnipro-I'MRT software. Plans were evaluated in terms of correlation coefficient, quantitative area gamma and average gamma. Study shows good agreement between mean correlation 0.9979±0.0012, 0.9984±0.0009 & 0.9979±0.0011 for AAA, CCC & Acuros respectively. Mean area gamma for criteria 3mm/3% was found to be 98.80±1.04, 98.14±2.31, 98.08±2.01 and 2mm/2% was found to be 93.94±3.83, 87.17±10.54 & 92.36±5.46 for AAA, CCC & Acuros respectively. Mean average gamma for 3mm/3% was 0.26±0.07, 0.42±0.08, 0.28±0.09 and 2mm/2% was found to be 0.39±0.10, 0.64±0.11, 0.42±0.13 for AAA, CCC & Acuros respectively. This study demonstrated that the AcurosXB algorithm had a good agreement with the AAA & CCC in terms of dose prediction. In conclusion AcurosXB algorithm provides a valid, accurate and speedy alternative to AAA

  17. Adjustment disorder: current perspectives

    Directory of Open Access Journals (Sweden)

    Zelviene P

    2018-01-01

    Full Text Available Paulina Zelviene, Evaldas Kazlauskas Department of Clinical and Organizational Psychology, Vilnius University, Vilnius, Lithuania Abstract: Adjustment disorder (AjD is among the most often diagnosed mental disorders in clinical practice. This paper reviews current status of AjD research and discusses scientific and clinical issues associated with AjD. AjD has been included in diagnostic classifications for over 50 years. Still, the diagnostic criteria for AjD remain vague and cause difficulties to mental health professionals. Controversies in definition resulted in the lack of reliable and valid measures of AjD. Epidemiological data on prevalence of AjD is scarce and not reliable because prevalence data are biased by the diagnostic algorithm, which is usually developed for each study, as no established diagnostic standards for AjD are available. Considerable changes in the field of AjD could follow after the release of the 11th edition of International Classification of Diseases (ICD-11. A new AjD symptom profile was introduced in ICD-11 with 2 main symptoms as follows: 1 preoccupation and 2 failure to adapt. However, differences between the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition and ICD-11 AjD diagnostic criteria could result in diverse research findings in the future. The best treatment approach for AjD remains unclear, and further treatment studies are needed to provide AjD treatment guidelines to clinicians. Keywords: adjustment disorder, review, diagnosis, prevalence, treatment, DSM, ICD

  18. Cancer of the Pancreas: Molecular Pathways and Current Advancement in Treatment.

    Science.gov (United States)

    Polireddy, Kishore; Chen, Qi

    2016-01-01

    Pancreatic cancer is one of the most lethal cancers among all malignances, with a median overall survival of cancers harbor a variety of genetic alternations that render it difficult to treat even with targeted therapy. Recent studies revealed that pancreatic cancers are highly enriched with a cancer stem cell (CSC) population, which is resistant to chemotherapeutic drugs, and therefore escapes chemotherapy and promotes tumor recurrence. Cancer cell epithelial to mesenchymal transition (EMT) is highly associated with metastasis, generation of CSCs, and treatment resistance in pancreatic cancer. Reviewed here are the molecular biology of pancreatic cancer, the major signaling pathways regulating pancreatic cancer EMT and CSCs, and the advancement in current clinical and experimental treatments for pancreatic cancer.

  19. New accountant job market reform by computer algorithm: an experimental study

    Directory of Open Access Journals (Sweden)

    Hirose Yoshitaka

    2017-01-01

    Full Text Available The purpose of this study is to examine the matching of new accountants with accounting firms in Japan. A notable feature of the present study is that it brings a computer algorithm to the job-hiring task. Job recruitment activities for new accountants in Japan are one-time, short-term struggles. Accordingly, many have searched for new rules to replace the current ones of the process. Job recruitment activities for new accountants in Japan change every year. This study proposes modifying these job recruitment activities by combining computer and human efforts. Furthermore, the study formulates the job recruitment activities by using a model and conducting experiments. As a result, the Deferred Acceptance (DA algorithm derives a high truth-telling percentage, a stable matching percentage, and greater efficiency compared with the previous approach. This suggests the potential of the Deferred Acceptance algorithm as a replacement for current approaches. In terms of accurate percentage and stability, the DA algorithm is superior to the current methods and should be adopted.

  20. CURRENT PRINCIPLES FOR CRANIOPHARYNGIOMA TREATMENT

    Directory of Open Access Journals (Sweden)

    A. N. Konovalov

    2012-01-01

    Full Text Available The paper describes the classification and treatment options of craniopharyngiomas, benign epithelial tumors arising from the cell remains of the Rathke’s pouch. It presents a few types of surgical accesses during surgical treatment for this disease and gives examples of how to place an Ommaya reservoir. 

  1. Current knowledge and treatment strategies for grade II gliomas

    International Nuclear Information System (INIS)

    Narita, Yoshitaka

    2013-01-01

    World Health Organization grade II gliomas (GIIGs) include diffuse astrocytoma, oligodendroglioma, and oligoastrocytoma. GIIG is a malignant brain tumor for which the treatment outcome can still be improved. Review of previous clinical trials found the following: GIIG increased in size by 3-5 mm per year when observed or treated with surgery alone; after pathological diagnosis, the survival rate was increased by early aggressive tumor removal at an earlier stage compared to observation alone; although the prognosis after total tumor removal was significantly better than that after partial tumor removal, half of the patients relapsed within 5 years; comparing postoperative early radiotherapy (RT) and non-early RT after relapse, early RT prolonged progression-free survival (PFS) but did not affect overall survival (OS); local RT of 45 to 64.8 Gy did not impact PFS or OS; in patients with residual tumors, RT combined with chemotherapy (procarbazine plus lomustine plus vincristine) prolonged PFS compared with RT alone but did not affect OS; and poor prognostic factors included astrocytoma, non-total tumor removal, age ≥40 years, largest tumor diameter ≥4-6 cm, tumor crossing the midline, and neurological deficit. To improve treatment outcomes, surgery with functional brain mapping or intraoperative magnetic resonance imaging or chemoradiotherapy with temozolomide is important. In this review, current knowledge regarding GIIG is described and treatment strategies are explored. (author)

  2. Metaheuristic algorithms for building Covering Arrays: A review

    Directory of Open Access Journals (Sweden)

    Jimena Adriana Timaná-Peña

    2016-09-01

    Full Text Available Covering Arrays (CA are mathematical objects used in the functional testing of software components. They enable the testing of all interactions of a given size of input parameters in a procedure, function, or logical unit in general, using the minimum number of test cases. Building CA is a complex task (NP-complete problem that involves lengthy execution times and high computational loads. The most effective methods for building CAs are algebraic, Greedy, and metaheuristic-based. The latter have reported the best results to date. This paper presents a description of the major contributions made by a selection of different metaheuristics, including simulated annealing, tabu search, genetic algorithms, ant colony algorithms, particle swarm algorithms, and harmony search algorithms. It is worth noting that simulated annealing-based algorithms have evolved as the most competitive, and currently form the state of the art.

  3. Optimal Placement and Sizing of Fault Current Limiters in Distributed Generation Systems Using a Hybrid Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    N. Bayati

    2017-02-01

    Full Text Available Distributed Generation (DG connection in a power system tends to increase the short circuit level in the entire system which, in turn, could eliminate the protection coordination between the existing relays. Fault Current Limiters (FCLs are often used to reduce the short-circuit level of the network to a desirable level, provided that they are dully placed and appropriately sized. In this paper, a method is proposed for optimal placement of FCLs and optimal determination of their impedance values by which the relay operation time, the number and size of the FCL are minimized while maintaining the relay coordination before and after DG connection. The proposed method adopts the removal of low-impact FCLs and uses a hybrid Genetic Algorithm (GA optimization scheme to determine the optimal placement of FCLs and the values of their impedances. The suitability of the proposed method is demonstrated by examining the results of relay coordination in a typical DG network before and after DG connection.

  4. Algorithm of Functional Musculoskeletal Disorders Diagnostics

    Directory of Open Access Journals (Sweden)

    Alexandra P. Eroshenko

    2012-04-01

    Full Text Available The article scientifically justifies the algorithm of complex diagnostics of functional musculoskeletal disorders during resort treatment, aimed at the optimal application of modern methods of physical rehabilitation (correction programs formation, based on diagnostic methodologies findings

  5. Comparative analysis of time efficiency and spatial resolution between different EIT reconstruction algorithms

    International Nuclear Information System (INIS)

    Kacarska, Marija; Loskovska, Suzana

    2002-01-01

    In this paper comparative analysis between different EIT algorithms is presented. Analysis is made for spatial and temporal resolution of obtained images by several different algorithms. Discussions consider spatial resolution dependent on data acquisition method, too. Obtained results show that conventional applied-current EIT is more powerful compared to induced-current EIT. (Author)

  6. Evidence-based medicine is affordable: the cost-effectiveness of current compared with optimal treatment in rheumatoid and osteoarthritis.

    Science.gov (United States)

    Andrews, Gavin; Simonella, Leonardo; Lapsley, Helen; Sanderson, Kristy; March, Lyn

    2006-04-01

    To determine the cost-effectiveness of averting the burden of disease. We used secondary population data and metaanalyses of various government-funded services and interventions to investigate the costs and benefits of various levels of treatment for rheumatoid arthritis (RA) and osteoarthritis (OA) in adults using a burden of disease framework. Population burden was calculated for both diseases in the absence of any treatment as years lived with disability (YLD), ignoring the years of life lost. We then estimated the proportion of burden averted with current interventions, the proportion that could be averted with optimally implemented current evidence-based guidelines, and the direct treatment cost-effectiveness ratio in dollars per YLD averted for both treatment levels. The majority of people with arthritis sought medical treatment. Current treatment for RA averted 26% of the burden, with a cost-effectiveness ratio of dollar 19,000 per YLD averted. Optimal, evidence-based treatment would avert 48% of the burden, with a cost-effectiveness ratio of dollar 12,000 per YLD averted. Current treatment of OA in Australia averted 27% of the burden, with a cost-effectiveness ratio of dollar 25,000 per YLD averted. Optimal, evidence-based treatment would avert 39% of the burden, with an unchanged cost-effectiveness ratio of dollar 25,000 per YLD averted. While the precise dollar costs in each country will differ, the relativities at this level of coverage should remain the same. There is no evidence that closing the gap between evidence and practice would result in a drop in efficiency.

  7. An Alternative Route to Teaching Fraction Division: Abstraction of Common Denominator Algorithm

    Science.gov (United States)

    Zembat, Ismail Özgür

    2015-01-01

    From a curricular stand point, the traditional invert and multiply algorithm for division of fractions provides few affordances for linking to a rich understanding of fractions. On the other hand, an alternative algorithm, called common denominator algorithm, has many such affordances. The current study serves as an argument for shifting…

  8. Stochastic Recursive Algorithms for Optimization Simultaneous Perturbation Methods

    CERN Document Server

    Bhatnagar, S; Prashanth, L A

    2013-01-01

    Stochastic Recursive Algorithms for Optimization presents algorithms for constrained and unconstrained optimization and for reinforcement learning. Efficient perturbation approaches form a thread unifying all the algorithms considered. Simultaneous perturbation stochastic approximation and smooth fractional estimators for gradient- and Hessian-based methods are presented. These algorithms: • are easily implemented; • do not require an explicit system model; and • work with real or simulated data. Chapters on their application in service systems, vehicular traffic control and communications networks illustrate this point. The book is self-contained with necessary mathematical results placed in an appendix. The text provides easy-to-use, off-the-shelf algorithms that are given detailed mathematical treatment so the material presented will be of significant interest to practitioners, academic researchers and graduate students alike. The breadth of applications makes the book appropriate for reader from sim...

  9. Efficient motif finding algorithms for large-alphabet inputs

    Directory of Open Access Journals (Sweden)

    Pavlovic Vladimir

    2010-10-01

    Full Text Available Abstract Background We consider the problem of identifying motifs, recurring or conserved patterns, in the biological sequence data sets. To solve this task, we present a new deterministic algorithm for finding patterns that are embedded as exact or inexact instances in all or most of the input strings. Results The proposed algorithm (1 improves search efficiency compared to existing algorithms, and (2 scales well with the size of alphabet. On a synthetic planted DNA motif finding problem our algorithm is over 10× more efficient than MITRA, PMSPrune, and RISOTTO for long motifs. Improvements are orders of magnitude higher in the same setting with large alphabets. On benchmark TF-binding site problems (FNP, CRP, LexA we observed reduction in running time of over 12×, with high detection accuracy. The algorithm was also successful in rapidly identifying protein motifs in Lipocalin, Zinc metallopeptidase, and supersecondary structure motifs for Cadherin and Immunoglobin families. Conclusions Our algorithm reduces computational complexity of the current motif finding algorithms and demonstrate strong running time improvements over existing exact algorithms, especially in important and difficult cases of large-alphabet sequences.

  10. A decision algorithm for patch spraying

    DEFF Research Database (Denmark)

    Christensen, Svend; Heisel, Torben; Walter, Mette

    2003-01-01

    method that estimates an economic optimal herbicide dose according to site-specific weed composition and density is presented in this paper. The method was termed a ‘decision algorithm for patch spraying’ (DAPS) and was evaluated in a 5-year experiment, in Denmark. DAPS consists of a competition model......, a herbicide dose–response model and an algorithm that estimates the economically optimal doses. The experiment was designed to compare herbicide treatments with DAPS recommendations and the Danish decision support system PC-Plant Protection. The results did not show any significant grain yield difference...

  11. Clinical algorithms to aid osteoarthritis guideline dissemination

    DEFF Research Database (Denmark)

    Meneses, S. R. F.; Goode, A. P.; Nelson, A. E

    2016-01-01

    Background: Numerous scientific organisations have developed evidence-based recommendations aiming to optimise the management of osteoarthritis (OA). Uptake, however, has been suboptimal. The purpose of this exercise was to harmonize the recent recommendations and develop a user-friendly treatment...... algorithm to facilitate translation of evidence into practice. Methods: We updated a previous systematic review on clinical practice guidelines (CPGs) for OA management. The guidelines were assessed using the Appraisal of Guidelines for Research and Evaluation for quality and the standards for developing...... to facilitate the implementation of guidelines in clinical practice are necessary. The algorithms proposed are examples of how to apply recommendations in the clinical context, helping the clinician to visualise the patient flow and timing of different treatment modalities. (C) 2016 Osteoarthritis Research...

  12. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    Science.gov (United States)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  13. Treatment planning optimization for linear accelerator radiosurgery

    International Nuclear Information System (INIS)

    Meeks, Sanford L.; Buatti, John M.; Bova, Francis J.; Friedman, William A.; Mendenhall, William M.

    1998-01-01

    Purpose: Linear accelerator radiosurgery uses multiple arcs delivered through circular collimators to produce a nominally spherical dose distribution. Production of dose distributions that conform to irregular lesions or conformally avoid critical neural structures requires a detailed understanding of the available treatment planning parameters. Methods and Materials: Treatment planning parameters that may be manipulated within a single isocenter to provide conformal avoidance and dose conformation to ellipsoidal lesions include differential arc weighting and gantry start/stop angles. More irregular lesions require the use of multiple isocenters. Iterative manipulation of treatment planning variables can be difficult and computationally expensive, especially if the effects of these manipulations are not well defined. Effects of treatment parameter manipulation are explained and illustrated. This is followed by description of the University of Florida Stereotactic Radiosurgery Treatment Planning Algorithm. This algorithm organizes the manipulations into a practical approach for radiosurgery treatment planning. Results: Iterative treatment planning parameters may be efficiently manipulated to achieve optimal treatment plans by following the University of Florida Treatment Planning Algorithm. The ability to produce conformal stereotactic treatment plans using the algorithm is demonstrated for a variety of clinical presentations. Conclusion: The standard dose distribution produced in linear accelerator radiosurgery is spherical, but manipulation of available treatment planning parameters may result in optimal dose conformation. The University of Florida Treatment Planning Algorithm organizes available treatment parameters to efficiently produce conformal radiosurgery treatment plans

  14. Clarus quality checking algorithm documentation report.

    Science.gov (United States)

    2010-12-21

    With funding and support from the USDOT RITA IntelliDrive(SM) initiative and direction from the FHWA Road Weather Management Program, NCAR enhanced QCh algorithms that are a part of the current Clarus System. Moreover, NCAR developed new QCh algorith...

  15. Current approaches to treatments for schizophrenia spectrum disorders, part I: an overview and medical treatments

    Directory of Open Access Journals (Sweden)

    Chien WT

    2013-09-01

    Full Text Available Wai Tong Chien, Annie LK Yip School of Nursing, Faculty of Health and Social Sciences, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong Abstract: During the last three decades, an increasing understanding of the etiology, psychopathology, and clinical manifestations of schizophrenia spectrum disorders, in addition to the introduction of second-generation antipsychotics, has optimized the potential for recovery from the illness. Continued development of various models of psychosocial intervention promotes the goal of schizophrenia treatment from one of symptom control and social adaptation to an optimal restoration of functioning and/or recovery. However, it is still questionable whether these new treatment approaches can address the patients' needs for treatment and services and contribute to better patient outcomes. This article provides an overview of different treatment approaches currently used in schizophrenia spectrum disorders to address complex health problems and a wide range of abnormalities and impairments resulting from the illness. There are different treatment strategies and targets for patients at different stages of the illness, ranging from prophylactic antipsychotics and cognitive–behavioral therapy in the premorbid stage to various psychosocial interventions in addition to antipsychotics for relapse prevention and rehabilitation in the later stages of the illness. The use of antipsychotics alone as the main treatment modality may be limited not only in being unable to tackle the frequently occurring negative symptoms and cognitive impairments but also in producing a wide variety of adverse effects to the body or organ functioning. Because of varied pharmacokinetics and treatment responsiveness across agents, the medication regimen should be determined on an individual basis to ensure an optimal effect in its long-term use. This review also highlights that the recent practice guidelines and standards have

  16. Guidelines and algorithms for managing the difficult airway.

    Science.gov (United States)

    Gómez-Ríos, M A; Gaitini, L; Matter, I; Somri, M

    2018-01-01

    The difficult airway constitutes a continuous challenge for anesthesiologists. Guidelines and algorithms are key to preserving patient safety, by recommending specific plans and strategies that address predicted or unexpected difficult airway. However, there are currently no "gold standard" algorithms or universally accepted standards. The aim of this article is to present a synthesis of the recommendations of the main guidelines and difficult airway algorithms. Copyright © 2017 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. Path Planning Algorithms for the Adaptive Sensor Fleet

    Science.gov (United States)

    Stoneking, Eric; Hosler, Jeff

    2005-01-01

    The Adaptive Sensor Fleet (ASF) is a general purpose fleet management and planning system being developed by NASA in coordination with NOAA. The current mission of ASF is to provide the capability for autonomous cooperative survey and sampling of dynamic oceanographic phenomena such as current systems and algae blooms. Each ASF vessel is a software model that represents a real world platform that carries a variety of sensors. The OASIS platform will provide the first physical vessel, outfitted with the systems and payloads necessary to execute the oceanographic observations described in this paper. The ASF architecture is being designed for extensibility to accommodate heterogenous fleet elements, and is not limited to using the OASIS platform to acquire data. This paper describes the path planning algorithms developed for the acquisition phase of a typical ASF task. Given a polygonal target region to be surveyed, the region is subdivided according to the number of vessels in the fleet. The subdivision algorithm seeks a solution in which all subregions have equal area and minimum mean radius. Once the subregions are defined, a dynamic programming method is used to find a minimum-time path for each vessel from its initial position to its assigned region. This path plan includes the effects of water currents as well as avoidance of known obstacles. A fleet-level planning algorithm then shuffles the individual vessel assignments to find the overall solution which puts all vessels in their assigned regions in the minimum time. This shuffle algorithm may be described as a process of elimination on the sorted list of permutations of a cost matrix. All these path planning algorithms are facilitated by discretizing the region of interest onto a hexagonal tiling.

  18. Algorithmic Issues in Modeling Motion

    DEFF Research Database (Denmark)

    Agarwal, P. K; Guibas, L. J; Edelsbrunner, H.

    2003-01-01

    This article is a survey of research areas in which motion plays a pivotal role. The aim of the article is to review current approaches to modeling motion together with related data structures and algorithms, and to summarize the challenges that lie ahead in producing a more unified theory of mot...

  19. Personalized therapy algorithms for type 2 diabetes: a phenotype-based approach

    Directory of Open Access Journals (Sweden)

    Ceriello A

    2014-06-01

    Full Text Available Antonio Ceriello,1,2 Marco Gallo,3 Riccardo Candido,4 Alberto De Micheli,5 Katherine Esposito,6 Sandro Gentile,6 Gerardo Medea71Department of Endocrinology, Hospital Clinic de Barcelona, Institut d’Investigacions Biomèdiques August Pi iSunyer, 2Centro de Investigacion Biomèdica en Red de Diabetes y Enfermedades Metabolicas Asociadas, Barcelona, Spain; 3Oncological Endocrinology, AOU Città della Salute e della Scienza-Molinette, Turin, 4Diabetes Center, ASS 1 Triestina, Trieste, 5Ligurian Health Agency, Genoa, 6Department of Clinical and Experimental Medicine, Second University of Naples, Naples, 7Italian College of General Practitioners, Florence, ItalyAbstract: Type 2 diabetes is a progressive disease with a complex and multifactorial pathophysiology. Patients with type 2 diabetes show a variety of clinical features, including different "phenotypes" of hyperglycemia (eg, fasting/preprandial or postprandial. Thus, the best treatment choice is sometimes difficult to make, and treatment initiation or optimization is postponed. This situation may explain why, despite the existing complex therapeutic armamentarium and guidelines for the treatment of type 2 diabetes, a significant proportion of patients do not have good metabolic control and at risk of developing the late complications of diabetes. The Italian Association of Medical Diabetologists has developed an innovative personalized algorithm for the treatment of type 2 diabetes, which is available online. According to the main features shown by the patient, six algorithms are proposed, according to glycated hemoglobin (HbA1c, ≥9% or ≤9%, body mass index (≤30 kg/m2 or ≥30 kg/m2, occupational risk potentially related to hypoglycemia, chronic renal failure, and frail elderly status. Through self-monitoring of blood glucose, patients are phenotyped according to the occurrence of fasting/preprandial or postprandial hyperglycemia. In each of these six algorithms, the gradual choice of

  20. Spinal cord stimulation: Current applications for treatment of chronic pain.

    Science.gov (United States)

    Vannemreddy, Prasad; Slavin, Konstantin V

    2011-01-01

    Spinal cord stimulation (SCS) is thought to relieve chronic intractable pain by stimulating nerve fibers in the spinal cord. The resulting impulses in the fibers may inhibit the conduction of pain signals to the brain, according to the pain gate theory proposed by Melzack and Wall in 1965 and the sensation of pain is thus blocked. Although SCS may reduce pain, it will not eliminate it. After a period of concern about safety and efficacy, SCS is now regaining popularity among pain specialists for the treatment of chronic pain. The sympatholytic effect of SCS is one of its most interesting therapeutic properties. This effect is considered responsible for the effectiveness of SCS in peripheral ischemia, and at least some cases of complex regional pain syndrome. The sympatholytic effect has also been considered part of the management of other chronic pain states such as failed back surgery syndrome, phantom pain, diabetic neuropathy, and postherpetic neuralgia. In general, SCS is part of an overall treatment strategy and is used only after the more conservative treatments have failed. The concept of SCS has evolved rapidly following the technological advances that have produced leads with multiple contact electrodes and battery systems. The current prevalence of patients with chronic pain requiring treatment other than conventional medical management has significantly increased and so has been the need for SCS. With the cost benefit analysis showing significant support for SCS, it may be appropriate to offer this as an effective alternative treatment for these patients.

  1. Individualized Treatment for Tobacco Dependence in Addictions Treatment Settings: The Role of Current Depressive Symptoms on Outcomes at 3 and 6 Months.

    Science.gov (United States)

    Zawertailo, Laurie A; Baliunas, Dolly; Ivanova, Anna; Selby, Peter L

    2015-08-01

    Individuals with concurrent tobacco dependence and other addictions often report symptoms of low mood and depression and as such may have more difficulty quitting smoking. We hypothesized that current symptoms of depression would be a significant predictor of quit success among a group of smokers receiving individualized treatment for tobacco dependence within addiction treatment settings. Individuals in treatment for other addictions were enrolled in a smoking cessation program involving brief behavioral counseling and individualized dosing of nicotine replacement therapy. The baseline assessment included the Patient Health Questionnaire (PHQ9) for depression. Smoking cessation outcomes were measured at 3 and 6 months post-enrollment. Bivariate associations between cessation outcomes and PHQ9 score were analyzed. Of the 1,196 subjects enrolled to date, 1,171 (98%) completed the PHQ9. Moderate to severe depression (score >9) was reported by 28% of the sample, and another 29% reported mild depression (score between 5 and 9). Contrary to the extant literature and other findings by our own group, there was no association between current depression and cessation outcome at either 3 months (n = 1,171) (17.0% in those with PHQ9 > 9 vs. 19.8% in those with PHQ9 addictions treatment setting. These data indicate that patients in an addictions treatment setting can successfully quit smoking regardless of current depressive symptoms. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Current technologies for biological treatment of textile wastewater--a review.

    Science.gov (United States)

    Sarayu, K; Sandhya, S

    2012-06-01

    The release of colored wastewater represents a serious environmental problem and public health concern. Color removal from textile wastewater has become a big challenge over the last decades, and up to now, there is no single and economically attractive treatment method that can effectively decolorize the wastewater. Effluents from textile manufacturing, dyeing, and finishing processes contain high concentrations of biologically difficult-to-degrade or even inert auxiliaries, chemicals like acids, waxes, fats, salts, binders, thickeners, urea, surfactants, reducing agents, etc. The various chemicals such as biocides and stain repellents used for brightening, sequestering, anticreasing, sizing, softening, and wetting of the yarn or fabric are also present in wastewater. Therefore, the textile wastewater needs environmental friendly, effective treatment process. This paper provides a critical review on the current technology available for decolorization and degradation of textile wastewater and also suggests effective and economically attractive alternatives.

  3. A Multifactorial, Criteria-based Progressive Algorithm for Hamstring Injury Treatment.

    Science.gov (United States)

    Mendiguchia, Jurdan; Martinez-Ruiz, Enrique; Edouard, Pascal; Morin, Jean-Benoît; Martinez-Martinez, Francisco; Idoate, Fernando; Mendez-Villanueva, Alberto

    2017-07-01

    Given the prevalence of hamstring injuries in football, a rehabilitation program that effectively promotes muscle tissue repair and functional recovery is paramount to minimize reinjury risk and optimize player performance and availability. This study aimed to assess the concurrent effectiveness of administering an individualized and multifactorial criteria-based algorithm (rehabilitation algorithm [RA]) on hamstring injury rehabilitation in comparison with using a general rehabilitation protocol (RP). Implementing a double-blind randomized controlled trial approach, two equal groups of 24 football players (48 total) completed either an RA group or a validated RP group 5 d after an acute hamstring injury. Within 6 months after return to sport, six hamstring reinjuries occurred in RP versus one injury in RA (relative risk = 6, 90% confidence interval = 1-35; clinical inference: very likely beneficial effect). The average duration of return to sport was possibly quicker (effect size = 0.34 ± 0.42) in RP (23.2 ± 11.7 d) compared with RA (25.5 ± 7.8 d) (-13.8%, 90% confidence interval = -34.0% to 3.4%; clinical inference: possibly small effect). At the time to return to sport, RA players showed substantially better 10-m time, maximal sprinting speed, and greater mechanical variables related to speed (i.e., maximum theoretical speed and maximal horizontal power) than the RP. Although return to sport was slower, male football players who underwent an individualized, multifactorial, criteria-based algorithm with a performance- and primary risk factor-oriented training program from the early stages of the process markedly decreased the risk of reinjury compared with a general protocol where long-length strength training exercises were prioritized.

  4. Implementation and evaluation of an algorithm-based order set for the outpatient treatment of urinary tract infections in the spinal cord injury population in a VA Medical Center.

    Science.gov (United States)

    Patros, Clayton; Sabol, Mirella; Paniagua, Angela; Lans, Daniel

    2018-03-01

    Treatment of urinary tract infections (UTI) in the spinal cord injury (SCI) population is often difficult due to the lack of symptoms, increased resistance, and increased morbidity and mortality associated with UTIs. To develop an algorithm-based order set for the treatment of UTIs for patients with SCI based on SCI-specific antibiogram data in order to assess and improve current antimicrobial prescribing practices at the Clement J. Zablocki Veterans Affairs Medical Center (ZVAMC). This study is a retrospective, pre- and post-implementation analysis of an order set based on SCI antibiogram data. Descriptive statistics were used to compare baseline data and characteristics and chi squared tests were used to evaluate the primary outcome and all secondary outcomes. To achieve a power of 80% with an effect size of 0.3, the goal was to assess 45 antimicrobial treatment courses in the pre-implementation group and 45 antimicrobial treatment courses in the post-implementation group. The percentage of appropriate antimicrobial treatment courses increased from 47.9% in the pre-intervention group (n = 73) to 71.8% in the post-intervention group (n = 39), which was statistically significant (P = 0.015). Patients with SCI treated for UTIs within the ZVAMC had a significantly higher percentage of appropriate treatment courses following the implementation of a unit-specific antibiogram, electronic order set, and educational in-service for providers. An order set and unit-specific antibiogram with related education may be beneficial in improving antimicrobial therapy from a stewardship perspective.

  5. Automatic treatment plan re-optimization for adaptive radiotherapy guided with the initial plan DVHs

    International Nuclear Information System (INIS)

    Li, Nan; Zarepisheh, Masoud; Uribe-Sanchez, Andres; Moore, Kevin; Tian, Zhen; Zhen, Xin; Graves, Yan Jiang; Gautier, Quentin; Mell, Loren; Jia, Xun; Jiang, Steve; Zhou, Linghong

    2013-01-01

    Adaptive radiation therapy (ART) can reduce normal tissue toxicity and/or improve tumor control through treatment adaptations based on the current patient anatomy. Developing an efficient and effective re-planning algorithm is an important step toward the clinical realization of ART. For the re-planning process, manual trial-and-error approach to fine-tune planning parameters is time-consuming and is usually considered unpractical, especially for online ART. It is desirable to automate this step to yield a plan of acceptable quality with minimal interventions. In ART, prior information in the original plan is available, such as dose–volume histogram (DVH), which can be employed to facilitate the automatic re-planning process. The goal of this work is to develop an automatic re-planning algorithm to generate a plan with similar, or possibly better, DVH curves compared with the clinically delivered original plan. Specifically, our algorithm iterates the following two loops. An inner loop is the traditional fluence map optimization, in which we optimize a quadratic objective function penalizing the deviation of the dose received by each voxel from its prescribed or threshold dose with a set of fixed voxel weighting factors. In outer loop, the voxel weighting factors in the objective function are adjusted according to the deviation of the current DVH curves from those in the original plan. The process is repeated until the DVH curves are acceptable or maximum iteration step is reached. The whole algorithm is implemented on GPU for high efficiency. The feasibility of our algorithm has been demonstrated with three head-and-neck cancer IMRT cases, each having an initial planning CT scan and another treatment CT scan acquired in the middle of treatment course. Compared with the DVH curves in the original plan, the DVH curves in the resulting plan using our algorithm with 30 iterations are better for almost all structures. The re-optimization process takes about 30

  6. Named Entity Linking Algorithm

    Directory of Open Access Journals (Sweden)

    M. F. Panteleev

    2017-01-01

    Full Text Available In the tasks of processing text in natural language, Named Entity Linking (NEL represents the task to define and link some entity, which is found in the text, with some entity in the knowledge base (for example, Dbpedia. Currently, there is a diversity of approaches to solve this problem, but two main classes can be identified: graph-based approaches and machine learning-based ones. Graph and Machine Learning approaches-based algorithm is proposed accordingly to the stated assumptions about the interrelations of named entities in a sentence and in general.In the case of graph-based approaches, it is necessary to solve the problem of identifying an optimal set of the related entities according to some metric that characterizes the distance between these entities in a graph built on some knowledge base. Due to limitations in processing power, to solve this task directly is impossible. Therefore, its modification is proposed. Based on the algorithms of machine learning, an independent solution cannot be built due to small volumes of training datasets relevant to NEL task. However, their use can contribute to improving the quality of the algorithm. The adaptation of the Latent Dirichlet Allocation model is proposed in order to obtain a measure of the compatibility of attributes of various entities encountered in one context.The efficiency of the proposed algorithm was experimentally tested. A test dataset was independently generated. On its basis the performance of the model was compared using the proposed algorithm with the open source product DBpedia Spotlight, which solves the NEL problem.The mockup, based on the proposed algorithm, showed a low speed as compared to DBpedia Spotlight. However, the fact that it has shown higher accuracy, stipulates the prospects for work in this direction.The main directions of development were proposed in order to increase the accuracy of the system and its productivity.

  7. Application for verification of monitor units of the treatment planning system

    International Nuclear Information System (INIS)

    Suero Rodrigo, M. A.; Marques Fraguela, E.

    2011-01-01

    Current estimates algorithms achieve acceptable degree of accuracy. However, operate on the basis of un intuitive models. It is therefore necessary to verify the calculation of monitor units of the treatment planning system (TPS) with those obtained by other independent formalisms. To this end, we have developed an application based on factorization formalism that automates the calculation of dose.

  8. Algorithms for the optimization of RBE-weighted dose in particle therapy.

    Science.gov (United States)

    Horcicka, M; Meyer, C; Buschbacher, A; Durante, M; Krämer, M

    2013-01-21

    We report on various algorithms used for the nonlinear optimization of RBE-weighted dose in particle therapy. Concerning the dose calculation carbon ions are considered and biological effects are calculated by the Local Effect Model. Taking biological effects fully into account requires iterative methods to solve the optimization problem. We implemented several additional algorithms into GSI's treatment planning system TRiP98, like the BFGS-algorithm and the method of conjugated gradients, in order to investigate their computational performance. We modified textbook iteration procedures to improve the convergence speed. The performance of the algorithms is presented by convergence in terms of iterations and computation time. We found that the Fletcher-Reeves variant of the method of conjugated gradients is the algorithm with the best computational performance. With this algorithm we could speed up computation times by a factor of 4 compared to the method of steepest descent, which was used before. With our new methods it is possible to optimize complex treatment plans in a few minutes leading to good dose distributions. At the end we discuss future goals concerning dose optimization issues in particle therapy which might benefit from fast optimization solvers.

  9. Azelaic acid in dermatological treatmentcurrent state of knowledge

    Directory of Open Access Journals (Sweden)

    Radomir Reszke

    2016-09-01

    Full Text Available Azelaic acid (AZA is a naturally occurring substance produced by Malassezia furfur which exerts various effects on the skin. Azelaic acid has antibacterial, anti-inflammatory, keratolytic, comedolytic, sebostatic and tyrosinase-inhibiting properties. Topical application of AZA as 20% cream or 15% gel is a well-established therapeutic method in various common dermatoses, mainly acne vulgaris, rosacea and disorders associated with hyperpigmentation. Azelaic acid is used as a component of chemical peels as well. The paper summarizes the most relevant issues concerning AZA application in dermatological treatment based on current knowledge.

  10. Clinical effectiveness of primary and secondary headache treatment by transcranial direct current stimulation

    Directory of Open Access Journals (Sweden)

    Dmitry ePinchuk

    2013-03-01

    Full Text Available The clinical effectiveness of headache treatment by transcranial direct current stimulation with various locations of stimulating electrodes on the scalp was analyzed retrospectively. The results of the treatment were analyzed in 90 patients aged from 19 to 54 years (48 patients had migraine without aura, 32 – frequent episodic tension-type headaches, 10 – chronic tension-type headaches and in 44 adolescents aged 11 – 16 years with chronic posttraumatic headaches after a mild head injury. Clinical effectiveness of tDCS with 70 – 150 µA current for 30 – 45 minutes via 6.25 cm2 stimulating electrodes is comparable to that of modern pharmacological drugs, with no negative side effects. The obtained result has been maintained on average from 5 to 9 months. It has been demonstrated that effectiveness depends on localization of stimulating electrodes used for different types of headaches.

  11. Computational methods in calculating superconducting current problems

    Science.gov (United States)

    Brown, David John, II

    Various computational problems in treating superconducting currents are examined. First, field inversion in spatial Fourier transform space is reviewed to obtain both one-dimensional transport currents flowing down a long thin tape, and a localized two-dimensional current. The problems associated with spatial high-frequency noise, created by finite resolution and experimental equipment, are presented, and resolved with a smooth Gaussian cutoff in spatial frequency space. Convergence of the Green's functions for the one-dimensional transport current densities is discussed, and particular attention is devoted to the negative effects of performing discrete Fourier transforms alone on fields asymptotically dropping like 1/r. Results of imaging simulated current densities are favorably compared to the original distributions after the resulting magnetic fields undergo the imaging procedure. The behavior of high-frequency spatial noise, and the behavior of the fields with a 1/r asymptote in the imaging procedure in our simulations is analyzed, and compared to the treatment of these phenomena in the published literature. Next, we examine calculation of Mathieu and spheroidal wave functions, solutions to the wave equation in elliptical cylindrical and oblate and prolate spheroidal coordinates, respectively. These functions are also solutions to Schrodinger's equations with certain potential wells, and are useful in solving time-varying superconducting problems. The Mathieu functions are Fourier expanded, and the spheroidal functions expanded in associated Legendre polynomials to convert the defining differential equations to recursion relations. The infinite number of linear recursion equations is converted to an infinite matrix, multiplied by a vector of expansion coefficients, thus becoming an eigenvalue problem. The eigenvalue problem is solved with root solvers, and the eigenvector problem is solved using a Jacobi-type iteration method, after preconditioning the

  12. The current place of probiotics and prebiotics in the treatment of pouchitis.

    Science.gov (United States)

    Lichtenstein, Lev; Avni-Biron, Irit; Ben-Bassat, Ofer

    2016-02-01

    Pouchitis is a common complication in patients undergoing restorative proctocolectomy for ulcerative colitis. Therapeutic attempts include manipulations of pouch flora composition. In this review, we bring together the evidence supporting the use of probiotics and prebiotics in pouchitis patients, to clarify the place of these treatments in current therapeutic regimens. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Fault Tolerant External Memory Algorithms

    DEFF Research Database (Denmark)

    Jørgensen, Allan Grønlund; Brodal, Gerth Stølting; Mølhave, Thomas

    2009-01-01

    Algorithms dealing with massive data sets are usually designed for I/O-efficiency, often captured by the I/O model by Aggarwal and Vitter. Another aspect of dealing with massive data is how to deal with memory faults, e.g. captured by the adversary based faulty memory RAM by Finocchi and Italiano....... However, current fault tolerant algorithms do not scale beyond the internal memory. In this paper we investigate for the first time the connection between I/O-efficiency in the I/O model and fault tolerance in the faulty memory RAM, and we assume that both memory and disk are unreliable. We show a lower...... bound on the number of I/Os required for any deterministic dictionary that is resilient to memory faults. We design a static and a dynamic deterministic dictionary with optimal query performance as well as an optimal sorting algorithm and an optimal priority queue. Finally, we consider scenarios where...

  14. Congested Link Inference Algorithms in Dynamic Routing IP Network

    Directory of Open Access Journals (Sweden)

    Yu Chen

    2017-01-01

    Full Text Available The performance descending of current congested link inference algorithms is obviously in dynamic routing IP network, such as the most classical algorithm CLINK. To overcome this problem, based on the assumptions of Markov property and time homogeneity, we build a kind of Variable Structure Discrete Dynamic Bayesian (VSDDB network simplified model of dynamic routing IP network. Under the simplified VSDDB model, based on the Bayesian Maximum A Posteriori (BMAP and Rest Bayesian Network Model (RBNM, we proposed an Improved CLINK (ICLINK algorithm. Considering the concurrent phenomenon of multiple link congestion usually happens, we also proposed algorithm CLILRS (Congested Link Inference algorithm based on Lagrangian Relaxation Subgradient to infer the set of congested links. We validated our results by the experiments of analogy, simulation, and actual Internet.

  15. Determination of Pavement Rehabilitation Activities through a Permutation Algorithm

    Directory of Open Access Journals (Sweden)

    Sangyum Lee

    2013-01-01

    Full Text Available This paper presents a mathematical programming model for optimal pavement rehabilitation planning. The model maximized the rehabilitation area through a newly developed permutation algorithm, based on the procedures outlined in the harmony search (HS algorithm. Additionally, the proposed algorithm was based on an optimal solution method for the problem of multilocation rehabilitation activities on pavement structure, using empirical deterioration and rehabilitation effectiveness models, according to a limited maintenance budget. Thus, nonlinear pavement performance and rehabilitation activity decision models were used to maximize the objective functions of the rehabilitation area within a limited budget, through the permutation algorithm. Our results showed that the heuristic permutation algorithm provided a good optimum in terms of maximizing the rehabilitation area, compared with a method of the worst-first maintenance currently used in Seoul.

  16. Consensus Conference: A reappraisal of Gaucher disease - diagnosis and disease management algorithms

    Science.gov (United States)

    Mistry, Pramod K.; Cappellini, Maria Domenica; Lukina, Elena; Özsan, Hayri; Pascual, Sara Mach; Rosenbaum, Hanna; Solano, Maria Helena; Spigelman, Zachary; Villarrubia, Jesús; Watman, Nora Patricia; Massenkeil, Gero

    2010-01-01

    Type 1 (non neuronopathic) Gaucher disease was the first lysosomal storage disorder for which an effective enzyme replacement therapy was developed and it has become a prototype for treatments for related orphan diseases. There are currently four treatment options available to patients with Gaucher disease, nevertheless, almost 25% of type 1 Gaucher patients do not gain timely access to therapy because of delays in diagnosis after the onset of symptoms. Diagnosis of Gaucher disease by enzyme testing is unequivocal, but the rarity of the disease and non-specific and heterogeneous nature of Gaucher disease symptoms may impede consideration of this disease in the differential diagnosis. To help promote timely diagnosis and optimal management of the protean presentations of Gaucher disease, a consensus meeting was convened to develop algorithms for diagnosis and disease management for Gaucher disease. PMID:21080341

  17. Time Optimized Algorithm for Web Document Presentation Adaptation

    DEFF Research Database (Denmark)

    Pan, Rong; Dolog, Peter

    2010-01-01

    Currently information on the web is accessed through different devices. Each device has its own properties such as resolution, size, and capabilities to display information in different format and so on. This calls for adaptation of information presentation for such platforms. This paper proposes...... content-optimized and time-optimized algorithms for information presentation adaptation for different devices based on its hierarchical model. The model is formalized in order to experiment with different algorithms.......Currently information on the web is accessed through different devices. Each device has its own properties such as resolution, size, and capabilities to display information in different format and so on. This calls for adaptation of information presentation for such platforms. This paper proposes...

  18. Loading pattern optimization using ant colony algorithm

    International Nuclear Information System (INIS)

    Hoareau, Fabrice

    2008-01-01

    Electricite de France (EDF) operates 58 nuclear power plants (NPP), of the Pressurized Water Reactor type. The loading pattern optimization of these NPP is currently done by EDF expert engineers. Within this framework, EDF R and D has developed automatic optimization tools that assist the experts. LOOP is an industrial tool, developed by EDF R and D and based on a simulated annealing algorithm. In order to improve the results of such automatic tools, new optimization methods have to be tested. Ant Colony Optimization (ACO) algorithms are recent methods that have given very good results on combinatorial optimization problems. In order to evaluate the performance of such methods on loading pattern optimization, direct comparisons between LOOP and a mock-up based on the Max-Min Ant System algorithm (a particular variant of ACO algorithms) were made on realistic test-cases. It is shown that the results obtained by the ACO mock-up are very similar to those of LOOP. Future research will consist in improving these encouraging results by using parallelization and by hybridizing the ACO algorithm with local search procedures. (author)

  19. Current treatments of acne: Medications, lights, lasers, and a novel 650-μs 1064-nm Nd: YAG laser.

    Science.gov (United States)

    Gold, Michael H; Goldberg, David J; Nestor, Mark S

    2017-09-01

    The treatment of acne, especially severe acne, remains a challenge to dermatologists. Therapies include retinoids, antibiotics, hormones, lights, lasers, and various combinations of these modalities. Acne is currently considered a chronic rather than an adolescent condition. The appropriate treatment depends on the patient and the severity of disease. The purpose of this study was to review current therapies for acne of all severities and to introduce the 650-μs 1064-nm laser for the treatment of acne. © 2017 Wiley Periodicals, Inc.

  20. Fast clustering algorithm for large ECG data sets based on CS theory in combination with PCA and K-NN methods.

    Science.gov (United States)

    Balouchestani, Mohammadreza; Krishnan, Sridhar

    2014-01-01

    Long-term recording of Electrocardiogram (ECG) signals plays an important role in health care systems for diagnostic and treatment purposes of heart diseases. Clustering and classification of collecting data are essential parts for detecting concealed information of P-QRS-T waves in the long-term ECG recording. Currently used algorithms do have their share of drawbacks: 1) clustering and classification cannot be done in real time; 2) they suffer from huge energy consumption and load of sampling. These drawbacks motivated us in developing novel optimized clustering algorithm which could easily scan large ECG datasets for establishing low power long-term ECG recording. In this paper, we present an advanced K-means clustering algorithm based on Compressed Sensing (CS) theory as a random sampling procedure. Then, two dimensionality reduction methods: Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) followed by sorting the data using the K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers are applied to the proposed algorithm. We show our algorithm based on PCA features in combination with K-NN classifier shows better performance than other methods. The proposed algorithm outperforms existing algorithms by increasing 11% classification accuracy. In addition, the proposed algorithm illustrates classification accuracy for K-NN and PNN classifiers, and a Receiver Operating Characteristics (ROC) area of 99.98%, 99.83%, and 99.75% respectively.

  1. The current treatment of erectile dysfunction

    Directory of Open Access Journals (Sweden)

    Maria Isabela Sarbu

    2016-10-01

    Full Text Available Erectile dysfunction (ED is the inability to achieve and maintain an erection sufficient for satisfactory sexual intercourse. It is the most frequent sexual dysfunction in elderly men and its prevalence increases with age. Ever since ED was recognized as a real health problem, several treatment options became available and some of them proved to be very efficient. PDE5 inhibitors are the mainstay treatment of ED. However, other treatment options such as intracorporal injections, surgery, vacuum devices and prosthesis are also available for patients who are unresponsive to PDE5 inhibitors. Since none of the treatment options available so far has proven ideal, research in the field of sexual medicine continues. The aim of this paper is to review the most advances in the treatment of ED.

  2. [Current situation and thoughts on radiofrequency ablation in the treatment of thyroid cancers].

    Science.gov (United States)

    Zhang, H; Dong, W W

    2017-08-01

    Radiofrequency ablation (RFA) was originally used primarily for the treatment of regional metastatic lymph nodes from recurrent thyroid cancers in the field of thyroid surgery. In recent years it is gradually used to treat a part of benign thyroid nodules. However, the domestic issues resulting from indiscriminately enlarged RFA indication and lack of standardization of therapy become more and more prominent, including initial treatment of operable thyroid cancers by RFA, which is against by the current consensus about RFA for patients with thyroid nodules and management guidelines for patients with thyroid cancers. Therefore, RFA should be avoided for initial treatment of operable thyroid cancers before the introduction of guidelines based on evidence-based medicine.

  3. Quantification of the influence of the choice of the algorithm and planning system on the calculation of a treatment plan; Cuantificacion de la influencia que tiene la eleccion del algoritmo y del sistema de planificacion en el calculo de una dosimetria clinica

    Energy Technology Data Exchange (ETDEWEB)

    Moral, F. del; Ramos, A.; Salgado, M.; Andrade, B; Munoz, V.

    2010-07-01

    In this work an analysis of the influence of the choice of the algorithm or planning system, on the calculus of the same treatment plan is introduced. For this purpose specific software has been developed for comparing plans of a series of IMRT cases of prostate and head and neck cancer calculated using the convolution, superposition and fast superposition algorithms implemented in the XiO 4.40 planning system (CMS). It has also been used for the comparison of the same treatment plan for lung pathology calculated in XiO with the mentioned algorithms, and calculated in the Plan 4.1 planning system (Brainlab) using its pencil beam algorithm. Differences in dose among the treatment plans have been quantified using a set of metrics. The recommendation for the dosimetry of a careful choice of the algorithm has been numerically confirmed. (Author).

  4. [Novel current and future therapy options for treatment of dry eye disease].

    Science.gov (United States)

    Messmer, E M

    2018-02-01

    Dry eye disease was redefined by the dry eye workshop (DEWS II) in May 2017. According to the new definition "dry eye is a multifactorial disease of the ocular surface characterized by a loss of homeostasis of the tear film and accompanied by ocular symptoms". The current definition encompasses etiological factors, such as instability and hyperosmolarity of the tear film, ocular surface inflammation and damage as well as a new aspect compared to the former definition, neurosensory abnormalities. Recent and future therapeutic options for dry eye focus on treatment of the aforementioned pathogenetic events. New tear substitutes, medications and devices to stimulate tear production, innovative anti-inflammatory treatment, medications to influence corneal innervation and new methods for treatment of Meibomian gland dysfunction are already available or will be available in the near future.

  5. Linear scaling of density functional algorithms

    International Nuclear Information System (INIS)

    Stechel, E.B.; Feibelman, P.J.; Williams, A.R.

    1993-01-01

    An efficient density functional algorithm (DFA) that scales linearly with system size will revolutionize electronic structure calculations. Density functional calculations are reliable and accurate in determining many condensed matter and molecular ground-state properties. However, because current DFA's, including methods related to that of Car and Parrinello, scale with the cube of the system size, density functional studies are not routinely applied to large systems. Linear scaling is achieved by constructing functions that are both localized and fully occupied, thereby eliminating the need to calculate global eigenfunctions. It is, however, widely believed that exponential localization requires the existence of an energy gap between the occupied and unoccupied states. Despite this, the authors demonstrate that linear scaling can still be achieved for metals. Using a linear scaling algorithm, they have explicitly constructed localized, almost fully occupied orbitals for the quintessential metallic system, jellium. The algorithm is readily generalizable to any system geometry and Hamiltonian. They will discuss the conceptual issues involved, convergence properties and scaling for their new algorithm

  6. Innovations in lattice QCD algorithms

    International Nuclear Information System (INIS)

    Orginos, Konstantinos

    2006-01-01

    Lattice QCD calculations demand a substantial amount of computing power in order to achieve the high precision results needed to better understand the nature of strong interactions, assist experiment to discover new physics, and predict the behavior of a diverse set of physical systems ranging from the proton itself to astrophysical objects such as neutron stars. However, computer power alone is clearly not enough to tackle the calculations we need to be doing today. A steady stream of recent algorithmic developments has made an important impact on the kinds of calculations we can currently perform. In this talk I am reviewing these algorithms and their impact on the nature of lattice QCD calculations performed today

  7. Merged Search Algorithms for Radio Frequency Identification Anticollision

    Directory of Open Access Journals (Sweden)

    Bih-Yaw Shih

    2012-01-01

    The arbitration algorithm for RFID system is used to arbitrate all the tags to avoid the collision problem with the existence of multiple tags in the interrogation field of a transponder. A splitting algorithm which is called Binary Search Tree (BST is well known for multitags arbitration. In the current study, a splitting-based schema called Merged Search Tree is proposed to capture identification codes correctly for anticollision. Performance of the proposed algorithm is compared with the original BST according to time and power consumed during the arbitration process. The results show that the proposed model can reduce searching time and power consumed to achieve a better performance arbitration.

  8. Algorithms

    Indian Academy of Sciences (India)

    polynomial) division have been found in Vedic Mathematics which are dated much before Euclid's algorithm. A programming language Is used to describe an algorithm for execution on a computer. An algorithm expressed using a programming.

  9. Two-Phase Algorithm for Optimal Camera Placement

    Directory of Open Access Journals (Sweden)

    Jun-Woo Ahn

    2016-01-01

    Full Text Available As markers for visual sensor networks have become larger, interest in the optimal camera placement problem has continued to increase. The most featured solution for the optimal camera placement problem is based on binary integer programming (BIP. Due to the NP-hard characteristic of the optimal camera placement problem, however, it is difficult to find a solution for a complex, real-world problem using BIP. Many approximation algorithms have been developed to solve this problem. In this paper, a two-phase algorithm is proposed as an approximation algorithm based on BIP that can solve the optimal camera placement problem for a placement space larger than in current studies. This study solves the problem in three-dimensional space for a real-world structure.

  10. Evaluation of collapsed cone convolution superposition (CCCS algorithms in prowess treatment planning system for calculating symmetric and asymmetric field size

    Directory of Open Access Journals (Sweden)

    Tamer Dawod

    2015-01-01

    Full Text Available Purpose: This work investigated the accuracy of prowess treatment planning system (TPS in dose calculation in a homogenous phantom for symmetric and asymmetric field sizes using collapse cone convolution / superposition algorithm (CCCS. Methods: The measurements were carried out at source-to-surface distance (SSD set to 100 cm for 6 and 10 MV photon beams. Data for a full set of measurements for symmetric fields and asymmetric fields, including inplane and crossplane profiles at various depths and percentage depth doses (PDDs were obtained during measurements on the linear accelerator.Results: The results showed that the asymmetric collimation dose lead to significant errors (up to approximately 7% in dose calculations if changes in primary beam intensity and beam quality. It is obvious that the most difference in the isodose curves was found in buildup and the penumbra regions. Conclusion: The results showed that the dose calculation using Prowess TPS based on CCCS algorithm is generally in excellent agreement with measurements.

  11. Parallel optimization of IDW interpolation algorithm on multicore platform

    Science.gov (United States)

    Guan, Xuefeng; Wu, Huayi

    2009-10-01

    Due to increasing power consumption, heat dissipation, and other physical issues, the architecture of central processing unit (CPU) has been turning to multicore rapidly in recent years. Multicore processor is packaged with multiple processor cores in the same chip, which not only offers increased performance, but also presents significant challenges to application developers. As a matter of fact, in GIS field most of current GIS algorithms were implemented serially and could not best exploit the parallelism potential on such multicore platforms. In this paper, we choose Inverse Distance Weighted spatial interpolation algorithm (IDW) as an example to study how to optimize current serial GIS algorithms on multicore platform in order to maximize performance speedup. With the help of OpenMP, threading methodology is introduced to split and share the whole interpolation work among processor cores. After parallel optimization, execution time of interpolation algorithm is greatly reduced and good performance speedup is achieved. For example, performance speedup on Intel Xeon 5310 is 1.943 with 2 execution threads and 3.695 with 4 execution threads respectively. An additional output comparison between pre-optimization and post-optimization is carried out and shows that parallel optimization does to affect final interpolation result.

  12. A Framework To Support Management Of HIVAIDS Using K-Means And Random Forest Algorithm

    Directory of Open Access Journals (Sweden)

    Gladys Iseu

    2017-06-01

    Full Text Available Healthcare industry generates large amounts of complex data about patients hospital resources disease management electronic patient records and medical devices among others. The availability of these huge amounts of medical data creates a need for powerful mining tools to support health care professionals in diagnosis treatment and management of HIVAIDS. Several data mining techniques have been used in management of different data sets. Data mining techniques have been categorized into regression algorithms segmentation algorithms association algorithms sequence analysis algorithms and classification algorithms. In the medical field there has not been a specific study that has incorporated two or more data mining algorithms hence limiting decision making levels by medical practitioners. This study identified the extent to which K-means algorithm cluster patient characteristics it has also evaluated the extent to which random forest algorithm can classify the data for informed decision making as well as design a framework to support medical decision making in the treatment of HIVAIDS related diseases in Kenya. The paper further used random forest classification algorithm to compute proximities between pairs of cases that can be used in clustering locating outliers or by scaling to give interesting views of the data.

  13. Benchmarking homogenization algorithms for monthly data

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  14. Zero-block mode decision algorithm for H.264/AVC.

    Science.gov (United States)

    Lee, Yu-Ming; Lin, Yinyi

    2009-03-01

    In the previous paper , we proposed a zero-block intermode decision algorithm for H.264 video coding based upon the number of zero-blocks of 4 x 4 DCT coefficients between the current macroblock and the co-located macroblock. The proposed algorithm can achieve significant improvement in computation, but the computation performance is limited for high bit-rate coding. To improve computation efficiency, in this paper, we suggest an enhanced zero-block decision algorithm, which uses an early zero-block detection method to compute the number of zero-blocks instead of direct DCT and quantization (DCT/Q) calculation and incorporates two adequate decision methods into semi-stationary and nonstationary regions of a video sequence. In addition, the zero-block decision algorithm is also applied to the intramode prediction in the P frame. The enhanced zero-block decision algorithm brings out a reduction of average 27% of total encoding time compared to the zero-block decision algorithm.

  15. A High-Order CFS Algorithm for Clustering Big Data

    Directory of Open Access Journals (Sweden)

    Fanyu Bu

    2016-01-01

    Full Text Available With the development of Internet of Everything such as Internet of Things, Internet of People, and Industrial Internet, big data is being generated. Clustering is a widely used technique for big data analytics and mining. However, most of current algorithms are not effective to cluster heterogeneous data which is prevalent in big data. In this paper, we propose a high-order CFS algorithm (HOCFS to cluster heterogeneous data by combining the CFS clustering algorithm and the dropout deep learning model, whose functionality rests on three pillars: (i an adaptive dropout deep learning model to learn features from each type of data, (ii a feature tensor model to capture the correlations of heterogeneous data, and (iii a tensor distance-based high-order CFS algorithm to cluster heterogeneous data. Furthermore, we verify our proposed algorithm on different datasets, by comparison with other two clustering schemes, that is, HOPCM and CFS. Results confirm the effectiveness of the proposed algorithm in clustering heterogeneous data.

  16. Betweenness-based algorithm for a partition scale-free graph

    International Nuclear Information System (INIS)

    Zhang Bai-Da; Wu Jun-Jie; Zhou Jing; Tang Yu-Hua

    2011-01-01

    Many real-world networks are found to be scale-free. However, graph partition technology, as a technology capable of parallel computing, performs poorly when scale-free graphs are provided. The reason for this is that traditional partitioning algorithms are designed for random networks and regular networks, rather than for scale-free networks. Multilevel graph-partitioning algorithms are currently considered to be the state of the art and are used extensively. In this paper, we analyse the reasons why traditional multilevel graph-partitioning algorithms perform poorly and present a new multilevel graph-partitioning paradigm, top down partitioning, which derives its name from the comparison with the traditional bottom—up partitioning. A new multilevel partitioning algorithm, named betweenness-based partitioning algorithm, is also presented as an implementation of top—down partitioning paradigm. An experimental evaluation of seven different real-world scale-free networks shows that the betweenness-based partitioning algorithm significantly outperforms the existing state-of-the-art approaches. (interdisciplinary physics and related areas of science and technology)

  17. [Current situation and thinking of diagnosis and treatment in some types of thyroid cancer].

    Science.gov (United States)

    Yang, X Y; Yu, Y; Li, D P; Dong, L

    2017-04-07

    As arising incidence of thyroid cancer, the treatment for thyroid carcinoma is becoming increasingly standardized. But there are different opinions on the treatment for some types of thyroid cancers, including the determination of operative opportunity, surgical method, and follow-up observation plan. There are mainly two categories of patients, namely the patients diagnosed as familial thyroid cancer mutation carriers through family screening, including medullary thyroid carcinoma and familial nonmedullary thyroid carcinoma, and the patients with thyroid microcarcinoma that can be observed after diagnosed by fine needle biopsy cytology. We will discuss current situation for the diagnosis and treatment of these patients.

  18. Nanomedicine applications in the treatment of breast cancer: current state of the art.

    Science.gov (United States)

    Wu, Di; Si, Mengjie; Xue, Hui-Yi; Wong, Ho-Lun

    2017-01-01

    Breast cancer is the most common malignant disease in women worldwide, but the current drug therapy is far from optimal as indicated by the high death rate of breast cancer patients. Nanomedicine is a promising alternative for breast cancer treatment. Nanomedicine products such as Doxil ® and Abraxane ® have already been extensively used for breast cancer adjuvant therapy with favorable clinical outcomes. However, these products were originally designed for generic anticancer purpose and not specifically for breast cancer treatment. With better understanding of the molecular biology of breast cancer, a number of novel promising nanotherapeutic strategies and devices have been developed in recent years. In this review, we will first give an overview of the current breast cancer treatment and the updated status of nanomedicine use in clinical setting, then discuss the latest important trends in designing breast cancer nanomedicine, including passive and active cancer cell targeting, breast cancer stem cell targeting, tumor microenvironment-based nanotherapy and combination nanotherapy of drug-resistant breast cancer. Researchers may get insight from these strategies to design and develop nanomedicine that is more tailored for breast cancer to achieve further improvements in cancer specificity, antitumorigenic effect, antimetastasis effect and drug resistance reversal effect.

  19. Current nutritional treatments of obesity.

    Science.gov (United States)

    Greenwald, Ashli

    2006-01-01

    Obesity in our country is a growing concern. There are several different options for weight loss; however, individuals must be self-motivated and amendable to change in order to achieve success with their weight loss goals. Several strategies used by professionals in the US today to treat overweight and obesity, include diet therapy, exercise, behavior modification, pharmacotherapy, and surgery. The focus of the American Dietetic Association (ADA) Weight Management Position Statement is no longer just on weight loss but now on weight management. Reaching one's ideal body weight is recommended but not often realistic. Frequently, the goal of treatment shifts to maintenance of ones current weight or attempts at moderate weight loss. Lifestyle modification or behavioral modification interventions rely on analyzing behavior to identify events that are associated with appropriate vs. inappropriate eating, exercise, or thinking habits. Certain primary strategies that have been found to be useful for helping people change their behaviors so that they can lose weight and maintain their weight loss, include self-monitoring, stimulus control, cognitive restructuring, stress management, social support, physical activity, and relapse prevention. Weight loss programs should strive to combine a nutritionally balanced dietary regimen with exercise and lifestyle modifications at the lowest possible cost. There are several different methods used for dietary modifications; low calorie diets, very low calorie diets, fasting, formula diets and meal replacement programs, and popular diets. Bariatric surgery is gaining popularity as it has been an effective way to treat obesity. Following gastric bypass surgery, the patients must be prepared to modify their eating behaviors and dietary selections to assist with weight loss and prevent potential complications. Patients should be educated on the dietary guidelines extensively prior to surgery and again post-operatively.

  20. Implementation and analysis of list mode algorithm using tubes of response on a dedicated brain and breast PET

    Science.gov (United States)

    Moliner, L.; Correcher, C.; González, A. J.; Conde, P.; Hernández, L.; Orero, A.; Rodríguez-Álvarez, M. J.; Sánchez, F.; Soriano, A.; Vidal, L. F.; Benlloch, J. M.

    2013-02-01

    In this work we present an innovative algorithm for the reconstruction of PET images based on the List-Mode (LM) technique which improves their spatial resolution compared to results obtained with current MLEM algorithms. This study appears as a part of a large project with the aim of improving diagnosis in early Alzheimer disease stages by means of a newly developed hybrid PET-MR insert. At the present, Alzheimer is the most relevant neurodegenerative disease and the best way to apply an effective treatment is its early diagnosis. The PET device will consist of several monolithic LYSO crystals coupled to SiPM detectors. Monolithic crystals can reduce scanner costs with the advantage to enable implementation of very small virtual pixels in their geometry. This is especially useful for LM reconstruction algorithms, since they do not need a pre-calculated system matrix. We have developed an LM algorithm which has been initially tested with a large aperture (186 mm) breast PET system. Such an algorithm instead of using the common lines of response, incorporates a novel calculation of tubes of response. The new approach improves the volumetric spatial resolution about a factor 2 at the border of the field of view when compared with traditionally used MLEM algorithm. Moreover, it has also shown to decrease the image noise, thus increasing the image quality.

  1. Implementation and analysis of list mode algorithm using tubes of response on a dedicated brain and breast PET

    International Nuclear Information System (INIS)

    Moliner, L.; Correcher, C.; González, A.J.; Conde, P.; Hernández, L.; Orero, A.; Rodríguez-Álvarez, M.J.; Sánchez, F.; Soriano, A.; Vidal, L.F.; Benlloch, J.M.

    2013-01-01

    In this work we present an innovative algorithm for the reconstruction of PET images based on the List-Mode (LM) technique which improves their spatial resolution compared to results obtained with current MLEM algorithms. This study appears as a part of a large project with the aim of improving diagnosis in early Alzheimer disease stages by means of a newly developed hybrid PET-MR insert. At the present, Alzheimer is the most relevant neurodegenerative disease and the best way to apply an effective treatment is its early diagnosis. The PET device will consist of several monolithic LYSO crystals coupled to SiPM detectors. Monolithic crystals can reduce scanner costs with the advantage to enable implementation of very small virtual pixels in their geometry. This is especially useful for LM reconstruction algorithms, since they do not need a pre-calculated system matrix. We have developed an LM algorithm which has been initially tested with a large aperture (186 mm) breast PET system. Such an algorithm instead of using the common lines of response, incorporates a novel calculation of tubes of response. The new approach improves the volumetric spatial resolution about a factor 2 at the border of the field of view when compared with traditionally used MLEM algorithm. Moreover, it has also shown to decrease the image noise, thus increasing the image quality

  2. Predictive factors for renal failure and a control and treatment algorithm

    Directory of Open Access Journals (Sweden)

    Denise de Paula Cerqueira

    2014-04-01

    Full Text Available OBJECTIVES: to evaluate the renal function of patients in an intensive care unit, to identify the predisposing factors for the development of renal failure, and to develop an algorithm to help in the control of the disease.METHOD: exploratory, descriptive, prospective study with a quantitative approach.RESULTS: a total of 30 patients (75.0% were diagnosed with kidney failure and the main factors associated with this disease were: advanced age, systemic arterial hypertension, diabetes mellitus, lung diseases, and antibiotic use. Of these, 23 patients (76.6% showed a reduction in creatinine clearance in the first 24 hours of hospitalization.CONCLUSION: a decline in renal function was observed in a significant number of subjects, therefore, an algorithm was developed with the aim of helping in the control of renal failure in a practical and functional way.

  3. Management of vascular anomalies: Review of institutional management algorithm

    Directory of Open Access Journals (Sweden)

    Lalit K Makhija

    2017-01-01

    Full Text Available Introduction: Vascular anomalies are congenital lesions broadly categorised into vascular tumour (haemangiomas and vascular dysmorphogenesis (vascular malformation. The management of these difficult problems has lately been simplified by the biological classification and multidisciplinary approach. To standardise the treatment protocol, an algorithm has been devised. The study aims to validate the algorithm in terms of its utility and presents our experience in managing vascular anomalies. Materials and Methods: The biological classification of Mulliken and Glowacki was followed. A detailed algorithm for management of vascular anomalies has been devised in the department. The protocol is being practiced by us since the past two decades. The data regarding the types of lesions and treatment modality used were maintained. Results and Conclusion: This study was conducted from 2002 to 2012. A total of 784 cases of vascular anomalies were included in the study of which 196 were haemangiomas and 588 were vascular malformations. The algorithmic approach has brought an element of much-needed objectivity in the management of vascular anomalies. This has helped us to define the management of particular lesion considering its pathology, extent and aesthetic and functional consequences of ablation to a certain extent.

  4. Cognitive-behavior therapy for problem gambling: a critique of current treatments and proposed new unified approach.

    Science.gov (United States)

    Tolchard, Barry

    2017-06-01

    There is evidence supporting the use of cognitive-behavioral therapy (CBT) in the treatment of problem gambling. Despite this, little is known about how CBT works and which particular approach is most effective. This paper aims to synthesize the evidence for current CBT and propose a more unified approach to treatment. A literature review and narrative synthesis of the current research evidence of CBT for the treatment of problem gambling was conducted, focusing on the underlying mechanisms within the treatment approach. Several CBT approaches were critiqued. These can be divided into forms of exposure therapy (including aversion techniques, systematic desensitization and other behavioral experiments) those focusing on cognitive restructuring techniques (such as reinforcement of nongambling activity, use of diaries, motivational enhancement and audio-playback techniques and third wave techniques including mindfulness. Findings, in relation to the treatment actions, from this synthesis are reported. The debate surrounding the treatment of problem gambling has been conducted as an either/or rather than a both/and discourse. This paper proposes a new, unified approach to the treatment of problem gambling that incorporates the best elements of both exposure and cognitive restructuring techniques, alongside the use of techniques borrowed from mindfulness and other CBT approaches.

  5. Performance of the "CCS Algorithm" in real world patients.

    Science.gov (United States)

    LaHaye, Stephen A; Olesen, Jonas B; Lacombe, Shawn P

    2015-06-01

    With the publication of the 2014 Focused Update of the Canadian Cardiovascular Society Guidelines for the Management of Atrial Fibrillation, the Canadian Cardiovascular Society Atrial Fibrillation Guidelines Committee has introduced a new triage and management algorithm; the so-called "CCS Algorithm". The CCS Algorithm is based upon expert opinion of the best available evidence; however, the CCS Algorithm has not yet been validated. Accordingly, the purpose of this study is to evaluate the performance of the CCS Algorithm in a cohort of real world patients. We compared the CCS Algorithm with the European Society of Cardiology (ESC) Algorithm in 172 hospital inpatients who are at risk of stroke due to non-valvular atrial fibrillation in whom anticoagulant therapy was being considered. The CCS Algorithm and the ESC Algorithm were concordant in 170/172 patients (99% of the time). There were two patients (1%) with vascular disease, but no other thromboembolic risk factors, which were classified as requiring oral anticoagulant therapy using the ESC Algorithm, but for whom ASA was recommended by the CCS Algorithm. The CCS Algorithm appears to be unnecessarily complicated in so far as it does not appear to provide any additional discriminatory value above and beyond the use of the ESC Algorithm, and its use could result in under treatment of patients, specifically female patients with vascular disease, whose real risk of stroke has been understated by the Guidelines.

  6. STAR Algorithm Integration Team - Facilitating operational algorithm development

    Science.gov (United States)

    Mikles, V. J.

    2015-12-01

    The NOAA/NESDIS Center for Satellite Research and Applications (STAR) provides technical support of the Joint Polar Satellite System (JPSS) algorithm development and integration tasks. Utilizing data from the S-NPP satellite, JPSS generates over thirty Environmental Data Records (EDRs) and Intermediate Products (IPs) spanning atmospheric, ocean, cryosphere, and land weather disciplines. The Algorithm Integration Team (AIT) brings technical expertise and support to product algorithms, specifically in testing and validating science algorithms in a pre-operational environment. The AIT verifies that new and updated algorithms function in the development environment, enforces established software development standards, and ensures that delivered packages are functional and complete. AIT facilitates the development of new JPSS-1 algorithms by implementing a review approach based on the Enterprise Product Lifecycle (EPL) process. Building on relationships established during the S-NPP algorithm development process and coordinating directly with science algorithm developers, the AIT has implemented structured reviews with self-contained document suites. The process has supported algorithm improvements for products such as ozone, active fire, vegetation index, and temperature and moisture profiles.

  7. A Rotor Tip Vortex Tracing Algorithm for Image Post-Processing

    Science.gov (United States)

    Overmeyer, Austin D.

    2015-01-01

    A neurite tracing algorithm, originally developed for medical image processing, was used to trace the location of the rotor tip vortex in density gradient flow visualization images. The tracing algorithm was applied to several representative test images to form case studies. The accuracy of the tracing algorithm was compared to two current methods including a manual point and click method and a cross-correlation template method. It is shown that the neurite tracing algorithm can reduce the post-processing time to trace the vortex by a factor of 10 to 15 without compromising the accuracy of the tip vortex location compared to other methods presented in literature.

  8. Graphical Presentation of Patient-Treatment Interaction Elucidated by Continuous Biomarkers. Current Practice and Scope for Improvement.

    Science.gov (United States)

    Shen, Yu-Ming; Le, Lien D; Wilson, Rory; Mansmann, Ulrich

    2017-01-09

    Biomarkers providing evidence for patient-treatment interaction are key in the development and practice of personalized medicine. Knowledge that a patient with a specific feature - as demonstrated through a biomarker - would have an advantage under a given treatment vs. a competing treatment can aid immensely in medical decision-making. Statistical strategies to establish evidence of continuous biomarkers are complex and their formal results are thus not easy to communicate. Good graphical representations would help to translate such findings for use in the clinical community. Although general guidelines on how to present figures in clinical reports are available, there remains little guidance for figures elucidating the role of continuous biomarkers in patient-treatment interaction (CBPTI). To combat the current lack of comprehensive reviews or adequate guides on graphical presentation within this topic, our study proposes presentation principles for CBPTI plots. In order to understand current practice, we review the development of CBPTI methodology and how CBPTI plots are currently used in clinical research. The quality of a CBPTI plot is determined by how well the presentation provides key information for clinical decision-making. Several criteria for a good CBPTI plot are proposed, including general principles of visual display, use of units presenting absolute outcome measures, appropriate quantification of statistical uncertainty, correct display of benchmarks, and informative content for answering clinical questions especially on the quantitative advantage for an individual patient with regard to a specific treatment. We examined the development of CBPTI methodology from the years 2000 - 2014, and reviewed how CBPTI plots were currently used in clinical research in six major clinical journals from 2013 - 2014 using the principle of theoretical saturation. Each CBPTI plot found was assessed for appropriateness of its presentation and clinical utility

  9. Analysis of Various Multi-Objective Optimization Evolutionary Algorithms for Monte Carlo Treatment Planning System

    CERN Document Server

    Tydrichova, Magdalena

    2017-01-01

    In this project, various available multi-objective optimization evolutionary algorithms were compared considering their performance and distribution of solutions. The main goal was to select the most suitable algorithms for applications in cancer hadron therapy planning. For our purposes, a complex testing and analysis software was developed. Also, many conclusions and hypothesis have been done for the further research.

  10. Coupling Algorithms for Calculating Sensitivities of Population Balances

    International Nuclear Information System (INIS)

    Man, P. L. W.; Kraft, M.; Norris, J. R.

    2008-01-01

    We introduce a new class of stochastic algorithms for calculating parametric derivatives of the solution of the space-homogeneous Smoluchowski's coagulation equation. Currently, it is very difficult to produce low variance estimates of these derivatives in reasonable amounts of computational time through the use of stochastic methods. These new algorithms consider a central difference estimator of the parametric derivative which is calculated by evaluating the coagulation equation at two different parameter values simultaneously, and causing variance reduction by maximising the covariance between these. The two different coupling strategies ('Single' and 'Double') have been compared to the case when there is no coupling ('Independent'). Both coupling algorithms converge and the Double coupling is the most 'efficient' algorithm. For the numerical example chosen we obtain a factor of about 100 in efficiency in the best case (small system evolution time and small parameter perturbation).

  11. Evolutionary Algorithms Application Analysis in Biometric Systems

    Directory of Open Access Journals (Sweden)

    N. Goranin

    2010-01-01

    Full Text Available Wide usage of biometric information for person identity verification purposes, terrorist acts prevention measures and authenticationprocess simplification in computer systems has raised significant attention to reliability and efficiency of biometricsystems. Modern biometric systems still face many reliability and efficiency related issues such as reference databasesearch speed, errors while recognizing of biometric information or automating biometric feature extraction. Current scientificinvestigations show that application of evolutionary algorithms may significantly improve biometric systems. In thisarticle we provide a comprehensive review of main scientific research done in sphere of evolutionary algorithm applicationfor biometric system parameter improvement.

  12. A parallel line sieve for the GNFS Algorithm

    OpenAIRE

    Sameh Daoud; Ibrahim Gad

    2014-01-01

    RSA is one of the most important public key cryptosystems for information security. The security of RSA depends on Integer factorization problem, it relies on the difficulty of factoring large integers. Much research has gone into problem of factoring a large number. Due to advances in factoring algorithms and advances in computing hardware the size of the number that can be factorized increases exponentially year by year. The General Number Field Sieve algorithm (GNFS) is currently the best ...

  13. Dose-calculation algorithms in the context of inhomogeneity corrections for high energy photon beams

    International Nuclear Information System (INIS)

    Papanikolaou, Niko; Stathakis, Sotirios

    2009-01-01

    Radiation therapy has witnessed a plethora of innovations and developments in the past 15 years. Since the introduction of computed tomography for treatment planning there has been a steady introduction of new methods to refine treatment delivery. Imaging continues to be an integral part of the planning, but also the delivery, of modern radiotherapy. However, all the efforts of image guided radiotherapy, intensity-modulated planning and delivery, adaptive radiotherapy, and everything else that we pride ourselves in having in the armamentarium can fall short, unless there is an accurate dose-calculation algorithm. The agreement between the calculated and delivered doses is of great significance in radiation therapy since the accuracy of the absorbed dose as prescribed determines the clinical outcome. Dose-calculation algorithms have evolved greatly over the years in an effort to be more inclusive of the effects that govern the true radiation transport through the human body. In this Vision 20/20 paper, we look back to see how it all started and where things are now in terms of dose algorithms for photon beams and the inclusion of tissue heterogeneities. Convolution-superposition algorithms have dominated the treatment planning industry for the past few years. Monte Carlo techniques have an inherent accuracy that is superior to any other algorithm and as such will continue to be the gold standard, along with measurements, and maybe one day will be the algorithm of choice for all particle treatment planning in radiation therapy.

  14. Geometry correction Algorithm for UAV Remote Sensing Image Based on Improved Neural Network

    Science.gov (United States)

    Liu, Ruian; Liu, Nan; Zeng, Beibei; Chen, Tingting; Yin, Ninghao

    2018-03-01

    Aiming at the disadvantage of current geometry correction algorithm for UAV remote sensing image, a new algorithm is proposed. Adaptive genetic algorithm (AGA) and RBF neural network are introduced into this algorithm. And combined with the geometry correction principle for UAV remote sensing image, the algorithm and solving steps of AGA-RBF are presented in order to realize geometry correction for UAV remote sensing. The correction accuracy and operational efficiency is improved through optimizing the structure and connection weight of RBF neural network separately with AGA and LMS algorithm. Finally, experiments show that AGA-RBF algorithm has the advantages of high correction accuracy, high running rate and strong generalization ability.

  15. SHORT OVERVIEW OF CLINICAL TRIALS WITH CURRENT IMMUNOTHERAPEUTIC TOOLS FOR CANCER TREATMENT

    Directory of Open Access Journals (Sweden)

    T. S. Nepomnyashchikh

    2017-01-01

    Full Text Available Over last decade, a substantial progress has been made, with respect to understanding of cancer biology and its interplay with the host immune system. Different immunotherapeutic drugs based on recombinant cytokines and monoclonal antibodies are widely used in cancer therapy, and a large number of experimental cancer treatments have been developed, many of which are currently undergoing various stages of clinical trials. Recent endorsement of a recombinant oncolytic herpesvirus T-VEC for the treatment of melanoma was an important step towards a more safe and efficient anticancer therapeutics. In this review, we shall mention only some of the most promising cancer immunotherapy strategies, namely, immune checkpoint inhibitors, cellular therapy and oncolytic viruses. 

  16. A new algorithm to construct phylogenetic networks from trees.

    Science.gov (United States)

    Wang, J

    2014-03-06

    Developing appropriate methods for constructing phylogenetic networks from tree sets is an important problem, and much research is currently being undertaken in this area. BIMLR is an algorithm that constructs phylogenetic networks from tree sets. The algorithm can construct a much simpler network than other available methods. Here, we introduce an improved version of the BIMLR algorithm, QuickCass. QuickCass changes the selection strategy of the labels of leaves below the reticulate nodes, i.e., the nodes with an indegree of at least 2 in BIMLR. We show that QuickCass can construct simpler phylogenetic networks than BIMLR. Furthermore, we show that QuickCass is a polynomial-time algorithm when the output network that is constructed by QuickCass is binary.

  17. Comprehensive eye evaluation algorithm

    Science.gov (United States)

    Agurto, C.; Nemeth, S.; Zamora, G.; Vahtel, M.; Soliz, P.; Barriga, S.

    2016-03-01

    In recent years, several research groups have developed automatic algorithms to detect diabetic retinopathy (DR) in individuals with diabetes (DM), using digital retinal images. Studies have indicated that diabetics have 1.5 times the annual risk of developing primary open angle glaucoma (POAG) as do people without DM. Moreover, DM patients have 1.8 times the risk for age-related macular degeneration (AMD). Although numerous investigators are developing automatic DR detection algorithms, there have been few successful efforts to create an automatic algorithm that can detect other ocular diseases, such as POAG and AMD. Consequently, our aim in the current study was to develop a comprehensive eye evaluation algorithm that not only detects DR in retinal images, but also automatically identifies glaucoma suspects and AMD by integrating other personal medical information with the retinal features. The proposed system is fully automatic and provides the likelihood of each of the three eye disease. The system was evaluated in two datasets of 104 and 88 diabetic cases. For each eye, we used two non-mydriatic digital color fundus photographs (macula and optic disc centered) and, when available, information about age, duration of diabetes, cataracts, hypertension, gender, and laboratory data. Our results show that the combination of multimodal features can increase the AUC by up to 5%, 7%, and 8% in the detection of AMD, DR, and glaucoma respectively. Marked improvement was achieved when laboratory results were combined with retinal image features.

  18. Quantum algorithms for topological and geometric analysis of data

    Science.gov (United States)

    Lloyd, Seth; Garnerone, Silvano; Zanardi, Paolo

    2016-01-01

    Extracting useful information from large data sets can be a daunting task. Topological methods for analysing data sets provide a powerful technique for extracting such information. Persistent homology is a sophisticated tool for identifying topological features and for determining how such features persist as the data is viewed at different scales. Here we present quantum machine learning algorithms for calculating Betti numbers—the numbers of connected components, holes and voids—in persistent homology, and for finding eigenvectors and eigenvalues of the combinatorial Laplacian. The algorithms provide an exponential speed-up over the best currently known classical algorithms for topological data analysis. PMID:26806491

  19. Capacity of non-invasive hepatic fibrosis algorithms to replace transient elastography to exclude cirrhosis in people with hepatitis C virus infection: A multi-centre observational study.

    Science.gov (United States)

    Kelly, Melissa Louise; Riordan, Stephen M; Bopage, Rohan; Lloyd, Andrew R; Post, Jeffrey John

    2018-01-01

    Achievement of the 2030 World Health Organisation (WHO) global hepatitis C virus (HCV) elimination targets will be underpinned by scale-up of testing and use of direct-acting antiviral treatments. In Australia, despite publically-funded testing and treatment, less than 15% of patients were treated in the first year of treatment access, highlighting the need for greater efficiency of health service delivery. To this end, non-invasive fibrosis algorithms were examined to reduce reliance on transient elastography (TE) which is currently utilised for the assessment of cirrhosis in most Australian clinical settings. This retrospective and prospective study, with derivation and validation cohorts, examined consecutive patients in a tertiary referral centre, a sexual health clinic, and a prison-based hepatitis program. The negative predictive value (NPV) of seven non-invasive algorithms were measured using published and newly derived cut-offs. The number of TEs avoided for each algorithm, or combination of algorithms, was determined. The 850 patients included 780 (92%) with HCV mono-infection, and 70 (8%) co-infected with HIV or hepatitis B. The mono-infected cohort included 612 men (79%), with an overall prevalence of cirrhosis of 16% (125/780). An 'APRI' algorithm cut-off of 1.0 had a 94% NPV (95%CI: 91-96%). Newly derived cut-offs of 'APRI' (0.49), 'FIB-4' (0.93) and 'GUCI' (0.5) algorithms each had NPVs of 99% (95%CI: 97-100%), allowing avoidance of TE in 40% (315/780), 40% (310/780) and 40% (298/749) respectively. When used in combination, NPV was retained and TE avoidance reached 54% (405/749), regardless of gender or co-infection. Non-invasive algorithms can reliably exclude cirrhosis in many patients, allowing improved efficiency of HCV assessment services in Australia and worldwide.

  20. The radiological diagnosis and treatment of renal angiomyolipoma-current status.

    LENUS (Irish Health Repository)

    Halpenny, D

    2010-02-01

    Angiomyolipomas (AMLs) are the most common benign renal neoplasm and are often discovered incidentally. Due to both an increase in the use of imaging, as well as advances in imaging technology, they are being increasing identified in the general population. As these lesions are benign, there is good evidence that the majority of them can be safely followed up without treatment. However, there is an increasing wealth of information available suggesting there are individuals with AMLs where prophylactic treatment is indicated to prevent complications such as haemorrhage. In such cases, treatment with radiological interventional techniques with subselective particle embolization has superseded surgical techniques in most cases. Even in emergency cases with catastrophic rupture, prompt embolization may save the patient with the additional benefit of renal salvage. Confident identification of a lesion as an AML is important as its benign nature obviates the need for surgery in most cases. The presence of fat is paramount in the confirmatory identification and characterization of these lesions. Although fat-rich AMLs are easy to diagnose, some lesions are fat poor and it is these cases where newer imaging techniques, such as in-phase and out-of-phase magnetic resonance imaging (MRI) may aid in making a confident diagnosis of AML. In this paper, we comprehensively review the imaging techniques in making a diagnosis of AML, including features of both characteristic lesions as well as atypical lesions. In addition, we discuss current guidelines for follow-up and prophylactic treatment of these lesions, as well as the increasing role that the interventional radiologist has to play in these cases.

  1. Dealing with uncertainty in the treatment of Helicobacter pylori.

    Science.gov (United States)

    Calvet, Xavier

    2018-04-01

    Helicobacter pylori treatment may be viewed as an uncertain situation, where current knowledge is insufficient to provide evidence-based recommendations for all possible scenarios. Evidence suggests that, under uncertainty conditions, a few simple rules of thumb tend to work better than complex algorithms. Overall, five evidence-based rules of thumb are suggested: (1) Use four drugs; (2) Use maximal acid inhibition; (3) Treat for 2 weeks; (4) Do not repeat antibiotics after treatment failure; and (5) If your treatment works locally, keep using it. These simple rules of thumb may help the reader to select the best alternative for a given patient, choosing between the heterogeneous recommendations provided by the many different consensus conferences on H. pylori treatment recently published.

  2. A pencil beam algorithm for helium ion beam therapy

    Energy Technology Data Exchange (ETDEWEB)

    Fuchs, Hermann; Stroebele, Julia; Schreiner, Thomas; Hirtl, Albert; Georg, Dietmar [Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna, 1090 Vienna (Austria); Department of Radiation Oncology, Medical University of Vienna/AKH Vienna, 1090 Vienna (Austria) and Comprehensive Cancer Center, Medical University of Vienna/AKH Vienna, 1090 Vienna (Austria); Department of Radiation Oncology, Medical University of Vienna/AKH Vienna (Austria) and Comprehensive Cancer Center, Medical University of Vienna/AKH Vienna, 1090 Vienna (Austria); PEG MedAustron, 2700 Wiener Neustadt (Austria); Department of Nuclear Medicine, Medical University of Vienna, 1090 Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna, 1090 Vienna (Austria); Department of Radiation Oncology, Medical University of Vienna/AKH Vienna, 1090 Vienna (Austria) and Comprehensive Cancer Center, Medical University of Vienna/AKH Vienna, 1090 Vienna (Austria)

    2012-11-15

    presented algorithm was considered to be sufficient for clinical practice. Although only data for helium beams was presented, the performance of the pencil beam algorithm for proton beams was comparable. Conclusions: The pencil beam algorithm developed for helium ions presents a suitable tool for dose calculations. Its calculation speed was evaluated to be similar to other published pencil beam algorithms. The flexible design allows easy customization of measured depth-dose distributions and use of varying beam profiles, thus making it a promising candidate for integration into future treatment planning systems. Current work in progress deals with RBE effects of helium ions to complete the model.

  3. PSO-RBF Neural Network PID Control Algorithm of Electric Gas Pressure Regulator

    Directory of Open Access Journals (Sweden)

    Yuanchang Zhong

    2014-01-01

    Full Text Available The current electric gas pressure regulator often adopts the conventional PID control algorithm to take drive control of the core part (micromotor of electric gas pressure regulator. In order to further improve tracking performance and to shorten response time, this paper presents an improved PID intelligent control algorithm which applies to the electric gas pressure regulator. The algorithm uses the improved RBF neural network based on PSO algorithm to make online adjustment on PID parameters. Theoretical analysis and simulation result show that the algorithm shortens the step response time and improves tracking performance.

  4. Automatic learning algorithm for the MD-logic artificial pancreas system.

    Science.gov (United States)

    Miller, Shahar; Nimri, Revital; Atlas, Eran; Grunberg, Eli A; Phillip, Moshe

    2011-10-01

    Applying real-time learning into an artificial pancreas system could effectively track the unpredictable behavior of glucose-insulin dynamics and adjust insulin treatment accordingly. We describe a novel learning algorithm and its performance when integrated into the MD-Logic Artificial Pancreas (MDLAP) system developed by the Diabetes Technology Center, Schneider Children's Medical Center of Israel, Petah Tikva, Israel. The algorithm was designed to establish an initial patient profile using open-loop data (Initial Learning Algorithm component) and then make periodic adjustments during closed-loop operation (Runtime Learning Algorithm component). The MDLAP system, integrated with the learning algorithm, was tested in seven different experiments using the University of Virginia/Padova simulator, comprising adults, adolescents, and children. The experiments included simulations using the open-loop and closed-loop control strategy under nominal and varying insulin sensitivity conditions. The learning algorithm was automatically activated at the end of the open-loop segment and after every day of the closed-loop operation. Metabolic control parameters achieved at selected time points were compared. The percentage of time glucose levels were maintained within 70-180 mg/dL for children and adolescents significantly improved when open-loop was compared with day 6 of closed-loop control (Psignificantly reduced by approximately sevenfold (Psignificant reduction in the Low Blood Glucose Index (P<0.001). The new algorithm was effective in characterizing the patient profiles from open-loop data and in adjusting treatment to provide better glycemic control during closed-loop control in both conditions. These findings warrant corroboratory clinical trials.

  5. Contemporary treatment of sexual dysfunction: reexamining the biopsychosocial model.

    Science.gov (United States)

    Berry, Michael D; Berry, Philip D

    2013-11-01

    The introduction of phosphodiesterase type 5 inhibitors has revolutionized the armamentarium of clinicians in the field of sexual medicine. However, pharmacotherapy as a stand-alone treatment option has been criticized, particularly by psychosocial therapists, as incomplete. Specifically, it is widely argued that drug treatment alone often does not meet the standards of biopsychosocial (BPS) therapy. A literature review was performed to explore the role of the biopsychosocial paradigm in the treatment of sexual dysfunction and outline some of the key challenges and possible shortcomings in the current application of biopsychosocial treatment. Published treatment outcomes of integrative biopsychosocial clinical practice, including medical outcomes, psychological and relational factors, treatment of comorbid conditions, cost of treatment, and treatment efficacy, were investigated. Using Medline, PubMed, and EMBASE databases, a literature search for articles published from January 1, 1980, to March 1, 2013, was performed, examining current approaches to the biopsychosocial model of sexual dysfunction and sexual medicine. Data were reviewed and combined, allowing characterization of current treatment approaches and recommendations for clinical practice and future research. The biopsychosocial model of treatment appears to have an intuitively obvious meaning (i.e., treatment of all three facets of the patient's biological-psychological-social condition). However, research suggests that clear treatment algorithms are still in development. By virtue of the ongoing development of biopsychosocial methods in sexual medicine, new models and research initiatives may be warranted. The evidence identified allows for characterization of some of the current clinical, professional, financial, and systemic challenges to biopsychosocial treatment, with the aim of helping identify possible directions for future research. Implementation of biopsychosocial treatment, though mandated by

  6. Algorithm for Controlling a Centrifugal Compressor

    Science.gov (United States)

    Benedict, Scott M.

    2004-01-01

    An algorithm has been developed for controlling a centrifugal compressor that serves as the prime mover in a heatpump system. Experimental studies have shown that the operating conditions for maximum compressor efficiency are close to the boundary beyond which surge occurs. Compressor surge is a destructive condition in which there are instantaneous reversals of flow associated with a high outlet-to-inlet pressure differential. For a given cooling load, the algorithm sets the compressor speed at the lowest possible value while adjusting the inlet guide vane angle and diffuser vane angle to maximize efficiency, subject to an overriding requirement to prevent surge. The onset of surge is detected via the onset of oscillations of the electric current supplied to the compressor motor, associated with surge-induced oscillations of the torque exerted by and on the compressor rotor. The algorithm can be implemented in any of several computer languages.

  7. Accuracy Analysis of Lunar Lander Terminal Guidance Algorithm

    Directory of Open Access Journals (Sweden)

    E. K. Li

    2017-01-01

    Full Text Available This article studies a proposed analytical algorithm of the terminal guidance for the lunar lander. The analytical solution, which forms the basis of the algorithm, was obtained for a constant acceleration trajectory and thrust vector orientation programs that are essentially linear with time. The main feature of the proposed algorithm is a completely analytical solution to provide the lander terminal guidance to the desired spot in 3D space when landing on the atmosphereless body with no numerical procedures. To reach 6 terminal conditions (components of position and velocity vectors at the final time are used 6 guidance law parameters, namely time-to-go, desired value of braking deceleration, initial values of pitch and yaw angles and rates of their change. In accordance with the principle of flexible trajectories, this algorithm assumes the implementation of a regularly updated control program that ensures reaching terminal conditions from the current state that corresponds to the control program update time. The guidance law parameters, which ensure that terminal conditions are reached, are generated as a function of the current phase coordinates of a lander. The article examines an accuracy and reliability of the proposed analytical algorithm that provides the terminal guidance of the lander in 3D space through mathematical modeling of the lander guidance from the circumlunar pre-landing orbit to the desired spot near the lunar surface. A desired terminal position of the lunar lander is specified by the selenographic latitude, longitude and altitude above the lunar surface. The impact of variations in orbital parameters on the terminal guidance accuracy has been studied. By varying the five initial orbit parameters (obliquity, ascending node longitude, argument of periapsis, periapsis height, apoapsis height when the terminal spot is fixed the statistic characteristics of the terminal guidance algorithm error according to the terminal

  8. New algorithm for risk analysis in radiotherapy

    International Nuclear Information System (INIS)

    Torres, Antonio; Montes de Oca, Joe

    2015-01-01

    Risk analyses applied to radiotherapy treatments have become an undeniable necessity, considering the dangers generated by the combination of using powerful radiation fields on patients and the occurrence of human errors and equipment failures during these treatments. The technique par excellence to execute these analyses has been the risk matrix. This paper presents the development of a new algorithm to execute the task with wide graphic and analytic potentialities, thus transforming it into a very useful option for risk monitoring and the optimization of quality assurance. The system SECURE- MR, which is the basic software of this algorithm, has been successfully used in risk analysis regarding different kinds of radiotherapies. Compared to previous methods, It offers new possibilities of analysis considering risk controlling factors as the robustness of reducers of initiators frequency and its consequences. Their analytic capacities and graphs allow novel developments to classify risk contributing factors, to represent information processes as well as accidental sequences. The paper shows the application of the proposed system to a generic process of radiotherapy treatment using a lineal accelerator. (author)

  9. Modification of transmission dose algorithm for irregularly shaped radiation field and tissue deficit

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Hyong Geon; Shin, Kyo Chul [Dankook Univ., College of Medicine, Seoul (Korea, Republic of); Huh, Soon Nyung; Woo, Hong Gyun; Ha, Sung Whan [Seoul National Univ., College of Medicine, Seoul (Korea, Republic of); Lee, Hyoung Koo [The Catholic Univ., College of Medicine, Seoul (Korea, Republic of)

    2002-07-01

    Algorithm for estimation of transmission dose was modified for use in partially blocked radiation fields and in cases with tissue deficit. The beam data was measured with flat solid phantom in various conditions of beam block. And an algorithm for correction of transmission dose in cases of partially blocked radiation field was developed from the measured data. The algorithm was tested in some clinical settings with irregular shaped field. Also, another algorithm for correction of transmission dose for tissue deficit was developed by physical reasoning. This algorithm was tested in experimental settings with irregular contours mimicking breast cancer patients by using multiple sheets of solid phantoms. The algorithm for correction of beam block could accurately reflect the effect of beam block, with error within {+-}1.0%, both with square fields and irregularly shaped fields. The correction algorithm for tissue deficit could accurately reflect the effect of tissue deficit with errors within {+-}1.0% in most situations and within {+-}3.0% in experimental settings with irregular contours mimicking breast cancer treatment set-up. Developed algorithms could accurately estimate the transmission dose in most radiation treatment settings including irregularly shaped field and irregularly shaped body contour with tissue deficit in transmission dosimetry.

  10. Modification of transmission dose algorithm for irregularly shaped radiation field and tissue deficit

    International Nuclear Information System (INIS)

    Yun, Hyong Geon; Shin, Kyo Chul; Huh, Soon Nyung; Woo, Hong Gyun; Ha, Sung Whan; Lee, Hyoung Koo

    2002-01-01

    Algorithm for estimation of transmission dose was modified for use in partially blocked radiation fields and in cases with tissue deficit. The beam data was measured with flat solid phantom in various conditions of beam block. And an algorithm for correction of transmission dose in cases of partially blocked radiation field was developed from the measured data. The algorithm was tested in some clinical settings with irregular shaped field. Also, another algorithm for correction of transmission dose for tissue deficit was developed by physical reasoning. This algorithm was tested in experimental settings with irregular contours mimicking breast cancer patients by using multiple sheets of solid phantoms. The algorithm for correction of beam block could accurately reflect the effect of beam block, with error within ±1.0%, both with square fields and irregularly shaped fields. The correction algorithm for tissue deficit could accurately reflect the effect of tissue deficit with errors within ±1.0% in most situations and within ±3.0% in experimental settings with irregular contours mimicking breast cancer treatment set-up. Developed algorithms could accurately estimate the transmission dose in most radiation treatment settings including irregularly shaped field and irregularly shaped body contour with tissue deficit in transmission dosimetry

  11. An enhanced search algorithm for the charged fuel enrichment in equilibrium cycle analysis of REBUS-3

    International Nuclear Information System (INIS)

    Park, Tongkyu; Yang, Won Sik; Kim, Sang-Ji

    2017-01-01

    Highlights: • An enhanced search algorithm for charged fuel enrichment was developed for equilibrium cycle analysis with REBUS-3. • The new search algorithm is not sensitive to the user-specified initial guesses. • The new algorithm reduces the computational time by a factor of 2–3. - Abstract: This paper presents an enhanced search algorithm for the charged fuel enrichment in equilibrium cycle analysis of REBUS-3. The current enrichment search algorithm of REBUS-3 takes a large number of iterations to yield a converged solution or even terminates without a converged solution when the user-specified initial guesses are far from the solution. To resolve the convergence problem and to reduce the computational time, an enhanced search algorithm was developed. The enhanced algorithm is based on the idea of minimizing the number of enrichment estimates by allowing drastic enrichment changes and by optimizing the current search algorithm of REBUS-3. Three equilibrium cycle problems with recycling, without recycling and of high discharge burnup were defined and a series of sensitivity analyses were performed with a wide range of user-specified initial guesses. Test results showed that the enhanced search algorithm is able to produce a converged solution regardless of the initial guesses. In addition, it was able to reduce the number of flux calculations by a factor of 2.9, 1.8, and 1.7 for equilibrium cycle problems with recycling, without recycling, and of high discharge burnup, respectively, compared to the current search algorithm.

  12. Selfish Gene Algorithm Vs Genetic Algorithm: A Review

    Science.gov (United States)

    Ariff, Norharyati Md; Khalid, Noor Elaiza Abdul; Hashim, Rathiah; Noor, Noorhayati Mohamed

    2016-11-01

    Evolutionary algorithm is one of the algorithms inspired by the nature. Within little more than a decade hundreds of papers have reported successful applications of EAs. In this paper, the Selfish Gene Algorithms (SFGA), as one of the latest evolutionary algorithms (EAs) inspired from the Selfish Gene Theory which is an interpretation of Darwinian Theory ideas from the biologist Richards Dawkins on 1989. In this paper, following a brief introduction to the Selfish Gene Algorithm (SFGA), the chronology of its evolution is presented. It is the purpose of this paper is to present an overview of the concepts of Selfish Gene Algorithm (SFGA) as well as its opportunities and challenges. Accordingly, the history, step involves in the algorithm are discussed and its different applications together with an analysis of these applications are evaluated.

  13. Age-related macular degeneration: using morphological predictors to modify current treatment protocols.

    Science.gov (United States)

    Ashraf, Mohammed; Souka, Ahmed; Adelman, Ron A

    2018-03-01

    To assess predictors of treatment response in neovascular age-related macular degeneration (AMD) in an attempt to develop a patient-centric treatment algorithm. We conducted a systematic search using PubMed, EMBASE and Web of Science for prognostic indicators/predictive factors with the key words: 'age related macular degeneration', 'neovascular AMD', 'choroidal neovascular membrane (CNV)', 'anti-vascular endothelial growth factor (anti-VEGF)', 'aflibercept', 'ranibizumab', 'bevacizumab', 'randomized clinical trials', 'post-hoc', 'prognostic', 'predictive', 'response' 'injection frequency, 'treat and extend (TAE), 'pro re nata (PRN)', 'bi-monthly' and 'quarterly'. We only included studies that had an adequate period of follow-up (>1 year), a single predefined treatment regimen with a predetermined re-injection criteria, an adequate number of patients, specific morphological [optical coherence tomography (OCT)] criteria that predicted final visual outcomes and injection frequency and did not include switching from one drug to the other. We were able to identify seven prospective studies and 16 retrospective studies meeting our inclusion criteria. There are several morphological and demographic prognostic indicators that can predict response to therapy in wet AMD. Smaller CNV size, subretinal fluid (SRF), retinal angiomatous proliferation (RAP) and response to therapy at 12 weeks (visual, angiographic or OCT) can all predict good visual outcomes in patients receiving anti-VEGF therapy. Patients with larger CNV, older age, pigment epithelial detachment (PED), intraretinal cysts (IRC) and vitreomacular adhesion (VMA) achieved less visual gains. Patients having VMA/VMT required more intensive treatment with increased treatment frequency. Patients with both posterior vitreous detachment (PVD) and SRF require infrequent injections. Patients with PED are prone to recurrences of fluid activity with a reduction in visual acuity (VA). A regimen that involves less intensive

  14. Comparison of Acuros (AXB) and Anisotropic Analytical Algorithm (AAA) for dose calculation in treatment of oesophageal cancer: effects on modelling tumour control probability.

    Science.gov (United States)

    Padmanaban, Sriram; Warren, Samantha; Walsh, Anthony; Partridge, Mike; Hawkins, Maria A

    2014-12-23

    To investigate systematic changes in dose arising when treatment plans optimised using the Anisotropic Analytical Algorithm (AAA) are recalculated using Acuros XB (AXB) in patients treated with definitive chemoradiotherapy (dCRT) for locally advanced oesophageal cancers. We have compared treatment plans created using AAA with those recalculated using AXB. Although the Anisotropic Analytical Algorithm (AAA) is currently more widely used in clinical routine, Acuros XB (AXB) has been shown to more accurately calculate the dose distribution, particularly in heterogeneous regions. Studies to predict clinical outcome should be based on modelling the dose delivered to the patient as accurately as possible. CT datasets from ten patients were selected for this retrospective study. VMAT (Volumetric modulated arc therapy) plans with 2 arcs, collimator rotation ± 5-10° and dose prescription 50 Gy / 25 fractions were created using Varian Eclipse (v10.0). The initial dose calculation was performed with AAA, and AXB plans were created by re-calculating the dose distribution using the same number of monitor units (MU) and multileaf collimator (MLC) files as the original plan. The difference in calculated dose to organs at risk (OAR) was compared using dose-volume histogram (DVH) statistics and p values were calculated using the Wilcoxon signed rank test. The potential clinical effect of dosimetric differences in the gross tumour volume (GTV) was evaluated using three different TCP models from the literature. PTV Median dose was apparently 0.9 Gy lower (range: 0.5 Gy - 1.3 Gy; p AAA plans re-calculated with AXB and GTV mean dose was reduced by on average 1.0 Gy (0.3 Gy -1.5 Gy; p AAA plan (on average, dose reduction: lung 1.7%, heart 2.4%). Similar trends were seen for CRT plans. Differences in dose distribution are observed with VMAT and CRT plans recalculated with AXB particularly within soft tissue at the tumour/lung interface, where AXB has been shown to more

  15. Evaluation of HIV-1 rapid tests and identification of alternative testing algorithms for use in Uganda.

    Science.gov (United States)

    Kaleebu, Pontiano; Kitandwe, Paul Kato; Lutalo, Tom; Kigozi, Aminah; Watera, Christine; Nanteza, Mary Bridget; Hughes, Peter; Musinguzi, Joshua; Opio, Alex; Downing, Robert; Mbidde, Edward Katongole

    2018-02-27

    The World Health Organization recommends that countries conduct two phase evaluations of HIV rapid tests (RTs) in order to come up with the best algorithms. In this report, we present the first ever such evaluation in Uganda, involving both blood and oral based RTs. The role of weak positive (WP) bands on the accuracy of the individual RT and on the algorithms was also investigated. In total 11 blood based and 3 oral transudate kits were evaluated. All together 2746 participants from seven sites, covering the four different regions of Uganda participated. Two enzyme immunoassays (EIAs) run in parallel were used as the gold standard. The performance and cost of the different algorithms was calculated, with a pre-determined price cut-off of either cheaper or within 20% price of the current algorithm of Determine + Statpak + Unigold. In the second phase, the three best algorithms selected in phase I were used at the point of care for purposes of quality control using finger stick whole blood. We identified three algorithms; Determine + SD Bioline + Statpak; Determine + Statpak + SD Bioline, both with the same sensitivity and specificity of 99.2% and 99.1% respectively and Determine + Statpak + Insti, with sensitivity and specificity of 99.1% and 99% respectively as having performed better and met the cost requirements. There were 15 other algorithms that performed better than the current one but rated more than the 20% price. None of the 3 oral mucosal transudate kits were suitable for inclusion in an algorithm because of their low sensitivities. Band intensity affected the performance of individual RTs but not the final algorithms. We have come up with three algorithms we recommend for public or Government procurement based on accuracy and cost. In case one algorithm is preferred, we recommend to replace Unigold, the current tie breaker with SD Bioline. We further recommend that all the 18 algorithms that have shown better performance than the current one are made

  16. MODA: an efficient algorithm for network motif discovery in biological networks.

    Science.gov (United States)

    Omidi, Saeed; Schreiber, Falk; Masoudi-Nejad, Ali

    2009-10-01

    In recent years, interest has been growing in the study of complex networks. Since Erdös and Rényi (1960) proposed their random graph model about 50 years ago, many researchers have investigated and shaped this field. Many indicators have been proposed to assess the global features of networks. Recently, an active research area has developed in studying local features named motifs as the building blocks of networks. Unfortunately, network motif discovery is a computationally hard problem and finding rather large motifs (larger than 8 nodes) by means of current algorithms is impractical as it demands too much computational effort. In this paper, we present a new algorithm (MODA) that incorporates techniques such as a pattern growth approach for extracting larger motifs efficiently. We have tested our algorithm and found it able to identify larger motifs with more than 8 nodes more efficiently than most of the current state-of-the-art motif discovery algorithms. While most of the algorithms rely on induced subgraphs as motifs of the networks, MODA is able to extract both induced and non-induced subgraphs simultaneously. The MODA source code is freely available at: http://LBB.ut.ac.ir/Download/LBBsoft/MODA/

  17. Algorithms

    Indian Academy of Sciences (India)

    to as 'divide-and-conquer'. Although there has been a large effort in realizing efficient algorithms, there are not many universally accepted algorithm design paradigms. In this article, we illustrate algorithm design techniques such as balancing, greedy strategy, dynamic programming strategy, and backtracking or traversal of ...

  18. Early nerve repair in traumatic brachial plexus injuries in adults: treatment algorithm and first experiences.

    Science.gov (United States)

    Pondaag, Willem; van Driest, Finn Y; Groen, Justus L; Malessy, Martijn J A

    2018-01-26

    OBJECTIVE The object of this study was to assess the advantages and disadvantages of early nerve repair within 2 weeks following adult traumatic brachial plexus injury (ATBPI). METHODS From 2009 onwards, the authors have strived to repair as early as possible extended C-5 to C-8 or T-1 lesions or complete loss of C-5 to C-6 or C-7 function in patients in whom there was clinical and radiological suspicion of root avulsion. Among a group of 36 patients surgically treated in the period between 2009 and 2011, surgical findings in those who had undergone treatment within 2 weeks after trauma were retrospectively compared with results in those who had undergone delayed treatment. The result of biceps muscle reanimation was the primary outcome measure. RESULTS Five of the 36 patients were referred within 2 weeks after trauma and were eligible for early surgery. Nerve ruptures and/or avulsions were found in all early cases of surgery. The advantages of early surgery are as follows: no scar formation, easy anatomical identification, and gap length reduction. Disadvantages include less-clear demarcation of vital nerve tissue and unfamiliarity with the interpretation of frozen-section examination findings. All 5 early-treatment patients recovered a biceps force rated Medical Research Council grade 4. CONCLUSIONS Preliminary results of nerve repair within 2 weeks of ATBPI are encouraging, and the benefits outweigh the drawbacks. The authors propose a decision algorithm to select patients eligible for early surgery. Referral standards for patients with ATBPI must be adapted to enable early surgery.

  19. Aggressive spinal haemangiomas: imaging correlates to clinical presentation with analysis of treatment algorithm and clinical outcomes

    Science.gov (United States)

    Cloran, Francis J; Pukenas, Bryan A; Loevner, Laurie A; Aquino, Christopher; Schuster, James

    2015-01-01

    Objective: Aggressive spinal haemangiomas (those with significant osseous expansion/extraosseous extension) represent approximately 1% of spinal haemangiomas and are usually symptomatic. In this study, we correlate imaging findings with presenting symptomatology, review treatment strategies and their outcomes and propose a treatment algorithm. Methods: 16 patients with aggressive haemangiomas were retrospectively identified from 1995 to 2013. Imaging was assessed for size, location, CT/MR characteristics, osseous expansion and extraosseous extension. Presenting symptoms, management and outcomes were reviewed. Results: Median patient age was 52 years. Median size was 4.5 cm. Lumbar spine was the commonest location (n = 8), followed by thoracic spine (n = 7) and sacrum (n = 2); one case involved the lumbosacral junction. 12 haemangiomas had osseous expansion; 13 had extraosseous extension [epidural (n = 11), pre-vertebral/paravertebral (n = 10) and foraminal (n = 6)]. On CT, 11 had accentuated trabeculae and 5 showed lysis. On MRI, eight were T1 hyperintense, six were T1 hypointense and all were T2 hyperintense. 11 symptomatic patients underwent treatment: chemical ablation (n = 6), angioembolization (n = 3, 2 had subsequent surgery), radiotherapy (n = 2, 1 primary and 1 adjuvant) and surgery (n = 4). Median follow-up was 20 months. Four of six patients managed only by percutaneous methods had symptom resolution. Three of four patients requiring surgery had symptom resolution. Conclusion: Aggressive haemangiomas cause significant morbidity. Treatment is multidisciplinary, with surgery reserved for large lesions and those with focal neurological signs. Minimally invasive procedures may be successful in smaller lesions. Advances in knowledge: Aggressive haemangiomas are rare, but knowledge of their imaging features and treatment strategies enhances the radiologist's role in their management. PMID:26313498

  20. Algorithm for lamotrigine dose adjustment before, during, and after pregnancy

    DEFF Research Database (Denmark)

    Sabers, A

    2012-01-01

    Sabers A. Algorithm for lamotrigine dose adjustment before, during, and after pregnancy. Acta Neurol Scand: DOI: 10.1111/j.1600-0404.2011.01627.x. © 2011 John Wiley & Sons A/S. Background -  Treatment with lamotrigine (LTG) during pregnancy is associated with a pronounced risk of seizure deterior......Sabers A. Algorithm for lamotrigine dose adjustment before, during, and after pregnancy. Acta Neurol Scand: DOI: 10.1111/j.1600-0404.2011.01627.x. © 2011 John Wiley & Sons A/S. Background -  Treatment with lamotrigine (LTG) during pregnancy is associated with a pronounced risk of seizure...