Yuan-Hong Jiang
Full Text Available OBJECTIVES: The aim of this study was to investigate the predictive values of the total International Prostate Symptom Score (IPSS-T and voiding to storage subscore ratio (IPSS-V/S in association with total prostate volume (TPV and maximum urinary flow rate (Qmax in the diagnosis of bladder outlet-related lower urinary tract dysfunction (LUTD in men with lower urinary tract symptoms (LUTS. METHODS: A total of 298 men with LUTS were enrolled. Video-urodynamic studies were used to determine the causes of LUTS. Differences in IPSS-T, IPSS-V/S ratio, TPV and Qmax between patients with bladder outlet-related LUTD and bladder-related LUTD were analyzed. The positive and negative predictive values (PPV and NPV for bladder outlet-related LUTD were calculated using these parameters. RESULTS: Of the 298 men, bladder outlet-related LUTD was diagnosed in 167 (56%. We found that IPSS-V/S ratio was significantly higher among those patients with bladder outlet-related LUTD than patients with bladder-related LUTD (2.28±2.25 vs. 0.90±0.88, p1 or >2 was factored into the equation instead of IPSS-T, PPV were 91.4% and 97.3%, respectively, and NPV were 54.8% and 49.8%, respectively. CONCLUSIONS: Combination of IPSS-T with TPV and Qmax increases the PPV of bladder outlet-related LUTD. Furthermore, including IPSS-V/S>1 or >2 into the equation results in a higher PPV than IPSS-T. IPSS-V/S>1 is a stronger predictor of bladder outlet-related LUTD than IPSS-T.
20 CFR 226.52 - Total annuity subject to maximum.
2010-04-01
... rate effective on the date the supplemental annuity begins, before any reduction for a private pension... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Total annuity subject to maximum. 226.52... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Railroad Retirement Family Maximum § 226.52...
Water Quality Assessment and Total Maximum Daily Loads Information (ATTAINS)
U.S. Environmental Protection Agency — The Water Quality Assessment TMDL Tracking And Implementation System (ATTAINS) stores and tracks state water quality assessment decisions, Total Maximum Daily Loads...
Environmental Monitoring, Water Quality - Total Maximum Daily Load (TMDL)
NSGIC GIS Inventory (aka Ramona) — The Clean Water Act Section 303(d) establishes the Total Maximum Daily Load (TMDL) program. The purpose of the TMDL program is to identify sources of pollution and...
Environmental Monitoring, Water Quality - Total Maximum Daily Load (TMDL)
NSGIC Education | GIS Inventory — The Clean Water Act Section 303(d) establishes the Total Maximum Daily Load (TMDL) program. The purpose of the TMDL program is to identify sources of pollution and...
Validity and Reliability of the Achilles Tendon Total Rupture Score
Ganestam, Ann; Barfod, Kristoffer; Klit, Jakob
2013-01-01
The best treatment of acute Achilles tendon rupture remains debated. Patient-reported outcome measures have become cornerstones in treatment evaluations. The Achilles tendon total rupture score (ATRS) has been developed for this purpose but requires additional validation. The purpose of the present...... study was to validate a Danish translation of the ATRS. The ATRS was translated into Danish according to internationally adopted standards. Of 142 patients, 90 with previous rupture of the Achilles tendon participated in the validity study and 52 in the reliability study. The ATRS showed moderately...
Caregiver Burden: Looking Beyond the Unidimensional Total Score.
Lau, Sabrina; Chong, Mei Sian; Ali, Noorhazlina; Chan, Mark; Chua, Kia Chong; Lim, Wee Shiong
2015-01-01
The Zarit Burden Interview allows caregiver burden to be interpreted from a total score. However, recent studies propose a multidimensional Zarit Burden Interview model. This study aims to determine the agreement between unidimensional (UD) and multidimensional (MD) classification of burden, and differences in predictors among identified groups. We studied 165 dyads of dementia patients and primary caregivers. Caregivers were dichotomized into low-burden and high-burden groups based upon: (1) UD score using quartile cutoffs; and (2) MD model via exploratory cluster analysis. We compared UD versus MD 2×2 classification of burden using κ statistics. Caregivers not showing agreement by either definition were classified as "intermediate" burden. We performed binary logistic regression to ascertain differences in predictive factors. The 2 models showed moderate agreement (κ=0.72, P<0.01), yielding 104 low, 20 intermediate (UD "low burden"/MD "high burden"), and 41 high-burden caregivers. Neuropsychiatric symptoms [odds ratio (OR)=1.27, P=0.003], coresidence (OR=6.32, P=0.040), and decreased caregiving hours (OR=0.99, P=0.018) were associated with intermediate burden, whereas neuropsychiatric symptoms (OR=1.21, P=0.001) and adult children caregivers (OR=2.80, P=0.055) were associated with high burden. Our results highlight the differences between UD and MD classification of caregiver burden. Future studies should explore the significance of the noncongruent intermediate group and its predictors.
Bangalore, Harish; Gaies, Michael; Ocampo, Elena C; Heinle, Jeffrey S; Guffey, Danielle; Minard, Charles G; Checchia, Paul; Shekerdemian, Lara S
2017-08-01
The aim of the present study was to explore and compare the association between a new vasoactive score - the Total Inotrope Exposure Score - and outcome and the established Vasoactive Inotrope Score in children undergoing cardiac surgery with cardiopulmonary bypass DESIGN: The present study was a single-centre, retrospective study. The study was carried out at a 21-bed cardiovascular ICU in a Tertiary Children's Hospital between September, 2010 and May, 2011 METHODS: The Total Inotrope Exposure Score is a new vasoactive score that brings together cumulative vasoactive drug exposure and incorporates dose adjustments over time. The performance of these scores - average, maximum Vasoactive Inotrope Score at 24 and 48 hours, and Total Inotrope Exposure Score - to predict primary clinical outcomes - either death, cardiopulmonary resuscitation, or extra-corporeal membrane oxygenation before hospital discharge - and secondary outcomes - length of invasive mechanical ventilation, length of ICU stay, and hospital stay - was calculated. Main results The study cohort included 167 children under 18 years of age, with 37 (22.2%) neonates and 65 (41.3%) infants aged between 1 month and 1 year. The Total Inotrope Exposure Score best predicted the primary outcome (six of 167 cases) with an unadjusted odds ratio for a poor outcome of 42 (4.8, 369.6). Although the area under curve was higher than other scores, this difference did not reach statistical significance. The Total Inotrope Exposure Score best predicted prolonged invasive mechanical ventilation, length of ICU stay, and hospital stay as compared with the other scores. The Total Inotrope Exposure Score appears to have a good association with poor postoperative outcomes and warrants prospective validation across larger numbers of patients across institutions.
Factors Predicting the Forgotten Joint Score After Total Knee Arthroplasty.
Behrend, Henrik; Zdravkovic, Vilijam; Giesinger, Johannes; Giesinger, Karlmeinrad
2016-09-01
We recently developed the forgotten joint score 12 (FJS-12), a tool to assess joint awareness in everyday life. It is unknown whether patient factors predicting the outcome of the FJS-12 after total knee arthroplasty (TKA) exist. Five hundred forty cases of TKA were analyzed. Objective clinical results were obtained for range of motion, stability, and alignment. Patient-reported outcome was assessed using the FJS-12. Baseline data and complications were recorded. Cluster analysis based on FJS-12, postoperative flexion, and age resulted in 3 groups: poor outcome (88 patients), good outcome (340 patients), and excellent outcome (118 patients). The characteristics of "poor" compared to "excellent" clusters were studied more closely using bivariate comparative tests and logistic regression. We could find that male patients around 63 years with a lower body mass index were most likely to be allocated to the cluster "excellent" (defined as high FJS-12 and high postoperative flexion). Preoperative extension and flexion, stability, varus/valgus alignment, surgery prior TKA, or comorbidities were not predictive for the FJS-12 at 1 year follow-up. We identified 3 preoperative patient-related factors that may predict the FJS-12 after TKA: body mass index, age, and gender. These findings can be used to guide decision-making and important preoperative discussions on expectations after TKA. Copyright © 2016 Elsevier Inc. All rights reserved.
7 CFR 51.1178 - Maximum anhydrous citric acid permissible for corresponding total soluble solids.
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Maximum anhydrous citric acid permissible for... Sinensis (l) Osbeck) § 51.1178 Maximum anhydrous citric acid permissible for corresponding total soluble solids. For determining the grade of juice, the maximum permissible anhydrous citric acid content...
Maximum Potential Score (MPS: An operating model for a successful customer-focused strategy.
Cabello González, José Manuel
2015-11-01
Full Text Available One of marketers’ chief objectives is to achieve customer loyalty, which is a key factor for profitable growth. Therefore, they need to develop a strategy that attracts and maintains customers, giving them adequate motives, both tangible (prices and promotions and intangible (personalized service and treatment, to satisfy a customer and make him loyal to the company. Finding a way to accurately measure satisfaction and customer loyalty is very important. With regard to typical Relationship Marketing measures, we can consider listening to customers, which can help to achieve a competitive sustainable advantage. Customer satisfaction surveys are essential tools for listening to customers. Short questionnaires have gained considerable acceptance among marketers as a means to achieve a customer satisfaction measure. Our research provides an indication of the benefits of a short questionnaire (one/three questions. We find that the number of questions survey is significantly related to the participation in the survey (Net Promoter Score or NPS. We also prove that a the three question survey is more likely to have more participants than a traditional survey (Maximum Potential Score or MPS . Our main goal is to analyse one method as a potential predictor of customer loyalty. Using surveys, we attempt to empirically establish the causal factors in determining the satisfaction of customers. This paper describes a maximum potential operating model that captures with a three questions survey, important elements for a successful customer-focused strategy. MPS may give us lower participation rates than NPS but important information that helps to convert unhappy customers or just satisfied customers, into loyal customers.
National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...
Analysing relations between specific and total liking scores
Menichelli, Elena; Kraggerud, Hilde; Olsen, Nina Veflen
2013-01-01
The objective of this article is to present a new statistical approach for the study of consumer liking. Total liking data are extended by incorporating liking for specific sensory properties. The approach combines different analyses for the purpose of investigating the most important aspects of ...
A Suffcient Condition for Planar Graphs with Maximum Degree 8 to Be 9-totally Colorable
Jian Sheng CAI; Chang Chun TENG; Gui Ying YAN
2014-01-01
A total k-coloring of a graph G is a coloring of V (G)∪E (G) using k colors such that no two adjacent or incident elements receive the same color. The total chromatic number χ ''(G) is the smallest integer k such that G has a total k-coloring. It is known that if a planar graph G has maximum degreeΔ≥9, thenχ ''(G)=Δ+1. In this paper, we prove that if G is a planar graph with maximum degree 8 and without a fan of four adjacent 3-cycles, thenχ ??(G)=9.
Dai, Huanping; Micheyl, Christophe
2015-05-01
Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.
Sex differences on Purpose-In-Life Test total and factorial scores among spanish undergraduates
2011-01-01
The aim of this paper is to analyze the differences on Purpose-In-Life Test [PIL] (Crumbaugh & Maholic, 1969) total and factorial scores associated to sex, among 309 spanish undergratudates (207 women, 102 men), range 18 to 45 years. PIL Spanish version is used (Noblejas de la Flor, 1994). PIL evalues life meaning achievement vs. existential vacuum. Women achieve higher means on PIL total and factorial scores, and estatistical analysis show that sex is significantly associated to total PIL sc...
Rakonjac, Zoran; Brdar, Radivoj; Popovic, Miroslav
2014-01-01
Introduction: The use of radical surgical treatments in treating congenital clubfoot is decreasing. Minimally invasive surgical treatment (MIST) is a way of treating congenital clubfoot, which is a kind of compromise between a radical surgical treatment and non-operational one. A few protocols of different authors McKay, Macnicol, Stevens, Meyer, G.W.Simons and Laaveg-Ponseti were used in the evaluation of the results. SCIENTIFIC OBJECTIVE: To determine the importance and role of groups of parameters (clinical, radiographic and functional) in the evaluation of the results in patients treated with the two methods (radical operation and MIST). Subjects and methods: This paper covers children who were treated for structural (idiopathic) form of PEVC. The testing is a prospective study and was conducted in two groups of patients. Group A (radical surgical treatment) – control group, where the total number of subjects was 50, out of which 35 male (70%) and 15 female (30%). The number of feet tested was 88. Group B (minimally invasive surgical treatment–MIST)–experimental group. The total number of subjects was 48, out of which 35 male (73%) and 13 female (27%). The number of feet tested was 84. For the analysis of the results, we used a questionnaire. The total number of parameters was fifteen, clinical, radiographic and functional, five parameters of each. Normal findings or measured value was determined by 0 points. The range of the total score (TS-a- total score range) 0-27 points, and the results were sorted out into the folowing categories: good result (0-5) satisfactory (6-11), poor (12-19) and deformity recrudescence (20-27) points. Results: The proportion of good results at 88 feet in group A was 0,477 as at 84 feet in group B it was significantly higher and came to 0,893. The difference between these proportions is statistically highly significant (t = 5.84, p <0.001). Chi-square test (χ2 = 30.083 df = 1 N = 172, p <0.001) indicated that there is a
Total number of tillers of different accessions of Panicum maximum Jacq.
Thiago Perez Granato
2012-12-01
Full Text Available The productivity of forage grasses is due to continuous emission of leaves and tillers, ensuring the restoration of leaf área after cutting or grazing, thus ensuring the sustainability of forage. This study aimed to asses the total number tillers in different acessions of Panicum maximum Jacq. The experiment was carried in field belonging to the Instituto de Zootecnia located in Nova Odessa / SP. Evaluated two new accesses Panicum maximum, and two commercial cultivars. The cultivars tested were Aruana, Milenio, NO 2487, NO 78, and the two latter belonging to the Germoplasm Collection of the IZ. The experimental desing was in randomized complete block with four replications. The experimental area consisted of 16 plots of 10 m2 (5 x 2 m each. The experimental area was analyzed and according to the results, received dolomitic limestone corresponding 2t /ha, two months before the implementation of the experiment. Sowing was made by broad costing together with 80 kg/ha of P2O5 in the form of single superphosfate. After 60 days of implantation of the experiment it was a made a leveling of the plots to a height of about 15 cm. After this it was applied 250g of the 20-00-20 fertilizer/plot. Thirty days after the standardization it was evaluated the total number of tillers of the cultivars, using a metal frame of 0.5 x 0.5m which was thrown at random on each of the 16 plots, leaving one meter of each extremitly, and all tillers which were within the frame counted. After finished the counting of all tillers, the plots cut again at a height of approximately 15 cm. The second evaluation took place after thirty days, and it was again counted the total number of tillers following the same procedure. The results were analyzed by Tukey test at 5% after transforming the data to log(x. For the first evaluation there was no statistical difference in the total number of tillers between cultivars. But, in the second evaluation, the total number of tillers of NO 78
The iScore predicts total healthcare costs early after hospitalization for an acute ischemic stroke.
Ewara, Emmanuel M; Isaranuwatchai, Wanrudee; Bravata, Dawn M; Williams, Linda S; Fang, Jiming; Hoch, Jeffrey S; Saposnik, Gustavo
2015-12-01
The ischemic Stroke risk score is a validated prognostic score which can be used by clinicians to estimate patient outcomes after the occurrence of an acute ischemic stroke. In this study, we examined the association between the ischemic Stroke risk score and patients' 30-day, one-year, and two-year healthcare costs from the perspective of a third party healthcare payer. Patients who had an acute ischemic stroke were identified from the Registry of Canadian Stroke Network. The 30-day ischemic Stroke risk score prognostic score was determined for each patient. Direct healthcare costs at each time point were determined using administrative databases in the province of Ontario. Unadjusted mean and the impact of a 10-point increase ischemic Stroke risk score and a patient's risk of death or disability on total cost were determined. There were 12,686 patients eligible for the study. Total unadjusted mean costs were greatest among patients at high risk. When adjusting for patient characteristics, a 10-point increase in the ischemic Stroke risk score was associated with 8%, 7%, and 4% increase in total costs at 30 days, one-year, and two-years. The same increase was found to impact patients at low, medium, and high risk differently. When adjusting for patient characteristics, patients in the high-risk group had the highest total costs at 30 days, while patients at medium risk had the highest costs at both one and two-years. The ischemic Stroke risk score can be useful as a predictor of healthcare utilization and costs early after hospitalization for an acute ischemic stroke. © 2015 World Stroke Organization.
Zhang, H X
2008-01-01
An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program.
Economic total maximum daily load for watershed-based pollutant trading.
Zaidi, A Z; deMonsabert, S M
2015-04-01
Water quality trading (WQT) is supported by the US Environmental Protection Agency (USEPA) under the framework of its total maximum daily load (TMDL) program. An innovative approach is presented in this paper that proposes post-TMDL trade by calculating pollutant rights for each pollutant source within a watershed. Several water quality trading programs are currently operating in the USA with an objective to achieve overall pollutant reduction impacts that are equivalent or better than TMDL scenarios. These programs use trading ratios for establishing water quality equivalence among pollutant reductions. The inbuilt uncertainty in modeling the effects of pollutants in a watershed from both the point and nonpoint sources on receiving waterbodies makes WQT very difficult. A higher trading ratio carries with it increased mitigation costs, but cannot ensure the attainment of the required water quality with certainty. The selection of an applicable trading ratio, therefore, is not a simple process. The proposed approach uses an Economic TMDL optimization model that determines an economic pollutant reduction scenario that can be compared with actual TMDL allocations to calculate selling/purchasing rights for each contributing source. The methodology is presented using the established TMDLs for the bacteria (fecal coliform) impaired Muddy Creek subwatershed WAR1 in Rockingham County, Virginia, USA. Case study results show that an environmentally and economically superior trading scenario can be realized by using Economic TMDL model or any similar model that considers the cost of TMDL allocations.
Jin, Cai De; Kim, Moo Hyun; Kim, Soo Jin; Lee, Kwang Min; Kim, Tae Hyung; Cho, Young-Rak; Serebruany, Victor L
2017-01-01
The optimal strategy to manage chronic total occlusion (CTO) remains unclear. The Japanese CTO multicenter registry (J-CTO) score is an established tool for predicting successful recanalization. However, it does not take into account nonangiographic predictors for final technique success. In the present study, we designed and tested a scoring model called the Busan single-center CTO registry (B-CTO) score combining clinical and angiographic characteristics to predict successful CTO recanalization in Korean patients. Prospectively enrolled CTO patients (n = 438) undergoing coronary intervention (1999-2015) were assessed. The B-CTO score comprises 6 independent predictors: age 60-74 years and lesion length ≥20 mm were assigned 1 point each, while age ≥75 years, female gender, lesion location in the right coronary artery, blunt stump, and bending >45° were assigned 2 points each. For each predictor, the points assigned were based on the associated odds ratio by multivariate analysis. The lesions were classified into 4 groups according to the summation of points scored to assess the probability of successful CTO recanalization: easy (score 0-1), intermediate (score 2-3), difficult (score 4-5), and very difficult (score ≥6). CTO opening was designated as the primary endpoint regardless of the interventional era or the skill of the operator. The final success rate for B-CTO was 81.1%. The probability of successful recanalization for patient groups classified as easy (n = 64), intermediate (n = 148), difficult (n = 134), and very difficult (n = 92) was 95.3, 86.5, 79.1 and 65.2%, respectively (p for trend CTO, the B-CTO score demonstrated a significant improvement in discrimination as indicated by the area under the receiver-operator characteristic curve (AUC 0.083; 95% CI 0.025-0.141), with a positive integrated discrimination improvement of 0.042 and a net reclassification improvement of 56.0%. The B-CTO score has been designed and validated in Korean patients
Thomsen, Morten G; Latifi, Roshan; Kallemose, Thomas; Barfod, Kristoffer W; Husted, Henrik; Troelsen, Anders
2016-06-01
Background and purpose - When evaluating the outcome after total knee arthroplasty (TKA), increasing emphasis has been put on patient satisfaction and ability to perform activities of daily living. To address this, the forgotten joint score (FJS) for assessment of knee awareness has been developed. We investigated the validity and reliability of the FJS. Patients and methods - A Danish version of the FJS questionnaire was created according to internationally accepted standards. 360 participants who underwent primary TKA were invited to participate in the study. Of these, 315 were included in a validity study and 150 in a reliability study. Correlation between the Oxford knee score (OKS) and the FJS was examined and test-retest evaluation was performed. A ceiling effect was defined as participants reaching a score within 15% of the maximum achievable score. Results - The validity study revealed a strong correlation between the FJS and the OKS (intraclass correlation coefficient (ICC) = 0.81, 95% CI: 0.77-0.85; p < 0.001). The test-retest evaluation showed almost perfect reliability for the FJS total score (ICC = 0.91, 95% CI: 0.88-0.94) and substantial reliability or better for individual items of the FJS (ICC? 0.79). We found a high level of internal consistency (Cronbach's? = 0.96). The ceiling effect for the FJS was 16%, as compared to 37% for the OKS. Interpretation - The FJS showed good construct validity and test-retest reliability. It had a lower ceiling effect than the OKS. The FJS appears to be a promising tool for evaluation of small differences in knee performance in groups of patients with good clinical results after TKA.
Cui, Jin; Jia, Zhenyu; Zhi, Xin; Li, Xiaoqun; Zhai, Xiao; Cao, Liehu; Weng, Weizong; Zhang, Jun; Wang, Lin; Chen, Xiao; Su, Jiacan
2017-01-05
The Achilles tendon Total Rupture Score (ATRS), which is originally developed in 2007 in Swedish, is the only patient-reported outcome measure (PROM) for specific outcome assessment of an Achilles tendon rupture.Purpose of this study is to translate and cross-culturally adapt Achilles tendon Total Rupture Score (ATRS) into simplified Chinese, and primarily evaluate the responsiveness, reliability and validity. International recognized guideline which was designed by Beaton was followed to make the translation of ATRS from English into simplified Chinese version (CH-ATRS). A prospective cohort study was carried out for the cross-cultural adaptation. There were 112 participants included into the study. Psychometric properties including floor and ceiling effects, Cronbach's alpha, intraclass correlation coefficient, effect size, standard response mean, and construct validity were tested. The mean scores of CH-ATRS are 57.42 ± 13.70. No sign of floor or ceiling effect was found of CH-ATRS. High level of internal consistency was supported by the value of Cronbach's alpha (0.893). ICC (0.979, 95%CI: 0.984-0.993) was high to indicate the high test-retest reliability. Great responsive ness was proved with the high absolute value of ES and SRM (0.84 and 8.98, respectively). The total CH-ATRS score had very good correlation with physical function and body pain subscales of SF-36 (r = -0.758 and r = -0.694, respectively, p Achilles tendon Total Rupture Score (CH-ATRS) can be used as a reliable and valid instrument for Achilles tendon rupture assessing in Chinese-speaking population. Level of evidence II.
Cross-cultural adaptation and validation of Persian Achilles tendon Total Rupture Score.
Ansari, Noureddin Nakhostin; Naghdi, Soofia; Hasanvand, Sahar; Fakhari, Zahra; Kordi, Ramin; Nilsson-Helander, Katarina
2016-04-01
To cross-culturally adapt the Achilles tendon Total Rupture Score (ATRS) to Persian language and to preliminary evaluate the reliability and validity of a Persian ATRS. A cross-sectional and prospective cohort study was conducted to translate and cross-culturally adapt the ATRS to Persian language (ATRS-Persian) following steps described in guidelines. Thirty patients with total Achilles tendon rupture and 30 healthy subjects participated in this study. Psychometric properties of floor/ceiling effects (responsiveness), internal consistency reliability, test-retest reliability, standard error of measurement (SEM), smallest detectable change (SDC), construct validity, and discriminant validity were tested. Factor analysis was performed to determine the ATRS-Persian structure. There were no floor or ceiling effects that indicate the content and responsiveness of ATRS-Persian. Internal consistency was high (Cronbach's α 0.95). Item-total correlations exceeded acceptable standard of 0.3 for the all items (0.58-0.95). The test-retest reliability was excellent [(ICC)agreement 0.98]. SEM and SDC were 3.57 and 9.9, respectively. Construct validity was supported by a significant correlation between the ATRS-Persian total score and the Persian Foot and Ankle Outcome Score (PFAOS) total score and PFAOS subscales (r = 0.55-0.83). The ATRS-Persian significantly discriminated between patients and healthy subjects. Explanatory factor analysis revealed 1 component. The ATRS was cross-culturally adapted to Persian and demonstrated to be a reliable and valid instrument to measure functional outcomes in Persian patients with Achilles tendon rupture. II.
Mazhar A. Memon
2016-04-01
Full Text Available ABSTRACT Objective: To evaluate correlation between visual prostate score (VPSS and maximum flow rate (Qmax in men with lower urinary tract symptoms. Material and Methods: This is a cross sectional study conducted at a university Hospital. Sixty-seven adult male patients>50 years of age were enrolled in the study after signing an informed consent. Qmax and voided volume recorded at uroflowmetry graph and at the same time VPSS were assessed. The education level was assessed in various defined groups. Pearson correlation coefficient was computed for VPSS and Qmax. Results: Mean age was 66.1±10.1 years (median 68. The mean voided volume on uroflowmetry was 268±160mL (median 208 and the mean Qmax was 9.6±4.96mLs/sec (median 9.0. The mean VPSS score was 11.4±2.72 (11.0. In the univariate linear regression analysis there was strong negative (Pearson's correlation between VPSS and Qmax (r=848, p<0.001. In the multiple linear regression analyses there was a significant correlation between VPSS and Qmax (β-http://www.blogapaixonadosporviagens.com.br/p/caribe.html after adjusting the effect of age, voided volume (V.V and level of education. Multiple linear regression analysis done for independent variables and results showed that there was no significant correlation between the VPSS and independent factors including age (p=0.27, LOE (p=0.941 and V.V (p=0.082. Conclusion: There is a significant negative correlation between VPSS and Qmax. The VPSS can be used in lieu of IPSS score. Men even with limited educational background can complete VPSS without assistance.
Vasquez, S.; Guidon, Marie; McHugh, E; Lennon, Olive; Grogan, L.; Breathnach, O S
2013-01-01
BACKGROUND: Chemotherapy-induced peripheral neuropathy (CIPN) is a common, potentially reversible side effect of some chemotherapeutic agents. CIPN is associated with decreased balance, function and quality of life (QoL). This association has to date been under-investigated. AIMS: To profile patients presenting with CIPN using the modified Total Neuropathy Score (mTNS) in this cross-sectional study and to examine the relationship between CIPN (measured by mTNS) and indices of balance, qual...
MODALITY OF DETERMINING THE TOTAL SCORE OF RISKS IN INTERNAL AUDIT
FRANCA DUMITRU
2012-11-01
Full Text Available Risk analysis materializes in: applying to the weightings of risk factors the level of risk assessment, on risk factors, based on the assessments made by auditors regarding: the functionality of internal control, the influence of quantitative and qualitative elements; determination of the total risk score, which represents a sum of weights between the appreciation level of each risk and the weightings of risk factors.
Turki, Turki; Roshan, Usman
2014-11-15
Programs based on hash tables and Burrows-Wheeler are very fast for mapping short reads to genomes but have low accuracy in the presence of mismatches and gaps. Such reads can be aligned accurately with the Smith-Waterman algorithm but it can take hours and days to map millions of reads even for bacteria genomes. We introduce a GPU program called MaxSSmap with the aim of achieving comparable accuracy to Smith-Waterman but with faster runtimes. Similar to most programs MaxSSmap identifies a local region of the genome followed by exact alignment. Instead of using hash tables or Burrows-Wheeler in the first part, MaxSSmap calculates maximum scoring subsequence score between the read and disjoint fragments of the genome in parallel on a GPU and selects the highest scoring fragment for exact alignment. We evaluate MaxSSmap's accuracy and runtime when mapping simulated Illumina E.coli and human chromosome one reads of different lengths and 10% to 30% mismatches with gaps to the E.coli genome and human chromosome one. We also demonstrate applications on real data by mapping ancient horse DNA reads to modern genomes and unmapped paired reads from NA12878 in 1000 genomes. We show that MaxSSmap attains comparable high accuracy and low error to fast Smith-Waterman programs yet has much lower runtimes. We show that MaxSSmap can map reads rejected by BWA and NextGenMap with high accuracy and low error much faster than if Smith-Waterman were used. On short read lengths of 36 and 51 both MaxSSmap and Smith-Waterman have lower accuracy compared to at higher lengths. On real data MaxSSmap produces many alignments with high score and mapping quality that are not given by NextGenMap and BWA. The MaxSSmap source code in CUDA and OpenCL is freely available from http://www.cs.njit.edu/usman/MaxSSmap.
A 6-Point TACS Score Predicts In-Hospital Mortality Following Total Anterior Circulation Stroke
Wood, Adrian D; Gollop, Nicholas D; Bettencourt-Silva, Joao H; Clark, Allan B; Metcalf, Anthony K; Bowles, Kristian M; Flather, Marcus D; Potter, John F
2016-01-01
Background and Purpose Little is known about the factors associated with in-hospital mortality following total anterior circulation stroke (TACS). We examined the characteristics and comorbidity data for TACS patients in relation to in-hospital mortality with the aim of developing a simple clinical rule for predicting the acute mortality outcome in TACS. Methods A routine data registry of one regional hospital in the UK was analyzed. The subjects were 2,971 stroke patients with TACS (82% ischemic; median age=81 years, interquartile age range=74–86 years) admitted between 1996 and 2012. Uni- and multivariate regression models were used to estimate in-hospital mortality odds ratios for the study covariates. A 6-point TACS scoring system was developed from regression analyses to predict in-hospital mortality as the outcome. Results Factors associated with in-hospital mortality of TACS were male sex [adjusted odds ratio (AOR)=1.19], age (AOR=4.96 for ≥85 years vs. <65 years), hemorrhagic subtype (AOR=1.70), nonlateralization (AOR=1.75), prestroke disability (AOR=1.73 for moderate disability vs. no symptoms), and congestive heart failure (CHF) (AOR=1.61). Risk stratification using the 6-point TACS Score [T=type (hemorrhage=1 point) and territory (nonlateralization=1 point), A=age (65–84 years=1 point, ≥85 years=2 points), C=CHF (if present=1 point), S=status before stroke (prestroke modified Rankin Scale score of 4 or 5=1 point)] reliably predicted a mortality outcome: score=0, 29.4% mortality; score=1, 46.2% mortality [negative predictive value (NPV)=70.6%, positive predictive value (PPV)=46.2%]; score=2, 64.1% mortality (NPV=70.6, PPV=64.1%); score=3, 73.7% mortality (NPV=70.6%, PPV=73.7%); and score=4 or 5, 81.2% mortality (NPV=70.6%, PPV=81.2%). Conclusions We have identified the key determinants of in-hospital mortality following TACS and derived a 6-point TACS Score that can be used to predict the prognosis of particular patients.
Technical evaluation of a total maximum daily load model for Upper Klamath and Agency Lakes, Oregon
Wood, Tamara M.; Wherry, Susan A.; Carter, James L.; Kuwabara, James S.; Simon, Nancy S.; Rounds, Stewart A.
2013-01-01
We reviewed a mass balance model developed in 2001 that guided establishment of the phosphorus total maximum daily load (TMDL) for Upper Klamath and Agency Lakes, Oregon. The purpose of the review was to evaluate the strengths and weaknesses of the model and to determine whether improvements could be made using information derived from studies since the model was first developed. The new data have contributed to the understanding of processes in the lakes, particularly internal loading of phosphorus from sediment, and include measurements of diffusive fluxes of phosphorus from the bottom sediments, groundwater advection, desorption from iron oxides at high pH in a laboratory setting, and estimates of fluxes of phosphorus bound to iron and aluminum oxides. None of these processes in isolation, however, is large enough to account for the episodically high values of whole-lake internal loading calculated from a mass balance, which can range from 10 to 20 milligrams per square meter per day for short periods. The possible role of benthic invertebrates in lake sediments in the internal loading of phosphorus in the lake has become apparent since the development of the TMDL model. Benthic invertebrates can increase diffusive fluxes several-fold through bioturbation and biodiffusion, and, if the invertebrates are bottom feeders, they can recycle phosphorus to the water column through metabolic excretion. These organisms have high densities (1,822–62,178 individuals per square meter) in Upper Klamath Lake. Conversion of the mean density of tubificid worms (Oligochaeta) and chironomid midges (Diptera), two of the dominant taxa, to an areal flux rate based on laboratory measurements of metabolic excretion of two abundant species suggested that excretion by benthic invertebrates is at least as important as any of the other identified processes for internal loading to the water column. Data from sediment cores collected around Upper Klamath Lake since the development of the
Higher forgotten joint score for fixed-bearing than for mobile-bearing total knee arthroplasty.
Thienpont, E; Zorman, D
2016-08-01
To compare the postoperative subjective outcome for fixed- and mobile-bearing total knee arthroplasty (TKA) by using the forgotten joint score (FJS-12), a new patient-reported outcome score of 12 questions evaluating the potential of a patient to forget about his operated joint. The hypothesis of this study was that a mobile-bearing TKA would have a higher level of forgotten joint than a fixed-bearing model of the same design. A retrospective cohort study was conducted in 100 patients who underwent TKA at least 1 year [mean (SD) 18 (5) months] before with either a fixed-bearing (N = 50) or a mobile-bearing (N = 50) TKA from the same implant family. Clinical outcome was evaluated with the knee society score and patient-reported outcome with the forgotten joint score. No difference was observed for demographics in between both study groups. The mean (SD) postoperative FJS-12 for the fixed-bearing TKA was 71 (28) compared to a mean (SD) of 56.5 (30) for the mobile-bearing TKA. The clinical relevance of the present retrospective study is that it shows for the first time a significant difference between fixed- and mobile-bearing TKA by using a new patient-reported outcome score. The hypothesis that mobile-bearing TKA would have a higher degree of forgotten joint than a fixed-bearing TKA could not be confirmed. A level I prospective study should be set up to objectivise these findings. IV.
Carmont, Michael R; Silbernagel, Karin Grävare; Nilsson-Helander, Katarina; Mei-Dan, Omer; Karlsson, Jon; Maffulli, Nicola
2013-06-01
The Achilles tendon Total Rupture Score (ATRS) was developed because of the need for a reliable, valid and sensitive instrument to evaluate symptoms and their effects on physical activity in patients following either conservative or surgical management of an Achilles tendon rupture. Prior to using the score in larger randomized trial in an English-speaking population, we decided to perform reliability, validity and responsiveness evaluations of the English version of the ATRS. Even though the score was published in English, the actual English version has not be validated and compared to the results of the Swedish version. From 2009 to 2010, all patients who received treatment for Achilles tendon rupture were followed up using the English version of the ATRS. Patients were asked to complete the score at 3, 6 and 12 months following treatment for Achilles tendon rupture. The ATRS was completed on arrival in the outpatient clinic and again following consultation. The outcomes of 49 (13 female and 36 male) patients were assessed. The mean (SD) age was 49 (12) years, and 27 patients had treatment for a left-sided rupture, 22 the right. All patients received treatment for ruptured Achilles tendons: 38 acute percutaneous repair, 1 open repair, 5 an Achilles tendon reconstruction using a Peroneus Brevis tendon transfer for delayed presentation, 1 gracilis augmented repair for re-rupture and 4 non-operative treatment for mid-portion rupture. The English version of ATRS was shown to have overall excellent reliability (ICC = 0.986). There was no significant difference between the results with the English version and the Swedish version when compared at the 6-month- or 12-month (n.s.) follow-up appointments. The effect size was 0.93. The minimal detectable change was 6.75 points. The ATRS was culturally adapted to English and shown to be a reliable, valid and responsive method of testing functional outcome following an Achilles tendon rupture.
Unlü, Ali
2008-05-01
This note provides a direct, elementary proof of the fundamental result on monotone likelihood ratio of the total score variable in unidimensional item response theory (IRT). This result is very important for practical measurement in IRT, because it justifies the use of the total score variable to order participants on the latent trait. The proof relies on a basic inequality for elementary symmetric functions which is proved by means of few purely algebraic, straightforward transformations. In particular, flaws in a proof of this result by Huynh [(1994). A new proof for monotone likelihood ratio for the sum of independent Bernoulli random variables. Psychometrika, 59, 77-79] are pointed out and corrected, and a natural generalization of the fundamental result to non-linear (quasi-ordered) latent trait spaces is presented. This may be useful for multidimensional IRT or knowledge space theory, in which the latent 'ability' spaces are partially ordered with respect to, for instance, coordinate-wise vector-ordering or set-inclusion, respectively.
Silvia Iacobelli
Full Text Available OBJECTIVE: We aimed to investigate the predictive value for severe adverse outcome of plasma protein measurements on day one of life in very preterm infants and to compare total plasma protein levels with the validated illness severity scores CRIB, CRIB-II, SNAP-II and SNAPPE-II, regarding their predictive ability for severe adverse outcome. METHODS: We analyzed a cohort of infants born at 24-31 weeks gestation, admitted to the tertiary intensive care unit of a university hospital over 10.5 years. The outcome measure was "severe adverse outcome" defined as death before discharge or severe neurological injury on cranial ultrasound. The adjusted odd ratio (aOR and 95% confidence interval (95% CI of severe adverse outcome for hypoproteinemia (total plasma protein level <40 g/L was calculated by univariate and multivariate analyses. Calibration (Hosmer-Lemeshow goodness-of-fit was performed and the predictive ability for severe adverse outcome was assessed for total plasma protein and compared with CRIB, CRIB-II, SNAP-II and SNAPPE-II, by calculating receiver operating characteristic (ROC curves and their associated area under the curve (AUC. RESULTS: 761 infants were studied: 14.4% died and 4.1% survived with severe cerebral ultrasound findings. The aOR of severe adverse outcome for hypoproteinemia was 6.1 (95% CI 3.8-9.9. The rank order for variables, as assessed by AUCs and 95% CIs, in predicting outcome was: total plasma protein [0.849 (0.821-0.873], SNAPPE-II [0.822 (0.792-0.848], CRIB [0.821 (0.792-0.848], SNAP-II [0.810 (0.780-0.837] and CRIB-II [0.803 (0.772-0.830]. Total plasma protein predicted severe adverse outcome significantly better than CRIB-II and SNAP-II (both p<0.05. Calibration for total plasma protein was very good. CONCLUSIONS: Early hypoproteinemia has prognostic value for severe adverse outcome in very preterm, sick infants. Total plasma protein has a predictive performance comparable with CRIB and SNAPPE-II and greater than
A novel autofocus algorithm based on maximum total variation criteria for SAR images
MA Lun; LIAO Guisheng
2007-01-01
A novel autofocus algorithm for synthetic aperture radar (SAR)based on total variation is presented in this Paper.The method,which starts with a complex phase-degraded SAR image,after the phase errors model is introduced into the range-compressed phase-history domain,carries out phase errors correction by changing the focus till the total variation of the azimuth profile is maximized.Compared with the minimum entropy autofocus algorithm,the autofocus algorithm has less computational complexity and is easier to implement.The simulation and the processing results of the measured data show the validity of the proposed method.
Internal Consistency and Power When Comparing Total Scores from Two Groups.
Barchard, Kimberly A; Brouwers, Vincent
2016-01-01
Researchers now know that when theoretical reliability increases, power can increase, decrease, or stay the same. However, no analytic research has examined the relationship of power to the most commonly used type of reliability-internal consistency-and the most commonly used measures of internal consistency, coefficient alpha and ICC(A,k). We examine the relationship between the power of independent samples t tests and internal consistency. We explicate the mathematical model upon which researchers usually calculate internal consistency, one in which total scores are calculated as the sum of observed scores on K measures. Using this model, we derive a new formula for effect size to show that power and internal consistency are influenced by many of the same parameters but not always in the same direction. Changing an experiment in one way (e.g., lengthening the measure) is likely to influence multiple parameters simultaneously; thus, there are no simple relationships between such changes and internal consistency or power. If researchers revise measures to increase internal consistency, this might not increase power. To increase power, researchers should increase sample size, select measures that assess areas where group differences are largest, and use more powerful statistical procedures (e.g., ANCOVA).
A. Qi
2016-01-01
Full Text Available This study aimed to determine whether psychological factors affect health-related quality of life (HRQL and recovery of knee function in total knee replacement (TKR patients. A total of 119 TKR patients (male: 38; female: 81 completed the Beck Anxiety Inventory (BAI, Beck Depression Inventory (BDI, State Trait Anxiety Inventory (STAI, Eysenck Personality Questionnaire-revised (EPQR-S, Knee Society Score (KSS, and HRQL (SF-36. At 1 and 6 months after surgery, anxiety, depression, and KSS scores in TKR patients were significantly better compared with those preoperatively (P<0.05. SF-36 scores at the sixth month after surgery were significantly improved compared with preoperative scores (P<0.001. Preoperative Physical Component Summary Scale (PCS and Mental Component Summary Scale (MCS scores were negatively associated with extraversion (E score (B=-0.986 and -0.967, respectively, both P<0.05. Postoperative PCS and State Anxiety Inventory (SAI scores were negatively associated with neuroticism (N score; B=-0.137 and -0.991, respectively, both P<0.05. Postoperative MCS, SAI, Trait Anxiety Inventory (TAI, and BAI scores were also negatively associated with the N score (B=-0.367, -0.107, -0.281, and -0.851, respectively, all P<0.05. The KSS function score at the sixth month after surgery was negatively associated with TAI and N scores (B=-0.315 and -0.532, respectively, both P<0.05, but positively associated with the E score (B=0.215, P<0.05. The postoperative KSS joint score was positively associated with postoperative PCS (B=0.356, P<0.05. In conclusion, for TKR patients, the scores used for evaluating recovery of knee function and HRQL after 6 months are inversely associated with the presence of negative emotions.
Soo, M; Sneddon, N W; Lopez-Villalobos, N; Worth, A J
2015-03-01
To use estimated breeding value (EBV) analysis to investigate the genetic trend of the total hip score (to assess canine hip dysplasia) in four populous breeds of dogs using the records from the New Zealand Veterinary Association (NZVA) Canine Hip Dysplasia Scheme database (1991 to 2011). Estimates of heritability and EBV for the NZVA total hip score of individual dogs from the German Shepherd, Labrador Retriever, Golden Retriever and Rottweiler breeds were obtained using restricted maximum likelihood procedures with a within-breed linear animal model. The model included the fixed effects of gender, birth year, birth season, age at scoring and the random effect of animal. The pedigree file included animals recorded between 1990 and 2011. A total of 2,983 NZVA hip score records, from a pedigree of 3,172 animals, were available for genetic evaluation. Genetic trends of the NZVA total hip score were calculated as the regression coefficient of the EBV (weighted by reliabilities) on year of birth. The estimates of heritability for hip score were 0.32 (SE 0.08) in German Shepherd, 0.37 (SE 0.08) in Labrador Retriever, 0.29 (SE 0.08) in Golden Retriever and 0.52 (SE 0.18) in Rottweiler breeds. Genetic trend analysis revealed that only the German Shepherd breed exhibited a genetic trend towards better hip conformation over time, with a decline of 0.13 (SE 0.04) NZVA total hip score units per year (phip score for the remaining three breeds were not significantly different from zero (p>0.1). Despite moderate heritability of the NZVA total hip score, there has not been substantial improvement of this trait for the four breeds analysed in the study period. Greater improvement in reducing the prevalence of canine hip dysplasia may be possible if screening were to be compulsory as a requirement for registration of pedigree breeding stock, greater selection pressure were to be applied and selection of breeding stock made on the basis on an individual's EBV rather than the NZVA
Oudyn, Frederik W; Lyons, David J; Pringle, M J
2012-01-01
Many scientific laboratories follow, as standard practice, a relatively short maximum holding time (within 7 days) for the analysis of total suspended solids (TSS) in environmental water samples. In this study we have subsampled from bulk water samples stored at ∼4 °C in the dark, then analysed for TSS at time intervals up to 105 days after collection. The nonsignificant differences in TSS results observed over time demonstrates that storage at ∼4 °C in the dark is an effective method of preserving samples for TSS analysis, far past the 7-day standard practice. Extending the maximum holding time will ease the pressure on sample collectors and laboratory staff who until now have had to determine TSS within an impractically short period.
Del Brutto, Victor J; Ortiz, Jorge G; Del Brutto, Oscar H; Mera, Robertino M; Zambrano, Mauricio; Biller, José
2017-05-26
Cerebral small vessel disease (SVD) is inversely associated with cognitive performance. However, whether the total SVD score is a better predictor of poor cognitive performance than individual signatures of SVD is inconclusive. We aimed to estimate the combined and independent predictive power of these MRI findings. Atahualpa residents aged ≥60 years underwent brain MRI. Cognitive performance was measured by the Montreal Cognitive Assessment (MoCA). The presence of moderate-to-severe white matter hyperintensities, deep cerebral microbleeds, lacunar infarcts, and >10 enlarged perivascular spaces was added for estimating the total SVD score ranging from 0 to 4 points. Montreal Cognitive Assessment predictive models were fitted to assess how well the total SVD score or each of its components predicts cognitive performance. Of 351 eligible candidates, 331 (94%) were included. The total SVD score was 0 points in 202 individuals (61%), 1 point in 67 (20%), 2 points in 40 (12%), 3 points in 15 (5%), and 4 points in seven (2%). A generalized lineal model showed an inverse relationship between the total SVD score and the MoCA (p = 0.015). The proportion of variance in the MoCA score explained by the SVD score was 32.8% (R(2) = 0.328). This predictive power was similar for white matter hyperintensities (R(2) = 0.306), microbleeds (R(2) = 0.313), lacunar infarcts (R(2) = 0.323), and perivascular spaces (R(2) = 0.313). This study shows a significant association between the SVD score and worse cognitive performance. The SVD score is a predictor of poor cognitive performance. This predictive power is not better than that of isolated neuroimaging signatures of SVD. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Renske Uiterwijk
2016-12-01
Full Text Available Objectives: Hypertension is a major risk factor for white matter hyperintensities, lacunes, cerebral microbleeds and perivascular spaces, which are MRI markers of cerebral small vessel disease (SVD. Studies have shown associations between these individual MRI markers and cognitive functioning and decline. Recently, a total SVD score was proposed in which the different MRI markers were combined into one measure of SVD, to capture total SVD-related brain damage. We investigated if this SVD score was associated with cognitive decline over 4 years in patients with hypertension. Methods: In this longitudinal cohort study, 130 hypertensive patients (91 patients with uncomplicated hypertension and 39 hypertensive patients with a lacunar stroke were included. They underwent a neuropsychological assessment at baseline and after 4 years. The presence of white matter hyperintensities, lacunes, cerebral microbleeds, and perivascular spaces were rated on baseline MRI. Presence of each individual marker was added to calculate the total SVD score (range 0-4 in each patient. Results: Uncorrected linear regression analyses showed associations between SVD score and decline in overall cognition (p=0.017, executive functioning (p<0.001 and information processing speed (p=0.037, but not with memory (p=0.911. The association between SVD score and decline in overall cognition and executive function remained significant after adjustment for age, sex, education, anxiety and depression score, potential vascular risk factors, patient group and baseline cognitive performance.Conclusions: Our study shows that a total SVD score can predict cognitive decline, specifically in executive function, over 4 years in hypertensive patients. This emphasizes the importance of considering total brain damage due to SVD.
Namazi, Mohammad Hasan; Serati, Ali Reza; Vakili, Hosein; Safi, Morteza; Parsa, Saeed Ali Pour; Saadat, Habibollah; Taherkhani, Maryam; Emami, Sepideh; Pedari, Shamseddin; Vatanparast, Masoomeh; Movahed, Mohammad Reza
2017-06-01
Total occlusion of a coronary artery for more than 3 months is defined as chronic total occlusion (CTO). The goal of this study was to develop a risk score in predicting failure or success during attempted percutaneous coronary intervention (PCI) of CTO lesions using antegrade approach. This study was based on retrospective analyses of clinical and angiographic characteristics of CTO lesions that were assessed between February 2012 and February 2014. Success rate was defined as passing through occlusion with successful stent deployment using an antegrade approach. A total of 188 patients were studied. Mean ± SD age was 59 ± 9 years. Failure rate was 33%. In a stepwise multivariate regression analysis, bridging collaterals (OR = 6.7, CI = 1.97-23.17, score = 2), absence of stump (OR = 5.8, CI = 1.95-17.9, score = 2), presence of calcification (OR = 3.21, CI = 1.46-7.07, score = 1), presence of bending (OR = 2.8, CI = 1.28-6.10, score = 1), presence of near side branch (OR = 2.7, CI = 1.08-6.57, score = 1), and absence of retrograde filling (OR = 2.5, CI = 1.03-6.17, score = 1) were independent predictors of PCI failure. A score of 7 or more was associated with 100% failure rate whereas a score of 2 or less was associated with over 80% success rate. Most factors associated with failure of CTO-PCI are related to lesion characteristics. A new risk score (range 0-8) is developed to predict CTO-PCI success or failure rate during antegrade approach as a guide before attempting PCI of CTO lesions.
Thomsen, Morten G; Latifi Yaghin, Roshan; Kallemose, Thomas;
2016-01-01
. We investigated the validity and reliability of the FJS. Patients and methods - A Danish version of the FJS questionnaire was created according to internationally accepted standards. 360 participants who underwent primary TKA were invited to participate in the study. Of these, 315 were included...... in a validity study and 150 in a reliability study. Correlation between the Oxford knee score (OKS) and the FJS was examined and test-retest evaluation was performed. A ceiling effect was defined as participants reaching a score within 15% of the maximum achievable score. Results - The validity study revealed...... of the FJS (ICC? 0.79). We found a high level of internal consistency (Cronbach's? = 0.96). The ceiling effect for the FJS was 16%, as compared to 37% for the OKS. Interpretation - The FJS showed good construct validity and test-retest reliability. It had a lower ceiling effect than the OKS. The FJS appears...
Christopoulos, Georgios; Kandzari, David E; Yeh, Robert W; Jaffer, Farouc A; Karmpaliotis, Dimitri; Wyman, Michael R; Alaswad, Khaldoon; Lombardi, William; Grantham, J Aaron; Moses, Jeffrey; Christakopoulos, Georgios; Tarar, Muhammad Nauman J; Rangan, Bavana V; Lembo, Nicholas; Garcia, Santiago; Cipher, Daisha; Thompson, Craig A; Banerjee, Subhash; Brilakis, Emmanouil S
2016-01-11
This study sought to develop a novel parsimonious score for predicting technical success of chronic total occlusion (CTO) percutaneous coronary intervention (PCI) performed using the hybrid approach. Predicting technical success of CTO PCI can facilitate clinical decision making and procedural planning. We analyzed clinical and angiographic parameters from 781 CTO PCIs included in PROGRESS CTO (Prospective Global Registry for the Study of Chronic Total Occlusion Intervention) using a derivation and validation cohort (2:1 sampling ratio). Variables with strong association with technical success in multivariable analysis were assigned 1 point, and a 4-point score was developed from summing all points. The PROGRESS CTO score was subsequently compared with the J-CTO (Multicenter Chronic Total Occlusion Registry in Japan) score in the validation cohort. Technical success was 92.9%. On multivariable analysis, factors associated with technical success included proximal cap ambiguity (beta coefficient [b] = 0.88), moderate/severe tortuosity (b = 1.18), circumflex artery CTO (b = 0.99), and absence of "interventional" collaterals (b = 0.88). The resulting score demonstrated good calibration and discriminatory capacity in the derivation (Hosmer-Lemeshow chi-square = 2.633; p = 0.268, and receiver-operator characteristic [ROC] area = 0.778) and validation (Hosmer-Lemeshow chi-square = 5.333; p = 0.070, and ROC area = 0.720) subset. In the validation cohort, the PROGRESS CTO and J-CTO scores performed similarly in predicting technical success (ROC area 0.720 vs. 0.746, area under the curve difference = 0.026, 95% confidence interval = -0.093 to 0.144). The PROGRESS CTO score is a novel useful tool for estimating technical success in CTO PCI performed using the hybrid approach. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Nilsdotter, Anna K; Lohmander, L Stefan; Klässbo, Maria
2003-01-01
The aim of the study was to evaluate if physical functions usually associated with a younger population were of importance for an older population, and to construct an outcome measure for hip osteoarthritis with improved responsiveness compared to the Western Ontario McMaster osteoarthritis score...
Kim, Sang M; Brannan, Kevin M; Zeckoski, Rebecca W; Benham, Brian L
2014-01-01
The objective of this study was to develop bacteria total maximum daily loads (TMDLs) for the Hardware River watershed in the Commonwealth of Virginia, USA. The TMDL program is an integrated watershed management approach required by the Clean Water Act. The TMDLs were developed to meet Virginia's water quality standard for bacteria at the time, which stated that the calendar-month geometric mean concentration of Escherichia coli should not exceed 126 cfu/100 mL, and that no single sample should exceed a concentration of 235 cfu/100 mL. The bacteria impairment TMDLs were developed using the Hydrological Simulation Program-FORTRAN (HSPF). The hydrology and water quality components of HSPF were calibrated and validated using data from the Hardware River watershed to ensure that the model adequately simulated runoff and bacteria concentrations. The calibrated and validated HSPF model was used to estimate the contributions from the various bacteria sources in the Hardware River watershed to the in-stream concentration. Bacteria loads were estimated through an extensive source characterization process. Simulation results for existing conditions indicated that the majority of the bacteria came from livestock and wildlife direct deposits and pervious lands. Different source reduction scenarios were evaluated to identify scenarios that meet both the geometric mean and single sample maximum E. coli criteria with zero violations. The resulting scenarios required extreme and impractical reductions from livestock and wildlife sources. Results from studies similar to this across Virginia partially contributed to a reconsideration of the standard's applicability to TMDL development.
Emanuel, Robyn M.; Dueck, Amylou C.; Geyer, Holly L.; Kiladjian, Jean-Jacques; Slot, Stefanie; Zweegman, Sonja; te Boekhorst, Peter A.W.; Commandeur, Suzan; Schouten, Harry C.; Sackmann, Federico; Kerguelen Fuentes, Ana; Hernández-Maraver, Dolores; Pahl, Heike L.; Griesshammer, Martin; Stegelmann, Frank; Doehner, Konstanze; Lehmann, Thomas; Bonatz, Karin; Reiter, Andreas; Boyer, Francoise; Etienne, Gabriel; Ianotto, Jean-Christophe; Ranta, Dana; Roy, Lydia; Cahn, Jean-Yves; Harrison, Claire N.; Radia, Deepti; Muxi, Pablo; Maldonado, Norman; Besses, Carlos; Cervantes, Francisco; Johansson, Peter L.; Barbui, Tiziano; Barosi, Giovanni; Vannucchi, Alessandro M.; Passamonti, Francesco; Andreasson, Bjorn; Ferarri, Maria L.; Rambaldi, Alessandro; Samuelsson, Jan; Birgegard, Gunnar; Tefferi, Ayalew; Mesa, Ruben A.
2012-01-01
Purpose Myeloproliferative neoplasm (MPN) symptoms are troublesome to patients, and alleviation of this burden represents a paramount treatment objective in the development of MPN-directed therapies. We aimed to assess the utility of an abbreviated symptom score for the most pertinent and representative MPN symptoms for subsequent serial use in assessing response to therapy. Patients and Methods The Myeloproliferative Neoplasm Symptom Assessment Form total symptom score (MPN-SAF TSS) was calculated as the mean score for 10 items from two previously validated scoring systems. Questions focus on fatigue, concentration, early satiety, inactivity, night sweats, itching, bone pain, abdominal discomfort, weight loss, and fevers. Results MPN-SAF TSS was calculable for 1,408 of 1,433 patients with MPNs who had a mean score of 21.2 (standard deviation [SD], 16.3). MPN-SAF TSS results significantly differed among MPN disease subtypes (P < .001), with a mean of 18.7 (SD, 15.3), 21.8 (SD, 16.3), and 25.3 (SD, 17.2) for patients with essential thrombocythemia, polycythemia vera, and myelofibrosis, respectively. The MPN-SAF TSS strongly correlated with overall quality of life (QOL; r = 0.59; P < .001) and European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire C30 (EORTC QLQ-C30) functional scales (all P < .001 and absolute r ≥ 0.50 except social functioning r = 0.48). No significant trends were present when comparing therapy subgroups. The MPN-SAF TSS had excellent internal consistency (Cronbach's α = .83). Factor analysis identified a single underlying construct, indicating that the MPN-SAF TSS is an appropriate, unified scoring method. Conclusion The MPN-SAF TSS is a concise, valid, and accurate assessment of MPN symptom burden with demonstrated clinical utility in the largest prospective MPN symptom study to date. This new prospective scoring method may be used to assess MPN symptom burden in both clinical practice and trial settings. PMID
Fitzpatrick, R; Norquist, J M; Jenkinson, C; Reeves, B C; Morris, R W; Murray, D W; Gregg, P J
2004-03-01
The purpose of this study was to examine whether there are advantages in terms of outcome assessment of using Rasch methods of scoring the 12-item Oxford Hip Score (OHS) questionnaire over conventionally Likert scores. As part of a prospective cohort study of total hip replacements in five former regions of England the OHS was sent to patients pre-operatively, at 3 months and 1 year post-operatively. Post-operative data was collected on over 5000 cases. Based on the level of satisfaction with surgery, patients were divided into satisfied and dissatisfied. Analyses were performed to test the relative precision (RP) of Rasch scoring vs. conventionally Likert scores in discriminating the groups experiencing different level of satisfaction. Considerable gains in precision were achieved with Rasch scoring methods when groups were compared 3 and 12 months post-operatively. The results from the current study suggest that in some situations there may be substantial gains in measuring health related outcomes using Rasch-based scoring methods.
Danek, Barbara Anna; Karatasakis, Aris; Karmpaliotis, Dimitri; Alaswad, Khaldoon; Yeh, Robert W; Jaffer, Farouc A; Patel, Mitul P; Mahmud, Ehtisham; Lombardi, William L; Wyman, Michael R; Grantham, J Aaron; Doing, Anthony; Kandzari, David E; Lembo, Nicholas J; Garcia, Santiago; Toma, Catalin; Moses, Jeffrey W; Kirtane, Ajay J; Parikh, Manish A; Ali, Ziad A; Karacsonyi, Judit; Rangan, Bavana V; Thompson, Craig A; Banerjee, Subhash; Brilakis, Emmanouil S
2016-10-11
High success rates are achievable for chronic total occlusion (CTO) percutaneous coronary intervention (PCI) using the hybrid approach, but periprocedural complications remain of concern. Although scores estimating success and efficiency in CTO PCI have been developed, there is currently no available score for estimation of the risk for periprocedural complications. We sought to develop a scoring tool for prediction of periprocedural complications during CTO PCI. We analyzed data from 1569 CTO PCIs in the Prospective Global Registry for the Study of Chronic Total Occlusion Intervention (PROGRESS CTO) using a derivation and validation sampling ratio of 2:1. Variables independently associated with periprocedural complications in multivariable analysis in the derivation set were assigned points based on their respective odds ratios. Forty-four (2.8%) patients experienced complications. Three factors were independent predictors of complications and were included in the score: patient age >65 years, +3 points (odds ratio, OR=4.85, CI 1.82-16.77); lesion length ≥23 mm, +2 points (OR=3.22, CI 1.08-13.89); and use of the retrograde approach +1 point (OR=2.41, CI 1.04-6.05). The resulting score showed good calibration and discriminatory capacity in the derivation (Hosmer-Lemeshow χ(2) 6.271, P=0.281, receiver-operating characteristic [ROC] area=0.758) and validation (Hosmer-Lemeshow χ(2) 4.551, P=0.473, ROC area=0.793) sets. Score values of 0 to 2, 3 to 4, and ≥5 were defined as low, intermediate, and high risk of complications (derivation cohort 0.4%, 1.8%, 6.5%, PCTO complication score is a useful tool for prediction of periprocedural complications in CTO PCI. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02061436. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
P V Pradeep
2013-01-01
Full Text Available Context: No single factor can predict the occurrence of post total thyroidectomy (TT hypocalcemia. Aims: This study was conducted to look at various factors usually implicated in post TT clinically significant hypocalcemia (CSH and to develop a scoring system using a combination of these factors to predict CSH. Settings and Design: Prospective study, tertiary care center. Materials and Methods: 145 patients, who underwent total thyroidectomy for benign goiters and early carcinoma thyroid ( 3 had 91% sensitivity, 84% specificity with a PPV of 71% and NPV of 95%, whereas score of ≥ 4 had 100% specificity and PPV in predicting CSH. Conclusions: CSH after TT is multi-factorial, and a combination of factors (Hypocalcemia prediction score > 3 can be used to predict it so as to discharge patients within 24 hours after surgery.
Thomsen, Morten G; Latifi, Roshan; Kallemose, Thomas
2016-01-01
Background and purpose - When evaluating the outcome after total knee arthroplasty (TKA), increasing emphasis has been put on patient satisfaction and ability to perform activities of daily living. To address this, the forgotten joint score (FJS) for assessment of knee awareness has been developed...
SF-36 total score as a single measure of health-related quality of life: Scoping review
Lins, Liliane; Carvalho, Fernando Martins
2016-01-01
According to the 36-Item Short Form Health Survey questionnaire developers, a global measure of health-related quality of life such as the “SF-36 Total/Global/Overall Score” cannot be generated from the questionnaire. However, studies keep on reporting such measure. This study aimed to evaluate the frequency and to describe some characteristics of articles reporting the SF-36 Total/Global/Overall Score in the scientific literature. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses method was adapted to a scoping review. We performed searches in PubMed, Web of Science, SCOPUS, BVS, and Cochrane Library databases for articles using such scores. We found 172 articles published between 1997 and 2015; 110 (64.0%) of them were published from 2010 onwards; 30.0% appeared in journals with Impact Factor 3.00 or greater. Overall, 129 (75.0%) out of the 172 studies did not specify the method for calculating the “SF-36 Total Score”; 13 studies did not specify their methods but referred to the SF-36 developers’ studies or others; and 30 articles used different strategies for calculating such score, the most frequent being arithmetic averaging of the eight SF-36 domains scores. We concluded that the “SF-36 Total/Global/Overall Score” has been increasingly reported in the scientific literature. Researchers should be aware of this procedure and of its possible impacts upon human health. PMID:27757230
Tan, Yahang; Zhou, Jia; Zhang, Wei; Zhou, Ying; Du, Luoshan; Tian, Feng; Guo, Jun; Chen, Lian; Cao, Feng; Chen, Yundai
2017-05-15
We sought to evaluate the ability of the CT-RECTOR and J-CTO scores to predict time-efficient guidewire (GW) crossing through a chronic total occlusion (CTO) and final procedure success. Data from 191 consecutive CTO lesions with pre-procedural coronary computed tomography angiography (CCTA) from our center were analyzed retrospectively. The difficulty of the procedure was classified as easy, intermediate, difficult, or very difficult according to CT-RECTOR and J-CTO scores. A successful GW crossing within 30min was set as the first endpoint. Final success of the procedure was set as the second endpoint. Receiver operating characteristic curves and net reclassification improvement (NRI) were used to compare the performance of both scores in predicting both endpoints. The first and second endpoints were achieved in 55% and 76% of lesions, respectively. The NRI for prediction for both endpoints were 30.21% and 28.94%, respectively. Use of the CT-RECTOR score demonstrated a positive NRI for both the first (p=0.0027) and second (p=0.0190) endpoints. Compared with the J-CTO score (area under the curve: 0.76), the CT-RECTOR score (area under the curve: 0.85) yielded a higher predictive value for successful GW crossing within 30min (p=0.0018). Compared with J-CTO, the CT-RECTOR scoring system provides a more accurate noninvasive tool for predicting time-efficient GW crossing and final procedure success. This scoring system, which is based on CCTA, may aid in the identification of very difficult CTO lesions and downstream management. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Harry X.ZHANG; Shaw L.YU
2008-01-01
One of the key challenges in the total max-imum daily load (TMDL) development process is how to define the critical condition for a receiving water-body. The main concern in using a continuous simu-lation approach is the absence of any guarantee that the most critical condition will be captured during the selected representative hydrologic period, given the scar-city of long-term continuous data. The objectives of this paper are to clearly address the critical condition in the TMDL development process and to compare continu-ous and evEnt-based approaches in defining critical con-dition during TMDL development for a waterbody impacted by both point and nonpoint source pollution. A practical, event-based critical flow-storm (CFS) approach was developed to explicitly addresses the crit-ical condition as a combination of a low stream flow and a storm event of a selected magnitude, both having cer-tain frequencies of occurrence. This paper illustrated the CFS concept and provided its theoretical basis using a derived analytical conceptual model. The CFS approach clearly defined a critical condition, obtained reasonable results and could be considered as an alternative method in TMDL development.
Cassidy, J Tristan; Phillips, Michael; Fatovich, Daniel; Duke, Janine; Edgar, Dale; Wood, Fiona
2014-08-01
There is limited research validating the injury severity score (ISS) in burns. We examined the concordance of ISS with burn mortality. We hypothesized that combining age and total body surface area (TBSA) burned to the ISS gives a more accurate mortality risk estimate. Data from the Royal Perth Hospital Trauma Registry and the Royal Perth Hospital Burns Minimum Data Set were linked. Area under the receiver operating characteristic curve (AUC) measured concordance of ISS with mortality. Using logistic regression models with death as the dependent variable we developed a burn-specific injury severity score (BISS). There were 1344 burns with 24 (1.8%) deaths, median TBSA 5% (IQR 2-10), and median age 36 years (IQR 23-50). The results show ISS is a good predictor of death for burns when ISS≤15 (OR 1.29, p=0.02), but not for ISS>15 (ISS 16-24: OR 1.09, p=0.81; ISS 25-49: OR 0.81, p=0.19). Comparing the AUCs adjusted for age, gender and cause, ISS of 84% (95% CI 82-85%) and BISS of 95% (95% CI 92-98%), demonstrated superior performance of BISS as a mortality predictor for burns. ISS is a poor predictor of death in severe burns. The BISS combines ISS with age and TBSA and performs significantly better than the ISS. Copyright © 2013 Elsevier Ltd and ISBI. All rights reserved.
Kim, Man S; Koh, In J; Choi, Young J; Lee, Jong Y; In, Yong
2017-05-01
The purpose of this study was to compare the patient-reported outcomes regarding joint awareness, function, and satisfaction after unicompartmental knee arthroplasty (UKA) and total knee arthroplasty (TKA). We identified all patients who underwent a UKA or TKA at our institution between September 2011 and March 2014, with a minimum follow-up of 2 years. Propensity score matching was performed for age, gender, body mass index, operation side, and the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) score. One hundred UKAs to 100 TKAs were matched. Each knee was evaluated according to the WOMAC score, Forgotten Joint Score (FJS), High Flexion Knee Score (HFKS) and patient's satisfaction at postoperative 2 years. There was no significant difference in WOMAC score at postoperative 2 years between UKA and TKA groups. However, the FJS of the UKA group was significantly higher than that of the TKA group (67.3 ± 19.8 and 60.6 ± 16.6, respectively; P = .011). The HFKS was also significantly higher in the UKA group compared with the TKA group (34.4 ± 6.4 and 31.3 ± 5.2, respectively; P < .001). Eighty-six percent of all patients who underwent UKA were satisfied compared with 71% of those who underwent TKA (P = .027). Patients who underwent UKA had higher FJS, HFKS, and satisfaction rate when compared with patients who underwent TKA, indicating that UKA facilitated less knee awareness and better function and satisfaction than TKA. Copyright © 2016 Elsevier Inc. All rights reserved.
Christopoulos, Georgios; Wyman, R. Michael; Alaswad, Khaldoon; Karmpaliotis, Dimitri; Lombardi, William; Grantham, J. Aaron; Yeh, Robert W.; Jaffer, Farouc A.; Cipher, Daisha J.; Rangan, Bavana V.; Christakopoulos, Georgios E.; Kypreos, Megan A.; Lembo, Nicholas; Kandzari, David; Garcia, Santiago; Thompson, Craig A.; Banerjee, Subhash; Brilakis, Emmanouil S.
2015-01-01
Background The performance of the J-CTO score in predicting success and efficiency of chronic total occlusion (CTO) percutaneous coronary intervention (PCI) has received limited study. Methods and Results We examined the records of 650 consecutive patients who underwent CTO PCI between 2011 and 2014 at six experienced centers in the United States. Six hundred and fifty-seven lesions were classified as easy (J-CTO=0), intermediate (J-CTO=1), difficult (J-CTO=2), and very difficult (J-CTO≥3). The impact of the J-CTO score on technical success and procedure time was evaluated with univariable logistic and linear regression, respectively. The performance of the logistic regression model was assessed with the Hosmer-Lemeshow statistic and receiver operator characteristic curves. Antegrade wiring techniques were used more frequently in easy lesions (97%) than very difficult lesions (58%), whereas the retrograde approach became less frequent with increased lesion difficulty (41% for very difficult lesions vs. 13% for easy lesions). The logistic regression model for technical success demonstrated satisfactory calibration and discrimination (p for Hosmer-Lemeshow=0.743 and area under curve=0.705). The J-CTO score was associated with a two-fold increase in the odds of technical failure (odds ratio 2.04, 95% confidence interval [95% CI] 1.52-2.80, pCTO score (regression coefficient 22.33, 95% CI 17.45-27.22, pCTO score was strongly associated with final success and efficiency in this study, supporting its expanded use in CTO interventions. Clinical Trial Registration URL: http://www.clinicaltrials.gov. Unique identifier: NCT02061436. PMID:26162857
Rounds, Stewart A.; Sullivan, Annett B.
2013-01-01
Flow and water-quality models are being used to support the development of Total Maximum Daily Load (TMDL) plans for the Klamath River downstream of Upper Klamath Lake (UKL) in south-central Oregon. For riverine reaches, the RMA-2 and RMA-11 models were used, whereas the CE-QUAL-W2 model was used to simulate pooled reaches. The U.S. Geological Survey (USGS) was asked to review the most upstream of these models, from Link River Dam at the outlet of UKL downstream through the first pooled reach of the Klamath River from Lake Ewauna to Keno Dam. Previous versions of these models were reviewed in 2009 by USGS. Since that time, important revisions were made to correct several problems and address other issues. This review documents an assessment of the revised models, with emphasis on the model revisions and any remaining issues. The primary focus of this review is the 19.7-mile Lake Ewauna to Keno Dam reach of the Klamath River that was simulated with the CE-QUAL-W2 model. Water spends far more time in the Lake Ewauna to Keno Dam reach than in the 1-mile Link River reach that connects UKL to the Klamath River, and most of the critical reactions affecting water quality upstream of Keno Dam occur in that pooled reach. This model review includes assessments of years 2000 and 2002 current conditions scenarios, which were used to calibrate the model, as well as a natural conditions scenario that was used as the reference condition for the TMDL and was based on the 2000 flow conditions. The natural conditions scenario included the removal of Keno Dam, restoration of the Keno reef (a shallow spot that was removed when the dam was built), removal of all point-source inputs, and derivation of upstream boundary water-quality inputs from a previously developed UKL TMDL model. This review examined the details of the models, including model algorithms, parameter values, and boundary conditions; the review did not assess the draft Klamath River TMDL or the TMDL allocations
A Reconsideration of the Self-Compassion Scale’s Total Score: Self-Compassion versus Self-Criticism
López, Angélica; Sanderman, Robbert; Smink, Ans; Zhang, Ying; van Sonderen, Eric; Ranchor, Adelita; Schroevers, Maya J.
2015-01-01
The Self-Compassion Scale (SCS) is currently the only self-report instrument to measure self-compassion. The SCS is widely used despite the limited evidence for the scale’s psychometric properties, with validation studies commonly performed in college students. The current study examined the factor structure, reliability, and construct validity of the SCS in a large representative sample from the community. The study was conducted in 1,736 persons, of whom 1,643 were included in the analyses. Besides the SCS, data was collected on positive and negative indicators of psychological functioning, as well as on rumination and neuroticism. Analyses included confirmatory factor analyses (CFA), exploratory factor analyses (EFA), and correlations. CFA showed that the SCS’s proposed six-factor structure could not be replicated. EFA suggested a two-factor solution, formed by the positively and negatively formulated items respectively. Internal consistency was good for the two identified factors. The negative factor (i.e., sum score of the negatively formulated items) correlated moderately to strongly to negative affect, depressive symptoms, perceived stress, as well as to rumination and neuroticism. Compared to this negative factor, the positive factor (i.e., sum score of the positively formulated items) correlated weaker to these indicators, and relatively more strongly to positive affect. Results from this study do not justify the common use of the SCS total score as an overall indicator of self-compassion, and provide support for the idea, as also assumed by others, that it is important to make a distinction between self-compassion and self-criticism. PMID:26193654
A Reconsideration of the Self-Compassion Scale's Total Score: Self-Compassion versus Self-Criticism.
López, Angélica; Sanderman, Robbert; Smink, Ans; Zhang, Ying; van Sonderen, Eric; Ranchor, Adelita; Schroevers, Maya J
2015-01-01
The Self-Compassion Scale (SCS) is currently the only self-report instrument to measure self-compassion. The SCS is widely used despite the limited evidence for the scale's psychometric properties, with validation studies commonly performed in college students. The current study examined the factor structure, reliability, and construct validity of the SCS in a large representative sample from the community. The study was conducted in 1,736 persons, of whom 1,643 were included in the analyses. Besides the SCS, data was collected on positive and negative indicators of psychological functioning, as well as on rumination and neuroticism. Analyses included confirmatory factor analyses (CFA), exploratory factor analyses (EFA), and correlations. CFA showed that the SCS's proposed six-factor structure could not be replicated. EFA suggested a two-factor solution, formed by the positively and negatively formulated items respectively. Internal consistency was good for the two identified factors. The negative factor (i.e., sum score of the negatively formulated items) correlated moderately to strongly to negative affect, depressive symptoms, perceived stress, as well as to rumination and neuroticism. Compared to this negative factor, the positive factor (i.e., sum score of the positively formulated items) correlated weaker to these indicators, and relatively more strongly to positive affect. Results from this study do not justify the common use of the SCS total score as an overall indicator of self-compassion, and provide support for the idea, as also assumed by others, that it is important to make a distinction between self-compassion and self-criticism.
A Reconsideration of the Self-Compassion Scale's Total Score: Self-Compassion versus Self-Criticism.
Angélica López
Full Text Available The Self-Compassion Scale (SCS is currently the only self-report instrument to measure self-compassion. The SCS is widely used despite the limited evidence for the scale's psychometric properties, with validation studies commonly performed in college students. The current study examined the factor structure, reliability, and construct validity of the SCS in a large representative sample from the community. The study was conducted in 1,736 persons, of whom 1,643 were included in the analyses. Besides the SCS, data was collected on positive and negative indicators of psychological functioning, as well as on rumination and neuroticism. Analyses included confirmatory factor analyses (CFA, exploratory factor analyses (EFA, and correlations. CFA showed that the SCS's proposed six-factor structure could not be replicated. EFA suggested a two-factor solution, formed by the positively and negatively formulated items respectively. Internal consistency was good for the two identified factors. The negative factor (i.e., sum score of the negatively formulated items correlated moderately to strongly to negative affect, depressive symptoms, perceived stress, as well as to rumination and neuroticism. Compared to this negative factor, the positive factor (i.e., sum score of the positively formulated items correlated weaker to these indicators, and relatively more strongly to positive affect. Results from this study do not justify the common use of the SCS total score as an overall indicator of self-compassion, and provide support for the idea, as also assumed by others, that it is important to make a distinction between self-compassion and self-criticism.
R.M. Emanuel (Robyn); A.C. Dueck (Amylou); H.L. Geyer (Holly); J.J. Kiladjian; S. Slot (Stefanie); S. Zweegman (Sonja); P.A.W. te Boekhorst (Peter); S. Commandeur (Suzan); H. Schouten (Harry); F. Sackmann (Federico); A.K. Fuentes (Ana Kerguelen); D. Hernández-Maraver (Dolores); C. Pahl (Clemens); M. Griesshammer (Martin); F. Stegelmann (Frank); K. Doehner (Konstanze); T. Lehmann (Thomas); K. Bonatz (Karin); A. Reiter (Alfred); F. Boyer (Francoise); J. Etienne (Jerome); J.-C. Ianotto (Jean-Christophe); D. Ranta (Dana); L. Roy (Lydia); J.-Y. Cahn (Jean-Yves); C.N. Harrison (Claire); D. Radia (Deepti); P. Muxi (Pablo); N. Maldonado (Norman); C. Besses (Carlos); F. Cervantes (Francisco); P.L. Johansson (Peter); T. Barbui (Tiziano); G. Barosi (Giovanni); A.M. Vannucchi (Alessandro); F. Passamonti (Francesco); B. Andreasson (Bjorn); M.L. Ferarri (Maria); A. Rambaldi (Alessandro); J. Samuelsson (Jan); G. Birgegard (Gunnar); A. Tefferi (Ayalew); A.A. Mesa
2012-01-01
textabstractPurpose: Myeloproliferative neoplasm (MPN) symptoms are troublesome to patients, and alleviation of this burden represents a paramount treatment objective in the development of MPN-directed therapies. We aimed to assess the utility of an abbreviated symptom score for the most pertinent a
R.M. Emanuel (Robyn); A.C. Dueck (Amylou); H.L. Geyer (Holly); J.J. Kiladjian; S. Slot (Stefanie); S. Zweegman (Sonja); P.A.W. te Boekhorst (Peter); S. Commandeur (Suzan); H. Schouten (Harry); F. Sackmann (Federico); A.K. Fuentes (Ana Kerguelen); D. Hernández-Maraver (Dolores); C. Pahl (Clemens); M. Griesshammer (Martin); F. Stegelmann (Frank); K. Doehner (Konstanze); T. Lehmann (Thomas); K. Bonatz (Karin); A. Reiter (Alfred); F. Boyer (Francoise); J. Etienne (Jerome); J.-C. Ianotto (Jean-Christophe); D. Ranta (Dana); L. Roy (Lydia); J.-Y. Cahn (Jean-Yves); C.N. Harrison (Claire); D. Radia (Deepti); P. Muxi (Pablo); N. Maldonado (Norman); C. Besses (Carlos); F. Cervantes (Francisco); P.L. Johansson (Peter); T. Barbui (Tiziano); G. Barosi (Giovanni); A.M. Vannucchi (Alessandro); F. Passamonti (Francesco); B. Andreasson (Bjorn); M.L. Ferarri (Maria); A. Rambaldi (Alessandro); J. Samuelsson (Jan); G. Birgegard (Gunnar); A. Tefferi (Ayalew); A.A. Mesa
2012-01-01
textabstractPurpose: Myeloproliferative neoplasm (MPN) symptoms are troublesome to patients, and alleviation of this burden represents a paramount treatment objective in the development of MPN-directed therapies. We aimed to assess the utility of an abbreviated symptom score for the most pertinent
Perfetti, Dean C; Schwarzkopf, Ran; Buckland, Aaron J; Paulino, Carl B; Vigdorchik, Jonathan M
2017-05-01
Lumbar-pelvic fusion reduces the variation in pelvic tilt in functional situations by reducing lumbar spine flexibility, which is thought to be important in maintaining stability of a total hip arthroplasty (THA). We compared dislocation and revision rates for patients with lumbar fusion and subsequent THA to a matched comparison cohort with hip and spine degenerative changes undergoing only THA. We identified patients in New York State who underwent primary elective lumbar fusion for degenerative disc disease pathology and subsequent THA between January 2005 and December 2012. A propensity score match was performed to compare 934 patients with prior lumbar fusion to 934 patients with only THA according to age, gender, race, Deyo comorbidity score, year of surgery, and surgeon volume. Revision and dislocation rates were assessed at 3, 6, and 12 months post-THA. At 12 months, patients with prior lumbar fusion had significantly increased rates of THA dislocation (control: 0.4%; fusion: 3.0%; P < .001) and revision (control: 0.9%; fusion: 3.9%; P < .001). At 12 months, fusion patients were 7.19 times more likely to dislocate their THA (P < .001) and 4.64 times more likely to undergo revision (P < .001). Patients undergoing lumbar fusion and subsequent THA have significantly higher risks of dislocation and revision of their hip arthroplasty than a matched cohort of patients with similar hip and spine pathology but only undergoing THA. During preoperative consultation for patients with prior lumbar fusion, orthopedic surgeons must educate the patient and family about the increased risk of dislocation and revision. Copyright © 2016 Elsevier Inc. All rights reserved.
Cesaroni, Claudio; Spogli, Luca; Alfonsi, Lucilla; De Franceschi, Giorgiana; Ciraolo, Luigi; Francisco Galera Monico, Joao; Scotto, Carlo; Romano, Vincenzo; Aquino, Marcio; Bougard, Bruno
2015-12-01
This work presents a contribution to the understanding of the ionospheric triggering of L-band scintillation in the region over São Paulo state in Brazil, under high solar activity. In particular, a climatological analysis of Global Navigation Satellite Systems (GNSS) data acquired in 2012 is presented to highlight the relationship between intensity and variability of the total electron content (TEC) gradients and the occurrence of ionospheric scintillation. The analysis is based on the GNSS data acquired by a dense distribution of receivers and exploits the integration of a dedicated TEC calibration technique into the Ground Based Scintillation Climatology (GBSC), previously developed at the Istituto Nazionale di Geofisica e Vulcanologia. Such integration enables representing the local ionospheric features through climatological maps of calibrated TEC and TEC gradients and of amplitude scintillation occurrence. The disentanglement of the contribution to the TEC variations due to zonal and meridional gradients conveys insight into the relation between the scintillation occurrence and the morphology of the TEC variability. The importance of the information provided by the TEC gradients variability and the role of the meridional TEC gradients in driving scintillation are critically described.
de Castro-Filho, Antonio; Lamas, Edgar Stroppa; Meneguz-Moreno, Rafael A; Staico, Rodolfo; Siqueira, Dimytri; Costa, Ricardo A; Braga, Sergio N; Costa, J Ribamar; Chamié, Daniel; Abizaid, Alexandre
2017-06-01
The present study examined the association between Multicenter CTO Registry in Japan (J-CTO) score in predicting failure of percutaneous coronary intervention (PCI) correlating with the estimated duration of chronic total occlusion (CTO). The J-CTO score does not incorporate estimated duration of the occlusion. This was an observational retrospective study that involved all consecutive procedures performed at a single tertiary-care cardiology center between January 2009 and December 2014. A total of 174 patients, median age 59.5 years (interquartile range [IQR], 53-65 years), undergoing CTO-PCI were included. The median estimated occlusion duration was 7.5 months (IQR, 4.0-12.0 months). The lesions were classified as easy (score = 0), intermediate (score = 1), difficult (score = 2), and very difficult (score ≥3) in 51.1%, 33.9%, 9.2%, and 5.7% of the patients, respectively. Failure rate significantly increased with higher J-CTO score (7.9%, 20.3%, 50.0%, and 70.0% in groups with J-CTO scores of 0, 1, 2, and ≥3, respectively; PJ-CTO score predicted failure of CTO-PCI independently of the estimated occlusion duration (P=.24). Areas under receiver-operating characteristic curves were computed and it was observed that for each occlusion time period, the discriminatory capacity of the J-CTO score in predicting CTO-PCI failure was good, with a C-statistic >0.70. The estimated duration of occlusion had no influence on the J-CTO score performance in predicting failure of PCI in CTO lesions. The probability of failure was mainly determined by grade of lesion complexity.
Hazel Denzil Dias
2014-06-01
Full Text Available Background: Falls are a major problem in the elderly leading to increased morbidity and mortality in this population. Scores from objective clinical measures of balance have frequently been associated with falls in older adults. The Berg Balance Score (BBS which is a frequently used scale to test balance impairments in the elderly ,takes time to perform and has been found to have scoring inconsistencies. The purpose was to determine if individual items or a group of BBS items would have better accuracy than the total BBS in classifying community dwelling elderly individuals according to fall history. Method: 60 community dwelling elderly individuals were chosen based on a history of falls in this cross sectional study. Each BBS item was dichotomized at three points along the scoring scale of 0 – 4: between scores of 1 and 2, 2 and 3, and 3 and 4. Sensitivity (Sn, specificity (Sp, and positive (+LR and negative (-LR likelihood ratios were calculated for all items for each scoring dichotomy based on their accuracy in classifying subjects with a history of multiple falls. These findings were compared with the total BBS score where the cut-off score was derived from receiver operating characteristic curve analysis. Results: On analysing a combination of BBS items, B9 and B11 were found to have the best sensitivity and specificity when considered together. However the area under the curve of these items was 0.799 which did not match that of the total score (AUC= 0.837. A, combination of 4 BBS items - B9 B11 B12 and B13 also had good Sn and Sp but the AUC was 0.815. The combination with the AUC closest to that of the total score was a combination items B11 and B13. (AUC= 0.824. hence these two items can be used as the best predictor of falls with a cut off of 6.5 The ROC curve of the Total Berg balance Scale scores revealed a cut off score of 48.5. Conclusion: This study showed that combination of items B11 and B13 may be best predictors of falls in
Hazel Denzil Dias
2014-09-01
Full Text Available Background: Falls are a major problem in the elderly leading to increased morbidity and mortality in this population. Scores from objective clinical measures of balance have frequently been associated with falls in older adults. The Berg Balance Score (BBS which is a frequently used scale to test balance impairments in the elderly ,takes time to perform and has been found to have scoring inconsistencies. The purpose was to determine if individual items or a group of BBS items would have better accuracy than the total BBS in classifying community dwelling elderly individuals according to fall history. Method: 60 community dwelling elderly individuals were chosen based on a history of falls in this cross sectional study. Each BBS item was dichotomized at three points along the scoring scale of 0 – 4: between scores of 1 and 2, 2 and 3, and 3 and 4. Sensitivity (Sn, specificity (Sp, and positive (+LR and negative (-LR likelihood ratios were calculated for all items for each scoring dichotomy based on their accuracy in classifying subjects with a history of multiple falls. These findings were compared with the total BBS score where the cut-off score was derived from receiver operating characteristic curve analysis. Results: On analysing a combination of BBS items, B9 and B11 were found to have the best sensitivity and specificity when considered together. However the area under the curve of these items was 0.799 which did not match that of the total score (AUC= 0.837. A, combination of 4 BBS items - B9 B11 B12 and B13 also had good Sn and Sp but the AUC was 0.815. The combination with the AUC closest to that of the total score was a combination items B11 and B13. (AUC= 0.824. hence these two items can be used as the best predictor of falls with a cut off of 6.5 The ROC curve of the Total Berg balance Scale scores revealed a cut off score of 48.5. Conclusion: This study showed that combination of items B11 and B13 may be best predictors of falls in
Metsna, Vahur; Vorobjov, Sigrid; Lepik, Katrin; Märtson, Aare
2014-08-01
Attempts to relate patellar cartilage involvement to anterior knee pain (AKP) have yielded conflicting results. We determined whether the condition of the cartilage of the patella at the time of knee replacement, as assessed by the OARSI score, correlates with postsurgical AKP. We prospectively studied 100 patients undergoing knee arthroplasty. At surgery, we photographed and biopsied the articular surface of the patella, leaving the patella unresurfaced. Following determination of the microscopic grade of the patellar cartilage lesion and the stage by analyzing the intraoperative photographs, we calculated the OARSI score. We interviewed the patients 1 year after knee arthroplasty using the HSS patella score for diagnosis of AKP. 57 of 95 patients examined had AKP. The average OARSI score of painless patients was 13 (6-20) and that of patients with AKP was 15 (6-20) (p = 0.04). Patients with OARSI scores of 13-24 had 50% higher risk of AKP (prevalence ratio = 1.5, 95% CI: 1.0-2.3) than patients with OARSI scores of 0-12. The depth and extent of the cartilage lesion of the knee-cap should be considered when deciding between the various options for treatment of the patella during knee replacement.
NSGIC State | GIS Inventory — Environmental Modeling dataset current as of 1999. Florida Adopted TMDLs. What is a TMDL (Total Maximum Daily Load)? A scientific determination of the maximum amount...
Tomitaka, Shinichiro; Kawasaki, Yohei; Ide, Kazuki; Akutagawa, Maiko; Yamada, Hiroshi; Yutaka, Ono; Furukawa, Toshiaki A
2017-01-01
Several recent studies have shown that total scores on depressive symptom measures in a general population approximate an exponential pattern except for the lower end of the distribution. Furthermore, we confirmed that the exponential pattern is present for the individual item responses on the Center for Epidemiologic Studies Depression Scale (CES-D). To confirm the reproducibility of such findings, we investigated the total score distribution and item responses of the Kessler Screening Scale for Psychological Distress (K6) in a nationally representative study. Data were drawn from the National Survey of Midlife Development in the United States (MIDUS), which comprises four subsamples: (1) a national random digit dialing (RDD) sample, (2) oversamples from five metropolitan areas, (3) siblings of individuals from the RDD sample, and (4) a national RDD sample of twin pairs. K6 items are scored using a 5-point scale: "none of the time," "a little of the time," "some of the time," "most of the time," and "all of the time." The pattern of total score distribution and item responses were analyzed using graphical analysis and exponential regression model. The total score distributions of the four subsamples exhibited an exponential pattern with similar rate parameters. The item responses of the K6 approximated a linear pattern from "a little of the time" to "all of the time" on log-normal scales, while "none of the time" response was not related to this exponential pattern. The total score distribution and item responses of the K6 showed exponential patterns, consistent with other depressive symptom scales.
Jin, Ying; Myers, Nicholas D.; Ahn, Soyeon
2014-01-01
Previous research has demonstrated that differential item functioning (DIF) methods that do not account for multilevel data structure could result in too frequent rejection of the null hypothesis (i.e., no DIF) when the intraclass correlation coefficient (?) of the studied item was the same as the ? of the total score. The current study extended…
Jin, Ying; Myers, Nicholas D.; Ahn, Soyeon
2014-01-01
Previous research has demonstrated that differential item functioning (DIF) methods that do not account for multilevel data structure could result in too frequent rejection of the null hypothesis (i.e., no DIF) when the intraclass correlation coefficient (?) of the studied item was the same as the ? of the total score. The current study extended…
Nevens, Daan; Deschuymer, Sarah; Langendijk, Johannes A.; Daisne, Jean -Francois; Duprez, Frederic; De Neve, Wilfried; Nuyts, Sandra
Background and purpose: A risk model, the total dysphagia risk score (TDRS), was developed to predict which patients are most at risk to develop grade >= 2 dysphagia at 6 months following radiotherapy (RT) for head and neck cancer. The purpose of this study was to validate this model at 6 months and
Barnett, Gillian C., E-mail: gillbarnett@doctors.org.uk [University of Cambridge Department of Oncology, Oncology Centre, Cambridge (United Kingdom); Cancer Research-UK Centre for Genetic Epidemiology and Department of Oncology, Strangeways Research Laboratories, Cambridge (United Kingdom); West, Catharine M.L. [School of Cancer and Enabling Sciences, Manchester Academic Health Science Centre, University of Manchester, Christie Hospital, Manchester (United Kingdom); Coles, Charlotte E. [University of Cambridge Department of Oncology, Oncology Centre, Cambridge (United Kingdom); Pharoah, Paul D.P. [Cancer Research-UK Centre for Genetic Epidemiology and Department of Oncology, Strangeways Research Laboratories, Cambridge (United Kingdom); Talbot, Christopher J. [Department of Genetics, University of Leicester, Leicester (United Kingdom); Elliott, Rebecca M. [School of Cancer and Enabling Sciences, Manchester Academic Health Science Centre, University of Manchester, Christie Hospital, Manchester (United Kingdom); Tanteles, George A. [Department of Clinical Genetics, University Hospitals of Leicester, Leicester (United Kingdom); Symonds, R. Paul [Department of Cancer Studies and Molecular Medicine, University Hospitals of Leicester, Leicester (United Kingdom); Wilkinson, Jennifer S. [University of Cambridge Department of Oncology, Oncology Centre, Cambridge (United Kingdom); Dunning, Alison M. [Cancer Research-UK Centre for Genetic Epidemiology and Department of Oncology, Strangeways Research Laboratories, Cambridge (United Kingdom); Burnet, Neil G. [University of Cambridge Department of Oncology, Oncology Centre, Cambridge (United Kingdom); Bentzen, Soren M. [University of Wisconsin, School of Medicine and Public Health, Department of Human Oncology, Madison, WI (United States)
2012-03-01
Purpose: The search for clinical and biologic biomarkers associated with late radiotherapy toxicity is hindered by the use of multiple and different endpoints from a variety of scoring systems, hampering comparisons across studies and pooling of data. We propose a novel metric, the Standardized Total Average Toxicity (STAT) score, to try to overcome these difficulties. Methods and Materials: STAT scores were derived for 1010 patients from the Cambridge breast intensity-modulated radiotherapy trial and 493 women from University Hospitals of Leicester. The sensitivity of the STAT score to detect differences between patient groups, stratified by factors known to influence late toxicity, was compared with that of individual endpoints. Analysis of residuals was used to quantify the effect of these covariates. Results: In the Cambridge cohort, STAT scores detected differences (p < 0.00005) between patients attributable to breast volume, surgical specimen weight, dosimetry, acute toxicity, radiation boost to tumor bed, postoperative infection, and smoking (p < 0.0002), with no loss of sensitivity over individual toxicity endpoints. Diabetes (p = 0.017), poor postoperative surgical cosmesis (p = 0.0036), use of chemotherapy (p = 0.0054), and increasing age (p = 0.041) were also associated with increased STAT score. When the Cambridge and Leicester datasets were combined, STAT was associated with smoking status (p < 0.00005), diabetes (p = 0.041), chemotherapy (p = 0.0008), and radiotherapy boost (p = 0.0001). STAT was independent of the toxicity scale used and was able to deal with missing data. There were correlations between residuals of the STAT score obtained using different toxicity scales (r > 0.86, p < 0.00005 for both datasets). Conclusions: The STAT score may be used to facilitate the analysis of overall late radiation toxicity, from multiple trials or centers, in studies of possible genetic and nongenetic determinants of radiotherapy toxicity.
Yu, Cheol-Woong; Lee, Hyun-Jong; Suh, Jon; Lee, Nae-Hee; Park, Sang-Min; Park, Taek Kyu; Yang, Jeong Hoon; Song, Young Bin; Hahn, Joo-Yong; Choi, Seung Hyuk; Gwon, Hyeon-Cheol; Lee, Sang-Hoon; Choe, Yeon Hyeon; Kim, Sung Mok; Choi, Jin-Ho
2017-04-01
We developed a model that predicts difficulty of percutaneous coronary intervention for coronary chronic total occlusion (CTO) using coronary computed tomographic angiography. A total of 684 CTO lesions with preprocedural computed tomographic angiography were enrolled from 4 centers. Data were randomly divided into derivation and validation datasets at 2:1 ratio. The end point was successful guidewire crossing ≤30 minutes, which was met in 50%. The KCCT (Korean Multicenter CTO CT Registry) score was developed based on independent predictors identified by multivariable analysis, which were proximal blunt entry, proximal side branch, bending, occlusion length ≥15 mm, severe calcification, whole luminal calcification, reattempt, and ≥12 months or unknown duration of occlusion. The KCCT score was compared with the other prediction scores, including angiography-based J-CTO, PROGRESS-CTO, CL-score, and CT-based CT-RECTOR. The probability of guidewire crossing ≤30 minutes declined consistently from 100% to 0% according to the KCCT score (PCTO percutaneous coronary intervention. © 2017 American Heart Association, Inc.
Kennen, Jonathan G.; Riskin, Melissa L.; Reilly, Pamela A.; Colarullo, Susan J.
2013-01-01
More than 300 ambient monitoring sites in New Jersey have been identified by the New Jersey Department of Environmental Protection (NJDEP) in its integrated water-quality monitoring and assessment report (that is, the 305(b) Report on general water quality and 303(d) List of waters that do not support their designated uses) as being impaired with respect to aquatic life; however, no unambiguous stressors (for example, nutrients or bacteria) have been identified. Because of the indeterminate nature of the broad range of possible impairments, surrogate measures that more holistically encapsulate the full suite of potential environmental stressors need to be developed. Streamflow alteration resulting from anthropogenic changes in the landscape is one such surrogate. For example, increases in impervious surface cover (ISC) commonly cause increases in surface runoff, which can result in “flashy” hydrology and other changes in the stream corridor that are associated with streamflow alteration. The NJDEP has indicated that methodologies to support a hydrologically based Total Maximum Daily Load (hydro-TMDL) need to be developed in order to identify hydrologic targets that represent a minimal percent deviation from a baseline condition (“minimally altered”) as a surrogate measure to meet criteria in support of designated uses. The primary objective of this study was to develop an applicable hydro-TMDL approach to address aquatic-life impairments associated with hydrologic alteration for New Jersey streams. The U.S. Geological Survey, in cooperation with the NJDEP, identified 51 non- to moderately impaired gaged streamflow sites in the Raritan River Basin for evaluation. Quantile regression (QR) analysis was used to compare flow and precipitation records and identify baseline hydrographs at 37 of these sites. At sites without an appropriately long period of record (POR) or where a baseline hydrograph could not be identified with QR, a rainfall-runoff model was used
Nagarajan, Rajakumar; Iqbal, Zohaib; Burns, Brian; Wilson, Neil E; Sarma, Manoj K; Margolis, Daniel A; Reiter, Robert E; Raman, Steven S; Thomas, M Albert
2015-11-01
The overlap of metabolites is a major limitation in one-dimensional (1D) spectral-based single-voxel MRS and multivoxel-based MRSI. By combining echo planar spectroscopic imaging (EPSI) with a two-dimensional (2D) J-resolved spectroscopic (JPRESS) sequence, 2D spectra can be recorded in multiple locations in a single slice of prostate using four-dimensional (4D) echo planar J-resolved spectroscopic imaging (EP-JRESI). The goal of the present work was to validate two different non-linear reconstruction methods independently using compressed sensing-based 4D EP-JRESI in prostate cancer (PCa): maximum entropy (MaxEnt) and total variation (TV). Twenty-two patients with PCa with a mean age of 63.8 years (range, 46-79 years) were investigated in this study. A 4D non-uniformly undersampled (NUS) EP-JRESI sequence was implemented on a Siemens 3-T MRI scanner. The NUS data were reconstructed using two non-linear reconstruction methods, namely MaxEnt and TV. Using both TV and MaxEnt reconstruction methods, the following observations were made in cancerous compared with non-cancerous locations: (i) higher mean (choline + creatine)/citrate metabolite ratios; (ii) increased levels of (choline + creatine)/spermine and (choline + creatine)/myo-inositol; and (iii) decreased levels of (choline + creatine)/(glutamine + glutamate). We have shown that it is possible to accelerate the 4D EP-JRESI sequence by four times and that the data can be reliably reconstructed using the TV and MaxEnt methods. The total acquisition duration was less than 13 min and we were able to detect and quantify several metabolites.
Kearney Rebecca S
2012-02-01
Full Text Available Abstract Background The Achilles tendon Total Rupture Score was developed by a research group in 2007 in response to the need for a patient reported outcome measure for this patient population. Beyond this original development paper, no further validation studies have been published. Consequently the purpose of this study was to evaluate internal consistency, convergent validity and responsiveness of this newly developed patient reported outcome measure within patients who have sustained an isolated acute Achilles tendon rupture. Methods Sixty-four eligible patients with an acute rupture of their Achilles tendon completed the Achilles tendon Total Rupture Score alongside two further patient reported outcome measures (Disability Rating Index and EQ 5D. These were completed at baseline, six weeks, three months, six months and nine months post injury. The Achilles tendon Total Rupture Score was evaluated for internal consistency, using Cronbach's alpha, convergent validity, through correlation analysis and responsiveness, by analysing floor and ceiling effects and calculating its relative efficiency in comparison to the Disability Rating Index and EQ 5D scores. Results The Achilles tendon Total Rupture Score demonstrated high internal consistency (Cronbachs alpha > 0.8 and correlated significantly (p Conclusions A universally accepted outcome measure is imperative to allow comparisons to be made across practice. This is the first study to evaluate aspects of validity of this newly developed outcome measure, outside of the developing centre. The ATRS demonstrated high internal consistency and responsiveness, with limited convergent validity. This research provides further support for the use of this outcome measure, however further research is required to advocate its universal use in patients with acute Achilles tendon ruptures. Such areas include inter-rater reliability and research to determine the minimally clinically important difference
Kearney, Rebecca S; Achten, Juul; Lamb, Sarah E; Parsons, Nicholas; Costa, Matthew L
2012-02-29
The Achilles tendon Total Rupture Score was developed by a research group in 2007 in response to the need for a patient reported outcome measure for this patient population. Beyond this original development paper, no further validation studies have been published.Consequently the purpose of this study was to evaluate internal consistency, convergent validity and responsiveness of this newly developed patient reported outcome measure within patients who have sustained an isolated acute Achilles tendon rupture. Sixty-four eligible patients with an acute rupture of their Achilles tendon completed the Achilles tendon Total Rupture Score alongside two further patient reported outcome measures (Disability Rating Index and EQ 5D). These were completed at baseline, six weeks, three months, six months and nine months post injury. The Achilles tendon Total Rupture Score was evaluated for internal consistency, using Cronbach's alpha, convergent validity, through correlation analysis and responsiveness, by analysing floor and ceiling effects and calculating its relative efficiency in comparison to the Disability Rating Index and EQ 5D scores. The Achilles tendon Total Rupture Score demonstrated high internal consistency (Cronbachs alpha > 0.8) and correlated significantly (p measure is imperative to allow comparisons to be made across practice. This is the first study to evaluate aspects of validity of this newly developed outcome measure, outside of the developing centre. The ATRS demonstrated high internal consistency and responsiveness, with limited convergent validity. This research provides further support for the use of this outcome measure, however further research is required to advocate its universal use in patients with acute Achilles tendon ruptures. Such areas include inter-rater reliability and research to determine the minimally clinically important difference between scores.All authors have read and concur with the content of this manuscript. The material
Koshiishi, H.; Kimoto, Y.; Matsumoto, H.; Goka, T.
The Tsubasa satellite developed by the Japan Aerospace Exploration Agency was launched in Feb 2002 into Geo-stationary Transfer Orbit GTO Perigee 500km Apogee 36000km and had been operated well until Sep 2003 The objective of this satellite was to verify the function of commercial parts and new technologies of bus-system components in space Thus the on-board experiments were conducted in the more severe radiation environment of GTO rather than in Geo-stationary Earth Orbit GEO or Low Earth Orbit LEO The Space Environment Data Acquisition equipment SEDA on board the Tsubasa satellite had the Single-event Upset Monitor SUM and the DOSimeter DOS to evaluate influences on electronic devices caused by radiation environment that was also measured by the particle detectors of the SEDA the Standard DOse Monitor SDOM for measurements of light particles and the Heavy Ion Telescope HIT for measurements of heavy ions The SUM monitored single-event upsets and single-event latch-ups occurred in the test sample of two 64-Mbit DRAMs The DOS measured accumulated radiation dose at fifty-six locations in the body of the Tsubasa satellite Using the data obtained by these instruments single-event and total-dose effects in GTO during solar-activity maximum period especially their rapid changes due to solar flares and CMEs in the region from L 1 1 through L 11 is discussed in this paper
Ovalle, E. M.; Bravo, M. A.; Villalobos, C. U.; Foppiano, A. J.
2013-10-01
Ionospheric variability observed prior to mayor earthquakes has been studied for decades. In particular, in many such studies the identification of ionospheric precursors of large earthquakes has been regarded as a specific goal. This paper analyses the observations of the maximum electron concentration (NmF2) over Concepción (36.8°S; 73.0°W) and of the total electron content (TEC) for an area covering the rupture zone corresponding to the very large Chile earthquake of 27 February 2010. The analyses used here are similar to those published before for many earthquakes in Taiwan, Japan and Russia. Possible NmF2 and TEC precursors are compared with other precursors proposed for the same earthquake using different TEC determinations and satellite observations of electron/ion concentration, energetic particle bursts and electromagnetic emissions. Some possible precursors derived from the various observations are consistent with each other. However, none can be unambiguously associated to the Chilean earthquake.
Hamilton, D F; Loth, F L; Giesinger, J M; Giesinger, K; MacDonald, D J; Patton, J T; Simpson, A H R W; Howie, C R
2017-02-01
To validate the English language Forgotten Joint Score-12 (FJS-12) as a tool to evaluate the outcome of hip and knee arthroplasty in a United Kingdom population. All patients undergoing surgery between January and August 2014 were eligible for inclusion. Prospective data were collected from 205 patients undergoing total hip arthroplasty (THA) and 231 patients undergoing total knee arthroplasty (TKA). Outcomes were assessed with the FJS-12 and the Oxford Hip and Knee Scores (OHS, OKS) pre-operatively, then at six and 12 months post-operatively. Internal consistency, convergent validity, effect size, relative validity and ceiling effects were determined. Data for the TKA and THA patients showed high internal consistency for the FJS-12 (Cronbach α = 0.97 in TKAs, 0.98 in THAs). Convergent validity with the Oxford Scores was high (r = 0.85 in TKAs, r = 0.79 for THAs). From six to 12 months, the change was higher for the FJS-12 than for the OHS in THA patients (effect size d = 0.21 versus -0.03). Ceiling effects at one-year follow-up were low for the FJS-12 with just 3.9% (TKA) and 8.8% (THA) of patients achieving the best possible score. The FJS-12 has strong measurement properties in terms of validity, internal consistency and sensitivity to change in TKA and THA patients. Low ceiling effects and good relative validity allow the monitoring of longer term outcomes, particularly in well-performing groups after total joint arthroplasty. Cite this article: Bone Joint J 2017;99-B:218-24. ©2017 Hamilton et al.
Roller, F C; Harth, S; Rixe, J; Krombach, G A; Schneider, C
2016-02-01
Analyzing occluded segments with computed tomography angiography (CTA) prior to percutaneous coronary intervention (PCI) increased revascularization success in chronic total occlusions (CTO). The aim of our study was to develop a scoring system for the prediction of PCI success in CTO. 41 consecutive CTO patients (30 male; 63.1 years +/- 8.3 standard deviation) underwent CTA prior to PCI. All CTOs were categorized by two radiologists in consensus regarding the presence of special features and without knowledge of PCI outcome. All outcome criteria were evaluated. Afterwards one point was assigned for each unequally distributed outcome criteria per CTO and all points were added up to a single score. Severe calcifications (failure group 68.8 % vs. success group 28.0 %; p CTO have been identified. Success rates are improved by analyzing CTA data sets prior to revascularization approaches. Prediction of revascularization success via a scoring system based on five CTA criteria seems promising. Patient selection for the right treatment options might be improved in the future due to application of the scoring system. Also risks, complications, contrast media amounts and radiation doses might be reduced. © Georg Thieme Verlag KG Stuttgart · New York.
Zimmerman, Marc J.; Qian, Yu; Yong Q., Tian
2011-01-01
In 2004, the Total Maximum Daily Load (TMDL) for Total Phosphorus in the Assabet River, Massachusetts, was approved by the U.S. Environmental Protection Agency. The goal of the TMDL was to decrease the concentrations of the nutrient phosphorus to mitigate some of the instream ecological effects of eutrophication on the river; these effects were, for the most part, direct consequences of the excessive growth of aquatic macrophytes. The primary instrument effecting lower concentrations of phosphorus was to be strict control of phosphorus releases from four major wastewatertreatment plants in Westborough, Marlborough, Hudson, and Maynard, Massachusetts. The improvements to be achieved from implementing this control were lower concentrations of total and dissolved phosphorus in the river, a 50-percent reduction in aquatic-plant biomass, a 30-percent reduction in episodes of dissolved oxygen supersaturation, no low-flow dissolved oxygen concentrations less than 5.0 milligrams per liter, and a 90-percent reduction in sediment releases of phosphorus to the overlying water. In 2007, the U.S. Geological Survey, in cooperation with the Massachusetts Department of Environmental Protection, initiated studies to evaluate conditions in the Assabet River prior to the upgrading of wastewater-treatment plants to remove more phosphorus from their effluents. The studies, completed in 2008, implemented a visual monitoring plan to evaluate the extent and biomass of the floating macrophyte Lemna minor (commonly known as lesser duckweed) in five impoundments and evaluated the potential for phosphorus flux from sediments in impounded and free-flowing reaches of the river. Hydrologically, the two study years 2007 and 2008 were quite different. In 2007, summer streamflows, although low, were higher than average, and in 2008, the flows were generally higher than in 2007. Visually, the effects of these streamflow differences on the distribution of Lemna were obvious. In 2007, large amounts of
Matsumoto, Mikio; Baba, Tomonori; Ochi, Hironori; Ozaki, Yu; Watari, Taiji; Homma, Yasuhiro; Kaneko, Kazuo
2017-04-25
The purpose of this study was to examine the influence of the contralateral hip state on postoperative assessment using the Forgotten Joint Score-12 (FJS-12) in comparison with the McMaster Universities Osteoarthritis Index (WOMAC) and the Japanese Orthopaedic Association Hip Disease Evaluation Questionnaire (JHEQ). One hundred and thirty-four hips underwent total hip arthroplasty (THA) between 2014 and 2015. Of these, the subjects were 106 hips with degenerative hip arthrosis as a primary disease for whom initial THA was performed on the affected side. The WOMAC and JHEQ were investigated before surgery and 1 month, 6 months, and 1 year after surgery. The FJS-12 was examined 1 month, 6 months, and 1 year after surgery. We divided the subjects into three groups based on the state of the contralateral hip, which was not surgically treated in this study: healthy (n = 43), THA (n = 31), and OA (n = 31) groups. One year after surgery, the mean FJS-12 scores in the healthy, THA, and OA groups were 69.1, 52.8, and 68.0 points, respectively. In the THA group, the score was significantly lower than in the healthy and OA group. There were no significant differences in WOMAC and JHEQ scores among the three groups. The FJS-12 score in the presence of an arthroplasty on the contralateral side was more markedly influenced by the contralateral hip state compared with that in the presence of contralateral painful OA. This result suggests that it is necessary to understand the characteristics of PROs and utilize them for post-THA assessment.
Moussa, Mohamed E; Lee, Yuo-Yu; Westrich, Geoffrey H; Mehta, Nabil; Lyman, Stephen; Marx, Robert G
2017-02-01
Attaining stability during total knee arthroplasty (TKA) is essential for a successful outcome. Although traditional constrained total knee prostheses have generally been used in conjunction with intramedullary stems, some devices have been widely used without the use of stems, referred to as non-modular constrained condylar total knee arthroplasty (NMCCK). The aim of this study was to compare revisions rates after total knee replacement with a non-modular constrained condylar total knee (NMCCK) compared to a posterior-stabilized (PS) design. Between 2007 and 2012, primary PS total knees were compared with NMCCK implants from the same manufacturer. Propensity score matching was performed, and implant survivorship was examined using a Cox proportional hazards model. The cohort consisted of 817 PS knees and 817 NMCCKs matched for patient demographics, surgeon volume, and pre-operative diagnosis. All cause revisions occurred in 11 of 817 (1.35%) in the PS group compared to 28 of 817 (3.43%) in the NMCCK group (p = 0.0168). Excluding revisions for infection and fracture, 8 of 817 (0.98%) PS knees required revision for mechanical failure compared to 18 of 817 (2.20%) NMCCK knees (p = 0.0193). While revisions rates in both cohorts were low, there was a significantly higher revision rate with NMCCKs. Given that cases requiring the use of NMCCK implants are likely more complex than those in which PS implants are used, our findings support the judicious use of NMCCK prostheses.
Gronewold, A. D.; Alameddine, I.; Anderson, R.; Wolpert, R.; Reckhow, K.
2008-12-01
The United States Environmental Protection Agency (USEPA) total maximum daily load (TMDL) program requires that individual states assess the condition of surface waters and identify those which fail to meet ambient water quality standards. Waters failing to meet those standards must have a TMDL assessment conducted to determine the maximum allowable pollutant load which can enter the water without violating water quality standards. While most of the nearly 30,000 TMDL assessments completed since 1995 use mechanistic or empirical water quality models to forecast water quality conditions under alternative pollutant loading reduction scenarios, few, if any, also simulate water quality conditions under alternative climate change scenarios. As a result, model-based loading reduction requirements (which serve as the cornerstone for implementing water resource management plans, and initiating environmental management infrastructure projects), believed to improve water quality in impaired waters and reinstate their designated use, may misrepresent the actual required reduction when future climate change scenarios are considered. For example, recent research indicates a potential long term future increase in both the number of days between, and the intensity of, individual precipitation events. In coastal terrestrial and aquatic ecosystems, such climate conditions could lead to an increased accumulation of pollutants on the landscape between precipitation events, followed by a washoff event with a relatively high pollutant load. On the other hand, anticipated increases in average temperature and evaporation rate might not only reduce effective rainfall rates (resulting in less energy for transporting pollutants from the landscape) but also reduce the tidal exchange ratio in shallow estuaries (many of which are valuable recreational, commercial, and aesthetic natural resources). Here, we develop and apply a comprehensive watershed-scale model for simulating water quality in
Hafeez, Rehana; Boulos, Paul [University College London Hospitals NHS Trust, Department of Surgery, London (United Kingdom); Punwani, Shonit; Halligan, Steve [University College London, Centre for Medical Imaging, London (United Kingdom); University College London Hospitals NHS Trust, Department of Specialist X-ray, Level 2 podium, London (United Kingdom); Pendse, Doug [University College London, Centre for Medical Imaging, London (United Kingdom); Bloom, Stuart [University College London Hospitals NHS Trust, Department of Gastroenterology, London (United Kingdom); Taylor, Stuart A. [University College London, Centre for Medical Imaging, London (United Kingdom); University College London Hospitals NHS Trust, Department of Specialist X-ray, Level 2 podium, London (United Kingdom)
2011-02-15
To derive an MRI score for assessing severity, therapeutic response and prognosis in acute severe inflammatory colitis. Twenty-one patients with acute severe colitis underwent colonic MRI after admission and again (n = 16) after median 5 days of treatment. Using T2-weighted images, two radiologists in consensus graded segmental haustral loss, mesenteric and mural oedema, mural thickness, and small bowel and colonic dilatation producing a total colonic inflammatory score (TCIS, range 6-95). Pre- and post-treatment TCIS were compared, and correlated with CRP, stool frequency, and number of inpatient days (therapeutic response marker). Questionnaire assessment of patient worry, satisfaction and discomfort graded 1 (bad) to 7 (good) was administered Admission TCIS correlated significantly with CRP (Kendall's tau=0.45, 95% confidence interval [CI] 0.11-0.79, p = 0.006), and stool frequency (Kendall's tau 0.39, 95% CI 0.14-0.64, p = 0.02). TCIS fell after treatment (median [22 range 15-31]) to median 20 [range 8-25], p = 0.01. Admission TCIS but not CRP or stool frequency was correlated with length of inpatient stay (Kendall's tau 0.40, 95% CI 0.11-0.69, p = 0.02). Patients reported some discomfort (median score 4) during MRI. MRI TCIS falls after therapy, correlates with existing markers of disease severity, and in comparison may better predict therapeutic response. (orig.)
Hansen, Maria Swennergren; Christensen, Marianne; Budolfsen, Thomas;
2016-01-01
PURPOSE: To investigate how the Achilles tendon Total Rupture Score (ATRS) at 3 months and 1 year after injury is associated with a patient's ability to return to work and sports as well as to investigate whether sex and age influence ATRS after 3 months and 1 year. METHOD: This is a retrospective...... study analysing the data from the Danish Achilles tendon Database. A total of 366 patients were included. Logistic regression was conducted to describe the effect of ATRS on return to work and sports. The effect of age and sex on ATRS was analysed by linear regression. RESULTS: Three months after injury...... patients had a significantly increased chance of return to sport after 1 year with an increased ATRS (OR 1.06, p = 0.001) but a non-significant effect on return to work. After 1 year, patients had a significantly increased probability of having returned to sport (OR 1.11, p
Burt, Richard K; Cohen, Bruce A; Russell, Eric; Spero, Kenneth; Joshi, Akash; Oyama, Yu; Karpus, William J; Luo, Kehuan; Jovanovic, Borko; Traynor, Ann; Karlin, Karyn; Stefoski, Dusan; Burns, William H
2003-10-01
There were 21 patients with rapidly progressive multiple sclerosis (MS) treated on a phase 1/2 study of intense immune suppressive therapy and autologous hematopoietic stem cell (HSC) support with no 1-year mortality. Following transplantation, one patient had a confirmed acute attack of MS. Neurologic progression defined by the expanded disability status scale (EDSS) did not increase in disability by 1.0 or more steps in any of 9 patients with a pretransplantation EDSS of 6.0 or less. In 8 of 12 patients with high pretransplantation disability scores (EDSS > 6.0), progressive neurologic disability as defined by at least a 1-point increase in the EDSS has occurred and was manifested as gradual neurologic deterioration. There were 2 patients with a pretransplantation EDSS of 7.0 and 8.0 who died from complications of progressive disease at 13 and 18 months following treatment. Our experience suggests that intense immune suppression using a total body irradiation (TBI)-based regimen and hematopoietic stem cell transplantation (HSCT) are not effective for patients with progressive disease and high pretransplantation disability scores. Further studies are necessary to determine the role of intense immune suppressive therapy and HSC support in ambulatory patients with less accumulated disability and more inflammatory disease activity. Specifically, more patients and longer follow-up would be required in patients with an EDSS of 6.0 or less before drawing conclusions on this subgroup.
Neff, Kristin D; Whittaker, Tiffany A; Karl, Anke
2017-01-31
This study examined the factor structure of the Self-Compassion Scale (SCS) using a bifactor model, a higher order model, a 6-factor correlated model, a 2-factor correlated model, and a 1-factor model in 4 distinct populations: college undergraduates (N = 222), community adults (N = 1,394), individuals practicing Buddhist meditation (N = 215), and a clinical sample of individuals with a history of recurrent depression (N = 390). The 6-factor correlated model demonstrated the best fit across samples, whereas the 1- and 2-factor models had poor fit. The higher order model also showed relatively poor fit across samples, suggesting it is not representative of the relationship between subscale factors and a general self-compassion factor. The bifactor model, however, had acceptable fit in the student, community, and meditator samples. Although fit was suboptimal in the clinical sample, results suggested an overall self-compassion factor could still be interpreted with some confidence. Moreover, estimates suggested a general self-compassion factor accounted for at least 90% of the reliable variance in SCS scores across samples, and item factor loadings and intercepts were equivalent across samples. Results suggest that a total SCS score can be used as an overall mesure of self-compassion.
Hojat, Mohammadreza; Gonnella, Joseph S; Nasca, Thomas J; Fields, Sylvia K; Cicchetti, Americo; Lo Scalzo, Alessandra; Taroni, Francesco; Amicosante, Anna Maria Vincenza; Macinati, Manuela; Tangucci, Massimo; Liva, Carlo; Ricciardi, Gualtiero; Eidelman, Shmuel; Admi, Hanna; Geva, Hana; Mashiach, Tanya; Alroy, Gideon; Alcorta-Gonzalez, Adelina; Ibarra, David; Torres-Ruiz, Antonio
2003-05-01
This cross-cultural study was designed to compare the attitudes of physicians and nurses toward physician-nurse collaboration in the United States, Israel, Italy and Mexico. Total participants were 2522 physicians and nurses who completed the Jefferson Scale of Attitudes Toward Physician-Nurse Collaboration (15 Likert-type items, (Hojat et al., Evaluation and the Health Professions 22 (1999a) 208; Nursing Research 50 (2001) 123). They were compared on the total scores and four factors of the Jefferson Scale (shared education and team work, caring as opposed to curing, nurses, autonomy, physicians' dominance). Results showed inter- and intra-cultural similarities and differences among the study groups providing support for the social role theory (Hardy and Conway, Role Theory: Perspectives for Health Professionals, Appelton-Century-Crofts, New York, 1978) and the principle of least interest (Waller and Hill, The Family: A Dynamic Interpretation, Dryden, New York, 1951) in inter-professional relationships. Implications for promoting physician-nurse education and inter-professional collaboration are discussed.
Hansen, Maria Swennergren; Christensen, Marianne; Budolfsen, Thomas; Østergaard, Thomas Friis; Kallemose, Thomas; Troelsen, Anders; Barfod, Kristoffer Weisskirchner
2016-04-01
To investigate how the Achilles tendon Total Rupture Score (ATRS) at 3 months and 1 year after injury is associated with a patient's ability to return to work and sports as well as to investigate whether sex and age influence ATRS after 3 months and 1 year. This is a retrospective study analysing the data from the Danish Achilles tendon Database. A total of 366 patients were included. Logistic regression was conducted to describe the effect of ATRS on return to work and sports. The effect of age and sex on ATRS was analysed by linear regression. Three months after injury patients had a significantly increased chance of return to sport after 1 year with an increased ATRS (OR 1.06, p = 0.001) but a non-significant effect on return to work. After 1 year, patients had a significantly increased probability of having returned to sport (OR 1.11, p < 0.001) and also having returned to work (OR 1.05, p = 0.007) with an increased ATRS. Men had an average 7 (p = 0.006) points higher ATRS at 3 months and an average 22 (p = 0.006) points higher at 1 year. ATRS is associated with patients' ability to return to sports and work. ATRS at 3 months can be used as a predictor of the patient's ability to return to sports after 1 year. Hereby, ATRS might help to individualise rehabilitation by identifying patients who do not respond adequately to the chosen treatment. II.
Ferreira-Valente, Alexandra; Costa, Patrício; Elorduy, Marta; Virumbrales, Montserrat; Costa, Manuel J; Palés, Jorge
2016-09-19
Empathy is a key aspect of the physician-patient interactions. The Jefferson Scale of Empathy (JSE) is one of the most used empathy measures of medical students. The development of cross-cultural empathy studies depends on valid and reliable translations of the JSE. This study sought to: (1) adapt and assess the psychometric properties in Spanish students of the Spanish JSE validated in Mexican students; (2) test a second order latent factor model. The Spanish JSE was adapted from the Spanish JSE-S, resulting in a final version of the measure. A non-probabilistic sample of 1104 medical students of two Spanish medical schools completed a socio-demographic and the Spanish JSE-S. Descriptive statistics, along with a confirmatory factor analysis, the average variance extracted (AVE), Cronbach's alphas and composite reliability (CR) coefficients were computed. An independent samples t-test was performed to access sex differences. The Spanish JSE-S demonstrated acceptable to good sensitivity (individual items - except for item 2 - and JSE-S total score: -2.72 factor analysis supported the three-factor solution and the second order latent factor model. The findings provide support for the sensitivity, construct validity and reliability of the adapted Spanish JSE-S with Spanish medical students. Data confirm the hypothesized second order latent factor model. This version may be useful in future research examining empathy in Spanish medical students, as well as in cross-cultural studies.
Chapple, Christopher R; Drake, Marcus J; Van Kerrebroeck, Philip; Cardozo, Linda; Drogendijk, Ted; Klaver, Monique; Van Charldorp, Karin; Hakimi, Zalmai; Compion, Gerhard
2014-05-01
The term lower urinary tract symptoms (LUTS) encompasses a range of urinary symptoms, including storage symptoms (e.g. overactive bladder [OAB]) as well as voiding and post-micturition symptoms. Although treatment of male LUTS tends to focus on voiding symptoms, patients typically find storage symptoms the most bothersome. The core storage symptom is urgency, which drives the other main storage symptoms of increased daytime frequency, nocturia and incontinence. Although several validated questionnaires have been widely used to study urgency, few measure the two important storage parameters, urgency and frequency, in a single assessment. The total urgency and frequency score (TUFS) is a new validated tool that captures both variables and is derived from the Patient Perception of Intensity of Urgency Scale, which has been validated in patients with OAB and LUTS. The TUFS was first validated in OAB in the phase IIa BLOSSOM study, which was designed to assess the efficacy and safety of mirabegron, a β3 -adrenoceptor agonist, in 260 patients. The responsiveness of the TUFS to treatment has been confirmed in a further three large-scale randomized controlled trials of solifenacin in patients with OAB or LUTS. Changes in TUFS from baseline to end of treatment were consistent with changes in micturition diary variables in all four studies. Furthermore, the TUFS was significantly correlated with several health-related quality-of-life variables in the phase III NEPTUNE study. Thus, the TUFS appears to be useful for assessing improvements in major storage symptoms (urgency and frequency) in clinical trials.
Doganlar, Oguzhan; Doganlar, Zeynep Banu; Tabakcioglu, Kiymet
2015-10-01
In this study, we aimed to investigate the mutagenic and carcinogenic potential of a volatile organic compound (VOC) mixture with references to the response of D.melanogaster using selected antioxidant gene expressions, RAPD assay and base-pair change of ribosomal 18S, and the internal transcribed spacer, ITS2 rDNA gene sequences. For this purpose, Drosophila melanogaster Oregon R, reared under controlled conditions on artificial diets, were treated with the mixture of thirteen VOCs, which are commonly found in water in concentrations of 10, 20, 50, and 75 ppb for 1 and 5 days. In the random amplified polymorphic DNA (RAPD) assay, band changes were clearly detected, especially at the 50 and 75 ppb exposure levels, for both treatment periods, and the band profiles exhibited clear differences between the treated and untreated flies with changes in band intensity and the loss/appearance of bands. Quantitative real-time PCR (qRT-PCR) analysis of Mn-superoxide dismutase (Mn-SOD), catalase (CAT) and glutathione-synthetase (GS) expressions demonstrated that these markers responded significantly to VOC-induced oxidative stress. Whilst CAT gene expressions increased linearly with increasing concentrations of VOCs and treatment times, the 50- and 75-ppb treatments caused decreases in GS expressions compared to the control at 5 days. Treatment with VOCs at both exposure times, especially in high doses, caused gene mutation of the 18S and the ITS2 ribosomal DNA. According to this research, we thought that the permissible maximum-contamination level of VOCs can cause genotoxic effect especially when mixed.
Kinkhabwala, Ali
2013-01-01
The most fundamental problem in statistics is the inference of an unknown probability distribution from a finite number of samples. For a specific observed data set, answers to the following questions would be desirable: (1) Estimation: Which candidate distribution provides the best fit to the observed data?, (2) Goodness-of-fit: How concordant is this distribution with the observed data?, and (3) Uncertainty: How concordant are other candidate distributions with the observed data? A simple unified approach for univariate data that addresses these traditionally distinct statistical notions is presented called "maximum fidelity". Maximum fidelity is a strict frequentist approach that is fundamentally based on model concordance with the observed data. The fidelity statistic is a general information measure based on the coordinate-independent cumulative distribution and critical yet previously neglected symmetry considerations. An approximation for the null distribution of the fidelity allows its direct conversi...
Rasmussen, Teresa J.; Paxson, Chelsea R.
2017-08-25
Municipalities in Johnson County in northeastern Kansas are required to implement stormwater management programs to reduce pollutant discharges, protect water quality, and comply with applicable water-quality regulations in accordance with National Pollutant Discharge Elimination System permits for stormwater discharge. To this end, municipalities collect grab samples at streams entering and leaving their jurisdiction to determine levels of excessive nutrients, sediment, and fecal bacteria to characterize pollutants and understand the factors affecting them.In 2014, the U.S. Geological Survey and the Johnson County Stormwater Management Program, with input from the Kansas Department of Health and Environment, initiated a 5-year monitoring program to satisfy minimum sampling requirements for each municipality as described by new stormwater permits issued to Johnson County municipalities. The purpose of this report is to provide a preliminary assessment of the monitoring program. The monitoring program is described, a preliminary assessment of the monitoring program design is provided using water-quality data collected during the first 2 years of the program, and the ability of the current monitoring network and sampling plan to provide data sufficient to quantify improvements in water quality resulting from implemented and planned best management practices is evaluated. The information in this initial report may be used to evaluate changes in data collection methods while data collection is still ongoing that may lead to improved data utility.Discrete water-quality samples were collected at 27 sites and analyzed for nutrients, Escherichia coli (E. coli) bacteria, total suspended solids, and suspended-sediment concentration. In addition, continuous water-quality data (water temperature, pH, dissolved oxygen, specific conductance, turbidity, and nitrate plus nitrite) were collected at one site to characterize variability and provide a basis for comparison to discrete
Tanaka, Hiroyuki; Morino, Yoshihiro; Abe, Mitsuru; Kimura, Takeshi; Hayashi, Yasuhiko; Muramatsu, Toshiya; Ochiai, Masahiko; Noguchi, Yuichi; Kato, Kenichi; Shibata, Yoshisato; Hiasa, Yoshikazu; Doi, Osamu; Yamashita, Takehiro; Morimoto, Takeshi; Hinohara, Tomoaki; Fujii, Toshiharu; Mitsudo, Kazuaki
2016-01-22
We investigated the impact of the J-CTO score, a pre-procedural risk score for successful guidewire crossing within 30 minutes through chronic total occlusion (CTO) lesions, on procedural and midterm clinical outcomes in terms of target lesion revascularisation (TLR) after CTO recanalisation. The primary endpoint of this substudy was midterm TLR. The net midterm success rate was calculated by multiplying the lesion success rate by the TLR-free survival rate. The initial lesion success rates according to the J-CTO score categories of 0, 1, 2, and ≥3 were 97.0%, 92.1%, 86.5%, and 73.6%, respectively (pCTO score categories of 0, 1, 2, and ≥3 were 5.3%, 11.1%, 16.7%, and 13.4%, respectively (p=0.082). The net midterm success rates according to the J-CTO score categories of 0, 1, 2, and ≥3 were 91.9%, 81.9%, 72.1%, and 63.7%, respectively (pCTO lesions with lower J-CTO scores are expected to achieve a high procedural success rate and an increased TLR-free survival rate. Patients with high J-CTO scores still remain an issue.
S.G Zuh; Ö. Nagy; Ancuța Zazgyva; O.M. Russu; Gergely, I; T.S. Pop
2014-01-01
Total hip replacement is one of the most frequently performed orthopaedic interventions that can significantly improve the functional status and the quality of life of patients suffering from hip arthrosis. Recently patient satisfaction and patient-reported results of total hip arthroplasty are increasingly emphasised as important tools for the assessments of these interventions. For patients with arthrosis secondary to hip dysplasia, these evaluations can be more difficult, due to younger ag...
Wetke, Eva; Zerahn, Bo; Kofoed, Hakon
2012-01-01
We hypothesized that a total replacement of the first metatarsophalangeal joint (MTP-1) would alter the walking pattern with medialisation of the ground reaction force (GRF) of the foot and subsequently cause an increase in bone mineral density (BMD) in the medial metatarsal bones and a decline...
Eggleston, Jack
2009-01-01
Due to elevated levels of methylmercury in fish, three streams in the Shenandoah Valley of Virginia have been placed on the State's 303d list of contaminated waters. These streams, the South River, the South Fork Shenandoah River, and parts of the Shenandoah River, are downstream from the city of Waynesboro, where mercury waste was discharged from 1929-1950 at an industrial site. To evaluate mercury contamination in fish, this total maximum daily load (TMDL) study was performed in a cooperative effort between the U.S. Geological Survey, the Virginia Department of Environmental Quality, and the U.S. Environmental Protection Agency. The investigation focused on the South River watershed, a headwater of the South Fork Shenandoah River, and extrapolated findings to the other affected downstream rivers. A numerical model of the watershed, based on Hydrological Simulation Program-FORTRAN (HSPF) software, was developed to simulate flows of water, sediment, and total mercury. Results from the investigation and numerical model indicate that contaminated flood-plain soils along the riverbank are the largest source of mercury to the river. Mercury associated with sediment accounts for 96 percent of the annual downstream mercury load (181 of 189 kilograms per year) at the mouth of the South River. Atmospherically deposited mercury contributes a smaller load (less than 1 percent) as do point sources, including current discharge from the historic industrial source area. In order to determine how reductions of mercury loading to the stream could reduce methylmercury concentrations in fish tissue below the U.S. Environmental Protection Agency criterion of 0.3 milligrams per kilogram, multiple scenarios were simulated. Bioaccumulation of mercury was expressed with a site-specific exponential relation between aqueous total mercury and methylmercury in smallmouth bass, the indicator fish species. Simulations indicate that if mercury loading were to decrease by 98.9 percent from 189
... this page: //medlineplus.gov/ency/article/003402.htm Apgar score To use the sharing features on this page, ... birth. Virginia Apgar, MD (1909-1974) introduced the Apgar score in 1952. How the Test is Performed The ...
Brucki S.M.D.
2004-01-01
Full Text Available Verbal fluency tests are used as a measure of executive functions and language, and can also be used to evaluate semantic memory. We analyzed the influence of education, gender and age on scores in a verbal fluency test using the animal category, and on number of categories, clustering and switching. We examined 257 healthy participants (152 females and 105 males with a mean age of 49.42 years (SD = 15.75 and having a mean educational level of 5.58 (SD = 4.25 years. We asked them to name as many animals as they could. Analysis of variance was performed to determine the effect of demographic variables. No significant effect of gender was observed for any of the measures. However, age seemed to influence the number of category changes, as expected for a sensitive frontal measure, after being controlled for the effect of education. Educational level had a statistically significant effect on all measures, except for clustering. Subject performance (mean number of animals named according to schooling was: illiterates, 12.1; 1 to 4 years, 12.3; 5 to 8 years, 14.0; 9 to 11 years, 16.7, and more than 11 years, 17.8. We observed a decrease in performance in these five educational groups over time (more items recalled during the first 15 s, followed by a progressive reduction until the fourth interval. We conclude that education had the greatest effect on the category fluency test in this Brazilian sample. Therefore, we must take care in evaluating performance in lower educational subjects.
Antonio Oliveira-Neto
2012-01-01
Full Text Available Objective. To evaluate the performance of Sequential Organ Failure Assessment (SOFA score in cases of severe maternal morbidity (SMM. Design. Retrospective study of diagnostic validation. Setting. An obstetric intensive care unit (ICU in Brazil. Population. 673 women with SMM. Main Outcome Measures. mortality and SOFA score. Methods. Organ failure was evaluated according to maximum score for each one of its six components. The total maximum SOFA score was calculated using the poorest result of each component, reflecting the maximum degree of alteration in systemic organ function. Results. highest total maximum SOFA score was associated with mortality, 12.06 ± 5.47 for women who died and 1.87 ± 2.56 for survivors. There was also a significant correlation between the number of failing organs and maternal mortality, ranging from 0.2% (no failure to 85.7% (≥3 organs. Analysis of the area under the receiver operating characteristic (ROC curve (AUC confirmed the excellent performance of total maximum SOFA score for cases of SMM (AUC = 0.958. Conclusions. Total maximum SOFA score proved to be an effective tool for evaluating severity and estimating prognosis in cases of SMM. Maximum SOFA score may be used to conceptually define and stratify the degree of severity in cases of SMM.
Description and validation of a scoring system for tomosynthesis in pulmonary cystic fibrosis
Vult von Steyern, Kristina; Bjoerkman-Burtscher, Isabella M.; Bozovic, Gracijela; Wiklund, Marie; Geijer, Mats [Skaane University Hospital, Lund University, Centre for Medical Imaging and Physiology, Lund (Sweden); Hoeglund, Peter [Skaane University Hospital, Competence Centre for Clinical Research, Lund (Sweden)
2012-12-15
To design and validate a scoring system for tomosynthesis (digital tomography) in pulmonary cystic fibrosis. A scoring system dedicated to tomosynthesis in pulmonary cystic fibrosis was designed. Three radiologists independently scored 88 pairs of radiographs and tomosynthesis examinations of the chest in 60 patients with cystic fibrosis and 7 oncology patients. Radiographs were scored according to the Brasfield scoring system and tomosynthesis examinations were scored using the new scoring system. Observer agreements for the tomosynthesis score were almost perfect for the total score with square-weighted kappa >0.90, and generally substantial to almost perfect for subscores. Correlation between the tomosynthesis score and the Brasfield score was good for the three observers (Kendall's rank correlation tau 0.68, 0.77 and 0.78). Tomosynthesis was generally scored higher as a percentage of the maximum score. Observer agreements for the total score for Brasfield score were almost perfect (square-weighted kappa 0.80, 0.81 and 0.85). The tomosynthesis scoring system seems robust and correlates well with the Brasfield score. Compared with radiography, tomosynthesis is more sensitive to cystic fibrosis changes, especially bronchiectasis and mucus plugging, and the new tomosynthesis scoring system offers the possibility of more detailed and accurate scoring of disease severity. (orig.)
Morino, Yoshihiro; Abe, Mitsuru; Morimoto, Takeshi; Kimura, Takeshi; Hayashi, Yasuhiko; Muramatsu, Toshiya; Ochiai, Masahiko; Noguchi, Yuichi; Kato, Kenichi; Shibata, Yoshisato; Hiasa, Yoshikazu; Doi, Osamu; Yamashita, Takehiro; Hinohara, Tomoaki; Tanaka, Hiroyuki; Mitsudo, Kazuaki
2011-02-01
This study sought to establish a model for grading lesion difficulty in interventional chronic total occlusion (CTO) treatment. Owing to uncertainty of success of the procedure and difficulties in selecting suitable cases for treatment, performance of interventional CTO remains infrequent. Data from 494 native CTO lesions were analyzed. To eliminate operator bias, the objective parameter of successful guidewire crossing within 30 min was set as an end point, instead of actual procedural success. All observations were randomly assigned to a derivation set and a validation set at a 2:1 ratio. The J-CTO (Multicenter CTO Registry of Japan) score was determined by assigning 1 point for each independent predictor of this end point and summing all points accrued. This value was then used to develop a model stratifying all lesions into 4 difficulty groups: easy (J-CTO score of 0), intermediate (score of 1), difficult (score of 2), and very difficult (score of ≥ 3). The set end point was achieved in 48.2% of lesions. Independent predictors included calcification, bending, blunt stump, occlusion length >20 mm, and previously failed lesion. Easy, intermediate, difficult, and very difficult groups, stratified by J-CTO score, demonstrated stepwise, proportioned, and highly reproducible differences in probability of successful guidewire crossing within 30 min (87.7%, 67.1%, 42.4%, and 10.0% in the derivation set and 92.3%, 58.3%, 34.8%, and 22.2% in the validation set, respectively). Areas under receiver-operator characteristic curves were comparable (derivation: 0.82 vs. validation: 0.76). This model predicted the probability of successful guidewire crossing within 30 min very well and can be applied for difficulty grading. Copyright © 2011 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
... Stages Listen Español Text Size Email Print Share Apgar Scores Page Content Article Body As soon as your ... the syringe, but is blue; her one minute Apgar score would be 8—two points off because she ...
Huber, Erika O; Meichtry, Andre; de Bie, Rob A; Bastiaenen, Caroline H
2016-02-01
The Chair Stand Test (CST) is a frequently used performance-based test in clinical studies involving individuals with knee osteoarthritis and demonstrates good reliability. To assess the construct validity of change scores of the CST compared to three other measures in patients before and after total knee replacement surgery. The construct validity of change scores of the CST compared to the Timed Up and Go (TUG) test, the Knee Injury and Osteoarthritis Outcome Score questionnaire (KOOS, subscale ADL) and the isometric muscle strength test of the knee extensors (IMS sum) was measured 1-2 week before and 3 months after surgery. Change (%) CST = -4.45, TUG = -2.08, KOOS ADL = 43.90, IMS sum = -13.24. Correlations CST-TUG = 0.56 (95% confidence interval (CI) 0.29, 0.74), CST-KOOS = -0.31 (95% CI -0.57, 0.01), CST-IMS sum = -0.11 (95% CI -0.42, 0.22). Comparison of pairwise correlations: CST-KOOS versus CST-TUG (p < 0.0004), CST-TUG versus CST-IMS sum (p < 0.0068), CST-KOOS versus CST-IMS sum (p < 0.3100). For patients undergoing TKR, the CST might not be an ideal measure to assess change between pre-surgery and 3 months post-surgery. Construct validity of change scores was close to zero but the result might have been influenced by the relatively small homogeneous sample size and the chosen timespan of measurement. We ordered pairwise correlations based on the strength of correlation between the different instruments, which to our knowledge has never been done before. Copyright © 2015 Elsevier Ltd. All rights reserved.
Syrseloudis, Dimitrios; Secco, Gioel Gabrio; Barrero, Eduardo Alegria; Lindsay, Alistair C; Ghione, Matteo; Kilickesmez, Kadriye; Foin, Nicolas; Martos, Ramon; Di Mario, Carlo
2013-04-01
To investigate whether treatment of lesions of greater complexity is now undertaken and to assess the rates of procedural success per class of lesion complexity. Observational study. Despite impressive progress in treatment strategies and equipment, the success rate of percutaneous coronary intervention for chronic total occlusion (CTO) has remained relatively stable. 483 patients consecutively treated with CTO from 2003 to 2012. The Multicenter CTO Registry of Japan (J-CTO) score was used to classify lesion complexity. The study population was subdivided into an early (period 1, n=288) and a late (period 2, n=195) period according to the routine implementation of novel techniques and advanced equipment. Period 2 was marked by more 'difficult' and 'very difficult' lesions (J-CTO grades 2 and 3) being attempted, with procedural success increasing from 68.4% to 88.1% (pCTO grades 0 and 1) were less common, but with similarly high success rates (89.1% vs 96.6% (p=0.45) for easy, and 86.3% vs 86.1% (p=0.99) for intermediate). Period 2 was characterised by a trend for more successful procedures overall (by 6.1%, p=0.09). Procedural complications were similarly low in both periods. J-CTO score and technical era were identified as independent correlates of success in the total population by logistic regression analysis. Advanced CTO techniques and equipment have resulted in an increase in the successful treatment of highly complex lesions. Total success rate did not substantially improve, as it was counterbalanced by the increased rate at which complex lesions were attempted.
Fabián, Z. (Zdeněk)
2010-01-01
In this paper, we study a distribution-dependent correlation coefficient based on the concept of scalar score. This new measure of association of continuous random variables is compared by means of simulation experiments with the Pearson, Kendall and Spearman correlation coefficients.
Bhomi, K K; Subedi, N; Panta, P P
2017-01-01
International prostate symptom score is a validated questionnaire used to evaluate the lower urinary tract symptoms in benign prostatic hyperplasia. Visual prostate symptom score is a new simplified symptom score with pictograms to evaluate the same. We evaluated the correlation of visual prostate symptom score with international prostate symptom score and uroflowmetry parameters in Nepalese male patients with lower urinary tract symptoms. Male patients aged ≥40 years attending the Urology clinic were enrolled in the study. They were given international prostate symptom score and visual prostate symptom score questionnaires to complete providing assistance whenever needed. Demographic data, examination findings and uroflowmetry parameters were noted. Correlation and regression analysis was used to identify correlation of the two scoring systems and uroflowmetry parameters. Among the 66 patients enrolled, only 10 (15.15%) patients were able to understand English language. There was a statistically significant correlation between total visual prostate symptom score and international prostate symptom score (r= 0.822; Pcorrelations between individual scores of the two scoring systems related to force of urinary stream, frequency, nocturia and quality of life were also statistically significant. There was also a statistically significant correlation of both scores with maximum flow rate and average flow rate. There is a statistically significant correlation of visual prostate symptom score with international prostate symptom score and uroflowmetry parameters. IPSS can be replaced with simple VPSS in evaluation of lower urinary tract symptoms in elderly male patients.
Maximum Autocorrelation Factorial Kriging
Nielsen, Allan Aasbjerg; Conradsen, Knut; Pedersen, John L.
2000-01-01
This paper describes maximum autocorrelation factor (MAF) analysis, maximum autocorrelation factorial kriging, and its application to irregularly sampled stream sediment geochemical data from South Greenland. Kriged MAF images are compared with kriged images of varimax rotated factors from...
Estimating Decision Indices Based on Composite Scores
Knupp, Tawnya Lee
2009-01-01
The purpose of this study was to develop an IRT model that would enable the estimation of decision indices based on composite scores. The composite scores, defined as a combination of unidimensional test scores, were either a total raw score or an average scale score. Additionally, estimation methods for the normal and compound multinomial models…
Gilberto Silvério da Silva
2007-06-01
Full Text Available Neste estudo foram avaliadas a capacidade de suporte e o estado de degradação do Rio Atibaia, considerando a ameaça para a vida aquática pela presença da Amônia, a qual representa um dos principais riscos às comunidades aquáticas no Rio Atibaia. Com este objetivo foi aplicado o método da Carga Máxima Total Diária (CMTD, da Agência de Proteção Ambiental dos Estados Unidos (EPA. Os resultados revelaram que as cargas de Amônia aumentavam progressivamente ao longo do Rio Atibaia, principalmente devido às fontes pontuais. As cargas de Amônia diárias assumiram valores de 30 a 5000 kg NH3. A capacidade de suporte das águas Rio Atibaia, para proteger a vida aquática contra os efeitos tóxicos da Amônia, tem sido violadas em trechos próximos à sua foz. A degradação dessas águas foi mais intensa na estação seca. Este trabalho mostrou que o esgoto doméstico não-tratado de uma população aproximada de 250 mil habitantes da cidade de Campinas, via Ribeirão Anhumas, é a principal fonte de Amônia na bacia do Rio Atibaia, apesar do grande número de indústrias ali presentes.This study evaluated the tolerance capacity and the impairment state of the Atibaia River, considering the threat to aquatic life by the presence of Ammonia, which represents one of the main risks to the aquatic communities in the Atibaia River. With this aim, the method Total Maximum Daily Load (TMDL, from the United States Environmental Protection Agency (EPA, was applied. The results revealed that the Ammonia loads increased progressively through the Atibaia River, especially due to the point sources. The daily Ammonia loads assumed values that ranged from 30 to 5000 kg NH3. The tolerance capacity of the waters of the Atibaia River, to protect aquatic life against the toxic effects of the Ammonia, has been violated in reaches near its mouth. The impairment of these waters was more intense during the dry season. This study showed that the domestic sewer
Rice, J P; Saccone, N L; Corbett, J
2001-01-01
The lod score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential, so that pedigrees or lod curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders, where the maximum lod score (MLS) statistic shares some of the advantages of the traditional lod score approach but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the lod score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.
Aoki, Masahiko; Sato, Mariko; Hirose, Katsumi; Akimoto, Hiroyoshi; Kawaguchi, Hideo; Hatayama, Yoshiomi; Ono, Shuichi; Takai, Yoshihiro
2015-04-22
Radiation-induced rib fracture after stereotactic body radiotherapy (SBRT) for lung cancer has been recently reported. However, incidence of radiation-induced rib fracture after SBRT using moderate fraction sizes with a long-term follow-up time are not clarified. We examined incidence and risk factors of radiation-induced rib fracture after SBRT using moderate fraction sizes for the patients with peripherally located lung tumor. During 2003-2008, 41 patients with 42 lung tumors were treated with SBRT to 54-56 Gy in 9-7 fractions. The endpoint in the study was radiation-induced rib fracture detected by CT scan after the treatment. All ribs where the irradiated doses were more than 80% of prescribed dose were selected and contoured to build the dose-volume histograms (DVHs). Comparisons of the several factors obtained from the DVHs and the probabilities of rib fracture calculated by Kaplan-Meier method were performed in the study. Median follow-up time was 68 months. Among 75 contoured ribs, 23 rib fractures were observed in 34% of the patients during 16-48 months after SBRT, however, no patients complained of chest wall pain. The 4-year probabilities of rib fracture for maximum dose of ribs (Dmax) more than and less than 54 Gy were 47.7% and 12.9% (p = 0.0184), and for fraction size of 6, 7 and 8 Gy were 19.5%, 31.2% and 55.7% (p = 0.0458), respectively. Other factors, such as D2cc, mean dose of ribs, V10-55, age, sex, and planning target volume were not significantly different. The doses and fractionations used in this study resulted in no clinically significant rib fractures for this population, but that higher Dmax and dose per fraction treatments resulted in an increase in asymptomatic grade 1 rib fractures.
Nakada, Masao; Okuno, Jun'ichi; Yokoyama, Yusuke
2016-02-01
Inference of globally averaged eustatic sea level (ESL) rise since the Last Glacial Maximum (LGM) highly depends on the interpretation of relative sea level (RSL) observations at Barbados and Bonaparte Gulf, Australia, which are sensitive to the viscosity structure of Earth's mantle. Here we examine the RSL changes at the LGM for Barbados and Bonaparte Gulf ({{RSL}}_{{L}}^{{{Bar}}} and {{RSL}}_{{L}}^{{{Bon}}}), differential RSL for both sites (Δ {{RSL}}_{{L}}^{{{Bar}},{{Bon}}}) and rate of change of degree-two harmonics of Earth's geopotential due to glacial isostatic adjustment (GIA) process (GIA-induced J˙2) to infer the ESL component and viscosity structure of Earth's mantle. Differential RSL, Δ {{RSL}}_{{L}}^{{{Bar}},{{Bon}}} and GIA-induced J˙2 are dominantly sensitive to the lower-mantle viscosity, and nearly insensitive to the upper-mantle rheological structure and GIA ice models with an ESL component of about (120-130) m. The comparison between the predicted and observationally derived Δ {{RSL}}_{{L}}^{{{Bar}},{{Bon}}} indicates the lower-mantle viscosity higher than ˜2 × 1022 Pa s, and the observationally derived GIA-induced J˙2 of -(6.0-6.5) × 10-11 yr-1 indicates two permissible solutions for the lower mantle, ˜1022 and (5-10) × 1022 Pa s. That is, the effective lower-mantle viscosity inferred from these two observational constraints is (5-10) × 1022 Pa s. The LGM RSL changes at both sites, {{RSL}}_{{L}}^{{{Bar}}} and {{RSL}}_{{L}}^{{{Bon}}}, are also sensitive to the ESL component and upper-mantle viscosity as well as the lower-mantle viscosity. The permissible upper-mantle viscosity increases with decreasing ESL component due to the sensitivity of the LGM sea level at Bonaparte Gulf ({{RSL}}_{{L}}^{{{Bon}}}) to the upper-mantle viscosity, and inferred upper-mantle viscosity for adopted lithospheric thicknesses of 65 and 100 km is (1-3) × 1020 Pa s for ESL˜130 m and (4-10) × 1020 Pa s for ESL˜125 m. The former solution of (1-3) × 1020
彭颂; 胥伯勇; 曹力
2015-01-01
背景：单髁置换治疗骨关节炎被越来越多的学者所接受，但仍有部分学者认为行全膝置换是更好的选择。目的：对比单髁置换与全膝置换患者置换前、后的膝关节评分。 方法：纳入2013年3月至2014年11月在新疆医科大学第一附属医院行单髁置换并得到随访的膝骨关节炎患者53例，并选取同期行全膝置换并得到随访的患者53例。记录两组患者置换前与末次随访的膝关节评分，包括KSS评分、WOMAC评分、OKS评分、HSS评分，并于末次随访时记录膝关节最大屈曲角度及患者满意度。 结果与结论：排除两组中出现并发症的患者后，两组患者末次随访的膝关节评分均较置换前明显提高(P 0.05)；单髁置换组末次随访的膝关节最大活动度大于全膝置换组(P0.05). Maximum range of motion in final fol ow-up was significantly larger in unicompartmental knee arthroplasty group than in total knee arthroplasty group (P<0.05). Patient satisfaction was similar between the two groups. Moreover, the association between WOMAC scores and OKS scores was high. These results suggest that unicompartmental knee arthroplasty could improve patient’s quality of life as total knee arthroplasty so long as physicians grasp strict indications;standard replacement operation and good functional exercise after replacement, but its long-term outcomes stil need further investigations.
Maximum Autocorrelation Factorial Kriging
Nielsen, Allan Aasbjerg; Conradsen, Knut; Pedersen, John L.; Steenfelt, Agnete
2000-01-01
This paper describes maximum autocorrelation factor (MAF) analysis, maximum autocorrelation factorial kriging, and its application to irregularly sampled stream sediment geochemical data from South Greenland. Kriged MAF images are compared with kriged images of varimax rotated factors from an ordinary non-spatial factor analysis, and they are interpreted in a geological context. It is demonstrated that MAF analysis contrary to ordinary non-spatial factor analysis gives an objective discrimina...
Maximum likely scale estimation
Loog, Marco; Pedersen, Kim Steenstrup; Markussen, Bo
2005-01-01
A maximum likelihood local scale estimation principle is presented. An actual implementation of the estimation principle uses second order moments of multiple measurements at a fixed location in the image. These measurements consist of Gaussian derivatives possibly taken at several scales and/or ...
Clean Water Act Approved Total Maximum Daily Load (TMDL) Documents
U.S. Environmental Protection Agency — Information from Approved and Established TMDL Documents as well as TMDLs that have been Withdrawn. This includes the pollutants identified in the TMDL Document, the...
Task 0715: Army Chesapeake Bay Total Maximum Daily Load Pilots
2011-05-01
NDCEE/CTC The NDCEE is operated by: Office of the Assistant Sec etary of the Army for Installations, E ergy and Enviro ment Technology Transition...Center for Energy and Environment 4) Guidebook and Training Development/Delivery (Future) Guidebook – Capture lessons learned from implementation of
Maximum information photoelectron metrology
Hockett, P; Wollenhaupt, M; Baumert, T
2015-01-01
Photoelectron interferograms, manifested in photoelectron angular distributions (PADs), are a high-information, coherent observable. In order to obtain the maximum information from angle-resolved photoionization experiments it is desirable to record the full, 3D, photoelectron momentum distribution. Here we apply tomographic reconstruction techniques to obtain such 3D distributions from multiphoton ionization of potassium atoms, and fully analyse the energy and angular content of the 3D data. The PADs obtained as a function of energy indicate good agreement with previous 2D data and detailed analysis [Hockett et. al., Phys. Rev. Lett. 112, 223001 (2014)] over the main spectral features, but also indicate unexpected symmetry-breaking in certain regions of momentum space, thus revealing additional continuum interferences which cannot otherwise be observed. These observations reflect the presence of additional ionization pathways and, most generally, illustrate the power of maximum information measurements of th...
Maximum Likelihood Associative Memories
Gripon, Vincent; Rabbat, Michael
2013-01-01
Associative memories are structures that store data in such a way that it can later be retrieved given only a part of its content -- a sort-of error/erasure-resilience property. They are used in applications ranging from caches and memory management in CPUs to database engines. In this work we study associative memories built on the maximum likelihood principle. We derive minimum residual error rates when the data stored comes from a uniform binary source. Second, we determine the minimum amo...
Maximum likely scale estimation
Loog, Marco; Pedersen, Kim Steenstrup; Markussen, Bo
2005-01-01
A maximum likelihood local scale estimation principle is presented. An actual implementation of the estimation principle uses second order moments of multiple measurements at a fixed location in the image. These measurements consist of Gaussian derivatives possibly taken at several scales and....../or having different derivative orders. Although the principle is applicable to a wide variety of image models, the main focus here is on the Brownian model and its use for scale selection in natural images. Furthermore, in the examples provided, the simplifying assumption is made that the behavior...... of the measurements is completely characterized by all moments up to second order....
F. TopsÃƒÂ¸e
2001-09-01
Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over
Regularized maximum correntropy machine
Wang, Jim Jing-Yan
2015-02-12
In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.
Cardiorespiratory Fitness of Inmates of a Maximum Security Prison ...
USER
Maximum Security Prison; and also to determine the effects of age, gender, and period of incarceration on CRF. A total of 247 apparently healthy inmates of Maiduguri Maximum Security ... with different types of cardiovascular and metabolic.
Duality of Maximum Entropy and Minimum Divergence
Shinto Eguchi
2014-06-01
Full Text Available We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of maximum entropy distributions; the divergence measure leads to statistical estimation via minimization, for arbitrarily giving a statistical model. The dualistic relationship between the maximum entropy model and the minimum divergence estimation is explored in the framework of information geometry. The model of maximum entropy distributions is characterized to be totally geodesic with respect to the linear connection associated with the divergence. A natural extension for the classical theory for the maximum likelihood method under the maximum entropy model in terms of the Boltzmann-Gibbs-Shannon entropy is given. We discuss the duality in detail for Tsallis entropy as a typical example.
Equalized near maximum likelihood detector
2012-01-01
This paper presents new detector that is used to mitigate intersymbol interference introduced by bandlimited channels. This detector is named equalized near maximum likelihood detector which combines nonlinear equalizer and near maximum likelihood detector. Simulation results show that the performance of equalized near maximum likelihood detector is better than the performance of nonlinear equalizer but worse than near maximum likelihood detector.
黄威; 孔荣; 李守民; 方诗元; 禹德万; 朱晨; 李乾明; 徐泽; 王磊
2015-01-01
目的：通过初步基础实验研究，探讨股骨后髁偏心距与膝关节置换术后膝关节最大屈曲度的关系。方法对6具尸体膝关节（6膝）进行全膝关节置换术，应用股骨前参照测量导板系统，保持前髁截骨平面不变，调整导板钻头导向刻度决定股骨后髁截骨平面，依股骨后髁截骨量变化数值再调整胫骨侧截骨平面，保持膝关节屈曲间隙和胫骨平台后倾角恒定，依此安装相应型号股骨侧试模假体，最大屈曲膝关节纯侧位1∶1摄X线片，测量并记录股骨后髁偏心距、膝关节最大屈曲角度，进行统计学分析。结果经Pearson相关系数和Spearman相关系数检验，同一标本不同股骨髁偏心距与术后膝关节最大屈曲度并无相关性（P＞0．05）。结论全膝关节置换术应用后稳定型假体股骨后髁偏心距大小与术后膝关节最大屈曲度无相关性。%Objective Application of preliminary experiment research,this paper discusses the relationship between the femoral postrior condyle and the maximum flexion of the knee after the total knee arthroplasty. Methods To take the total knee arthroplasty on the six lower limb corpse knees(six knees),adjusting the drill of the guide to decide posterior condylar osteotomy plane,depanding on the amount of postrior condylar osteotomy change and then adjust the value of the tibial osteotomy plane,keeping the knee flexion gap and posterior tibial slope constantly,then install the appropriate type of the femoral prosthetic tryout in turn,get the Maximum flexion knee pure lateral 1 ∶ 1 X-ray images,measure and record the value of the femoral condyle and knee maximum flexion degree for statistical analysis. Results The data were assessed by the Pearson and Spearman correlation coefficients,the same sample of different femoral condylar offset and knee maximum flexion degree after total knee arthroplasty has no relevance. (P>0. 05,had no significant
Cheeseman, Peter; Stutz, John
2005-01-01
A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this problem, advocated by E.T. Jaynes [1], is to ignore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represented by a probability density {e.g: a Gaussian), and this uncertainty yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].
Meijer, Rob R.
2003-01-01
This book discusses how to obtain test scores and, in particular, how to obtain test scores from tests that consist of a combination of multiple choice and open-ended questions. The strength of the book is that scoring solutions are presented for a diversity of real world scoring problems. (SLD)
陈树国
2012-01-01
Huai’an adopted global budget and Payment by Disease with score. This method of payment well embodies the fund operation principle, which is " fixed income support, balance of payments, slightly balances". For 8 years it implemented, average annual growth of per medical expenses is only 2.79%, lower than the national and the province average growth over the same period. In the case of inpatient admissions increased, the proportion of outpatient specific project costs and security standards to enhance, co-ordinate fund current balances rate is always maintained at a reasonable level of 2%to 3%.% 淮安总额控制下的病种分值结算办法较好地体现了“以收定支、收支平衡、略有结余”的基金运作原则，实施8年间次均医疗费用年均增幅仅为2.79%，低于全国和全省同期平均增幅.在住院人次增加、门诊特定项目费用占比及保障标准提升的情况下，统筹基金当期结余率始终维持在2%-3%的合理水平.
Lin, Miao-Hsiang; Hsiung, Chao A.
1994-01-01
Two simple empirical approximate Bayes estimators are introduced for estimating domain scores under binomial and hypergeometric distributions respectively. Criteria are established regarding use of these functions over maximum likelihood estimation counterparts. (SLD)
Heteroscedastic one-factor models and marginal maximum likelihood estimation
Hessen, D.J.; Dolan, C.V.
2009-01-01
In the present paper, a general class of heteroscedastic one-factor models is considered. In these models, the residual variances of the observed scores are explicitly modelled as parametric functions of the one-dimensional factor score. A marginal maximum likelihood procedure for parameter estimati
Lower bounds to the reliabilities of factor score estimators
Hessen, D.J.|info:eu-repo/dai/nl/256041717
2017-01-01
Under the general common factor model, the reliabilities of factor score estimators might be of more interest than the reliability of the total score (the unweighted sum of item scores). In this paper, lower bounds to the reliabilities of Thurstone’s factor score estimators, Bartlett’s factor score
Optimal cutting scores using a linear loss function
Linden, van der Wim J.; Mellenbergh, Gideon J.
1977-01-01
The situation is considered in which a total score on a test is used for classifying examinees into two categories: "accepted (with scores above a cutting score on the test) and "not accepted" (with scores below the cutting score). A value on the latent variable is fixed in advance; examinees above
European conformation and fat scores have no relationship with eating quality.
Bonny, S P F; Pethick, D W; Legrand, I; Wierzbicki, J; Allen, P; Farmer, L J; Polkinghorne, R J; Hocquette, J-F; Gardner, G E
2016-06-01
European conformation and fat grades are a major factor determining carcass value throughout Europe. The relationships between these scores and sensory scores were investigated. A total of 3786 French, Polish and Irish consumers evaluated steaks, grilled to a medium doneness, according to protocols of the ���Meat Standards Australia��� system, from 18 muscles representing 455 local, commercial cattle from commercial abattoirs. A mixed linear effects model was used for the analysis. There was a negative relationship between juiciness and European conformation score. For the other sensory scores, a maximum of three muscles out of a possible 18 demonstrated negative effects of conformation score on sensory scores. There was a positive effect of European fat score on three individual muscles. However, this was accounted for by marbling score. Thus, while the European carcass classification system may indicate yield, it has no consistent relationship with sensory scores at a carcass level that is suitable for use in a commercial system. The industry should consider using an additional system related to eating quality to aid in the determination of the monetary value of carcasses, rewarding eating quality in addition to yield.
Instance Optimality of the Adaptive Maximum Strategy
L. Diening; C. Kreuzer; R. Stevenson
2016-01-01
In this paper, we prove that the standard adaptive finite element method with a (modified) maximum marking strategy is instance optimal for the total error, being the square root of the squared energy error plus the squared oscillation. This result will be derived in the model setting of Poisson’s e
Subgroup Balancing Propensity Score
DONG, JING; Zhang, Junni L; Li, Fan
2017-01-01
We investigate the estimation of subgroup treatment effects with observational data. Existing propensity score matching and weighting methods are mostly developed for estimating overall treatment effect. Although the true propensity score should balance covariates for the subgroup populations, the estimated propensity score may not balance covariates for the subgroup samples. We propose the subgroup balancing propensity score (SBPS) method, which selects, for each subgroup, to use either the ...
The relationship between second-year medical students' OSCE scores and USMLE Step 1 scores.
Simon, Steven R; Volkan, Kevin; Hamann, Claus; Duffey, Carol; Fletcher, Suzanne W
2002-09-01
The relationship between objective structured clinical examinations (OSCEs) and standardized tests is not well known. We linked second-year medical students' physical diagnosis OSCE scores from 1998, 1999 and 2000 (n = 355) with demographic information, Medical College Admission Test (MCAT) scores, and United States Medical Licensing Examination (USMLE) Step 1 scores. The correlation coefficient for the total OSCE score with USMLE Step 1 score was 0.41 (p USMLE Step 1 score. OSCE station scores accounted for approximately 22% of the variability in USMLE Step 1 scores. A second-year OSCE in physical diagnosis is correlated with scores on the USMLE Step 1 exam, with skills that foreshadow the clinical clerkships most predictive of USMLE scores. This correlation suggests predictive validity of this OSCE and supports the use of OSCEs early in medical school.
... page: //medlineplus.gov/ency/article/003483.htm Total protein To use the sharing features on this page, please enable JavaScript. The total protein test measures the total amount of two classes ...
2015-10-01
The Apgar score provides an accepted and convenient method for reporting the status of the newborn infant immediately after birth and the response to resuscitation if needed. The Apgar score alone cannot be considered as evidence of, or a consequence of, asphyxia; does not predict individual neonatal mortality or neurologic outcome; and should not be used for that purpose. An Apgar score assigned during resuscitation is not equivalent to a score assigned to a spontaneously breathing infant. The American Academy of Pediatrics and the American College of Obstetricians and Gynecologists encourage use of an expanded Apgar score reporting form that accounts for concurrent resuscitative interventions.
Maximum magnitude earthquakes induced by fluid injection
McGarr, Arthur F.
2014-01-01
Analysis of numerous case histories of earthquake sequences induced by fluid injection at depth reveals that the maximum magnitude appears to be limited according to the total volume of fluid injected. Similarly, the maximum seismic moment seems to have an upper bound proportional to the total volume of injected fluid. Activities involving fluid injection include (1) hydraulic fracturing of shale formations or coal seams to extract gas and oil, (2) disposal of wastewater from these gas and oil activities by injection into deep aquifers, and (3) the development of enhanced geothermal systems by injecting water into hot, low-permeability rock. Of these three operations, wastewater disposal is observed to be associated with the largest earthquakes, with maximum magnitudes sometimes exceeding 5. To estimate the maximum earthquake that could be induced by a given fluid injection project, the rock mass is assumed to be fully saturated, brittle, to respond to injection with a sequence of earthquakes localized to the region weakened by the pore pressure increase of the injection operation and to have a Gutenberg-Richter magnitude distribution with a b value of 1. If these assumptions correctly describe the circumstances of the largest earthquake, then the maximum seismic moment is limited to the volume of injected liquid times the modulus of rigidity. Observations from the available case histories of earthquakes induced by fluid injection are consistent with this bound on seismic moment. In view of the uncertainties in this analysis, however, this should not be regarded as an absolute physical limit.
Maximum magnitude earthquakes induced by fluid injection
McGarr, A.
2014-02-01
Analysis of numerous case histories of earthquake sequences induced by fluid injection at depth reveals that the maximum magnitude appears to be limited according to the total volume of fluid injected. Similarly, the maximum seismic moment seems to have an upper bound proportional to the total volume of injected fluid. Activities involving fluid injection include (1) hydraulic fracturing of shale formations or coal seams to extract gas and oil, (2) disposal of wastewater from these gas and oil activities by injection into deep aquifers, and (3) the development of enhanced geothermal systems by injecting water into hot, low-permeability rock. Of these three operations, wastewater disposal is observed to be associated with the largest earthquakes, with maximum magnitudes sometimes exceeding 5. To estimate the maximum earthquake that could be induced by a given fluid injection project, the rock mass is assumed to be fully saturated, brittle, to respond to injection with a sequence of earthquakes localized to the region weakened by the pore pressure increase of the injection operation and to have a Gutenberg-Richter magnitude distribution with a b value of 1. If these assumptions correctly describe the circumstances of the largest earthquake, then the maximum seismic moment is limited to the volume of injected liquid times the modulus of rigidity. Observations from the available case histories of earthquakes induced by fluid injection are consistent with this bound on seismic moment. In view of the uncertainties in this analysis, however, this should not be regarded as an absolute physical limit.
Diet Quality Scores of Australian Adults Who Have Completed the Healthy Eating Quiz.
Williams, Rebecca L; Rollo, Megan E; Schumacher, Tracy; Collins, Clare E
2017-08-15
Higher scores obtained using diet quality and variety indices are indicators of more optimal food and nutrient intakes and lower chronic disease risk. The aim of this paper is to describe the overall diet quality and variety in a sample of Australian adults who completed an online diet quality self-assessment tool, the Healthy Eating Quiz. The Healthy Eating Quiz takes approximately five minutes to complete online and computes user responses into a total diet quality score (out of a maximum of 73 points) and then categorizes them into the following groups: 'needs work' (Healthy eating quiz scores were higher in those aged 45-75 years compared to 16-44 years (p Healthy Eating Quiz data indicates that individuals receiving feedback on how to improve their score can improve their diet quality, there is a need for further nutrition promotion interventions in Australian adults.
The Maximum Resource Bin Packing Problem
Boyar, J.; Epstein, L.; Favrholdt, L.M.
2006-01-01
Usually, for bin packing problems, we try to minimize the number of bins used or in the case of the dual bin packing problem, maximize the number or total size of accepted items. This paper presents results for the opposite problems, where we would like to maximize the number of bins used...... algorithms, First-Fit-Increasing and First-Fit-Decreasing for the maximum resource variant of classical bin packing. For the on-line variant, we define maximum resource variants of classical and dual bin packing. For dual bin packing, no on-line algorithm is competitive. For classical bin packing, we find...
Akai, Takanori; Taniguchi, Daigo; Oda, Ryo; Asada, Maki; Toyama, Shogo; Tokunaga, Daisaku; Seno, Takahiro; Kawahito, Yutaka; Fujii, Yosuke; Ito, Hirotoshi; Fujiwara, Hiroyoshi; Kubo, Toshikazu
2016-04-01
Contrast-enhanced magnetic resonance imaging with maximum intensity projection (MRI-MIP) is an easy, useful imaging method to evaluate synovitis in rheumatoid hands. However, the prognosis of synovitis-positive joints on MRI-MIP has not been clarified. The aim of this study was to evaluate the relationship between synovitis visualized by MRI-MIP and joint destruction on X-rays in rheumatoid hands. The wrists, metacarpophalangeal (MP) joints, and proximal interphalangeal (PIP) joints of both hands (500 joints in total) were evaluated in 25 rheumatoid arthritis (RA) patients. Synovitis was scored from grade 0 to 2 on the MRI-MIP images. The Sharp/van der Heijde score and Larsen grade were used for radiographic evaluation. The relationships between the MIP score and the progression of radiographic scores and between the MIP score and bone marrow edema on MRI were analyzed using the trend test. As the MIP score increased, the Sharp/van der Heijde score and Larsen grade progressed severely. The rate of bone marrow edema-positive joints also increased with higher MIP scores. MRI-MIP imaging of RA hands is a clinically useful method that allows semi-quantitative evaluation of synovitis with ease and can be used to predict joint destruction.
Tel, G.
1993-01-01
We define the notion of total algorithms for networks of processes. A total algorithm enforces that a "decision" is taken by a subset of the processes, and that participation of all processes is required to reach this decision. Total algorithms are an important building block in the design of distri
SLACK, CHARLES W.
REINFORCEMENT AND ROLE-REVERSAL TECHNIQUES ARE USED IN THE SCORE PROJECT, A LOW-COST PROGRAM OF DELINQUENCY PREVENTION FOR HARD-CORE TEENAGE STREET CORNER BOYS. COMMITTED TO THE BELIEF THAT THE BOYS HAVE THE POTENTIAL FOR ETHICAL BEHAVIOR, THE SCORE WORKER FOLLOWS B.F. SKINNER'S THEORY OF OPERANT CONDITIONING AND REINFORCES THE DELINQUENT'S GOOD…
Evaluation of pliers' grip spans in the maximum gripping task and sub-maximum cutting task.
Kim, Dae-Min; Kong, Yong-Ku
2016-12-01
A total of 25 males participated to investigate the effects of the grip spans of pliers on the total grip force, individual finger forces and muscle activities in the maximum gripping task and wire-cutting tasks. In the maximum gripping task, results showed that the 50-mm grip span had significantly higher total grip strength than the other grip spans. In the cutting task, the 50-mm grip span also showed significantly higher grip strength than the 65-mm and 80-mm grip spans, whereas the muscle activities showed a higher value at 80-mm grip span. The ratios of cutting force to maximum grip strength were also investigated. Ratios of 30.3%, 31.3% and 41.3% were obtained by grip spans of 50-mm, 65-mm, and 80-mm, respectively. Thus, the 50-mm grip span for pliers might be recommended to provide maximum exertion in gripping tasks, as well as lower maximum-cutting force ratios in the cutting tasks.
OECD Maximum Residue Limit Calculator
With the goal of harmonizing the calculation of maximum residue limits (MRLs) across the Organisation for Economic Cooperation and Development, the OECD has developed an MRL Calculator. View the calculator.
Rudolf, Frauke; Joaquim, Luis Carlos; Vieira, Cesaltina
2013-01-01
Background: This study was carried out in Guinea-Bissau ’ s capital Bissau among inpatients and outpatients attending for tuberculosis (TB) treatment within the study area of the Bandim Health Project, a Health and Demographic Surveillance Site. Our aim was to assess the variability between 2...... physicians in performing the Bandim tuberculosis score (TBscore), a clinical severity score for pulmonary TB (PTB), and to compare it to the Karnofsky performance score (KPS). Method : From December 2008 to July 2009 we assessed the TBscore and the KPS of 100 PTB patients at inclusion in the TB cohort and...
Reporting Valid and Reliable Overall Scores and Domain Scores
Yao, Lihua
2010-01-01
In educational assessment, overall scores obtained by simply averaging a number of domain scores are sometimes reported. However, simply averaging the domain scores ignores the fact that different domains have different score points, that scores from those domains are related, and that at different score points the relationship between overall…
Calhoun, William; Dargahi-Noubary, G. R.; Shi, Yixun
2002-01-01
The widespread interest in sports in our culture provides an excellent opportunity to catch students' attention in mathematics and statistics classes. One mathematically interesting aspect of volleyball, which can be used to motivate students, is the scoring system. (MM)
Calhoun, William; Dargahi-Noubary, G. R.; Shi, Yixun
2002-01-01
The widespread interest in sports in our culture provides an excellent opportunity to catch students' attention in mathematics and statistics classes. One mathematically interesting aspect of volleyball, which can be used to motivate students, is the scoring system. (MM)
Maximum margin Bayesian network classifiers.
Pernkopf, Franz; Wohlmayr, Michael; Tschiatschek, Sebastian
2012-03-01
We present a maximum margin parameter learning algorithm for Bayesian network classifiers using a conjugate gradient (CG) method for optimization. In contrast to previous approaches, we maintain the normalization constraints on the parameters of the Bayesian network during optimization, i.e., the probabilistic interpretation of the model is not lost. This enables us to handle missing features in discriminatively optimized Bayesian networks. In experiments, we compare the classification performance of maximum margin parameter learning to conditional likelihood and maximum likelihood learning approaches. Discriminative parameter learning significantly outperforms generative maximum likelihood estimation for naive Bayes and tree augmented naive Bayes structures on all considered data sets. Furthermore, maximizing the margin dominates the conditional likelihood approach in terms of classification performance in most cases. We provide results for a recently proposed maximum margin optimization approach based on convex relaxation. While the classification results are highly similar, our CG-based optimization is computationally up to orders of magnitude faster. Margin-optimized Bayesian network classifiers achieve classification performance comparable to support vector machines (SVMs) using fewer parameters. Moreover, we show that unanticipated missing feature values during classification can be easily processed by discriminatively optimized Bayesian network classifiers, a case where discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.
Maximum Entropy in Drug Discovery
Chih-Yuan Tseng
2014-07-01
Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.
Maximum Variance Hashing via Column Generation
Lei Luo
2013-01-01
item search. Recently, a number of data-dependent methods have been developed, reflecting the great potential of learning for hashing. Inspired by the classic nonlinear dimensionality reduction algorithm—maximum variance unfolding, we propose a novel unsupervised hashing method, named maximum variance hashing, in this work. The idea is to maximize the total variance of the hash codes while preserving the local structure of the training data. To solve the derived optimization problem, we propose a column generation algorithm, which directly learns the binary-valued hash functions. We then extend it using anchor graphs to reduce the computational cost. Experiments on large-scale image datasets demonstrate that the proposed method outperforms state-of-the-art hashing methods in many cases.
The Maximum Resource Bin Packing Problem
Boyar, J.; Epstein, L.; Favrholdt, L.M.
2006-01-01
algorithms, First-Fit-Increasing and First-Fit-Decreasing for the maximum resource variant of classical bin packing. For the on-line variant, we define maximum resource variants of classical and dual bin packing. For dual bin packing, no on-line algorithm is competitive. For classical bin packing, we find......Usually, for bin packing problems, we try to minimize the number of bins used or in the case of the dual bin packing problem, maximize the number or total size of accepted items. This paper presents results for the opposite problems, where we would like to maximize the number of bins used...... the competitive ratio of various natural algorithms. We study the general versions of the problems as well as the parameterized versions where there is an upper bound of on the item sizes, for some integer k....
Shinn, Maxwell
2013-01-01
Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. Instant MuseScore is written in an easy-to follow format, packed with illustrations that will help you get started with this music composition software.This book is for musicians who would like to learn how to notate music digitally with MuseScore. Readers should already have some knowledge about musical terminology; however, no prior experience with music notation software is necessary.
Greenslade, Thomas B., Jr.
1985-01-01
Discusses a series of experiments performed by Thomas Hope in 1805 which show the temperature at which water has its maximum density. Early data cast into a modern form as well as guidelines and recent data collected from the author provide background for duplicating Hope's experiments in the classroom. (JN)
Abolishing the maximum tension principle
Dabrowski, Mariusz P
2015-01-01
We find the series of example theories for which the relativistic limit of maximum tension $F_{max} = c^2/4G$ represented by the entropic force can be abolished. Among them the varying constants theories, some generalized entropy models applied both for cosmological and black hole horizons as well as some generalized uncertainty principle models.
Abolishing the maximum tension principle
Mariusz P. Da̧browski
2015-09-01
Full Text Available We find the series of example theories for which the relativistic limit of maximum tension Fmax=c4/4G represented by the entropic force can be abolished. Among them the varying constants theories, some generalized entropy models applied both for cosmological and black hole horizons as well as some generalized uncertainty principle models.
Zaki Noah Hasan
2010-09-01
Full Text Available The Guillain-Barré syndrome (GBS is an acute post-infective autoimmune polyradiculoneuropathy, it is the commonest peripheral neuropathy causing respiratory failure. The aim of the study is to use the New Combined Scoring System in anticipating respiratory failure in order to perform elective measures without waiting for emergency situations to occur.
Patients and methods: Fifty patients with GBS were studied. Eight clinical parameters (including progression of patients to maximum weakness, respiratory rate/minute, breath holding
count (the number of digits the patient can count in holding his breath, presence of facial muscle weakness (unilateral or bilateral, presence of weakness of the bulbar muscle, weakness of the neck flexor muscle, and limbs weakness were assessed for each patient and a certain score was given to
each parameter, a designed combined score being constructed by taking into consideration all the above mentioned clinical parameters. Results and discussion: Fifteen patients (30% that were enrolled in our study developed respiratory failure. There was a highly significant statistical association between the development of respiratory failure and the lower grades of (bulbar muscle weakness score, breath holding count scores, neck muscle weakness score, lower limbs and upper limbs weakness score , respiratory rate score and the total sum score above 16 out of 30 (p-value=0.000 . No significant statistical difference was found regarding the progression to maximum weakness (p-value=0.675 and facial muscle weakness (p-value=0.482.
Conclusion: The patients who obtained a combined score (above 16’30 are at great risk of having respiratory failure.
Multiple Sclerosis Questionnaire for Job Difficulties (MSQ-Job): definition of the cut-off score.
Schiavolin, Silvia; Giovannetti, Ambra Mara; Leonardi, Matilde; Brenna, Greta; Brambilla, Laura; Confalonieri, Paolo; Frangiamore, Rita; Mantegazza, Renato; Moscatelli, Marco; Clerici, Valentina Torri; Cortese, Francesca; Covelli, Venusia; Ponzio, Michela; Zaratin, Paola; Raggi, Alberto
2016-05-01
Multiple Sclerosis (MS) mainly affects people of working age. The Multiple Sclerosis Questionnaire for Job Difficulties (MSQ-Job) was designed to measure difficulties in work-related tasks. Our aim is to define cut-off score of MSQ-Job to identify potential critical situations that might require specific attention. A sample of patients with MS completed the MSQ-Job, WHODAS 2.0 and MSQOL-54 respectively for work difficulties, disability and health-related quality of life (HRQoL) evaluation. K-means Cluster Analysis was used to divide the sample in three groups on the basis of HRQoL and disability. ANOVA test was performed to compare the response pattern between these groups. The cut-off score was defined using the receiver operating characteristic (ROC) curve analyses for MSQ-Job total and count of MSQ-Job items scores ≥3: a score value corresponding to the maximum of the sensitivity-to-specificity ratio was chosen as the cut-off. Out of 180 patients enrolled, twenty were clustered in the higher severity group. The area under the ROC curve was 0.845 for the MSQ-Job total and 0.859 for the count of MSQ-Job items scores ≥3 while the cut-off score was 15.8 for MSQ-Job total and 8 for count of items scored ≥3. We recommend the use of MSQ-Job with this calculation as cut-off for identifying critical situations, e.g. in vocational rehabilitation services, where work-related difficulties have a significant impact in terms of lower quality of life and higher disability.
Lovell, D P
1999-06-01
Principal component analyses (PCA) have been carried out on the tissue scores from Draize eye irritation tests on the 55 formulations and chemical ingredients included in the COLIPA Eye Irritation Validation Study. A PCA was carried out on the tissue scores 24, 48 and 72 hours after instillation of the substances. The first Principal Component (PC I) explained 77% of the total variation in the tissues scores and showed a high negative correlation (r=-0.971) with the scores used to derive the Modified Maximum Average Score (MMAS). The second component (PC II) explained 7% of the total variability and contrasted corneal and iris damage with conjunctival damage as in a similar analysis carried out previously on the ECETOC databank. The third component (PCIII), while only explaining about 3% of the variability, identified individuals treated with formulations that were observed to have low corneal opacity but large corneal area scores. This may represent some particular manner of scoring at the laboratory administering the Draize test or a specific effect of some formulations. A further PCA was carried out on tissue scores from observations at 1hr to 21 days. PC I in this analysis explained 62% of the variability and there was a high negative correlation with the sum of all the tissue scores, while PC II explained 14% of the variability and contrasted damage up to 72 hours with damage after 72 hours. A number of formulations were identified with relatively low MMAS scores but tissue damage that persisted. PCA analysis is thus shown to be a powerful method for exploring complex datasets and for identification of outliers and subgroups. It has shown that the MMAS score captures most of the information on tissue scores in the first 72 hours following exposure, and it is unlikely to be of any advantage in using individual tissue scores for comparisons with alternative tests. The relationship of the classifications schemes used by three alternative methods in the COLIPA
Haavardsholm, Espen A; Østergaard, Mikkel; Ejbjerg, Bo J;
2007-01-01
OBJECTIVES: To describe a novel scoring system for the assessment of tenosynovitis by magnetic resonance imaging (MRI) in patients with rheumatoid arthritis, and assess its intra- and inter-reader reliability in a multireader, longitudinal setting. METHODS: Flexor and extensor tenosynovitis were...... evaluated at the level of the wrist in 10 different anatomical areas, graded semi-quantitatively from grade 0 to 3 (total score 0-30), based on the maximum width of post-contrast enhancement within each anatomical area on axial T1-weighted MR images. Ten sets of baseline and 1-year follow-up MR images...
van de Gronde, Jasper J.; Azzopardi, George; Petkov, Nicolai
2015-01-01
Orientation scores are representations of images built using filters that only select on orientation (and not on the magnitude of the frequency). Importantly, they allow (easy) reconstruction, making them ideal for use in a filtering pipeline. Traditionally a specific set of orientations has to be c
We developed scoring procedures to convert screener responses to estimates of individual dietary intake for fruits and vegetables, dairy, added sugars, whole grains, fiber, and calcium using the What We Eat in America 24-hour dietary recall data from the 2003-2006 NHANES.
Miranda, DR; Nap, R; de Rijk, A; Schaufeli, W; Lapichino, G
Objectives. The instruments used for measuring nursing workload in the intensive care unit (e.g., Therapeutic Intervention Scoring System-28) are based on therapeutic interventions related to severity of illness. Many nursing activities are not necessarily related to severity of illness, and
External validation of the discharge of hip fracture patients score
Vochteloo, Anne J. H.; Flikweert, Elvira R.; Tuinebreijer, Wim E.; Maier, Andrea B.; Bloem, Rolf M.; Pilot, Peter; Nelissen, Rob G. H. H.
This paper reports the external validation of a recently developed instrument, the Discharge of Hip fracture Patients score (DHP) that predicts discharge location on admission in patients living in their own home prior to hip fracture surgery. The DHP (maximum score 100 points) was applied to 125
Semire DIKLI
2006-01-01
Full Text Available Automated Essay Scoring Semire DIKLI Florida State University Tallahassee, FL, USA ABSTRACT The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES has revealed that computers have the capacity to function as a more effective cognitive tool (Attali, 2004. AES is defined as the computer technology that evaluates and scores the written prose (Shermis & Barrera, 2002; Shermis & Burstein, 2003; Shermis, Raymat, & Barrera, 2003. Revision and feedback are essential aspects of the writing process. Students need to receive feedback in order to increase their writing quality. However, responding to student papers can be a burden for teachers. Particularly if they have large number of students and if they assign frequent writing assignments, providing individual feedback to student essays might be quite time consuming. AES systems can be very useful because they can provide the student with a score as well as feedback within seconds (Page, 2003. Four types of AES systems, which are widely used by testing companies, universities, and public schools: Project Essay Grader (PEG, Intelligent Essay Assessor (IEA, E-rater, and IntelliMetric. AES is a developing technology. Many AES systems are used to overcome time, cost, and generalizability issues in writing assessment. The accuracy and reliability of these systems have been proven to be high. The search for excellence in machine scoring of essays is continuing and numerous studies are being conducted to improve the effectiveness of the AES systems.
MELD-XI Scores Correlate with Post-Fontan Hepatic Biopsy Fibrosis Scores.
Evans, William N; Acherman, Ruben J; Ciccolo, Michael L; Carrillo, Sergio A; Galindo, Alvaro; Rothman, Abraham; Winn, Brody J; Yumiaco, Noel S; Restrepo, Humberto
2016-10-01
We tested the hypothesis that MELD-XI values correlated with hepatic total fibrosis scores obtained in 70 predominately stable, post-Fontan patients that underwent elective cardiac catheterization. We found a statistically significant correlation between MELD-XI values and total fibrosis scores (p = 0.003). Thus, serial MELD-XI values may be an additional useful clinical parameter for follow-up care in post-Fontan patients.
Fetal Biophysical Profile Scoring
H.R. HaghighatKhah
2009-01-01
Full Text Available "nFetal biophysical profile scoring is a sonographic-based method of fetal assessment first described by Manning and Platt in 1980. "nThe biophysical profile score was developed as a method to integrate real-time observations of the fetus and his/her intrauterine environment in order to more comprehensively assess the fetal condition. These findings must be evaluated in the context of maternal/fetal history (i.e., chronic hypertension, post-dates, intrauterine growth restriction, etc, fetal structural integrity (presence or absence of congenital anomalies, and the functionality of fetal support structures (placental and umbilical cord. For example, acute asphyxia due to placental abruption may result in an absence of the acute variables of the biophysical profile score (fetal breathing movements, fetal movement, fetal tone, and fetal heart rate reactivity with a normal amniotic fluid volume. With post maturity the asphyxial event may be intermittent and chronic resulting in a decrease in amniotic fluid volume, but with the acute variables remaining normal. "nWhile the 5 components of the biophysical profile score have remained unchanged since 1980 (Manning, 1980, the definitions of a normal and abnormal parameter have evolved with increasing experience. "nIn 1984 the definition of oligohydramnios was increased from < 1cm pocket of fluid to < 2.0 x 1.0 cm pocket. Oligohydramnios is now defined as a pocket of amniotic fluid < 2.0 x 2.0 cm (Manning, 1995a "nIf the four ultrasound variables are normal, the accuracy of the biophysical profile score was not found to be significantly improved by adding the non-stress test. As a result, in 1987 the profile score was modified to incorporate the non-stress test only when one of the ultrasound variables was abnormal (Manning 1987. Table 1 outlines the current definitions for quantifying a variable as present or absent. "nEach of the 5 components of the biophysical profile score does not have equal
Male-female differences in Scoliosis Research Society-30 scores in adolescent idiopathic scoliosis.
Roberts, David W; Savage, Jason W; Schwartz, Daniel G; Carreon, Leah Y; Sucato, Daniel J; Sanders, James O; Richards, Benjamin Stephens; Lenke, Lawrence G; Emans, John B; Parent, Stefan; Sarwark, John F
2011-01-01
Longitudinal cohort study. To compare functional outcomes between male and female patients before and after surgery for adolescent idiopathic scoliosis (AIS). There is no clear consensus in the existing literature with respect to sex differences in functional outcomes in the surgical treatment of AIS. A prospective, consecutive, multicenter database of patients who underwent surgical correction for adolescent idiopathic scoliosis was analyzed retrospectively. All patients completed Scoliosis Research Society-30 (SRS-30) questionnaires before and 2 years after surgery. Patients with previous spine surgery were excluded. Data were collected for sex, age, Risser grade, previous bracing history, maximum preoperative Cobb angle, curve correction at 2 years, and SRS-30 domain scores. Paired sample t tests were used to compare preoperative and postoperative scores within each sex. Independent sample t tests were used to compare scores between sexes. A P value of Self-image/appearance had the greatest relative improvement. Males had better self-image/appearance scores preoperatively, better pain scores at 2 years, and better mental health and total scores both preoperatively and at 2 years. Both males and females were similarly satisfied with surgery. Males treated with surgery for AIS report better preoperative self-image, less postoperative pain, and better mental health than females. These differences may be clinically significant. For both males and females, the most beneficial effect of surgery is improved self-image/appearance. Overall, the benefits of surgery for AIS are similar for both sexes.
Maximum Genus of Strong Embeddings
Er-ling Wei; Yan-pei Liu; Han Ren
2003-01-01
The strong embedding conjecture states that any 2-connected graph has a strong embedding on some surface. It implies the circuit double cover conjecture: Any 2-connected graph has a circuit double cover.Conversely, it is not true. But for a 3-regular graph, the two conjectures are equivalent. In this paper, a characterization of graphs having a strong embedding with exactly 3 faces, which is the strong embedding of maximum genus, is given. In addition, some graphs with the property are provided. More generally, an upper bound of the maximum genus of strong embeddings of a graph is presented too. Lastly, it is shown that the interpolation theorem is true to planar Halin graph.
Remizov, Ivan D
2009-01-01
In this note, we represent a subdifferential of a maximum functional defined on the space of all real-valued continuous functions on a given metric compact set. For a given argument, $f$ it coincides with the set of all probability measures on the set of points maximizing $f$ on the initial compact set. This complete characterization lies in the heart of several important identities in microeconomics, such as Roy's identity, Sheppard's lemma, as well as duality theory in production and linear programming.
Prehospital score for acute disease: a community-based observational study in Japan
Fujiwara Hidekazu
2007-10-01
Full Text Available Abstract Background Ambulance usage in Japan has increased consistently because it is free under the national health insurance system. The introduction of refusal for ambulance transfer is being debated nationally. The purpose of the present study was to investigate the relationship between prehospital data and hospitalization outcome for acute disease patients, and to develop a simple prehospital evaluation tool using prehospital data for Japan's emergency medical service system. Methods The subjects were 9,160 consecutive acute disease patients aged ≥ 15 years who were transferred to hospital by Kishiwada City Fire Department ambulance between July 2004 and March 2006. The relationship between prehospital data (age, systolic blood pressure, pulse rate, respiration rate, level of consciousness, SpO2 level and ability to walk and outcome (hospitalization or non-hospitalization was analyzed using logistic regression models. The prehospital score component of each item of prehospital data was determined by beta coefficients. Eligible patients were scored retrospectively and the distribution of outcome was examined. For patients transported to the two main hospitals, outcome after hospitalization was also confirmed. Results A total of 8,330 (91% patients were retrospectively evaluated using a prehospital score with a maximum value of 14. The percentage of patients requiring hospitalization rose from 9% with score = 0 to 100% with score = 14. With a cut-off point score ≥ 2, the sensitivity, specificity, positive predictive value and negative predictive value were 97%, 16%, 39% and 89%, respectively. Among the 6,498 patients transported to the two main hospitals, there were no deaths at scores ≤ 1 and the proportion of non-hospitalization was over 90%. The proportion of deaths increased rapidly at scores ≥ 11. Conclusion The prehospital score could be a useful tool for deciding the refusal of ambulance transfer in Japan's emergency medical
The Testability of Maximum Magnitude
Clements, R.; Schorlemmer, D.; Gonzalez, A.; Zoeller, G.; Schneider, M.
2012-12-01
Recent disasters caused by earthquakes of unexpectedly large magnitude (such as Tohoku) illustrate the need for reliable assessments of the seismic hazard. Estimates of the maximum possible magnitude M at a given fault or in a particular zone are essential parameters in probabilistic seismic hazard assessment (PSHA), but their accuracy remains untested. In this study, we discuss the testability of long-term and short-term M estimates and the limitations that arise from testing such rare events. Of considerable importance is whether or not those limitations imply a lack of testability of a useful maximum magnitude estimate, and whether this should have any influence on current PSHA methodology. We use a simple extreme value theory approach to derive a probability distribution for the expected maximum magnitude in a future time interval, and we perform a sensitivity analysis on this distribution to determine if there is a reasonable avenue available for testing M estimates as they are commonly reported today: devoid of an appropriate probability distribution of their own and estimated only for infinite time (or relatively large untestable periods). Our results imply that any attempt at testing such estimates is futile, and that the distribution is highly sensitive to M estimates only under certain optimal conditions that are rarely observed in practice. In the future we suggest that PSHA modelers be brutally honest about the uncertainty of M estimates, or must find a way to decrease its influence on the estimated hazard.
Alternative Multiview Maximum Entropy Discrimination.
Chao, Guoqing; Sun, Shiliang
2016-07-01
Maximum entropy discrimination (MED) is a general framework for discriminative estimation based on maximum entropy and maximum margin principles, and can produce hard-margin support vector machines under some assumptions. Recently, the multiview version of MED multiview MED (MVMED) was proposed. In this paper, we try to explore a more natural MVMED framework by assuming two separate distributions p1( Θ1) over the first-view classifier parameter Θ1 and p2( Θ2) over the second-view classifier parameter Θ2 . We name the new MVMED framework as alternative MVMED (AMVMED), which enforces the posteriors of two view margins to be equal. The proposed AMVMED is more flexible than the existing MVMED, because compared with MVMED, which optimizes one relative entropy, AMVMED assigns one relative entropy term to each of the two views, thus incorporating a tradeoff between the two views. We give the detailed solving procedure, which can be divided into two steps. The first step is solving our optimization problem without considering the equal margin posteriors from two views, and then, in the second step, we consider the equal posteriors. Experimental results on multiple real-world data sets verify the effectiveness of the AMVMED, and comparisons with MVMED are also reported.
Credit scoring for individuals
Maria DIMITRIU
2010-12-01
Full Text Available Lending money to different borrowers is profitable, but risky. The profits come from the interest rate and the fees earned on the loans. Banks do not want to make loans to borrowers who cannot repay them. Even if the banks do not intend to make bad loans, over time, some of them can become bad. For instance, as a result of the recent financial crisis, the capability of many borrowers to repay their loans were affected, many of them being on default. That’s why is important for the bank to monitor the loans. The purpose of this paper is to focus on credit scoring main issues. As a consequence of this, we presented in this paper the scoring model of an important Romanian Bank. Based on this credit scoring model and taking into account the last lending requirements of the National Bank of Romania, we developed an assessment tool, in Excel, for retail loans which is presented in the case study.
Earthquake forecast enrichment scores
Christine Smyth
2012-03-01
Full Text Available The Collaboratory for the Study of Earthquake Predictability (CSEP is a global project aimed at testing earthquake forecast models in a fair environment. Various metrics are currently used to evaluate the submitted forecasts. However, the CSEP still lacks easily understandable metrics with which to rank the universal performance of the forecast models. In this research, we modify a well-known and respected metric from another statistical field, bioinformatics, to make it suitable for evaluating earthquake forecasts, such as those submitted to the CSEP initiative. The metric, originally called a gene-set enrichment score, is based on a Kolmogorov-Smirnov statistic. Our modified metric assesses if, over a certain time period, the forecast values at locations where earthquakes have occurred are significantly increased compared to the values for all locations where earthquakes did not occur. Permutation testing allows for a significance value to be placed upon the score. Unlike the metrics currently employed by the CSEP, the score places no assumption on the distribution of earthquake occurrence nor requires an arbitrary reference forecast. In this research, we apply the modified metric to simulated data and real forecast data to show it is a powerful and robust technique, capable of ranking competing earthquake forecasts.
Dichotomous decisions based on dichotomously scored items: a case study
Mellenbergh, G.J.; Koppelaar, H.; Linden, van der W.J.
1977-01-01
In a course in elementary statistics for psychology students using criterion-referenced achievement tests, the total test score, based on dichotomously scored items, was used for classifying students into those who passed and those who failed. The score on a test is considered as depending on a late
Cacti with maximum Kirchhoff index
Wang, Wen-Rui; Pan, Xiang-Feng
2015-01-01
The concept of resistance distance was first proposed by Klein and Randi\\'c. The Kirchhoff index $Kf(G)$ of a graph $G$ is the sum of resistance distance between all pairs of vertices in $G$. A connected graph $G$ is called a cactus if each block of $G$ is either an edge or a cycle. Let $Cat(n;t)$ be the set of connected cacti possessing $n$ vertices and $t$ cycles, where $0\\leq t \\leq \\lfloor\\frac{n-1}{2}\\rfloor$. In this paper, the maximum kirchhoff index of cacti are characterized, as well...
Generic maximum likely scale selection
Pedersen, Kim Steenstrup; Loog, Marco; Markussen, Bo
2007-01-01
The fundamental problem of local scale selection is addressed by means of a novel principle, which is based on maximum likelihood estimation. The principle is generally applicable to a broad variety of image models and descriptors, and provides a generic scale estimation methodology. The focus...... on second order moments of multiple measurements outputs at a fixed location. These measurements, which reflect local image structure, consist in the cases considered here of Gaussian derivatives taken at several scales and/or having different derivative orders....
Cardiovascular risk score in Rheumatoid Arthritis
Wagan, Abrar Ahmed; Mahmud, Tafazzul E Haque; Rasheed, Aflak; Zafar, Zafar Ali; Rehman, Ata ur; Ali, Amjad
2016-01-01
Objective: To determine the 10-year Cardiovascular risk score with QRISK-2 and Framingham risk calculators in Rheumatoid Arthritis and Non Rheumatoid Arthritis subjects and asses the usefulness of QRISK-2 and Framingham calculators in both groups. Methods: During the study 106 RA and 106 Non RA patients age and sex matched participants were enrolled from outpatient department. Demographic data and questions regarding other study parameters were noted. After 14 hours of fasting 5 ml of venous blood was drawn for Cholesterol and HDL levels, laboratory tests were performed on COBAS c III (ROCHE). QRISK-2 and Framingham risk calculators were used to get individual 10-year CVD risk score. Results: In this study the mean age of RA group was (45.1±9.5) for Non RA group (43.7±8.2), with female gender as common. The mean predicted 10-year score with QRISK-2 calculator in RA group (14.2±17.1%) and Non RA group was (13.2±19.0%) with (p-value 0.122). The 10-year score with Framingham risk score in RA group was (12.9±10.4%) and Non RA group was (8.9±8.7%) with (p-value 0.001). In RA group QRISK-2 (24.5%) and FRS (31.1%) cases with predicted score were in higher risk category. The maximum agreement scores between both calculators was observed in both groups (Kappa = 0.618 RA Group; Kappa = 0.671 Non RA Group). Conclusion: QRISK-2 calculator is more appropriate as it takes RA, ethnicity, CKD, and Atrial fibrillation as factors in risk assessment score. PMID:27375684
Economics and Maximum Entropy Production
Lorenz, R. D.
2003-04-01
Price differentials, sales volume and profit can be seen as analogues of temperature difference, heat flow and work or entropy production in the climate system. One aspect in which economic systems exhibit more clarity than the climate is that the empirical and/or statistical mechanical tendency for systems to seek a maximum in production is very evident in economics, in that the profit motive is very clear. Noting the common link between 1/f noise, power laws and Self-Organized Criticality with Maximum Entropy Production, the power law fluctuations in security and commodity prices is not inconsistent with the analogy. There is an additional thermodynamic analogy, in that scarcity is valued. A commodity concentrated among a few traders is valued highly by the many who do not have it. The market therefore encourages via prices the spreading of those goods among a wider group, just as heat tends to diffuse, increasing entropy. I explore some empirical price-volume relationships of metals and meteorites in this context.
Maximum Potential Score (MPS): An operating model for a successful customer-focused strategy.
2015-01-01
One of marketers’ chief objectives is to achieve customer loyalty, which is a key factor for profitable growth. Therefore, they need to develop a strategy that attracts and maintains customers, giving them adequate motives, both tangible (prices and promotions) and intangible (personalized service and treatment), to satisfy a customer and make him loyal to the company. Finding a way to accurately measure satisfaction and customer loyalty is very important. With regard to typical Relationship ...
PASI and PQOL-12 score in psoriasis : Is there any correlation?
Vikas Shankar
2011-01-01
Full Text Available Background: Psoriasis, a common papulo-squamous disorder of the skin, is universal in occurrence and may interfere with the quality of life adversely. Whether extent of the disease has any bearing upon the patients′ psychology has not much been studied in this part of the world. Aims: The objective of this hospital-based cross-sectional study was to assess the disease severity objectively using Psoriasis area and severity index (PASI score and the quality of life by Psoriasis quality-of-life questionnaire-12 (PQOL-12 and to draw correlation between them, if any. Materials and Methods PASI score denotes an objective method of scoring severity of psoriasis, reflecting not only the body surface area but also erythema, induration and scaling. The PQOL-12 represents a 12-item self-administered, disease-specific psychometric instrument created to specifically assess quality-of-life issues that are more important with psoriasis patients. PASI and PQOL-12 score were calculated in each patient for objectively assessing their disease severity and quality of life. Results: In total, 34 psoriasis patients (16 males, 18 females, of age ranging from 8 to 55 years, were studied. Maximum and minimum PASI scores were 0.8 and 32.8, respectively, whereas maximum and minimum PQOL-12 scores were 4 and 120, respectively. PASI and PQOL-12 values showed minimal positive correlation (r = +0.422. Conclusion: Disease severity of psoriasis had no direct reflection upon their quality of life. Limited psoriasis on visible area may also have greater impact on mental health.
Farneti, D; Fattori, B; Nacci, A; Mancini, V; Simonelli, M; Ruoppolo, G; Genovese, E
2014-04-01
This study evaluated the intra- and inter-rater reliability of the Pooling score (P-score) in clinical endoscopic evaluation of severity of swallowing disorder, considering excess residue in the pharynx and larynx. The score (minimum 4 - maximum 11) is obtained by the sum of the scores given to the site of the bolus, the amount and ability to control residue/bolus pooling, the latter assessed on the basis of cough, raclage, number of dry voluntary or reflex swallowing acts ( 5). Four judges evaluated 30 short films of pharyngeal transit of 10 solid (1/4 of a cracker), 11 creamy (1 tablespoon of jam) and 9 liquid (1 tablespoon of 5 cc of water coloured with methlyene blue, 1 ml in 100 ml) boluses in 23 subjects (10 M/13 F, age from 31 to 76 yrs, mean age 58.56±11.76 years) with different pathologies. The films were randomly distributed on two CDs, which differed in terms of the sequence of the films, and were given to judges (after an explanatory session) at time 0, 24 hours later (time 1) and after 7 days (time 2). The inter- and intra-rater reliability of the P-score was calculated using the intra-class correlation coefficient (ICC; 3,k). The possibility that consistency of boluses could affect the scoring of the films was considered. The ICC for site, amount, management and the P-score total was found to be, respectively, 0.999, 0.997, 1.00 and 0.999. Clinical evaluation of a criterion of severity of a swallowing disorder remains a crucial point in the management of patients with pathologies that predispose to complications. The P-score, derived from static and dynamic parameters, yielded a very high correlation among the scores attributed by the four judges during observations carried out at different times. Bolus consistencies did not affect the outcome of the test: the analysis of variance, performed to verify if the scores attributed by the four judges to the parameters selected, might be influenced by the different consistencies of the boluses, was not
The International Bleeding Risk Score
Laursen, Stig Borbjerg; Laine, L.; Dalton, H.
2017-01-01
The International Bleeding Risk Score: A New Risk Score that can Accurately Predict Mortality in Patients with Upper GI-Bleeding.......The International Bleeding Risk Score: A New Risk Score that can Accurately Predict Mortality in Patients with Upper GI-Bleeding....
Parameter estimation in X-ray astronomy using maximum likelihood
Wachter, K.; Leach, R.; Kellogg, E.
1979-01-01
Methods of estimation of parameter values and confidence regions by maximum likelihood and Fisher efficient scores starting from Poisson probabilities are developed for the nonlinear spectral functions commonly encountered in X-ray astronomy. It is argued that these methods offer significant advantages over the commonly used alternatives called minimum chi-squared because they rely on less pervasive statistical approximations and so may be expected to remain valid for data of poorer quality. Extensive numerical simulations of the maximum likelihood method are reported which verify that the best-fit parameter value and confidence region calculations are correct over a wide range of input spectra.
Lopez Moris E
2016-06-01
Full Text Available Total thyroidectomy is a surgery that removes all the thyroid tissue from the patient. The suspect of cancer in a thyroid nodule is the most frequent indication and it is presume when previous fine needle puncture is positive or a goiter has significant volume increase or symptomes. Less frequent indications are hyperthyroidism when it is refractory to treatment with Iodine 131 or it is contraindicated, and in cases of symptomatic thyroiditis. The thyroid gland has an important anatomic relation whith the inferior laryngeal nerve and the parathyroid glands, for this reason it is imperative to perform extremely meticulous dissection to recognize each one of these elements and ensure their preservation. It is also essential to maintain strict hemostasis, in order to avoid any postoperative bleeding that could lead to a suffocating neck hematoma, feared complication that represents a surgical emergency and endangers the patient’s life.It is essential to run a formal technique, without skipping steps, and maintain prudence and patience that should rule any surgical act.
Kittelsen, K.E.; David, B.; Moe, R.O.
2017-01-01
Lameness and impaired walking ability in rapidly growing meat-type broiler chickens are major welfare issues that cause economic losses. This study analyzed the prevalence of impaired walking and its associations with production data, abattoir registrations, and postmortem tibia measurements......, although the mean slaughter age is only 31 d and the maximum allowed animal density is relatively low. Impaired walking ability could not be predicted by the welfare indicators footpad lesion score, total on-farm mortality, and decreasing DOA prevalence. Further studies are needed to explore...
Objects of maximum electromagnetic chirality
Fernandez-Corbaton, Ivan
2015-01-01
We introduce a definition of the electromagnetic chirality of an object and show that it has an upper bound. The upper bound is attained if and only if the object is transparent for fields of one handedness (helicity). Additionally, electromagnetic duality symmetry, i.e. helicity preservation upon scattering, turns out to be a necessary condition for reciprocal scatterers to attain the upper bound. We use these results to provide requirements for the design of such extremal scatterers. The requirements can be formulated as constraints on the polarizability tensors for dipolar scatterers or as material constitutive relations. We also outline two applications for objects of maximum electromagnetic chirality: A twofold resonantly enhanced and background free circular dichroism measurement setup, and angle independent helicity filtering glasses.
Maximum mutual information regularized classification
Wang, Jim Jing-Yan
2014-09-07
In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.
The strong maximum principle revisited
Pucci, Patrizia; Serrin, James
In this paper we first present the classical maximum principle due to E. Hopf, together with an extended commentary and discussion of Hopf's paper. We emphasize the comparison technique invented by Hopf to prove this principle, which has since become a main mathematical tool for the study of second order elliptic partial differential equations and has generated an enormous number of important applications. While Hopf's principle is generally understood to apply to linear equations, it is in fact also crucial in nonlinear theories, such as those under consideration here. In particular, we shall treat and discuss recent generalizations of the strong maximum principle, and also the compact support principle, for the case of singular quasilinear elliptic differential inequalities, under generally weak assumptions on the quasilinear operators and the nonlinearities involved. Our principal interest is in necessary and sufficient conditions for the validity of both principles; in exposing and simplifying earlier proofs of corresponding results; and in extending the conclusions to wider classes of singular operators than previously considered. The results have unexpected ramifications for other problems, as will develop from the exposition, e.g. two point boundary value problems for singular quasilinear ordinary differential equations (Sections 3 and 4); the exterior Dirichlet boundary value problem (Section 5); the existence of dead cores and compact support solutions, i.e. dead cores at infinity (Section 7); Euler-Lagrange inequalities on a Riemannian manifold (Section 9); comparison and uniqueness theorems for solutions of singular quasilinear differential inequalities (Section 10). The case of p-regular elliptic inequalities is briefly considered in Section 11.
Sharma, Prashant; Saxena, Renu
2010-07-01
Thromboelastography (TEM) yields a multitude of data that are complicated to analyze. We evaluated its value in identification of global coagulopathy in overt disseminated intravascular coagulation (DIC). We studied 21 patients, each with International Society for Haemostasis and Thrombosis scores of 5 or more (compatible with overt DIC) and less than 5 (suggestive of nonovert DIC), who underwent whole blood nonadditive TEM. A TEM score based on the reaction and kappa times, alpha angle, and maximum amplitude was defined as the total number of TEM parameters deranged in the direction of hypocoagulability. The TEM score at a cutoff of 2 or more achieved sensitivity of 95.2%, specificity of 81.0%, and the highest receiver operating characteristic area under the curve of all parameters of 0.957 for identifying overt DIC. Individual TEM parameters correlated variably with conventional tests. Their combination into a cohesive TEM score possibly better captured the multiple hemostatic derangements occurring in DIC. The TEM score may bring objectivity to the analysis of TEM data.
On the lack of comonotonicity between likert scores and Rasch-based measures.
Bertoli-Barsotti, Lucio
2005-01-01
The Rating Scale Model (RSM) and the Partial Credit Model (PCM) are fairly well-known examples of Rasch models for polytomously scored items. In addition to a number of threshold parameters, both the models contain two scalar parameters characterizing item and person location on a common interval-level scale. The rank order of items and persons defined by the Likert summative scores (i.e. the raw total scores) is compared with that obtained from the Rasch-based measures (i.e. the maximum likelihood estimates of person and item parameters). It is proved that: 1) the property of comonotonicity between Likert summative scores and Rasch-based measures holds for both the person and item parameters of the RSM; 2) the property of comonotonicity between Likert summative scores and Rasch-based measures holds for the PCM only with reference to the person parameters; 3) violations of comonotonicity are possible, for particular datasets, for the item parameters of the PCM.
Fingerprinting of music scores
Irons, Jonathan; Schmucker, Martin
2004-06-01
Publishers of sheet music are generally reluctant in distributing their content via the Internet. Although online sheet music distribution's advantages are numerous the potential risk of Intellectual Property Rights (IPR) infringement, e.g. illegal online distributions, disables any innovation propensity. While active protection techniques only deter external risk factors, additional technology is necessary to adequately treat further risk factors. For several media types including music scores watermarking technology has been developed, which ebeds information in data by suitable data modifications. Furthermore, fingerprinting or perceptual hasing methods have been developed and are being applied especially for audio. These methods allow the identification of content without prior modifications. In this article we motivate the development of watermarking and fingerprinting technologies for sheet music. Outgoing from potential limitations of watermarking methods we explain why fingerprinting methods are important for sheet music and address potential applications. Finally we introduce a condept for fingerprinting of sheet music.
PNNL: A Supervised Maximum Entropy Approach to Word Sense Disambiguation
Tratz, Stephen C.; Sanfilippo, Antonio P.; Gregory, Michelle L.; Chappell, Alan R.; Posse, Christian; Whitney, Paul D.
2007-06-23
In this paper, we described the PNNL Word Sense Disambiguation system as applied to the English All-Word task in Se-mEval 2007. We use a supervised learning approach, employing a large number of features and using Information Gain for dimension reduction. Our Maximum Entropy approach combined with a rich set of features produced results that are significantly better than baseline and are the highest F-score for the fined-grained English All-Words subtask.
On the maximum sufficient range of interstellar vessels
Cartin, Daniel
2011-01-01
This paper considers the likely maximum range of space vessels providing the basis of a mature interstellar transportation network. Using the principle of sufficiency, it is argued that this range will be less than three parsecs for the average interstellar vessel. This maximum range provides access from the Solar System to a large majority of nearby stellar systems, with total travel distances within the network not excessively greater than actual physical distance.
[Scoring--criteria for operability].
Oestern, H J
1997-01-01
For therapeutic recommendations three different kinds of scores are essential: 1. The severity scores for trauma; 2. Severity scores for mangled extremities; 3. Intensive care scores. The severity of polytrauma patients is measurable by the AIS, ISS, RTS, PTS and TRISS which is a combination of RTS, ISS, age, and mechanism of injury. For mangled extremities there are also different scores available: MESI (Mangled Extremity Syndrome Index) and MESS (Mangled Extremity Severity Score). The aim of these scores is to assist in the indication with regard to amputate or to save the extremity. These scoring indices can be used to evaluate the severity of a systemic inflammatory reaction syndrome with respect to multiple organ failure. All scores are dynamic values which are variable with improvement of therapy.
Maximum-likelihood cluster recontruction
Bartelmann, M; Seitz, S; Schneider, P J; Bartelmann, Matthias; Narayan, Ramesh; Seitz, Stella; Schneider, Peter
1996-01-01
We present a novel method to recontruct the mass distribution of galaxy clusters from their gravitational lens effect on background galaxies. The method is based on a least-chisquare fit of the two-dimensional gravitational cluster potential. The method combines information from shear and magnification by the cluster lens and is designed to easily incorporate possible additional information. We describe the technique and demonstrate its feasibility with simulated data. Both the cluster morphology and the total cluster mass are well reproduced.
Automated Trait Scores for "TOEFL"® Writing Tasks. Research Report. ETS RR-15-14
Attali, Yigal; Sinharay, Sandip
2015-01-01
The "e-rater"® automated essay scoring system is used operationally in the scoring of "TOEFL iBT"® independent and integrated tasks. In this study we explored the psychometric added value of reporting four trait scores for each of these two tasks, beyond the total e-rater score.The four trait scores are word choice, grammatical…
Maximum entropy production in daisyworld
Maunu, Haley A.; Knuth, Kevin H.
2012-05-01
Daisyworld was first introduced in 1983 by Watson and Lovelock as a model that illustrates how life can influence a planet's climate. These models typically involve modeling a planetary surface on which black and white daisies can grow thus influencing the local surface albedo and therefore also the temperature distribution. Since then, variations of daisyworld have been applied to study problems ranging from ecological systems to global climate. Much of the interest in daisyworld models is due to the fact that they enable one to study self-regulating systems. These models are nonlinear, and as such they exhibit sensitive dependence on initial conditions, and depending on the specifics of the model they can also exhibit feedback loops, oscillations, and chaotic behavior. Many daisyworld models are thermodynamic in nature in that they rely on heat flux and temperature gradients. However, what is not well-known is whether, or even why, a daisyworld model might settle into a maximum entropy production (MEP) state. With the aim to better understand these systems, this paper will discuss what is known about the role of MEP in daisyworld models.
Maximum stellar iron core mass
F W Giacobbe
2003-03-01
An analytical method of estimating the mass of a stellar iron core, just prior to core collapse, is described in this paper. The method employed depends, in part, upon an estimate of the true relativistic mass increase experienced by electrons within a highly compressed iron core, just prior to core collapse, and is signiﬁcantly different from a more typical Chandrasekhar mass limit approach. This technique produced a maximum stellar iron core mass value of 2.69 × 1030 kg (1.35 solar masses). This mass value is very near to the typical mass values found for neutron stars in a recent survey of actual neutron star masses. Although slightly lower and higher neutron star masses may also be found, lower mass neutron stars are believed to be formed as a result of enhanced iron core compression due to the weight of non-ferrous matter overlying the iron cores within large stars. And, higher mass neutron stars are likely to be formed as a result of fallback or accretion of additional matter after an initial collapse event involving an iron core having a mass no greater than 2.69 × 1030 kg.
Maximum Matchings via Glauber Dynamics
Jindal, Anant; Pal, Manjish
2011-01-01
In this paper we study the classic problem of computing a maximum cardinality matching in general graphs $G = (V, E)$. The best known algorithm for this problem till date runs in $O(m \\sqrt{n})$ time due to Micali and Vazirani \\cite{MV80}. Even for general bipartite graphs this is the best known running time (the algorithm of Karp and Hopcroft \\cite{HK73} also achieves this bound). For regular bipartite graphs one can achieve an $O(m)$ time algorithm which, following a series of papers, has been recently improved to $O(n \\log n)$ by Goel, Kapralov and Khanna (STOC 2010) \\cite{GKK10}. In this paper we present a randomized algorithm based on the Markov Chain Monte Carlo paradigm which runs in $O(m \\log^2 n)$ time, thereby obtaining a significant improvement over \\cite{MV80}. We use a Markov chain similar to the \\emph{hard-core model} for Glauber Dynamics with \\emph{fugacity} parameter $\\lambda$, which is used to sample independent sets in a graph from the Gibbs Distribution \\cite{V99}, to design a faster algori...
2011-01-10
...: Establishing Maximum Allowable Operating Pressure or Maximum Operating Pressure Using Record Evidence, and... facilities of their responsibilities, under Federal integrity management (IM) regulations, to perform... system, especially when calculating Maximum Allowable Operating Pressure (MAOP) or Maximum Operating...
Relationship of Apgar Scores and Bayley Mental and Motor Scores
Serunian, Sally A.; Broman, Sarah H.
1975-01-01
Examined the relationship of newborns' 1-minute Apgar scores to their 8-month Bayley mental and motor scores and to 8-month classifications of their development as normal, suspect, or abnormal. Also investigated relationships between Apgar scores and race, longevity, and birth weight. (JMB)
The Sherpa Maximum Likelihood Estimator
Nguyen, D.; Doe, S.; Evans, I.; Hain, R.; Primini, F.
2011-07-01
A primary goal for the second release of the Chandra Source Catalog (CSC) is to include X-ray sources with as few as 5 photon counts detected in stacked observations of the same field, while maintaining acceptable detection efficiency and false source rates. Aggressive source detection methods will result in detection of many false positive source candidates. Candidate detections will then be sent to a new tool, the Maximum Likelihood Estimator (MLE), to evaluate the likelihood that a detection is a real source. MLE uses the Sherpa modeling and fitting engine to fit a model of a background and source to multiple overlapping candidate source regions. A background model is calculated by simultaneously fitting the observed photon flux in multiple background regions. This model is used to determine the quality of the fit statistic for a background-only hypothesis in the potential source region. The statistic for a background-plus-source hypothesis is calculated by adding a Gaussian source model convolved with the appropriate Chandra point spread function (PSF) and simultaneously fitting the observed photon flux in each observation in the stack. Since a candidate source may be located anywhere in the field of view of each stacked observation, a different PSF must be used for each observation because of the strong spatial dependence of the Chandra PSF. The likelihood of a valid source being detected is a function of the two statistics (for background alone, and for background-plus-source). The MLE tool is an extensible Python module with potential for use by the general Chandra user.
Vestige: Maximum likelihood phylogenetic footprinting
Maxwell Peter
2005-05-01
Full Text Available Abstract Background Phylogenetic footprinting is the identification of functional regions of DNA by their evolutionary conservation. This is achieved by comparing orthologous regions from multiple species and identifying the DNA regions that have diverged less than neutral DNA. Vestige is a phylogenetic footprinting package built on the PyEvolve toolkit that uses probabilistic molecular evolutionary modelling to represent aspects of sequence evolution, including the conventional divergence measure employed by other footprinting approaches. In addition to measuring the divergence, Vestige allows the expansion of the definition of a phylogenetic footprint to include variation in the distribution of any molecular evolutionary processes. This is achieved by displaying the distribution of model parameters that represent partitions of molecular evolutionary substitutions. Examination of the spatial incidence of these effects across regions of the genome can identify DNA segments that differ in the nature of the evolutionary process. Results Vestige was applied to a reference dataset of the SCL locus from four species and provided clear identification of the known conserved regions in this dataset. To demonstrate the flexibility to use diverse models of molecular evolution and dissect the nature of the evolutionary process Vestige was used to footprint the Ka/Ks ratio in primate BRCA1 with a codon model of evolution. Two regions of putative adaptive evolution were identified illustrating the ability of Vestige to represent the spatial distribution of distinct molecular evolutionary processes. Conclusion Vestige provides a flexible, open platform for phylogenetic footprinting. Underpinned by the PyEvolve toolkit, Vestige provides a framework for visualising the signatures of evolutionary processes across the genome of numerous organisms simultaneously. By exploiting the maximum-likelihood statistical framework, the complex interplay between mutational
Siana Halim
2014-01-01
Full Text Available It is generally easier to predict defaults accurately if a large data set (including defaults is available for estimating the prediction model. This puts not only small banks, which tend to have smaller data sets, at disadvantage. It can also pose a problem for large banks that began to collect their own historical data only recently, or banks that recently introduced a new rating system. We used a Bayesian methodology that enables banks with small data sets to improve their default probability. Another advantage of the Bayesian method is that it provides a natural way for dealing with structural differences between a bank’s internal data and additional, external data. In practice, the true scoring function may differ across the data sets, the small internal data set may contain information that is missing in the larger external data set, or the variables in the two data sets are not exactly the same but related. Bayesian method can handle such kind of problem.
Developmental Sentence Scoring for Japanese
Miyata, Susanne; MacWhinney, Brian; Otomo, Kiyoshi; Sirai, Hidetosi; Oshima-Takane, Yuriko; Hirakawa, Makiko; Shirai, Yasuhiro; Sugiura, Masatoshi; Itoh, Keiko
2013-01-01
This article reports on the development and use of the Developmental Sentence Scoring for Japanese (DSSJ), a new morpho-syntactical measure for Japanese constructed after the model of Lee's English Developmental Sentence Scoring model. Using this measure, the authors calculated DSSJ scores for 84 children divided into six age groups between 2;8…
McCluskey, Neal
2017-01-01
Since at least the enactment of No Child Left Behind in 2002, standardized test scores have served as the primary measures of public school effectiveness. Yet, such scores fail to measure the ultimate goal of education: maximizing happiness. This exploratory analysis assesses nation level associations between test scores and happiness, controlling…
Line Lengths and Starch Scores.
Moriarty, Sandra E.
1986-01-01
Investigates readability of different line lengths in advertising body copy, hypothesizing a normal curve with lower scores for shorter and longer lines, and scores above the mean for lines in the middle of the distribution. Finds support for lower scores for short lines and some evidence of two optimum line lengths rather than one. (SKC)
Constrained Fisher Scoring for a Mixture of Factor Analyzers
2016-09-01
global appearance model across the entire sensor network. constrained maximum likelihood estimation, mixture of factor analyzers, Newton’s method...ARL-TR-7836• SEP 2016 US Army Research Laboratory Constrained Fisher Scoring for a Mixture of Factor Analyzers by Gene T Whipps, Emre Ertin, and...TR-7836• SEP 2016 US Army Research Laboratory Constrained Fisher Scoring for a Mixture of Factor Analyzers by Gene T Whipps Sensors and Electron
Gale, Robert W.
2007-01-01
The Commonwealth of Virginia Department of Environmental Quality, working closely with the State of West Virginia Department of Environmental Protection and the U.S. Environmental Protection Agency is undertaking a polychlorinated biphenyl source assessment study for the Bluestone River watershed. The study area extends from the Bluefield area of Virginia and West Virginia, targets the Bluestone River and tributaries suspected of contributing to polychlorinated biphenyl, polychlorinated dibenzo-p-dioxin and dibenzofuran contamination, and includes sites near confluences of Big Branch, Brush Fork, and Beaver Pond Creek. The objectives of this study were to gather information about the concentrations, patterns, and distribution of these contaminants at specific study sites to expand current knowledge about polychlorinated biphenyl impacts and to identify potential new sources of contamination. Semipermeable membrane devices were used to integratively accumulate the dissolved fraction of the contaminants at each site. Performance reference compounds were added prior to deployment and used to determine site-specific sampling rates, enabling estimations of time-weighted average water concentrations during the deployed period. Minimum estimated concentrations of polychlorinated biphenyl congeners in water were about 1 picogram per liter per congener, and total concentrations at study sites ranged from 130 to 18,000 picograms per liter. The lowest concentration was 130 picograms per liter, about threefold greater than total hypothetical concentrations from background levels in field blanks. Polychlorinated biphenyl concentrations in water fell into three groups of sites: low (130-350 picogram per liter); medium (640-3,500 picogram per liter; and high (11,000-18,000 picogram per liter). Concentrations at the high sites, Beacon Cave and Beaverpond Branch at the Resurgence, were about four- to sixfold higher than concentrations estimated for the medium group of sites
Thienpont, E; Vanden Berghe, A; Schwab, P E; Forthomme, J P; Cornu, O
2016-10-01
To utilize the 'Forgotten Joint' Score (FJS), a 12-item questionnaire analysing the ability to forget the joint, for comparing preoperative status in osteoarthritic patients scheduled for total hip arthroplasty (THA) or total knee arthroplasty (TKA). Higher scores represent a better result with a maximum of 100. The hypothesis of this study was that a preoperative difference in favour of hip arthritis could eventually explain why THA is cited more often as a forgotten joint than TKA. A prospective cohort study was conducted in 150 patients with either tricompartmental knee (n = 75) or hip osteoarthritis (n = 75). Patients completed FJS-12 scores preoperatively and 1 year postoperatively. A similar preoperative FJS-12 was observed for hip (22 (15)) and knee osteoarthritis (24 (17)) (n.s.). The postoperative FJS-12 score was significantly higher for THA (80 (24)) than for TKA (70 (27)) (p forgotten'. The preoperative FJS-12 Score is a powerful tool to provide patients with clearer insights into their positive evolution after surgery. The use of the FJS-12 in THA is a topic for further research, as this study found that floor and ceiling effects limit its usefulness in studies evaluating clinical outcome in this area. II.
Tássia Souza Bertipaglia
2012-06-01
Full Text Available The objective of this study was to evaluate the association of visual scores of body structure, precocity and muscularity with production (body weight at 18 months and average daily gain and reproductive (scrotal circumference traits in Brahman cattle in order to determine the possible use of these scores as selection criteria to improve carcass quality. Covariance components were estimated by the restricted maximum likelihood method using an animal model that included contemporary group as fixed effect. A total of 1,116 observations of body structure, precocity and muscularity were used. Heritability was 0.39, 043 and 0.40 for body structure, precocity and muscularity, respectively. The genetic correlations were 0.79 between body structure and precocity, 0.87 between body structure and muscularity, and 0.91 between precocity and muscularity. The genetic correlations between visual scores and body weight at 18 months were positive (0.77, 0.57 and 0.59 for body structure, precocity and muscularity, respectively. Similar genetic correlations were observed between average daily gain and visual scores (0.60, 0.57 and 0.48, respectively, whereas the genetic correlations between scrotal circumference and these scores were low (0.13, 0.02, and 0.13. The results indicate that visual scores can be used as selection criteria in Brahman breeding programs. Favorable correlated responses should be seen in average daily gain and body weight at 18 months. However, no correlated response is expected for scrotal circumference.
Validation of a new scoring system: Rapid assessment faecal incontinence score
Fernando; de; la; Portilla; Arantxa; Calero-Lillo; Rosa; M; Jiménez-Rodríguez; Maria; L; Reyes; Manuela; Segovia-González; María; Victoria; Maestre; Ana; M; García-Cabrera
2015-01-01
AIM: To implement a quick and simple test- rapid assessment faecal incontinence score(RAFIS) and show its reliability and validity.METHODS: From March 2008 through March 2010, we evaluated a total of 261 consecutive patients, including 53 patients with faecal incontinence. Demographic and comorbidity information was collected. In a single visit, patients were administered the RAFIS. The results obtained with the new score were compared with those of both Wexner score and faecal incontinence quality of life scale(FIQL) questionnaire. The patient withoutinfluence of the surgeon completed the test. The role of surgeon was explaining the meaning of each section and how he had to fill. Reliability of the RAFIS score was measured using intra-observer agreement and Cronbach’s alpha(internal consistency) coefficient. Multivariate analysis of the main components within the different scores was performed in order to determine whether all the scores measured the same factor and to conclude whether the information could be encompassed in a single factor. A sample size of 50 patients with faecal incontinence was estimated to be enough to detect a correlation of 0.55 or better at 5% level of significance with 80% power.RESULTS: We analysed the results obtained by 53 consecutive patients with faecal incontinence(median age 61.55 ± 12.49 years) in the three scoring systems. A total of 208 healthy volunteers(median age 58.41 ± 18.41 years) without faecal incontinence were included in the study as negative controls. Pearson’s correlation coefficient between "state" and "leaks" was excellent(r = 0.92, P < 0.005). Internal consistency in the comparison of "state" and "leaks" yielded also excellent correlation(Cronbach’s α = 0.93). Results in each score were compared using regression analysis and a correlation value of r = 0.98 was obtained with Wexner score. As regards FIQL questionnaire, the values of "r " for the different subscales of the questionnaire were: "lifestyle" r
Receiver function estimated by maximum entropy deconvolution
吴庆举; 田小波; 张乃铃; 李卫平; 曾融生
2003-01-01
Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient is always less than 1, which keeps maximum entropy deconvolution stable. The maximum entropy of the data outside window increases the resolution of receiver function. Both synthetic and real seismograms show that maximum entropy deconvolution is an effective method to measure receiver function in time-domain.
Maximum Power from a Solar Panel
Michael Miller
2010-01-01
Full Text Available Solar energy has become a promising alternative to conventional fossil fuel sources. Solar panels are used to collect solar radiation and convert it into electricity. One of the techniques used to maximize the effectiveness of this energy alternative is to maximize the power output of the solar collector. In this project the maximum power is calculated by determining the voltage and the current of maximum power. These quantities are determined by finding the maximum value for the equation for power using differentiation. After the maximum values are found for each time of day, each individual quantity, voltage of maximum power, current of maximum power, and maximum power is plotted as a function of the time of day.
[Propensity score matching in SPSS].
Huang, Fuqiang; DU, Chunlin; Sun, Menghui; Ning, Bing; Luo, Ying; An, Shengli
2015-11-01
To realize propensity score matching in PS Matching module of SPSS and interpret the analysis results. The R software and plug-in that could link with the corresponding versions of SPSS and propensity score matching package were installed. A PS matching module was added in the SPSS interface, and its use was demonstrated with test data. Score estimation and nearest neighbor matching was achieved with the PS matching module, and the results of qualitative and quantitative statistical description and evaluation were presented in the form of a graph matching. Propensity score matching can be accomplished conveniently using SPSS software.
Confidence scores for prediction models
Gerds, Thomas Alexander; van de Wiel, MA
2011-01-01
modelling strategy is applied to different training sets. For each modelling strategy we estimate a confidence score based on the same repeated bootstraps. A new decomposition of the expected Brier score is obtained, as well as the estimates of population average confidence scores. The latter can be used...... to distinguish rival prediction models with similar prediction performances. Furthermore, on the subject level a confidence score may provide useful supplementary information for new patients who want to base a medical decision on predicted risk. The ideas are illustrated and discussed using data from cancer...
Validity of the J-CTO Score and the CL-Score for predicting successful CTO recanalization.
Guelker, J E; Bansemir, L; Ott, R; Rock, T; Kroeger, K; Guelker, R; Klues, H G; Shin, D I; Bufe, A
2017-03-01
Percutaneous coronary intervention (PCI) of total chronic coronary occlusion (CTO) still remains a major challenge in interventional cardiology. To predict the probability of a successful intervention different scoring systems are available. We analyzed in this study the validity of two scoring systems, the Japanese CTO score (J-CTO score) and the newly developed Clinical and Lesion-related score (CL Score). Between 2012 and 2015 we included 379 consecutive patients. They underwent PCI for at least one CTO. Antegrade and retrograde CTO techniques were applied. The retrograde approach was used only after failed antegrade intervention. Patients undergoing CTO PCI were mainly men (84%). The overall procedural success rate was 84% (±0.4). The mean J-CTO score was 2.9 (±1.3) and the mean CL score was 4.3 (±1.7). The CL score predicted more precisely the interventional results than the J-CTO score. Our study suggests that the previously presented CL score is superior to the J-CTO score in identifying CTO lesions with a likelihood for successful recanalization. Generally it appears to be a helpful tool for selecting patients and identifying the appropriate operator. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Modelling sequentially scored item responses
Akkermans, W.
2000-01-01
The sequential model can be used to describe the variable resulting from a sequential scoring process. In this paper two more item response models are investigated with respect to their suitability for sequential scoring: the partial credit model and the graded response model. The investigation is c
Classification of current scoring functions.
Liu, Jie; Wang, Renxiao
2015-03-23
Scoring functions are a class of computational methods widely applied in structure-based drug design for evaluating protein-ligand interactions. Dozens of scoring functions have been published since the early 1990s. In literature, scoring functions are typically classified as force-field-based, empirical, and knowledge-based. This classification scheme has been quoted for more than a decade and is still repeatedly quoted by some recent publications. Unfortunately, it does not reflect the recent progress in this field. Besides, the naming convention used for describing different types of scoring functions has been somewhat jumbled in literature, which could be confusing for newcomers to this field. Here, we express our viewpoint on an up-to-date classification scheme and appropriate naming convention for current scoring functions. We propose that they can be classified into physics-based methods, empirical scoring functions, knowledge-based potentials, and descriptor-based scoring functions. We also outline the major difference and connections between different categories of scoring functions.
The Machine Scoring of Writing
McCurry, Doug
2010-01-01
This article provides an introduction to the kind of computer software that is used to score student writing in some high stakes testing programs, and that is being promoted as a teaching and learning tool to schools. It sketches the state of play with machines for the scoring of writing, and describes how these machines work and what they do.…
Skyrocketing Scores: An Urban Legend
Krashen, Stephen
2005-01-01
A new urban legend claims, "As a result of the state dropping bilingual education, test scores in California skyrocketed." Krashen disputes this theory, pointing out that other factors offer more logical explanations of California's recent improvements in SAT-9 scores. He discusses research on the effects of California's Proposition 227,…
Quadratic prediction of factor scores
Wansbeek, T
1999-01-01
Factor scores are naturally predicted by means of their conditional expectation given the indicators y. Under normality this expectation is linear in y but in general it is an unknown function of y. II is discussed that under nonnormality factor scores can be more precisely predicted by a quadratic
Trends in Classroom Observation Scores
Casabianca, Jodi M.; Lockwood, J. R.; McCaffrey, Daniel F.
2015-01-01
Observations and ratings of classroom teaching and interactions collected over time are susceptible to trends in both the quality of instruction and rater behavior. These trends have potential implications for inferences about teaching and for study design. We use scores on the Classroom Assessment Scoring System-Secondary (CLASS-S) protocol from…
D-score: a search engine independent MD-score.
Vaudel, Marc; Breiter, Daniela; Beck, Florian; Rahnenführer, Jörg; Martens, Lennart; Zahedi, René P
2013-03-01
While peptides carrying PTMs are routinely identified in gel-free MS, the localization of the PTMs onto the peptide sequences remains challenging. Search engine scores of secondary peptide matches have been used in different approaches in order to infer the quality of site inference, by penalizing the localization whenever the search engine similarly scored two candidate peptides with different site assignments. In the present work, we show how the estimation of posterior error probabilities for peptide candidates allows the estimation of a PTM score called the D-score, for multiple search engine studies. We demonstrate the applicability of this score to three popular search engines: Mascot, OMSSA, and X!Tandem, and evaluate its performance using an already published high resolution data set of synthetic phosphopeptides. For those peptides with phosphorylation site inference uncertainty, the number of spectrum matches with correctly localized phosphorylation increased by up to 25.7% when compared to using Mascot alone, although the actual increase depended on the fragmentation method used. Since this method relies only on search engine scores, it can be readily applied to the scoring of the localization of virtually any modification at no additional experimental or in silico cost.
42 CFR 57.307 - Maximum amount of nursing student loans.
2010-10-01
... 42 Public Health 1 2010-10-01 2010-10-01 false Maximum amount of nursing student loans. 57.307... Nursing Student Loans § 57.307 Maximum amount of nursing student loans. The total of the nursing student... longer than the 9-month academic year may be proportionately increased. The total of all nursing...
The inverse maximum dynamic flow problem
BAGHERIAN; Mehri
2010-01-01
We consider the inverse maximum dynamic flow (IMDF) problem.IMDF problem can be described as: how to change the capacity vector of a dynamic network as little as possible so that a given feasible dynamic flow becomes a maximum dynamic flow.After discussing some characteristics of this problem,it is converted to a constrained minimum dynamic cut problem.Then an efficient algorithm which uses two maximum dynamic flow algorithms is proposed to solve the problem.
A tropospheric ozone maximum over the equatorial Southern Indian Ocean
L. Zhang
2012-05-01
Full Text Available We examine the distribution of tropical tropospheric ozone (O_{3} from the Microwave Limb Sounder (MLS and the Tropospheric Emission Spectrometer (TES by using a global three-dimensional model of tropospheric chemistry (GEOS-Chem. MLS and TES observations of tropospheric O_{3} during 2005 to 2009 reveal a distinct, persistent O_{3} maximum, both in mixing ratio and tropospheric column, in May over the Equatorial Southern Indian Ocean (ESIO. The maximum is most pronounced in 2006 and 2008 and less evident in the other three years. This feature is also consistent with the total column O_{3} observations from the Ozone Mapping Instrument (OMI and the Atmospheric Infrared Sounder (AIRS. Model results reproduce the observed May O_{3} maximum and the associated interannual variability. The origin of the maximum reflects a complex interplay of chemical and dynamic factors. The O_{3} maximum is dominated by the O_{3} production driven by lightning nitrogen oxides (NO_{x} emissions, which accounts for 62% of the tropospheric column O_{3} in May 2006. We find the contribution from biomass burning, soil, anthropogenic and biogenic sources to the O_{3} maximum are rather small. The O_{3} productions in the lightning outflow from Central Africa and South America both peak in May and are directly responsible for the O_{3} maximum over the western ESIO. The lightning outflow from Equatorial Asia dominates over the eastern ESIO. The interannual variability of the O_{3} maximum is driven largely by the anomalous anti-cyclones over the southern Indian Ocean in May 2006 and 2008. The lightning outflow from Central Africa and South America is effectively entrained by the anti-cyclones followed by northward transport to the ESIO.
RISK FACTOR DIAGNOSTIC SCORE IN DIABETIC FOOT
Mohamed Shameem P. M
2016-09-01
Full Text Available INTRODUCTION Diabetic foot ulcers vary in their clinical presentation and nature of severity and therefore create a challenging problem to the treating surgeon regarding the prediction of the clinical course and the end result of the treatment. Clinical studies have shown that there are certain risk factors for the progression of foot ulcers in diabetics and it may therefore be possible to predict the course of an ulcer foot at presentation itself, thus instituting proper therapy without delay. Spoken otherwise clinical scoring may tell that this particular ulcer is having highest chance of amputation, then one may be able to take an early decision for the same and avoid the septic complications, inconvenience to the patient, long hospital stay and cost of treatments. AIM OF THE STUDY Aim of the study is to evaluate the above-mentioned scoring system in predicting the course the diabetic foot ulcers. MATERIALS AND METHODS 50 patients with Diabetic Foot attending the OPD of Department of Surgery of Government Hospital attached to Calicut Medical College are included in the present study. After thorough history taking and clinical examination, six risk factors like Age, pedal vessels, renal function, neuropathy, radiological findings and ulcers were observed in the patients by giving certain scoring points to each of them. The total number of points scored by the patients at the time of admission or OPD treatment was correlated with the final outcome in these patients, whether leading to amputation or conservative management. All the data was analysed using standard statistical methods. OBSERVATIONS AND RESULTS There were 12 females and 38 males with a female to male ratio 1:3.1. All were aged above 30 years. Twenty-four (48% of them were between 30-60 years and twenty six (52% were above 60 years. 10 patients were treated conservatively with risk score range: 10 to 35. Six had single toe loss with risk score: 25 to 35. Six had multiple toe loss
1983-07-01
equal to the maximum value for this index is due to the dependence of this index upon the magnitude and sign of the factor loadings. Gorsuch (1974, p...Measurement, 1972, 9, 205-207. Gorsuch , R. L. Factor analysis. Philadelphia: W. B. Saunders Company, 1974. Guilford, J. P. A simple scoring weight for test
Maximum permissible voltage of YBCO coated conductors
Wen, J.; Lin, B.; Sheng, J.; Xu, J.; Jin, Z. [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Hong, Z., E-mail: zhiyong.hong@sjtu.edu.cn [Department of Electrical Engineering, Shanghai Jiao Tong University, Shanghai (China); Wang, D.; Zhou, H.; Shen, X.; Shen, C. [Qingpu Power Supply Company, State Grid Shanghai Municipal Electric Power Company, Shanghai (China)
2014-06-15
Highlights: • We examine three kinds of tapes’ maximum permissible voltage. • We examine the relationship between quenching duration and maximum permissible voltage. • Continuous I{sub c} degradations under repetitive quenching where tapes reaching maximum permissible voltage. • The relationship between maximum permissible voltage and resistance, temperature. - Abstract: Superconducting fault current limiter (SFCL) could reduce short circuit currents in electrical power system. One of the most important thing in developing SFCL is to find out the maximum permissible voltage of each limiting element. The maximum permissible voltage is defined as the maximum voltage per unit length at which the YBCO coated conductors (CC) do not suffer from critical current (I{sub c}) degradation or burnout. In this research, the time of quenching process is changed and voltage is raised until the I{sub c} degradation or burnout happens. YBCO coated conductors test in the experiment are from American superconductor (AMSC) and Shanghai Jiao Tong University (SJTU). Along with the quenching duration increasing, the maximum permissible voltage of CC decreases. When quenching duration is 100 ms, the maximum permissible of SJTU CC, 12 mm AMSC CC and 4 mm AMSC CC are 0.72 V/cm, 0.52 V/cm and 1.2 V/cm respectively. Based on the results of samples, the whole length of CCs used in the design of a SFCL can be determined.
Score cards for standardized comparison of myocardial perfusion imaging reports
Jensen, Julie D; Hoff, Camilla; Bouchelouche, Kirsten
Background: When optimizing scan protocols or comparing modalities in myocardial perfusion imaging, it is necessary to compare the current method to the new method This can be achieved by a comparison based on hard numbers such as MBF, summed rest and stress scores, total perfusion deficit etc....... However, what is of importance to the patient is the total evaluation of these scores and the weight and confidence ascribed to each by the reporting physician. We suggest a standardized method summarizing the observations and the confidence of the physician in simple scores. We tested the developed score...... cards in a pilotproject using a training scenario where 3 observers with varying experience (1 month, 5 months and 3 years, respectively) scored static rest/stress Rb-82 PET scans. Method: 10 patients with known ischemic heart disease were included. Using the 17-segment AHA cardiac model, each patient...
Influence of maximum decking charge on intensity of blasting vibration
无
2006-01-01
Based on the character of short-time non-stationary random signal, the relationship between the maximum decking charge and energy distribution of blasting vibration signals was investigated by means of the wavelet packet method. Firstly, the characteristics of wavelet transform and wavelet packet analysis were described. Secondly, the blasting vibration signals were analyzed by wavelet packet based on software MATLAB, and the change of energy distribution curve at different frequency bands were obtained. Finally, the law of energy distribution of blasting vibration signals changing with the maximum decking charge was analyzed. The results show that with the increase of decking charge, the ratio of the energy of high frequency to total energy decreases, the dominant frequency bands of blasting vibration signals tend towards low frequency and blasting vibration does not depend on the maximum decking charge.
Generalised maximum entropy and heterogeneous technologies
Oude Lansink, A.G.J.M.
1999-01-01
Generalised maximum entropy methods are used to estimate a dual model of production on panel data of Dutch cash crop farms over the period 1970-1992. The generalised maximum entropy approach allows a coherent system of input demand and output supply equations to be estimated for each farm in the sam
20 CFR 229.48 - Family maximum.
2010-04-01
... month on one person's earnings record is limited. This limited amount is called the family maximum. The family maximum used to adjust the social security overall minimum rate is based on the employee's Overall..., when any of the persons entitled to benefits on the insured individual's compensation would, except...
The maximum rotation of a galactic disc
Bottema, R
1997-01-01
The observed stellar velocity dispersions of galactic discs show that the maximum rotation of a disc is on average 63% of the observed maximum rotation. This criterion can, however, not be applied to small or low surface brightness (LSB) galaxies because such systems show, in general, a continuously
Value of coronary artery calcium score to predict severity or complexity of coronary artery disease
Gökdeniz, Tayyar; Kalaycıoğlu, Ezgi; Aykan, Ahmet Çağrı; Boyacı, Faruk; Turan, Turhan; Gül, İlker; Çavuşoğlu, Gökhan; Dursun, İhsan
2014-01-01
Background Prediction of severity or complexity of coronary artery disease (CAD) is valuable owing to increased risk for cardiovascular events. Although the association between total coronary artery calcium (CAC) score and severity of CAD, Gensini score was not used, it has been previously demonstrated. There is no information about the association between total CAC score and complexity of CAD. Objectives To investigate the association between severity or complexity of coronary artery disease (CAD) assessed by Gensini score and SYNTAX score (SS), respectively, and coronary artery calcium (CAC) score, which is a noninvasive method for CAD evaluation in symptomatic patients with accompanying significant CAD. Methods Two-hundred-fourteen patients were enrolled. Total CAC score was obtained before angiography. Severity and complexity of CAD was assessed by Gensini score and SS, respectively. Associations between clinical and angiographic parameters and total CAC score were analyzed. Results Median total CAC score was 192 (23.0-729.8), and this was positively correlated with both Gensini score (r: 0.299, p 809 for SS >32 (high SS tertile). Conclusion In symptomatic patients with accompanying significant CAD, total CAC score was independently associated with SS and patients with SS >32 may be detected through high Agatston score. PMID:24676367
Obstetrical disseminated intravascular coagulation score.
Kobayashi, Takao
2014-06-01
Obstetrical disseminated intravascular coagulation (DIC) is usually a very acute, serious complication of pregnancy. The obstetrical DIC score helps with making a prompt diagnosis and starting treatment early. This DIC score, in which higher scores are given for clinical parameters rather than for laboratory parameters, has three components: (i) the underlying diseases; (ii) the clinical symptoms; and (iii) the laboratory findings (coagulation tests). It is justifiably appropriate to initiate therapy for DIC when the obstetrical DIC score reaches 8 points or more before obtaining the results of coagulation tests. Improvement of blood coagulation tests and clinical symptoms are essential to the efficacy evaluation for treatment after a diagnosis of obstetrical DIC. Therefore, the efficacy evaluation criteria for obstetrical DIC are also defined to enable follow-up of the clinical efficacy of DIC therapy.
... Development Infections Diseases & Conditions Pregnancy & Baby Nutrition & Fitness Emotions & Behavior School & Family Life First Aid & Safety Doctors & ... 2 being the best score: A ppearance (skin color) P ulse (heart rate) G rimace response (reflexes) ...
Shower reconstruction in TUNKA-HiSCORE
Porelli, Andrea; Wischnewski, Ralf [DESY-Zeuthen, Platanenallee 6, 15738 Zeuthen (Germany)
2015-07-01
The Tunka-HiSCORE detector is a non-imaging wide-angle EAS cherenkov array designed as an alternative technology for gamma-ray physics above 10 TeV and to study spectrum and composition of cosmic rays above 100 TeV. An engineering array with nine stations (HiS-9) has been deployed in October 2013 on the site of the Tunka experiment in Russia. In November 2014, 20 more HiSCORE stations have been installed, covering a total array area of 0.24 square-km. We describe the detector setup, the role of precision time measurement, and give results from the innovative WhiteRabbit time synchronization technology. Results of air shower reconstruction are presented and compared with MC simulations, for both the HiS-9 and the HiS-29 detector arrays.
From Rasch scores to regression
Christensen, Karl Bang
2006-01-01
Rasch models provide a framework for measurement and modelling latent variables. Having measured a latent variable in a population a comparison of groups will often be of interest. For this purpose the use of observed raw scores will often be inadequate because these lack interval scale propertie....... This paper compares two approaches to group comparison: linear regression models using estimated person locations as outcome variables and latent regression models based on the distribution of the score....
Commercial Building Energy Asset Score
2017-05-26
This software (Asset Scoring Tool) is designed to help building owners and managers to gain insight into the as-built efficiency of their buildings. It is a web tool where users can enter their building information and obtain an asset score report. The asset score report consists of modeled building energy use (by end use and by fuel type), building systems (envelope, lighting, heating, cooling, service hot water) evaluations, and recommended energy efficiency measures. The intended users are building owners and operators who have limited knowledge of building energy efficiency. The scoring tool collects minimum building data (~20 data entries) from users and build a full-scale energy model using the inference functionalities from Facility Energy Decision System (FEDS). The scoring tool runs real-time building energy simulation using EnergyPlus and performs life-cycle cost analysis using FEDS. An API is also under development to allow the third-party applications to exchange data with the web service of the scoring tool.
How long do centenarians survive? Life expectancy and maximum lifespan.
Modig, K; Andersson, T; Vaupel, J; Rau, R; Ahlbom, A
2017-08-01
The purpose of this study was to explore the pattern of mortality above the age of 100 years. In particular, we aimed to examine whether Scandinavian data support the theory that mortality reaches a plateau at particularly old ages. Whether the maximum length of life increases with time was also investigated. The analyses were based on individual level data on all Swedish and Danish centenarians born from 1870 to 1901; in total 3006 men and 10 963 women were included. Birth cohort-specific probabilities of dying were calculated. Exact ages were used for calculations of maximum length of life. Whether maximum age changed over time was analysed taking into account increases in cohort size. The results confirm that there has not been any improvement in mortality amongst centenarians in the past 30 years and that the current rise in life expectancy is driven by reductions in mortality below the age of 100 years. The death risks seem to reach a plateau of around 50% at the age 103 years for men and 107 years for women. Despite the rising life expectancy, the maximum age does not appear to increase, in particular after accounting for the increasing number of individuals of advanced age. Mortality amongst centenarians is not changing despite improvements at younger ages. An extension of the maximum lifespan and a sizeable extension of life expectancy both require reductions in mortality above the age of 100 years. © 2017 The Association for the Publication of the Journal of Internal Medicine.
40 CFR 142.63 - Variances and exemptions from the maximum contaminant level for total coliforms.
2010-07-01
... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER... pathogenic contamination, a treatment lapse or deficiency, or a problem in the operation or maintenance of...
40 CFR 142.60 - Variances from the maximum contaminant level for total trihalomethanes.
2010-07-01
... chloramines, chlorine dioxide or potassium permanganate. (5) Use of powdered activated carbon for THM precursor or TTHM reduction seasonally or intermittently at dosages not to exceed 10 mg/L on an...
Schuster, Tibor; Pang, Menglan; Platt, Robert W
2015-09-01
The high-dimensional propensity score algorithm attempts to improve control of confounding in typical treatment effect studies in pharmacoepidemiology and is increasingly being used for the analysis of large administrative databases. Within this multi-step variable selection algorithm, the marginal prevalence of non-zero covariate values is considered to be an indicator for a count variable's potential confounding impact. We investigate the role of the marginal prevalence of confounder variables on potentially caused bias magnitudes when estimating risk ratios in point exposure studies with binary outcomes. We apply the law of total probability in conjunction with an established bias formula to derive and illustrate relative bias boundaries with respect to marginal confounder prevalence. We show that maximum possible bias magnitudes can occur at any marginal prevalence level of a binary confounder variable. In particular, we demonstrate that, in case of rare or very common exposures, low and high prevalent confounder variables can still have large confounding impact on estimated risk ratios. Covariate pre-selection by prevalence may lead to sub-optimal confounder sampling within the high-dimensional propensity score algorithm. While we believe that the high-dimensional propensity score has important benefits in large-scale pharmacoepidemiologic studies, we recommend omitting the prevalence-based empirical identification of candidate covariates. Copyright © 2015 John Wiley & Sons, Ltd.
Huber-Wagner S
2010-05-01
Full Text Available Abstract Background There are several well established scores for the assessment of the prognosis of major trauma patients that all have in common that they can be calculated at the earliest during intensive care unit stay. We intended to develop a sequential trauma score (STS that allows prognosis at several early stages based on the information that is available at a particular time. Study design In a retrospective, multicenter study using data derived from the Trauma Registry of the German Trauma Society (2002-2006, we identified the most relevant prognostic factors from the patients basic data (P, prehospital phase (A, early (B1, and late (B2 trauma room phase. Univariate and logistic regression models as well as score quality criteria and the explanatory power have been calculated. Results A total of 2,354 patients with complete data were identified. From the patients basic data (P, logistic regression showed that age was a significant predictor of survival (AUCmodel p, area under the curve = 0.63. Logistic regression of the prehospital data (A showed that blood pressure, pulse rate, Glasgow coma scale (GCS, and anisocoria were significant predictors (AUCmodel A = 0.76; AUCmodel P + A = 0.82. Logistic regression of the early trauma room phase (B1 showed that peripheral oxygen saturation, GCS, anisocoria, base excess, and thromboplastin time to be significant predictors of survival (AUCmodel B1 = 0.78; AUCmodel P +A + B1 = 0.85. Multivariate analysis of the late trauma room phase (B2 detected cardiac massage, abbreviated injury score (AIS of the head ≥ 3, the maximum AIS, the need for transfusion or massive blood transfusion, to be the most important predictors (AUCmodel B2 = 0.84; AUCfinal model P + A + B1 + B2 = 0.90. The explanatory power - a tool for the assessment of the relative impact of each segment to mortality - is 25% for P, 7% for A, 17% for B1 and 51% for B2. A spreadsheet for the easy calculation of the sequential trauma
Sørensen, Johan Kløvgaard; Jæger, Pia; Dahl, Jørgen Berg
2016-01-01
BACKGROUND: Using peripheral nerve block after total knee arthroplasty (TKA), without impeding mobility, is challenging. We hypothesized that the analgesic effect of adductor canal block (ACB) could increase the maximum voluntary isometric contraction (MVIC) of the quadriceps femoris muscle after......, expressed as a percentage of postoperative preblock values. In this manner, the effect of the ACB could be isolated from the detrimental effect on muscle strength caused by the surgery. Secondary end points were differences between groups in mobility and pain scores. We planned a subgroup analysis dividing......: ACB improves quadriceps femoris muscle strength, but whether this translates into enhanced mobility is not clearly supported by this study....
Rhee CK
2015-08-01
more symptomatic. We aimed to identify the ideal CAT score that exhibits minimal discrepancy with the mMRC score.Methods: A receiver operating characteristic curve of the CAT score was generated for an mMRC scores of 1 and 2. A concordance analysis was applied to quantify the association between the frequencies of patients categorized into GOLD groups A–D using symptom cutoff points. A κ-coefficient was calculated.Results: For an mMRC score of 2, a CAT score of 15 showed the maximum value of Youden’s index with a sensitivity and specificity of 0.70 and 0.66, respectively (area under the receiver operating characteristic curve [AUC] 0.74; 95% confidence interval [CI], 0.70–0.77. For an mMRC score of 1, a CAT score of 10 showed the maximum value of Youden’s index with a sensitivity and specificity of 0.77 and 0.65, respectively (AUC 0.77; 95% CI, 0.72–0.83. The κ value for concordance was highest between an mMRC score of 1 and a CAT score of 10 (0.66, followed by an mMRC score of 2 and a CAT score of 15 (0.56, an mMRC score of 2 and a CAT score of 10 (0.47, and an mMRC score of 1 and a CAT score of 15 (0.43.Conclusion: A CAT score of 10 was most concordant with an mMRC score of 1 when classifying patients with COPD into GOLD groups A–D. However, a discrepancy remains between the CAT and mMRC scoring systems. Keywords: COPD, CAT, mMRC, concordance, discrepancy
The Application of Maximum Principle in Supply Chain Cost Optimization
Zhou Ling
2013-09-01
Full Text Available In this paper, using the maximum principle for analyzing dynamic cost, we propose a new two-stage supply chain model of the manufacturing-assembly mode for high-tech perishable products supply chain and obtain the optimal conditions and results. On this basis, we further research the effect of localization of CODP on the total cost and the relation of CODP, inventory policy and demand type through the data simulation. The results of simulation show that CODP locates in the downstream of the product life cycle, is a linear function of the product life cycle. The result indicates that the demand forecast is the main factors influencing the total cost; meanwhile the mode of production according to the demand forecast is the deciding factor of the total cost. Also the model can reflect the relation between the total cost of two-stage supply chain and inventory, demand.
A dual method for maximum entropy restoration
Smith, C. B.
1979-01-01
A simple iterative dual algorithm for maximum entropy image restoration is presented. The dual algorithm involves fewer parameters than conventional minimization in the image space. Minicomputer test results for Fourier synthesis with inadequate phantom data are given.
Maximum Throughput in Multiple-Antenna Systems
Zamani, Mahdi
2012-01-01
The point-to-point multiple-antenna channel is investigated in uncorrelated block fading environment with Rayleigh distribution. The maximum throughput and maximum expected-rate of this channel are derived under the assumption that the transmitter is oblivious to the channel state information (CSI), however, the receiver has perfect CSI. First, we prove that in multiple-input single-output (MISO) channels, the optimum transmission strategy maximizing the throughput is to use all available antennas and perform equal power allocation with uncorrelated signals. Furthermore, to increase the expected-rate, multi-layer coding is applied. Analogously, we establish that sending uncorrelated signals and performing equal power allocation across all available antennas at each layer is optimum. A closed form expression for the maximum continuous-layer expected-rate of MISO channels is also obtained. Moreover, we investigate multiple-input multiple-output (MIMO) channels, and formulate the maximum throughput in the asympt...
Photoemission spectromicroscopy with MAXIMUM at Wisconsin
Ng, W.; Ray-Chaudhuri, A.K.; Cole, R.K.; Wallace, J.; Crossley, S.; Crossley, D.; Chen, G.; Green, M.; Guo, J.; Hansen, R.W.C.; Cerrina, F.; Margaritondo, G. (Dept. of Electrical Engineering, Dept. of Physics and Synchrotron Radiation Center, Univ. of Wisconsin, Madison (USA)); Underwood, J.H.; Korthright, J.; Perera, R.C.C. (Center for X-ray Optics, Accelerator and Fusion Research Div., Lawrence Berkeley Lab., CA (USA))
1990-06-01
We describe the development of the scanning photoemission spectromicroscope MAXIMUM at the Wisoncsin Synchrotron Radiation Center, which uses radiation from a 30-period undulator. The article includes a discussion of the first tests after the initial commissioning. (orig.).
Maximum-likelihood method in quantum estimation
Paris, M G A; Sacchi, M F
2001-01-01
The maximum-likelihood method for quantum estimation is reviewed and applied to the reconstruction of density matrix of spin and radiation as well as to the determination of several parameters of interest in quantum optics.
The maximum entropy technique. System's statistical description
Belashev, B Z
2002-01-01
The maximum entropy technique (MENT) is applied for searching the distribution functions of physical values. MENT takes into consideration the demand of maximum entropy, the characteristics of the system and the connection conditions, naturally. It is allowed to apply MENT for statistical description of closed and open systems. The examples in which MENT had been used for the description of the equilibrium and nonequilibrium states and the states far from the thermodynamical equilibrium are considered
19 CFR 114.23 - Maximum period.
2010-04-01
... 19 Customs Duties 1 2010-04-01 2010-04-01 false Maximum period. 114.23 Section 114.23 Customs... CARNETS Processing of Carnets § 114.23 Maximum period. (a) A.T.A. carnet. No A.T.A. carnet with a period of validity exceeding 1 year from date of issue shall be accepted. This period of validity cannot be...
Maximum-Likelihood Detection Of Noncoherent CPM
Divsalar, Dariush; Simon, Marvin K.
1993-01-01
Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.
SEXUAL DIMORPHISM OF MAXIMUM FEMORAL LENGTH
Pandya A M
2011-04-01
Full Text Available Sexual identification from the skeletal parts has medico legal and anthropological importance. Present study aims to obtain values of maximum femoral length and to evaluate its possible usefulness in determining correct sexual identification. Study sample consisted of 184 dry, normal, adult, human femora (136 male & 48 female from skeletal collections of Anatomy department, M. P. Shah Medical College, Jamnagar, Gujarat. Maximum length of femur was considered as maximum vertical distance between upper end of head of femur and the lowest point on femoral condyle, measured with the osteometric board. Mean Values obtained were, 451.81 and 417.48 for right male and female, and 453.35 and 420.44 for left male and female respectively. Higher value in male was statistically highly significant (P< 0.001 on both sides. Demarking point (D.P. analysis of the data showed that right femora with maximum length more than 476.70 were definitely male and less than 379.99 were definitely female; while for left bones, femora with maximum length more than 484.49 were definitely male and less than 385.73 were definitely female. Maximum length identified 13.43% of right male femora, 4.35% of right female femora, 7.25% of left male femora and 8% of left female femora. [National J of Med Res 2011; 1(2.000: 67-70
Skin scoring in systemic sclerosis
Zachariae, Hugh; Bjerring, Peter; Halkier-Sørensen, Lars
1994-01-01
Forty-one patients with systemic sclerosis were investigated with a new and simple skin score method measuring the degree of thickening and pliability in seven regions together with area involvement in each region. The highest values were, as expected, found in diffuse cutaneous systemic sclerosis...... (type III SS) and the lowest in limited cutaneous systemic sclerosis (type I SS) with no lesions extending above wrists and ancles. A positive correlation was found to the aminoterminal propeptide of type III procollagen, a serological marker for synthesis of type III collagen. The skin score...
Skin scoring in systemic sclerosis
Zachariae, Hugh; Bjerring, Peter; Halkier-Sørensen, Lars
1994-01-01
Forty-one patients with systemic sclerosis were investigated with a new and simple skin score method measuring the degree of thickening and pliability in seven regions together with area involvement in each region. The highest values were, as expected, found in diffuse cutaneous systemic sclerosis...... (type III SS) and the lowest in limited cutaneous systemic sclerosis (type I SS) with no lesions extending above wrists and ancles. A positive correlation was found to the aminoterminal propeptide of type III procollagen, a serological marker for synthesis of type III collagen. The skin score...
Comparison of parent adolescent scores on Strengths and Difficulties Questionnaire
Arman, Soroor; Amel, Afsaneh Karbasi; Maracy, Mohamad Reza
2013-01-01
Background: Child and adolescent psychiatry has benefited from the application of self-report questionnaires because it is short, less costly and easy to apply. So we select the Strengths and Difficulties Questionnaire (SDQ) and evaluate the agreement between the self-report and parent report form. Materials and Methods: Subjects were 1934 Adolescents, 11-18 years old. After obtaining the samples consent, SDQ parent rated form and self-rated form were filled. The collected data were analyzed using the STATA statistical package version 9. Results: The adolescents obtained higher total difficulty scores than their parents, but it was not significant (P = 0.203). Boys had higher total difficulty scores than girls by parent informant (P = 0.001), but by self-report girls had higher total difficulty scores than boys (P = 0.42). 11-14 years had higher total difficulty scores by parent report than self-report (P = 0.42), but 15-18 years had higher total difficulty scores by self-report than parent report (P = 0.36). Conclusion: SDQ self-rating from adolescents may contribute better to the diagnostic process in the clinical setting. PMID:24250700
Comparison of parent adolescent scores on Strengths and Difficulties Questionnaire
Soroor Arman
2013-01-01
Full Text Available Background: Child and adolescent psychiatry has benefited from the application of self-report questionnaires because it is short, less costly and easy to apply. So we select the Strengths and Difficulties Questionnaire (SDQ and evaluate the agreement between the self-report and parent report form. Materials and Methods: Subjects were 1934 Adolescents , 11-18 years old. After obtaining the samples consent, SDQ parent rated form and self-rated form were filled. The collected data were analyzed using the STATA statistical package version 9. Results: The adolescents obtained higher total difficulty scores than their parents, but it was not significant (P = 0.203. Boys had higher total difficulty scores than girls by parent informant (P = 0.001, but by self-report girls had higher total difficulty scores than boys (P = 0.42. 11-14 years had higher total difficulty scores by parent report than self-report (P = 0.42, but 15-18 years had higher total difficulty scores by self-report than parent report (P = 0.36. Conclusion: SDQ self-rating from adolescents may contribute better to the diagnostic process in the clinical setting .
Developing Scoring Algorithms (Earlier Methods)
We developed scoring procedures to convert screener responses to estimates of individual dietary intake for fruits and vegetables, dairy, added sugars, whole grains, fiber, and calcium using the What We Eat in America 24-hour dietary recall data from the 2003-2006 NHANES.
The maximum rotation of a galactic disc
Bottema, R
1997-01-01
The observed stellar velocity dispersions of galactic discs show that the maximum rotation of a disc is on average 63% of the observed maximum rotation. This criterion can, however, not be applied to small or low surface brightness (LSB) galaxies because such systems show, in general, a continuously rising rotation curve until the outermost measured radial position. That is why a general relation has been derived, giving the maximum rotation for a disc depending on the luminosity, surface brightness, and colour of the disc. As a physical basis of this relation serves an adopted fixed mass-to-light ratio as a function of colour. That functionality is consistent with results from population synthesis models and its absolute value is determined from the observed stellar velocity dispersions. The derived maximum disc rotation is compared with a number of observed maximum rotations, clearly demonstrating the need for appreciable amounts of dark matter in the disc region and even more so for LSB galaxies. Matters h...
Maximum permissible voltage of YBCO coated conductors
Wen, J.; Lin, B.; Sheng, J.; Xu, J.; Jin, Z.; Hong, Z.; Wang, D.; Zhou, H.; Shen, X.; Shen, C.
2014-06-01
Superconducting fault current limiter (SFCL) could reduce short circuit currents in electrical power system. One of the most important thing in developing SFCL is to find out the maximum permissible voltage of each limiting element. The maximum permissible voltage is defined as the maximum voltage per unit length at which the YBCO coated conductors (CC) do not suffer from critical current (Ic) degradation or burnout. In this research, the time of quenching process is changed and voltage is raised until the Ic degradation or burnout happens. YBCO coated conductors test in the experiment are from American superconductor (AMSC) and Shanghai Jiao Tong University (SJTU). Along with the quenching duration increasing, the maximum permissible voltage of CC decreases. When quenching duration is 100 ms, the maximum permissible of SJTU CC, 12 mm AMSC CC and 4 mm AMSC CC are 0.72 V/cm, 0.52 V/cm and 1.2 V/cm respectively. Based on the results of samples, the whole length of CCs used in the design of a SFCL can be determined.
Computing Rooted and Unrooted Maximum Consistent Supertrees
van Iersel, Leo
2009-01-01
A chief problem in phylogenetics and database theory is the computation of a maximum consistent tree from a set of rooted or unrooted trees. A standard input are triplets, rooted binary trees on three leaves, or quartets, unrooted binary trees on four leaves. We give exact algorithms constructing rooted and unrooted maximum consistent supertrees in time O(2^n n^5 m^2 log(m)) for a set of m triplets (quartets), each one distinctly leaf-labeled by some subset of n labels. The algorithms extend to weighted triplets (quartets). We further present fast exact algorithms for constructing rooted and unrooted maximum consistent trees in polynomial space. Finally, for a set T of m rooted or unrooted trees with maximum degree D and distinctly leaf-labeled by some subset of a set L of n labels, we compute, in O(2^{mD} n^m m^5 n^6 log(m)) time, a tree distinctly leaf-labeled by a maximum-size subset X of L that all trees in T, when restricted to X, are consistent with.
Gasselseder, Hans-Peter
2014-01-01
This study explores immersive presence as well as emotional valence and arousal in the context of dynamic and non-dynamic music scores in the 3rd person action-adventure video game genre while also considering relevant personality traits of the player. 60 subjects answered self-report questionnai......This study explores immersive presence as well as emotional valence and arousal in the context of dynamic and non-dynamic music scores in the 3rd person action-adventure video game genre while also considering relevant personality traits of the player. 60 subjects answered self...... that a compatible integration of global and local goals in the ludonarrative contributes to a motivational-emotional reinforcement that can be gained through musical feedback. Shedding light on the implications of music dramaturgy within a semantic ecology paradigm, the perception of varying relational attributes...
Primary total elbow arthroplasty
Suresh Kumar
2013-01-01
Full Text Available Background: Primary total elbow arthroplasty (TEA is a challenging procedure for orthopedic surgeons. It is not performed as frequently as compared to hip or knee arthroplasty. The elbow is a nonweight-bearing joint; however, static loading can create forces up to three times the body weight and dynamic loading up to six times. For elderly patients with deformity and ankylosis of the elbow due to posttraumatic arthritis or rheumatoid arthritis or comminuted fracture distal humerus, arthroplasty is one of the option. The aim of this study is to analyze the role of primary total elbow arthroplasty in cases of crippling deformity of elbow. Materials and Methods: We analyzed 11 cases of TEA, between December 2002 and September 2012. There were 8 females and 3 males. The average age was 40 years (range 30-69 years. The indications for TEA were rheumatoid arthritis, comminuted fracture distal humerus with intraarticular extension, and posttraumatic bony ankylosis of elbow joint. The Baksi sloppy (semi constrained hinge elbow prosthesis was used. Clinico-radiological followup was done at 1 month, 3 months, 6 months, 1 year, and then yearly basis. Results: In the present study, average supination was 70° (range 60-80° and average pronation was 70° (range 60-80°. Average flexion was 135° (range 130-135°. However, in 5 cases, there was loss of 15 to 35° (average 25° of extension (45° out of 11 cases. The mean Mayo elbow performance score was 95.4 points (range 70-100. Arm length discrepancy was only in four patients which was 36% out of 11 cases. Clinico-radiologically all the elbows were stable except in one case and no immediate postoperative complication was noted. Radiolucency or loosening of ulnar stem was seen in 2 cases (18% out of 11 cases, in 1 case it was noted after 5 years and in another after 10 years. In second case, revision arthroplasty was done, in which only ulnar hinge section, hinge screw and lock screw with hexagonal head
Maximum Multiflow in Wireless Network Coding
Zhou, Jin-Yi; Jiang, Yong; Zheng, Hai-Tao
2012-01-01
In a multihop wireless network, wireless interference is crucial to the maximum multiflow (MMF) problem, which studies the maximum throughput between multiple pairs of sources and sinks. In this paper, we observe that network coding could help to decrease the impacts of wireless interference, and propose a framework to study the MMF problem for multihop wireless networks with network coding. Firstly, a network model is set up to describe the new conflict relations modified by network coding. Then, we formulate a linear programming problem to compute the maximum throughput and show its superiority over one in networks without coding. Finally, the MMF problem in wireless network coding is shown to be NP-hard and a polynomial approximation algorithm is proposed.
Restricted total stability and total attractivity
Giuseppe Zappala'
2006-08-01
Full Text Available In this paper the new concepts of restricted total stability and total attractivity is formulated. For this purpose the classical theory of Malkin with suitable changes and the theory of limiting equations, introduced by Sell developed by Artstein and Andreev, are used. Significant examples are presented.
The Wiener maximum quadratic assignment problem
Cela, Eranda; Woeginger, Gerhard J
2011-01-01
We investigate a special case of the maximum quadratic assignment problem where one matrix is a product matrix and the other matrix is the distance matrix of a one-dimensional point set. We show that this special case, which we call the Wiener maximum quadratic assignment problem, is NP-hard in the ordinary sense and solvable in pseudo-polynomial time. Our approach also yields a polynomial time solution for the following problem from chemical graph theory: Find a tree that maximizes the Wiener index among all trees with a prescribed degree sequence. This settles an open problem from the literature.
Maximum confidence measurements via probabilistic quantum cloning
Zhang Wen-Hai; Yu Long-Bao; Cao Zhuo-Liang; Ye Liu
2013-01-01
Probabilistic quantum cloning (PQC) cannot copy a set of linearly dependent quantum states.In this paper,we show that if incorrect copies are allowed to be produced,linearly dependent quantum states may also be cloned by the PQC.By exploiting this kind of PQC to clone a special set of three linearly dependent quantum states,we derive the upper bound of the maximum confidence measure of a set.An explicit transformation of the maximum confidence measure is presented.
Maximum floodflows in the conterminous United States
Crippen, John R.; Bue, Conrad D.
1977-01-01
Peak floodflows from thousands of observation sites within the conterminous United States were studied to provide a guide for estimating potential maximum floodflows. Data were selected from 883 sites with drainage areas of less than 10,000 square miles (25,900 square kilometers) and were grouped into regional sets. Outstanding floods for each region were plotted on graphs, and envelope curves were computed that offer reasonable limits for estimates of maximum floods. The curves indicate that floods may occur that are two to three times greater than those known for most streams.
Revealing the Maximum Strength in Nanotwinned Copper
Lu, L.; Chen, X.; Huang, Xiaoxu
2009-01-01
The strength of polycrystalline materials increases with decreasing grain size. Below a critical size, smaller grains might lead to softening, as suggested by atomistic simulations. The strongest size should arise at a transition in deformation mechanism from lattice dislocation activities to grain...... boundary–related processes. We investigated the maximum strength of nanotwinned copper samples with different twin thicknesses. We found that the strength increases with decreasing twin thickness, reaching a maximum at 15 nanometers, followed by a softening at smaller values that is accompanied by enhanced...
Maximum entropy analysis of EGRET data
Pohl, M.; Strong, A.W.
1997-01-01
EGRET data are usually analysed on the basis of the Maximum-Likelihood method \\cite{ma96} in a search for point sources in excess to a model for the background radiation (e.g. \\cite{hu97}). This method depends strongly on the quality of the background model, and thus may have high systematic unce...... uncertainties in region of strong and uncertain background like the Galactic Center region. Here we show images of such regions obtained by the quantified Maximum-Entropy method. We also discuss a possible further use of MEM in the analysis of problematic regions of the sky....
Revealing the Maximum Strength in Nanotwinned Copper
Lu, L.; Chen, X.; Huang, Xiaoxu
2009-01-01
The strength of polycrystalline materials increases with decreasing grain size. Below a critical size, smaller grains might lead to softening, as suggested by atomistic simulations. The strongest size should arise at a transition in deformation mechanism from lattice dislocation activities to grain...... boundary–related processes. We investigated the maximum strength of nanotwinned copper samples with different twin thicknesses. We found that the strength increases with decreasing twin thickness, reaching a maximum at 15 nanometers, followed by a softening at smaller values that is accompanied by enhanced...
Maximum phytoplankton concentrations in the sea
Jackson, G.A.; Kiørboe, Thomas
2008-01-01
A simplification of plankton dynamics using coagulation theory provides predictions of the maximum algal concentration sustainable in aquatic systems. These predictions have previously been tested successfully against results from iron fertilization experiments. We extend the test to data collected...... in the North Atlantic as part of the Bermuda Atlantic Time Series program as well as data collected off Southern California as part of the Southern California Bight Study program. The observed maximum particulate organic carbon and volumetric particle concentrations are consistent with the predictions...
Maximum Likelihood Dynamic Factor Modeling for Arbitrary "N" and "T" Using SEM
Voelkle, Manuel C.; Oud, Johan H. L.; von Oertzen, Timo; Lindenberger, Ulman
2012-01-01
This article has 3 objectives that build on each other. First, we demonstrate how to obtain maximum likelihood estimates for dynamic factor models (the direct autoregressive factor score model) with arbitrary "T" and "N" by means of structural equation modeling (SEM) and compare the approach to existing methods. Second, we go beyond standard time…
CytoMCS: A Multiple Maximum Common Subgraph Detection Tool for Cytoscape
Larsen, Simon; Baumbach, Jan
2017-01-01
such analyses we have developed CytoMCS, a Cytoscape app for computing inexact solutions to the maximum common edge subgraph problem for two or more graphs. Our algorithm uses an iterative local search heuristic for computing conserved subgraphs, optimizing a squared edge conservation score that is able...
Carla Franchi-Pinto
1999-03-01
Full Text Available Intraclass correlation coefficients for one- and five-min Apgar scores of 604 twin pairs born at a southeastern Brazilian hospital were calculated, after adjusting these scores for gestational age and sex. The data support a genetic hypothesis only for 1-min Apgar score, probably because it is less affected by the environment than 4 min later, after the newborns have been under the care of a neonatology team. First-born twins exhibited, on average, better clinical conditions than second-born twins. The former showed a significantly lower proportion of Apgar scores under seven than second-born twins, both at 1 min (17.5% vs. 29.8% and at 5 min (7.2% vs. 11.9%. The proportion of children born with "good" Apgar scores was significantly smaller among twins than among 1,522 singletons born at the same hospital. Among the latter, 1- and 5-min Apgar scores under seven were exhibited by 9.2% and 3.4% newborns, respectively.Os coeficientes de correlação intraclasse foram calculados para os índices de Apgar 1 e 5 minutos após o nascimento de 604 pares de gêmeos em uma maternidade do sudeste brasileiro, depois que esses índices foram ajustados para idade gestacional e sexo. Os dados obtidos apoiaram a hipótese genética apenas em relação ao primeiro índice de Apgar, provavelmente porque ele é menos influenciado pelo ambiente do que 4 minutos depois, quando os recém-nascidos já estiveram sob os cuidados de uma equipe de neonatologistas. Os gêmeos nascidos em primeiro lugar apresentaram, em média, melhor estado clínico que os nascidos em segundo lugar, visto que os primeiros mostraram uma proporção de índices de Apgar inferiores a 7 significativamente menor do que os nascidos em segundo lugar, tanto um minuto (17,5% contra 29,8% quanto cinco minutos após o nascimento (7,2% contra 11,9%. A proporção de recém-nascidos com índices de Apgar que indicam bom prognóstico foi significativamente menor nos gêmeos do que em 1.522 conceptos
Yihui Li
Full Text Available BACKGROUND: The CHADS2/CHA2DS2-VASc scores are used to predict thrombo-embolic/stroke in patients with nonvalvular atrial fibrillation (AF. Nevertheless, limited data are available regarding the association between these risk stratification for stroke and left atrial (LA remodeling status of AF patients. The purpose of this study was to explore the association between these scores and LA remodeling status assessed quantificationally by echocardiography in AF patients. METHODS: One hundred AF patients were divided into 3 groups based on the CHA2DS2-VASc/CHADS2 score: the score of 0 (low stroke risk, the score of 1 (moderate stroke risk and the score of ≥2 (high stroke risk. All patients were performed through conventional and velocity vector imaging echocardiography. Echocardiographic parameters: maximum LA volume index (LAVImax, LA total emptying fraction (LAEFt and LA mean strain were obtained to assess quantificationally LA remodeling status. RESULTS: On categorizing with CHA2DS2-VASc, the score of 1 group showed augment in LAVImax and attenuation in LA mean strain derived from VVI, compared with the score of 0 group (LAVImax: 40.27±21.91 vs. 26.79±7.87, p=0.002; LA mean strain: 15.18±6.36 vs. 22±8.54, p=0.001. On categorizing with the CHADS2 score, similar trends were seen between the score of ≥2 and 1 groups (LAVImax: 43.72±13.77 vs. 31.41±9.50, p<0.001; LA mean strain: 11.01±5.31 vs. 18.63±7.00, p<0.001. With multivariate logistic regression, LAVImax (odds ratio: 0.92 , 95% C=I: 0.85 to 0.98, p= 0.01 and LA mean strain reflecting LA remodeling (odds ratio: 1.10, 95% CI: 1.02 to 1.19, p=0.01 were strongly predictive of the CHA2DS2-VASc score of 0. CONCLUSIONS: The superiority of the CHADS2 score may lay in identifying LA remodeling of AF patients with high stroke risk. Whereas, the CHA2DS2-VASc score was better than the CHADS2 score at identifying LA remodeling of AF patients presenting low stroke risk.
Marginal Maximum Likelihood Estimation of Item Response Models in R
Matthew S. Johnson
2007-02-01
Full Text Available Item response theory (IRT models are a class of statistical models used by researchers to describe the response behaviors of individuals to a set of categorically scored items. The most common IRT models can be classified as generalized linear fixed- and/or mixed-effect models. Although IRT models appear most often in the psychological testing literature, researchers in other fields have successfully utilized IRT-like models in a wide variety of applications. This paper discusses the three major methods of estimation in IRT and develops R functions utilizing the built-in capabilities of the R environment to find the marginal maximum likelihood estimates of the generalized partial credit model. The currently available R packages ltm is also discussed.
Temporomandibular joint reconstruction with total alloplastic joint replacement.
Jones, R H B
2011-03-01
This paper is a preliminary paper which presents the early findings of an ongoing prospective trial on the use of the TMJ Concepts and Biomet Lorenz total joint replacement systems for the reconstruction of the temporomandibular joint (TMJ). Total alloplastic replacement of the TMJ has become a viable option for many people who suffer from TMJ disease where surgical reconstruction is indicated. Degenerative joint diseases such as osteoarthritis, rheumatoid arthritis, psoriatic arthritis, TMJ ankylosis, malunited condylar fractures and tumours can be successfully treated using this technique. There are a number of TMJ prostheses available. Two of the joint replacement products, which have been found to be most reliable and have FDA approval in the United States, are the TMJ Concepts system and the Biomet Lorenz system, and for this reason they are being investigated in this study. This study presents the findings of seven patients with a total of 12 joint replacements using either the TMJ Concepts system or the Biomet Lorenz joint system. Two patients (3 joints) had the TMJ Concepts system and five patients (9 joints) had the Biomet Lorenz system. Although still early, the results were generally pleasing, with the longest replacement having been in position for three years and the most recent six months. The average postoperative mouth opening was 29.7 mm (range 25-35 mm) with an average pain score of 1.7 (range 0-3, minimum score of 0 and maximum 10). Complications were minimal and related to sensory disturbance to the lip in one patient and joint dislocation in two patients.
Maximum-likelihood fits to histograms for improved parameter estimation
Fowler, Joseph W
2013-01-01
Straightforward methods for adapting the familiar chi^2 statistic to histograms of discrete events and other Poisson distributed data generally yield biased estimates of the parameters of a model. The bias can be important even when the total number of events is large. For the case of estimating a microcalorimeter's energy resolution at 6 keV from the observed shape of the Mn K-alpha fluorescence spectrum, a poor choice of chi^2 can lead to biases of at least 10% in the estimated resolution when up to thousands of photons are observed. The best remedy is a Poisson maximum-likelihood fit, through a simple modification of the standard Levenberg-Marquardt algorithm for chi^2 minimization. Where the modification is not possible, another approach allows iterative approximation of the maximum-likelihood fit.
Composite MRI scores improve correlation with EDSS in multiple sclerosis.
Poonawalla, A H; Datta, S; Juneja, V; Nelson, F; Wolinsky, J S; Cutter, G; Narayana, P A
2010-09-01
Quantitative measures derived from magnetic resonance imaging (MRI) have been widely investigated as non-invasive biomarkers in multiple sclerosis (MS). However, the correlation of single measures with Expanded Disability Status Scale (EDSS) is poor, especially for studies with large population samples. To explore the correlation of MRI-derived measures with EDSS through composite MRI scores. Magnetic resonance images of 126 patients with relapsing-remitting MS were segmented into white and gray matter, cerebrospinal fluid, T2-hyperintense lesions, gadolinium contrast-enhancing lesions, T1-hypointense lesions ('black holes': BH). The volumes and average T2 values for each of these tissues and lesions were calculated and converted to a z-score (in units of standard deviation from the mean). These z-scores were combined to construct composite z-scores, and evaluated against individual z-scores for correlation with EDSS. Composite scores including relaxation times of different tissues and/or volumetric measures generally correlated more strongly with EDSS than individual measures. The maximum observed correlation of a composite with EDSS was r = 0.344 (p EDSS.
Lovell, D P
1996-10-01
The multivariate statistical method Principal Component Analysis (PCA) has been applied to a set of data from the ECETOC reference chemical data bank. PCA is a multivariate method that can be used to explore a complex data set. The results of the analysis show that most of the variability in the values for tissue damage scores for the 55 chemicals can be described by a single principal component which explains nearly 80% of the variability. This component is derived by giving approximately equal weight to each of the 18 individual measures made on the tissues over the 24-, 48- and 72-hr observation period. The principal component scores on the first component (PC I) are very highly correlated with the maximum individual weighted Draize scores or total Draize scores (TDS) derived using the Draize scoring method. A second principal component, describing about 7% of the variability, contrasts damage measured on the iris and cornea with that measured on the conjunctiva. Plots of principal component scores show the overall pattern of responses. In general, low measures of the TDS and a positive (PC I) score are associated with iris and conjunctival damage (damage to the iris was never recorded in the absence of damage to the conjunctiva). High TDS and negative PC I scores are associated with corneal and/or iris and conjunctiva damage. Plots of the principal component scores identify some chemicals that appear to cause unusual patterns of damage and identify some individual animals as having outlying or idiosyncratic responses. However, the analysis suggests that (i) there is only limited evidence for differential responses of different tissues and (ii) that attempts to identify alternative tests which predict specific types of tissue damage based on the results collected in a Draize test are likely to be unsuccessful. It indicates that further refinement of the results of the in vivo Draize test will not arise from more detailed analysis of the tissue scores but by
Maximum Likelihood Learning of Conditional MTE Distributions
Langseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael
2009-01-01
We describe a procedure for inducing conditional densities within the mixtures of truncated exponentials (MTE) framework. We analyse possible conditional MTE speciﬁcations and propose a model selection scheme, based on the BIC score, for partitioning the domain of the conditioning variables....... Finally, experimental results demonstrate the applicability of the learning procedure as well as the expressive power of the conditional MTE distribution....
Herzberg, Guillaume; Boeckstyns, Michel Ernest Henri; Sorensen, Allan Ibsen
2012-01-01
preoperative and postoperative reports of "ReMotion" TWA at regular intervals. The cases of 7 centers with more than 15 inclusions were considered for this article. A total of 215 wrists were included. In the rheumatoid arthritis (RA; 129 wrists) and nonrheumatoid arthritis (non-RA; 86 wrists) groups......, there were respectively 5 and 6% complications requiring implant revision with a survival rate of 96 and 92%, respectively, at an average follow-up of 4 years. Within the whole series, only one dislocation was observed in one non-RA wrist. A total of 112 wrists (75 rheumatoid and 37 nonrheumatoid) had more...... than 2 years of follow-up (minimum: 2 years, maximum: 8 years). In rheumatoid and non-RA group, visual analog scale (VAS) pain score improved by 48 and 54 points, respectively, and QuickDASH score improved by 20 and 21 points, respectively, with no statistical differences. Average postoperative arc...
The Tipping Point: F-Score as a Function of the Number of Retrieved Items
Guns, Raf; Lioma, Christina; Larsen, Birger
2012-01-01
One of the best known measures of information retrieval (IR) performance is the F-score, the harmonic mean of precision and recall. In this article we show that the curve of the F-score as a function of the number of retrieved items is always of the same shape: a fast concave increase to a maximum......, followed by a slow decrease. In other words, there exists a single maximum, referred to as the tipping point, where the retrieval situation is 'ideal' in terms of the F-score. The tipping point thus indicates the optimal number of items to be retrieved, with more or less items resulting in a lower F...
Analysis of Photovoltaic Maximum Power Point Trackers
Veerachary, Mummadi
The photovoltaic generator exhibits a non-linear i-v characteristic and its maximum power point (MPP) varies with solar insolation. An intermediate switch-mode dc-dc converter is required to extract maximum power from the photovoltaic array. In this paper buck, boost and buck-boost topologies are considered and a detailed mathematical analysis, both for continuous and discontinuous inductor current operation, is given for MPP operation. The conditions on the connected load values and duty ratio are derived for achieving the satisfactory maximum power point operation. Further, it is shown that certain load values, falling out of the optimal range, will drive the operating point away from the true maximum power point. Detailed comparison of various topologies for MPPT is given. Selection of the converter topology for a given loading is discussed. Detailed discussion on circuit-oriented model development is given and then MPPT effectiveness of various converter systems is verified through simulations. Proposed theory and analysis is validated through experimental investigations.
On maximum cycle packings in polyhedral graphs
Peter Recht
2014-04-01
Full Text Available This paper addresses upper and lower bounds for the cardinality of a maximum vertex-/edge-disjoint cycle packing in a polyhedral graph G. Bounds on the cardinality of such packings are provided, that depend on the size, the order or the number of faces of G, respectively. Polyhedral graphs are constructed, that attain these bounds.
Hard graphs for the maximum clique problem
Hoede, Cornelis
1988-01-01
The maximum clique problem is one of the NP-complete problems. There are graphs for which a reduction technique exists that transforms the problem for these graphs into one for graphs with specific properties in polynomial time. The resulting graphs do not grow exponentially in order and number. Gra
Maximum Likelihood Estimation of Search Costs
J.L. Moraga-Gonzalez (José Luis); M.R. Wildenbeest (Matthijs)
2006-01-01
textabstractIn a recent paper Hong and Shum (forthcoming) present a structural methodology to estimate search cost distributions. We extend their approach to the case of oligopoly and present a maximum likelihood estimate of the search cost distribution. We apply our method to a data set of online p
Weak Scale From the Maximum Entropy Principle
Hamada, Yuta; Kawana, Kiyoharu
2015-01-01
The theory of multiverse and wormholes suggests that the parameters of the Standard Model are fixed in such a way that the radiation of the $S^{3}$ universe at the final stage $S_{rad}$ becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the Standard Model, we can check whether $S_{rad}$ actually becomes maximum at the observed values. In this paper, we regard $S_{rad}$ at the final stage as a function of the weak scale ( the Higgs expectation value ) $v_{h}$, and show that it becomes maximum around $v_{h}={\\cal{O}}(300\\text{GeV})$ when the dimensionless couplings in the Standard Model, that is, the Higgs self coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by \\begin{equation} v_{h}\\sim\\frac{T_{BBN}^{2}}{M_{pl}y_{e}^{5}},\
Weak scale from the maximum entropy principle
Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu
2015-03-01
The theory of the multiverse and wormholes suggests that the parameters of the Standard Model (SM) are fixed in such a way that the radiation of the S3 universe at the final stage S_rad becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the SM, we can check whether S_rad actually becomes maximum at the observed values. In this paper, we regard S_rad at the final stage as a function of the weak scale (the Higgs expectation value) vh, and show that it becomes maximum around vh = {{O}} (300 GeV) when the dimensionless couplings in the SM, i.e., the Higgs self-coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by vh ˜ T_{BBN}2 / (M_{pl}ye5), where ye is the Yukawa coupling of electron, T_BBN is the temperature at which the Big Bang nucleosynthesis starts, and M_pl is the Planck mass.
Global characterization of the Holocene Thermal Maximum
Renssen, H.; Seppä, H.; Crosta, X.; Goosse, H.; Roche, D.M.V.A.P.
2012-01-01
We analyze the global variations in the timing and magnitude of the Holocene Thermal Maximum (HTM) and their dependence on various forcings in transient simulations covering the last 9000 years (9 ka), performed with a global atmosphere-ocean-vegetation model. In these experiments, we consider the i
Maximum phonation time: variability and reliability.
Speyer, Renée; Bogaardt, Hans C A; Passos, Valéria Lima; Roodenburg, Nel P H D; Zumach, Anne; Heijnen, Mariëlle A M; Baijens, Laura W J; Fleskens, Stijn J H M; Brunings, Jan W
2010-05-01
The objective of the study was to determine maximum phonation time reliability as a function of the number of trials, days, and raters in dysphonic and control subjects. Two groups of adult subjects participated in this reliability study: a group of outpatients with functional or organic dysphonia versus a group of healthy control subjects matched by age and gender. Over a period of maximally 6 weeks, three video recordings were made of five subjects' maximum phonation time trials. A panel of five experts were responsible for all measurements, including a repeated measurement of the subjects' first recordings. Patients showed significantly shorter maximum phonation times compared with healthy controls (on average, 6.6 seconds shorter). The averaged interclass correlation coefficient (ICC) over all raters per trial for the first day was 0.998. The averaged reliability coefficient per rater and per trial for repeated measurements of the first day's data was 0.997, indicating high intrarater reliability. The mean reliability coefficient per day for one trial was 0.939. When using five trials, the reliability increased to 0.987. The reliability over five trials for a single day was 0.836; for 2 days, 0.911; and for 3 days, 0.935. To conclude, the maximum phonation time has proven to be a highly reliable measure in voice assessment. A single rater is sufficient to provide highly reliable measurements.
Maximum Phonation Time: Variability and Reliability
R. Speyer; H.C.A. Bogaardt; V.L. Passos; N.P.H.D. Roodenburg; A. Zumach; M.A.M. Heijnen; L.W.J. Baijens; S.J.H.M. Fleskens; J.W. Brunings
2010-01-01
The objective of the study was to determine maximum phonation time reliability as a function of the number of trials, days, and raters in dysphonic and control subjects. Two groups of adult subjects participated in this reliability study: a group of outpatients with functional or organic dysphonia v
Maximum likelihood estimation of fractionally cointegrated systems
Lasak, Katarzyna
In this paper we consider a fractionally cointegrated error correction model and investigate asymptotic properties of the maximum likelihood (ML) estimators of the matrix of the cointe- gration relations, the degree of fractional cointegration, the matrix of the speed of adjustment...
Maximum likelihood estimation for integrated diffusion processes
Baltazar-Larios, Fernando; Sørensen, Michael
EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...
Maximum gain of Yagi-Uda arrays
Bojsen, J.H.; Schjær-Jacobsen, Hans; Nilsson, E.
1971-01-01
Numerical optimisation techniques have been used to find the maximum gain of some specific parasitic arrays. The gain of an array of infinitely thin, equispaced dipoles loaded with arbitrary reactances has been optimised. The results show that standard travelling-wave design methods are not optimum....... Yagi–Uda arrays with equal and unequal spacing have also been optimised with experimental verification....
Interpreting force concept inventory scores: Normalized gain and SAT scores
Vincent P. Coletta
2007-05-01
Full Text Available Preinstruction SAT scores and normalized gains (G on the force concept inventory (FCI were examined for individual students in interactive engagement (IE courses in introductory mechanics at one high school (N=335 and one university (N=292, and strong, positive correlations were found for both populations (r=0.57 and r=0.46, respectively. These correlations are likely due to the importance of cognitive skills and abstract reasoning in learning physics. The larger correlation coefficient for the high school population may be a result of the much shorter time interval between taking the SAT and studying mechanics, because the SAT may provide a more current measure of abilities when high school students begin the study of mechanics than it does for college students, who begin mechanics years after the test is taken. In prior research a strong correlation between FCI G and scores on Lawson’s Classroom Test of Scientific Reasoning for students from the same two schools was observed. Our results suggest that, when interpreting class average normalized FCI gains and comparing different classes, it is important to take into account the variation of students’ cognitive skills, as measured either by the SAT or by Lawson’s test. While Lawson’s test is not commonly given to students in most introductory mechanics courses, SAT scores provide a readily available alternative means of taking account of students’ reasoning abilities. Knowing the students’ cognitive level before instruction also allows one to alter instruction or to use an intervention designed to improve students’ cognitive level.
Interpreting force concept inventory scores: Normalized gain and SAT scores
Jeffrey J. Steinert
2007-05-01
Full Text Available Preinstruction SAT scores and normalized gains (G on the force concept inventory (FCI were examined for individual students in interactive engagement (IE courses in introductory mechanics at one high school (N=335 and one university (N=292 , and strong, positive correlations were found for both populations ( r=0.57 and r=0.46 , respectively. These correlations are likely due to the importance of cognitive skills and abstract reasoning in learning physics. The larger correlation coefficient for the high school population may be a result of the much shorter time interval between taking the SAT and studying mechanics, because the SAT may provide a more current measure of abilities when high school students begin the study of mechanics than it does for college students, who begin mechanics years after the test is taken. In prior research a strong correlation between FCI G and scores on Lawson’s Classroom Test of Scientific Reasoning for students from the same two schools was observed. Our results suggest that, when interpreting class average normalized FCI gains and comparing different classes, it is important to take into account the variation of students’ cognitive skills, as measured either by the SAT or by Lawson’s test. While Lawson’s test is not commonly given to students in most introductory mechanics courses, SAT scores provide a readily available alternative means of taking account of students’ reasoning abilities. Knowing the students’ cognitive level before instruction also allows one to alter instruction or to use an intervention designed to improve students’ cognitive level.
Silva, Adriana Lucia Pastore E; Croci, Alberto Tesconi; Gobbi, Riccardo Gomes; Hinckel, Betina Bremer; Pecora, José Ricardo; Demange, Marco Kawamura
2017-01-01
Translation, cultural adaptation, and validation of the new version of the Knee Society Score - The 2011 KS Score - into Brazilian Portuguese and verification of its measurement properties, reproducibility, and validity. In 2012, the new version of the Knee Society Score was developed and validated. This scale comprises four separate subscales: (a) objective knee score (seven items: 100 points); (b) patient satisfaction score (five items: 40 points); (c) patient expectations score (three items: 15 points); and (d) functional activity score (19 items: 100 points). A total of 90 patients aged 55-85 years were evaluated in a clinical cross-sectional study. The pre-operative translated version was applied to patients with TKA referral, and the post-operative translated version was applied to patients who underwent TKA. Each patient answered the same questionnaire twice and was evaluated by two experts in orthopedic knee surgery. Evaluations were performed pre-operatively and three, six, or 12 months post-operatively. The reliability of the questionnaire was evaluated using the intraclass correlation coefficient (ICC) between the two applications. Internal consistency was evaluated using Cronbach's alpha. The ICC found no difference between the means of the pre-operative, three-month, and six-month post-operative evaluations between sub-scale items. The Brazilian Portuguese version of The 2011 KS Score is a valid and reliable instrument for objective and subjective evaluation of the functionality of Brazilian patients who undergo TKA and revision TKA.
Bias Adjusted Precipitation Threat Scores
F. Mesinger
2008-04-01
Full Text Available Among the wide variety of performance measures available for the assessment of skill of deterministic precipitation forecasts, the equitable threat score (ETS might well be the one used most frequently. It is typically used in conjunction with the bias score. However, apart from its mathematical definition the meaning of the ETS is not clear. It has been pointed out (Mason, 1989; Hamill, 1999 that forecasts with a larger bias tend to have a higher ETS. Even so, the present author has not seen this having been accounted for in any of numerous papers that in recent years have used the ETS along with bias "as a measure of forecast accuracy".
A method to adjust the threat score (TS or the ETS so as to arrive at their values that correspond to unit bias in order to show the model's or forecaster's accuracy in extit{placing} precipitation has been proposed earlier by the present author (Mesinger and Brill, the so-called dH/dF method. A serious deficiency however has since been noted with the dH/dF method in that the hypothetical function that it arrives at to interpolate or extrapolate the observed value of hits to unit bias can have values of hits greater than forecast when the forecast area tends to zero. Another method is proposed here based on the assumption that the increase in hits per unit increase in false alarms is proportional to the yet unhit area. This new method removes the deficiency of the dH/dF method. Examples of its performance for 12 months of forecasts by three NCEP operational models are given.
Empathy Score among Student Residence Assistants in Iran
Shahini, Najmeh; Rezayat, Kambiz Akhavane; Behdani, Fatemeh; Shojaei, Seyed Reza Habibzadeh; Rezayat, Amir Akhavan; Dadgarmoghaddam, Maliheh
2016-01-01
Introduction Empathy, an essential component of the physician–patient relationship, may be linked to positive patient outcomes. This study aimed to determine the empathy score among student residence assistants (RAs). Methods In this descriptive design (cross-sectional study), 102 Iranian RAs participated in the study during 2015, completing the Jefferson Scale of Empathy (JSPE). Data collection was analyzed using SPSS version 17. MANOVA, independent-samples t-test, Spearman correlation and confirmatory factor analysis (CFA) were used for data analysis. Results Mean score of JSE in the sample was 87.06 (±15.14). The mean scores for perspective taking, compassionate care, and standing in the patients shoes were 38.90 (±13.11), 39.27 (±7.94), and 8.89 (±2.80) respectively. Among the three specialties, (psychiatric, internal medicine, surgery) results showed significant differences in total empathy score (p=0.001) and perspective taking score (p= 0.008). Conclusions this study showed significant differences in total empathy score and perspective taking in three specialties. We suggest that the curriculum in Iranian RAs include more teaching on empathy and communicational skills. PMID:28163848
Model Selection Through Sparse Maximum Likelihood Estimation
Banerjee, Onureena; D'Aspremont, Alexandre
2007-01-01
We consider the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse. Our approach is to solve a maximum likelihood problem with an added l_1-norm penalty term. The problem as formulated is convex but the memory requirements and complexity of existing interior point methods are prohibitive for problems with more than tens of nodes. We present two new algorithms for solving problems with at least a thousand nodes in the Gaussian case. Our first algorithm uses block coordinate descent, and can be interpreted as recursive l_1-norm penalized regression. Our second algorithm, based on Nesterov's first order method, yields a complexity estimate with a better dependence on problem size than existing interior point methods. Using a log determinant relaxation of the log partition function (Wainwright & Jordan (2006)), we show that these same algorithms can be used to solve an approximate sparse maximum likelihood problem for...
Maximum-entropy description of animal movement.
Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M
2015-03-01
We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.
Pareto versus lognormal: a maximum entropy test.
Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano
2011-08-01
It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.
Nonparametric Maximum Entropy Estimation on Information Diagrams
Martin, Elliot A; Meinke, Alexander; Děchtěrenko, Filip; Davidsen, Jörn
2016-01-01
Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies. Specifically, we show how to apply the concept to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish a number of significant advantages of our approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases. In addition, we propose a nonparametric formulation of connected informations and give an illustrative example showing how this agrees with the existing parametric formulation in cases of interest. We furthe...
Zipf's law, power laws and maximum entropy
Visser, Matt
2013-04-01
Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines—from astronomy to demographics to software structure to economics to linguistics to zoology, and even warfare. A recent model of random group formation (RGF) attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present paper I argue that the specific cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified.
Zipf's law, power laws, and maximum entropy
Visser, Matt
2012-01-01
Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines - from astronomy to demographics to economics to linguistics to zoology, and even warfare. A recent model of random group formation [RGF] attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present article I argue that the cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified.
Regions of constrained maximum likelihood parameter identifiability
Lee, C.-H.; Herget, C. J.
1975-01-01
This paper considers the parameter identification problem of general discrete-time, nonlinear, multiple-input/multiple-output dynamic systems with Gaussian-white distributed measurement errors. Knowledge of the system parameterization is assumed to be known. Regions of constrained maximum likelihood (CML) parameter identifiability are established. A computation procedure employing interval arithmetic is proposed for finding explicit regions of parameter identifiability for the case of linear systems. It is shown that if the vector of true parameters is locally CML identifiable, then with probability one, the vector of true parameters is a unique maximal point of the maximum likelihood function in the region of parameter identifiability and the CML estimation sequence will converge to the true parameters.
A Maximum Radius for Habitable Planets.
Alibert, Yann
2015-09-01
We compute the maximum radius a planet can have in order to fulfill two constraints that are likely necessary conditions for habitability: 1- surface temperature and pressure compatible with the existence of liquid water, and 2- no ice layer at the bottom of a putative global ocean, that would prevent the operation of the geologic carbon cycle to operate. We demonstrate that, above a given radius, these two constraints cannot be met: in the Super-Earth mass range (1-12 Mearth), the overall maximum that a planet can have varies between 1.8 and 2.3 Rearth. This radius is reduced when considering planets with higher Fe/Si ratios, and taking into account irradiation effects on the structure of the gas envelope.
Maximum Profit Configurations of Commercial Engines
Yiran Chen
2011-01-01
An investigation of commercial engines with finite capacity low- and high-price economic subsystems and a generalized commodity transfer law [n ∝ Δ (P m)] in commodity flow processes, in which effects of the price elasticities of supply and demand are introduced, is presented in this paper. Optimal cycle configurations of commercial engines for maximum profit are obtained by applying optimal control theory. In some special cases, the eventual state—market equilibrium—is solely determined by t...
A stochastic maximum principle via Malliavin calculus
Øksendal, Bernt; Zhou, Xun Yu; Meyer-Brandis, Thilo
2008-01-01
This paper considers a controlled It\\^o-L\\'evy process where the information available to the controller is possibly less than the overall information. All the system coefficients and the objective performance functional are allowed to be random, possibly non-Markovian. Malliavin calculus is employed to derive a maximum principle for the optimal control of such a system where the adjoint process is explicitly expressed.
Tissue radiation response with maximum Tsallis entropy.
Sotolongo-Grau, O; Rodríguez-Pérez, D; Antoranz, J C; Sotolongo-Costa, Oscar
2010-10-08
The expression of survival factors for radiation damaged cells is currently based on probabilistic assumptions and experimentally fitted for each tumor, radiation, and conditions. Here, we show how the simplest of these radiobiological models can be derived from the maximum entropy principle of the classical Boltzmann-Gibbs expression. We extend this derivation using the Tsallis entropy and a cutoff hypothesis, motivated by clinical observations. The obtained expression shows a remarkable agreement with the experimental data found in the literature.
Maximum Estrada Index of Bicyclic Graphs
Wang, Long; Wang, Yi
2012-01-01
Let $G$ be a simple graph of order $n$, let $\\lambda_1(G),\\lambda_2(G),...,\\lambda_n(G)$ be the eigenvalues of the adjacency matrix of $G$. The Esrada index of $G$ is defined as $EE(G)=\\sum_{i=1}^{n}e^{\\lambda_i(G)}$. In this paper we determine the unique graph with maximum Estrada index among bicyclic graphs with fixed order.
Maximum privacy without coherence, zero-error
Leung, Debbie; Yu, Nengkun
2016-09-01
We study the possible difference between the quantum and the private capacities of a quantum channel in the zero-error setting. For a family of channels introduced by Leung et al. [Phys. Rev. Lett. 113, 030512 (2014)], we demonstrate an extreme difference: the zero-error quantum capacity is zero, whereas the zero-error private capacity is maximum given the quantum output dimension.
The HEART score for chest pain patients
Backus, B.E.
2012-01-01
The HEART score was developed to improve risk stratification in chest pain patients in the emergency department (ED). This thesis describes series of validation studies of the HEART score and sub studies for individual elements of the score. The predictive value of the HEART score for the occurrence
Scoring and Standard Setting with Standardized Patients.
Norcini, John J.; And Others
1993-01-01
The continuous method of scoring a performance test composed of standardized patients was compared with a derivative method that assigned each of the 131 examinees (medical residents) a dichotomous score, and use of Angoff's method with these scoring methods was studied. Both methods produce reasonable means and distributions of scores. (SLD)
Automatic maximum entropy spectral reconstruction in NMR.
Mobli, Mehdi; Maciejewski, Mark W; Gryk, Michael R; Hoch, Jeffrey C
2007-10-01
Developments in superconducting magnets, cryogenic probes, isotope labeling strategies, and sophisticated pulse sequences together have enabled the application, in principle, of high-resolution NMR spectroscopy to biomolecular systems approaching 1 megadalton. In practice, however, conventional approaches to NMR that utilize the fast Fourier transform, which require data collected at uniform time intervals, result in prohibitively lengthy data collection times in order to achieve the full resolution afforded by high field magnets. A variety of approaches that involve nonuniform sampling have been proposed, each utilizing a non-Fourier method of spectrum analysis. A very general non-Fourier method that is capable of utilizing data collected using any of the proposed nonuniform sampling strategies is maximum entropy reconstruction. A limiting factor in the adoption of maximum entropy reconstruction in NMR has been the need to specify non-intuitive parameters. Here we describe a fully automated system for maximum entropy reconstruction that requires no user-specified parameters. A web-accessible script generator provides the user interface to the system.
Maximum entropy analysis of cosmic ray composition
Nosek, Dalibor; Vícha, Jakub; Trávníček, Petr; Nosková, Jana
2016-01-01
We focus on the primary composition of cosmic rays with the highest energies that cause extensive air showers in the Earth's atmosphere. A way of examining the two lowest order moments of the sample distribution of the depth of shower maximum is presented. The aim is to show that useful information about the composition of the primary beam can be inferred with limited knowledge we have about processes underlying these observations. In order to describe how the moments of the depth of shower maximum depend on the type of primary particles and their energies, we utilize a superposition model. Using the principle of maximum entropy, we are able to determine what trends in the primary composition are consistent with the input data, while relying on a limited amount of information from shower physics. Some capabilities and limitations of the proposed method are discussed. In order to achieve a realistic description of the primary mass composition, we pay special attention to the choice of the parameters of the sup...
A Maximum Resonant Set of Polyomino Graphs
Zhang Heping
2016-05-01
Full Text Available A polyomino graph P is a connected finite subgraph of the infinite plane grid such that each finite face is surrounded by a regular square of side length one and each edge belongs to at least one square. A dimer covering of P corresponds to a perfect matching. Different dimer coverings can interact via an alternating cycle (or square with respect to them. A set of disjoint squares of P is a resonant set if P has a perfect matching M so that each one of those squares is M-alternating. In this paper, we show that if K is a maximum resonant set of P, then P − K has a unique perfect matching. We further prove that the maximum forcing number of a polyomino graph is equal to the cardinality of a maximum resonant set. This confirms a conjecture of Xu et al. [26]. We also show that if K is a maximal alternating set of P, then P − K has a unique perfect matching.
The maximum rate of mammal evolution
Evans, Alistair R.; Jones, David; Boyer, Alison G.; Brown, James H.; Costa, Daniel P.; Ernest, S. K. Morgan; Fitzgerald, Erich M. G.; Fortelius, Mikael; Gittleman, John L.; Hamilton, Marcus J.; Harding, Larisa E.; Lintulaakso, Kari; Lyons, S. Kathleen; Okie, Jordan G.; Saarinen, Juha J.; Sibly, Richard M.; Smith, Felisa A.; Stephens, Patrick R.; Theodor, Jessica M.; Uhen, Mark D.
2012-03-01
How fast can a mammal evolve from the size of a mouse to the size of an elephant? Achieving such a large transformation calls for major biological reorganization. Thus, the speed at which this occurs has important implications for extensive faunal changes, including adaptive radiations and recovery from mass extinctions. To quantify the pace of large-scale evolution we developed a metric, clade maximum rate, which represents the maximum evolutionary rate of a trait within a clade. We applied this metric to body mass evolution in mammals over the last 70 million years, during which multiple large evolutionary transitions occurred in oceans and on continents and islands. Our computations suggest that it took a minimum of 1.6, 5.1, and 10 million generations for terrestrial mammal mass to increase 100-, and 1,000-, and 5,000-fold, respectively. Values for whales were down to half the length (i.e., 1.1, 3, and 5 million generations), perhaps due to the reduced mechanical constraints of living in an aquatic environment. When differences in generation time are considered, we find an exponential increase in maximum mammal body mass during the 35 million years following the Cretaceous-Paleogene (K-Pg) extinction event. Our results also indicate a basic asymmetry in macroevolution: very large decreases (such as extreme insular dwarfism) can happen at more than 10 times the rate of increases. Our findings allow more rigorous comparisons of microevolutionary and macroevolutionary patterns and processes.
Minimal Length, Friedmann Equations and Maximum Density
Awad, Adel
2014-01-01
Inspired by Jacobson's thermodynamic approach[gr-qc/9504004], Cai et al [hep-th/0501055,hep-th/0609128] have shown the emergence of Friedmann equations from the first law of thermodynamics. We extend Akbar--Cai derivation [hep-th/0609128] of Friedmann equations to accommodate a general entropy-area law. Studying the resulted Friedmann equations using a specific entropy-area law, which is motivated by the generalized uncertainty principle (GUP), reveals the existence of a maximum energy density closed to Planck density. Allowing for a general continuous pressure $p(\\rho,a)$ leads to bounded curvature invariants and a general nonsingular evolution. In this case, the maximum energy density is reached in a finite time and there is no cosmological evolution beyond this point which leaves the big bang singularity inaccessible from a spacetime prospective. The existence of maximum energy density and a general nonsingular evolution is independent of the equation of state and the spacial curvature $k$. As an example w...
Maximum saliency bias in binocular fusion
Lu, Yuhao; Stafford, Tom; Fox, Charles
2016-07-01
Subjective experience at any instant consists of a single ("unitary"), coherent interpretation of sense data rather than a "Bayesian blur" of alternatives. However, computation of Bayes-optimal actions has no role for unitary perception, instead being required to integrate over every possible action-percept pair to maximise expected utility. So what is the role of unitary coherent percepts, and how are they computed? Recent work provided objective evidence for non-Bayes-optimal, unitary coherent, perception and action in humans; and further suggested that the percept selected is not the maximum a posteriori percept but is instead affected by utility. The present study uses a binocular fusion task first to reproduce the same effect in a new domain, and second, to test multiple hypotheses about exactly how utility may affect the percept. After accounting for high experimental noise, it finds that both Bayes optimality (maximise expected utility) and the previously proposed maximum-utility hypothesis are outperformed in fitting the data by a modified maximum-salience hypothesis, using unsigned utility magnitudes in place of signed utilities in the bias function.
The maximum rate of mammal evolution
Evans, Alistair R.; Jones, David; Boyer, Alison G.; Brown, James H.; Costa, Daniel P.; Ernest, S. K. Morgan; Fitzgerald, Erich M. G.; Fortelius, Mikael; Gittleman, John L.; Hamilton, Marcus J.; Harding, Larisa E.; Lintulaakso, Kari; Lyons, S. Kathleen; Okie, Jordan G.; Saarinen, Juha J.; Sibly, Richard M.; Smith, Felisa A.; Stephens, Patrick R.; Theodor, Jessica M.; Uhen, Mark D.
2012-01-01
How fast can a mammal evolve from the size of a mouse to the size of an elephant? Achieving such a large transformation calls for major biological reorganization. Thus, the speed at which this occurs has important implications for extensive faunal changes, including adaptive radiations and recovery from mass extinctions. To quantify the pace of large-scale evolution we developed a metric, clade maximum rate, which represents the maximum evolutionary rate of a trait within a clade. We applied this metric to body mass evolution in mammals over the last 70 million years, during which multiple large evolutionary transitions occurred in oceans and on continents and islands. Our computations suggest that it took a minimum of 1.6, 5.1, and 10 million generations for terrestrial mammal mass to increase 100-, and 1,000-, and 5,000-fold, respectively. Values for whales were down to half the length (i.e., 1.1, 3, and 5 million generations), perhaps due to the reduced mechanical constraints of living in an aquatic environment. When differences in generation time are considered, we find an exponential increase in maximum mammal body mass during the 35 million years following the Cretaceous–Paleogene (K–Pg) extinction event. Our results also indicate a basic asymmetry in macroevolution: very large decreases (such as extreme insular dwarfism) can happen at more than 10 times the rate of increases. Our findings allow more rigorous comparisons of microevolutionary and macroevolutionary patterns and processes. PMID:22308461
Maximum-biomass prediction of homofermentative Lactobacillus.
Cui, Shumao; Zhao, Jianxin; Liu, Xiaoming; Chen, Yong Q; Zhang, Hao; Chen, Wei
2016-07-01
Fed-batch and pH-controlled cultures have been widely used for industrial production of probiotics. The aim of this study was to systematically investigate the relationship between the maximum biomass of different homofermentative Lactobacillus and lactate accumulation, and to develop a prediction equation for the maximum biomass concentration in such cultures. The accumulation of the end products and the depletion of nutrients by various strains were evaluated. In addition, the minimum inhibitory concentrations (MICs) of acid anions for various strains at pH 7.0 were examined. The lactate concentration at the point of complete inhibition was not significantly different from the MIC of lactate for all of the strains, although the inhibition mechanism of lactate and acetate on Lactobacillus rhamnosus was different from the other strains which were inhibited by the osmotic pressure caused by acid anions at pH 7.0. When the lactate concentration accumulated to the MIC, the strains stopped growing. The maximum biomass was closely related to the biomass yield per unit of lactate produced (YX/P) and the MIC (C) of lactate for different homofermentative Lactobacillus. Based on the experimental data obtained using different homofermentative Lactobacillus, a prediction equation was established as follows: Xmax - X0 = (0.59 ± 0.02)·YX/P·C.
The maximum rate of mammal evolution.
Evans, Alistair R; Jones, David; Boyer, Alison G; Brown, James H; Costa, Daniel P; Ernest, S K Morgan; Fitzgerald, Erich M G; Fortelius, Mikael; Gittleman, John L; Hamilton, Marcus J; Harding, Larisa E; Lintulaakso, Kari; Lyons, S Kathleen; Okie, Jordan G; Saarinen, Juha J; Sibly, Richard M; Smith, Felisa A; Stephens, Patrick R; Theodor, Jessica M; Uhen, Mark D
2012-03-13
How fast can a mammal evolve from the size of a mouse to the size of an elephant? Achieving such a large transformation calls for major biological reorganization. Thus, the speed at which this occurs has important implications for extensive faunal changes, including adaptive radiations and recovery from mass extinctions. To quantify the pace of large-scale evolution we developed a metric, clade maximum rate, which represents the maximum evolutionary rate of a trait within a clade. We applied this metric to body mass evolution in mammals over the last 70 million years, during which multiple large evolutionary transitions occurred in oceans and on continents and islands. Our computations suggest that it took a minimum of 1.6, 5.1, and 10 million generations for terrestrial mammal mass to increase 100-, and 1,000-, and 5,000-fold, respectively. Values for whales were down to half the length (i.e., 1.1, 3, and 5 million generations), perhaps due to the reduced mechanical constraints of living in an aquatic environment. When differences in generation time are considered, we find an exponential increase in maximum mammal body mass during the 35 million years following the Cretaceous-Paleogene (K-Pg) extinction event. Our results also indicate a basic asymmetry in macroevolution: very large decreases (such as extreme insular dwarfism) can happen at more than 10 times the rate of increases. Our findings allow more rigorous comparisons of microevolutionary and macroevolutionary patterns and processes.
Score lists in multipartite hypertournaments
Pirzada, Shariefuddin; Iványi, Antal
2010-01-01
Given non-negative integers $n_{i}$ and $\\alpha_{i}$ with $0 \\leq \\alpha_{i} \\leq n_i$ $(i=1,2,...,k)$, an $[\\alpha_{1},\\alpha_{2},...,\\alpha_{k}]$-$k$-partite hypertournament on $\\sum_{1}^{k}n_{i}$ vertices is a $(k+1)$-tuple $(U_{1},U_{2},...,U_{k},E)$, where $U_{i}$ are $k$ vertex sets with $|U_{i}|=n_{i}$, and $E$ is a set of $\\sum_{1}^{k}\\alpha_{i}$-tuples of vertices, called arcs, with exactly $\\alpha_{i}$ vertices from $U_{i}$, such that any $\\sum_{1}^{k}\\alpha_{i}$ subset $\\cup_{1}^{k}U_{i}^{\\prime}$ of $\\cup_{1}^{k}U_{i}$, $E$ contains exactly one of the $(\\sum_{1}^{k} \\alpha_{i})!$ $\\sum_{1}^{k}\\alpha_{i}$-tuples whose entries belong to $\\cup_{1}^{k}U_{i}^{\\prime}$. We obtain necessary and sufficient conditions for $k$ lists of non-negative integers in non-decreasing order to be the losing score lists and to be the score lists of some $k$-partite hypertournament.
Disclosure Risk from Factor Scores
Drechsler Jörg
2014-03-01
Full Text Available Remote access can be a powerful tool for providing data access for external researchers. Since the microdata never leave the secure environment of the data-providing agency, alterations of the microdata can be kept to a minimum. Nevertheless, remote access is not free from risk. Many statistical analyses that do not seem to provide disclosive information at first sight can be used by sophisticated intruders to reveal sensitive information. For this reason the list of allowed queries is usually restricted in a remote setting. However, it is not always easy to identify problematic queries. We therefore strongly support the argument that has been made by other authors: that all queries should be monitored carefully and that any microlevel information should always be withheld. As an illustrative example, we use factor score analysis, for which the output of interest - the factor loading of the variables - seems to be unproblematic. However, as we show in the article, the individual factor scores that are usually returned as part of the output can be used to reveal sensitive information. Our empirical evaluations based on a German establishment survey emphasize that this risk is far from a purely theoretical problem.
Gabriel, Rafael; Brotons, Carlos; Tormo, M José; Segura, Antonio; Rigo, Fernando; Elosua, Roberto; Carbayo, Julio A; Gavrila, Diana; Moral, Irene; Tuomilehto, Jaakko; Muñiz, Javier
2015-03-01
In Spain, data based on large population-based cohorts adequate to provide an accurate prediction of cardiovascular risk have been scarce. Thus, calibration of the EuroSCORE and Framingham scores has been proposed and done for our population. The aim was to develop a native risk prediction score to accurately estimate the individual cardiovascular risk in the Spanish population. Seven Spanish population-based cohorts including middle-aged and elderly participants were assembled. There were 11800 people (6387 women) representing 107915 person-years of follow-up. A total of 1214 cardiovascular events were identified, of which 633 were fatal. Cox regression analyses were conducted to examine the contributions of the different variables to the 10-year total cardiovascular risk. Age was the strongest cardiovascular risk factor. High systolic blood pressure, diabetes mellitus and smoking were strong predictive factors. The contribution of serum total cholesterol was small. Antihypertensive treatment also had a significant impact on cardiovascular risk, greater in men than in women. The model showed a good discriminative power (C-statistic=0.789 in men and C=0.816 in women). Ten-year risk estimations are displayed graphically in risk charts separately for men and women. The ERICE is a new native cardiovascular risk score for the Spanish population derived from the background and contemporaneous risk of several Spanish cohorts. The ERICE score offers the direct and reliable estimation of total cardiovascular risk, taking in consideration the effect of diabetes mellitus and cardiovascular risk factor management. The ERICE score is a practical and useful tool for clinicians to estimate the total individual cardiovascular risk in Spain. Copyright © 2014 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.
Genetic scores of smoking behaviour in a Chinese population.
Yang, Shanshan; He, Yao; Wang, Jianhua; Wang, Yiyan; Wu, Lei; Zeng, Jing; Liu, Miao; Zhang, Di; Jiang, Bin; Li, Xiaoying
2016-03-07
This study sought to structure a genetic score for smoking behaviour in a Chinese population. Single-nucleotide polymorphisms (SNPs) from genome-wide association studies (GWAS) were evaluated in a community-representative sample (N = 3,553) of Beijing, China. The candidate SNPs were tested in four genetic models (dominance model, recessive model, heterogeneous codominant model and additive model), and 7 SNPs were selected to structure a genetic score. A total of 3,553 participants (1,477 males and 2,076 females) completed the survey. Using the unweighted score, we found that participants with a high genetic score had a 34% higher risk of trying smoking and a 43% higher risk of SI at ≤ 18 years of age after adjusting for age, gender, education, occupation, ethnicity, body mass index (BMI) and sports activity time. The unweighted genetic scores were chosen to best extrapolate and understand these results. Importantly, genetic score was significantly associated with smoking behaviour (smoking status and SI at ≤ 18 years of age). These results have the potential to guide relevant health education for individuals with high genetic scores and promote the process of smoking control to improve the health of the population.
Evaluation of prognostic factors and scoring system in colonic perforation
Atsushi Horiuchi; Yuji Watanabe; Takashi Doi; Kouichi Sato; Syungo Yukumi; Motohira Yoshida; Yuji Yamamoto; Hiroki Sugishita; Kanji Kawachi
2007-01-01
AIM: To study the significance of scoring systems assessing severity and prognostic factors in patients with colonic perforation.METHODS: A total of 26 patients (9 men, 17 women;mean age 72.7±11.6 years) underwent emergency operation for colorectal perforation in our institution between 1993 and 2005. Several clinical factors were measured preoperatively and 24 h postoperatively. Acute physiology and chronic health evaluationⅡ (APACHE Ⅱ),Mannheim peritonitis index (MPI) and peritonitis index of Altona (PIA Ⅱ) scores were calculated preoperatively.RESULTS: Overall postoperative mortality rate was 23.1% (6 patients). Compared with survivors, nonsurvivors displayed low blood pressure, low serum protein and high serum creatinine preoperatively, and low blood pressure, low white blood cell count, low pH,low PaO2/FiO2, and high serum creatinine postoperatively.APACHE Ⅱ score was significantly lower in survivors than in non-survivors (10.4±3.84 vs19.3±2.87, P= 0.00003). Non-survivors tended to display high MPI score and low PIA Ⅱ score, but no significant difference was identified.CONCLUSION: Pre- and postoperative blood pressure and serum creatinine level appear related to prognosis of colonic perforation. APACHE Ⅱ score is most associated with prognosis and scores ≥ 20 are associated with significantly increased mortality rate.
A Note on Stochastic Ordering of the Latent Trait Using the Sum of Polytomous Item Scores
van der Ark, L. Andries; Bergsma, Wicher P.
2010-01-01
In contrast to dichotomous item response theory (IRT) models, most well-known polytomous IRT models do not imply stochastic ordering of the latent trait by the total test score (SOL). This has been thought to make the ordering of respondents on the latent trait using the total test score questionable and throws doubt on the justifiability of using…
A strong test of the maximum entropy theory of ecology.
Xiao, Xiao; McGlinn, Daniel J; White, Ethan P
2015-03-01
The maximum entropy theory of ecology (METE) is a unified theory of biodiversity that predicts a large number of macroecological patterns using information on only species richness, total abundance, and total metabolic rate of the community. We evaluated four major predictions of METE simultaneously at an unprecedented scale using data from 60 globally distributed forest communities including more than 300,000 individuals and nearly 2,000 species.METE successfully captured 96% and 89% of the variation in the rank distribution of species abundance and individual size but performed poorly when characterizing the size-density relationship and intraspecific distribution of individual size. Specifically, METE predicted a negative correlation between size and species abundance, which is weak in natural communities. By evaluating multiple predictions with large quantities of data, our study not only identifies a mismatch between abundance and body size in METE but also demonstrates the importance of conducting strong tests of ecological theories.
Qualidade total do produto Products total quality
Henrique Silveira de Almeida
1992-06-01
Full Text Available O texto aborda o conceito de qualidade total do produto, seus determinantes, bem como as dimensões que compõem essa qualidade. Parte-se do pressuposto de que a qualidade do produto deve ser avaliada pela satisfação total do consumidor. Para o consumidor a qualidade do produto envolve pelo menos as seguintes dimensões: a qualidade do produto em si; a qualidade do produto ao longo do tempo; a qualidade dos serviços associados ao uso do produto; e o custo do ciclo de vida do produto. O trabalho procura detalhar e discutir cada uma dessas dimensões da qualidade, tendo em vista a satisfação do consumidor.The paper concerns to the concept of product's total quality, its determinants, and the dimensions wich constitute this quality. We admit that product quality should be evaluated via consumer's total satisfaction. Product quality for consumers includes at least the following dimensions: the product quality per se; the performance of product quality over time; the quality of services related to the use of the product; and the product lifecycle costs. This study seeks to specify and to discuss each of these quality dimensions related to consumer's satisfaction.
Evaluation of Stress Scores Throughout Radiological Biopsies
Turkoglu
2016-06-01
Full Text Available Background Ultrasound-guided biopsy procedures are the most prominent methods that increase the trauma, stress and anxiety experienced by the patients. Objectives Our goal was to examine the level of stress in patients waiting for radiologic biopsy procedures and determine the stress and anxiety level arising from waiting for a biopsy procedure. Patients and Methods This prospective study included 35 female and 65 male patients who were admitted to the interventional radiology department of Kartal Dr. Lütfi Kirdar training and research hospital, Istanbul between the years 2014 and 2015. They filled out the adult resilience scale consisting of 33 items. Patients who were undergoing invasive radiologic interventions were grouped according to their phenotypic characteristics, education level (low, intermediate, and high, and biopsy features (including biopsy localization: neck, thorax, abdomen, and bone; and the number of procedures performed, 1 or more than 1. Before the biopsy, they were also asked to complete the depression-anxiety-stress scale (DASS 42, state-trait anxiety inventory scale (STAI-I, and continuous anxiety scale STAI-II. A total of 80 patients were biopsied (20 thyroid and parathyroid, 20 thorax, 20 liver and kidney, and 20 bone biopsies. The association between education levels (primary- secondary, high school and postgraduate and the number of biopsies (1 and more than 1 with the level of anxiety and stress were evaluated using the above-mentioned scales. Results Evaluation of sociodemographic and statistical characteristics of the patients showed that patients with biopsy in the neck region were moderately and severely depressed and stressed. In addition, the ratio of severe and extremely severe anxiety scores was significantly high. While the STAI-I and II scores were lined up as neck > bone > thorax > abdomen, STAI-I was higher in neck biopsies compared to thorax and abdomen biopsies. Regarding STAI-I and II scales, patients
TAP score: torsion angle propensity normalization applied to local protein structure evaluation
Battistutta Roberto
2007-05-01
Full Text Available Abstract Background Experimentally determined protein structures may contain errors and require validation. Conformational criteria based on the Ramachandran plot are mainly used to distinguish between distorted and adequately refined models. While the readily available criteria are sufficient to detect totally wrong structures, establishing the more subtle differences between plausible structures remains more challenging. Results A new criterion, called TAP score, measuring local sequence to structure fitness based on torsion angle propensities normalized against the global minimum and maximum is introduced. It is shown to be more accurate than previous methods at estimating the validity of a protein model in terms of commonly used experimental quality parameters on two test sets representing the full PDB database and a subset of obsolete PDB structures. Highly selective TAP thresholds are derived to recognize over 90% of the top experimental structures in the absence of experimental information. Both a web server and an executable version of the TAP score are available at http://protein.cribi.unipd.it/tap/. Conclusion A novel procedure for energy normalization (TAP has significantly improved the possibility to recognize the best experimental structures. It will allow the user to more reliably isolate problematic structures in the context of automated experimental structure determination.
Validating MMI Scores: Are We Measuring Multiple Attributes?
Oliver, Tom; Hecker, Kent; Hausdorf, Peter A.; Conlon, Peter
2014-01-01
The multiple mini-interview (MMI) used in health professional schools' admission processes is reported to assess multiple non-cognitive constructs such as ethical reasoning, oral communication, or problem evaluation. Though validation studies have been performed with total MMI scores, there is a paucity of information regarding how well MMI…
A critical review of predefined diet quality scores
Waijers, P.M.C.M.; Feskens, E.J.M.; Ocke, M.C.
2007-01-01
The literature on predefined indexes of overall diet quality is reviewed. Their association with nutrient adequacy and health outcome is considered, but our primary interest is in the make-up of the scores. In total, twenty different indexes have been reviewed, four of which have gained most attenti
Dilemmas in Uncemented Total Hip Arthroplasty
Goosen, J.H.M.
2009-01-01
In this thesis, different aspects that are related to the survivorship and clinical outcome in uncemented total hip arthroplasty are analysed. In Chapter 2, the survival rate, Harris Hip score and radiographic features of a proximally hydroxyapatite coated titanium alloy femoral stem (Bi-Metric, Bio
Spinal appearance questionnaire: factor analysis, scoring, reliability, and validity testing.
Carreon, Leah Y; Sanders, James O; Polly, David W; Sucato, Daniel J; Parent, Stefan; Roy-Beaudry, Marjolaine; Hopkins, Jeffrey; McClung, Anna; Bratcher, Kelly R; Diamond, Beverly E
2011-08-15
Cross sectional. This study presents the factor analysis of the Spinal Appearance Questionnaire (SAQ) and its psychometric properties. Although the SAQ has been administered to a large sample of patients with adolescent idiopathic scoliosis (AIS) treated surgically, its psychometric properties have not been fully evaluated. This study presents the factor analysis and scoring of the SAQ and evaluates its psychometric properties. The SAQ and the Scoliosis Research Society-22 (SRS-22) were administered to AIS patients who were being observed, braced or scheduled for surgery. Standard demographic data and radiographic measures including Lenke type and curve magnitude were also collected. Of the 1802 patients, 83% were female; with a mean age of 14.8 years and mean initial Cobb angle of 55.8° (range, 0°-123°). From the 32 items of the SAQ, 15 loaded on two factors with consistent and significant correlations across all Lenke types. There is an Appearance (items 1-10) and an Expectations factor (items 12-15). Responses are summed giving a range of 5 to 50 for the Appearance domain and 5 to 20 for the Expectations domain. The Cronbach's α was 0.88 for both domains and Total score with a test-retest reliability of 0.81 for Appearance and 0.91 for Expectations. Correlations with major curve magnitude were higher for the SAQ Appearance and SAQ Total scores compared to correlations between the SRS Appearance and SRS Total scores. The SAQ and SRS-22 Scores were statistically significantly different in patients who were scheduled for surgery compared to those who were observed or braced. The SAQ is a valid measure of self-image in patients with AIS with greater correlation to curve magnitude than SRS Appearance and Total score. It also discriminates between patients who require surgery from those who do not.
... the rectum. This can cause an infection or abscess. Scarring of the connection between the small intestine ... More Crohn disease Ileostomy Total proctocolectomy and ileal - anal pouch Total proctocolectomy with ileostomy Ulcerative colitis Patient ...
Total parenteral nutrition - infants
... medlineplus.gov/ency/article/007239.htm Total parenteral nutrition - infants To use the sharing features on this page, please enable JavaScript. Total parenteral nutrition (TPN) is a method of feeding that bypasses ...
... medlineplus.gov/ency/patientinstructions/000177.htm Total parenteral nutrition To use the sharing features on this page, please enable JavaScript. Total parenteral nutrition (TPN) is a method of feeding that bypasses ...
Maximum power operation of interacting molecular motors
Golubeva, Natalia; Imparato, Alberto
2013-01-01
We study the mechanical and thermodynamic properties of different traffic models for kinesin which are relevant in biological and experimental contexts. We find that motor-motor interactions play a fundamental role by enhancing the thermodynamic efficiency at maximum power of the motors......, as compared to the non-interacting system, in a wide range of biologically compatible scenarios. We furthermore consider the case where the motor-motor interaction directly affects the internal chemical cycle and investigate the effect on the system dynamics and thermodynamics....
Maximum a posteriori decoder for digital communications
Altes, Richard A. (Inventor)
1997-01-01
A system and method for decoding by identification of the most likely phase coded signal corresponding to received data. The present invention has particular application to communication with signals that experience spurious random phase perturbations. The generalized estimator-correlator uses a maximum a posteriori (MAP) estimator to generate phase estimates for correlation with incoming data samples and for correlation with mean phases indicative of unique hypothesized signals. The result is a MAP likelihood statistic for each hypothesized transmission, wherein the highest value statistic identifies the transmitted signal.
Kernel-based Maximum Entropy Clustering
JIANG Wei; QU Jiao; LI Benxi
2007-01-01
With the development of Support Vector Machine (SVM),the "kernel method" has been studied in a general way.In this paper,we present a novel Kernel-based Maximum Entropy Clustering algorithm (KMEC).By using mercer kernel functions,the proposed algorithm is firstly map the data from their original space to high dimensional space where the data are expected to be more separable,then perform MEC clustering in the feature space.The experimental results show that the proposed method has better performance in the non-hyperspherical and complex data structure.
The sun and heliosphere at solar maximum.
Smith, E J; Marsden, R G; Balogh, A; Gloeckler, G; Geiss, J; McComas, D J; McKibben, R B; MacDowall, R J; Lanzerotti, L J; Krupp, N; Krueger, H; Landgraf, M
2003-11-14
Recent Ulysses observations from the Sun's equator to the poles reveal fundamental properties of the three-dimensional heliosphere at the maximum in solar activity. The heliospheric magnetic field originates from a magnetic dipole oriented nearly perpendicular to, instead of nearly parallel to, the Sun's rotation axis. Magnetic fields, solar wind, and energetic charged particles from low-latitude sources reach all latitudes, including the polar caps. The very fast high-latitude wind and polar coronal holes disappear and reappear together. Solar wind speed continues to be inversely correlated with coronal temperature. The cosmic ray flux is reduced symmetrically at all latitudes.
Conductivity maximum in a charged colloidal suspension
Bastea, S
2009-01-27
Molecular dynamics simulations of a charged colloidal suspension in the salt-free regime show that the system exhibits an electrical conductivity maximum as a function of colloid charge. We attribute this behavior to two main competing effects: colloid effective charge saturation due to counterion 'condensation' and diffusion slowdown due to the relaxation effect. In agreement with previous observations, we also find that the effective transported charge is larger than the one determined by the Stern layer and suggest that it corresponds to the boundary fluid layer at the surface of the colloidal particles.
Maximum entropy signal restoration with linear programming
Mastin, G.A.; Hanson, R.J.
1988-05-01
Dantzig's bounded-variable method is used to express the maximum entropy restoration problem as a linear programming problem. This is done by approximating the nonlinear objective function with piecewise linear segments, then bounding the variables as a function of the number of segments used. The use of a linear programming approach allows equality constraints found in the traditional Lagrange multiplier method to be relaxed. A robust revised simplex algorithm is used to implement the restoration. Experimental results from 128- and 512-point signal restorations are presented.
COMPARISON BETWEEN FORMULAS OF MAXIMUM SHIP SQUAT
PETRU SERGIU SERBAN
2016-06-01
Full Text Available Ship squat is a combined effect of ship’s draft and trim increase due to ship motion in limited navigation conditions. Over time, researchers conducted tests on models and ships to find a mathematical formula that can define squat. Various forms of calculating squat can be found in the literature. Among those most commonly used are of Barrass, Millward, Eryuzlu or ICORELS. This paper presents a comparison between the squat formulas to see the differences between them and which one provides the most satisfactory results. In this respect a cargo ship at different speeds was considered as a model for maximum squat calculations in canal navigation conditions.
Multi-Channel Maximum Likelihood Pitch Estimation
Christensen, Mads Græsbøll
2012-01-01
In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...
Maximum entropy PDF projection: A review
Baggenstoss, Paul M.
2017-06-01
We review maximum entropy (MaxEnt) PDF projection, a method with wide potential applications in statistical inference. The method constructs a sampling distribution for a high-dimensional vector x based on knowing the sampling distribution p(z) of a lower-dimensional feature z = T (x). Under mild conditions, the distribution p(x) having highest possible entropy among all distributions consistent with p(z) may be readily found. Furthermore, the MaxEnt p(x) may be sampled, making the approach useful in Monte Carlo methods. We review the theorem and present a case study in model order selection and classification for handwritten character recognition.
CORA: Emission Line Fitting with Maximum Likelihood
Ness, Jan-Uwe; Wichmann, Rainer
2011-12-01
CORA analyzes emission line spectra with low count numbers and fits them to a line using the maximum likelihood technique. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise, the software derives the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. CORA has been applied to an X-ray spectrum with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory.
Dynamical maximum entropy approach to flocking
Cavagna, Andrea; Giardina, Irene; Ginelli, Francesco; Mora, Thierry; Piovani, Duccio; Tavarone, Raffaele; Walczak, Aleksandra M.
2014-04-01
We derive a new method to infer from data the out-of-equilibrium alignment dynamics of collectively moving animal groups, by considering the maximum entropy model distribution consistent with temporal and spatial correlations of flight direction. When bird neighborhoods evolve rapidly, this dynamical inference correctly learns the parameters of the model, while a static one relying only on the spatial correlations fails. When neighbors change slowly and the detailed balance is satisfied, we recover the static procedure. We demonstrate the validity of the method on simulated data. The approach is applicable to other systems of active matter.
Maximum Temperature Detection System for Integrated Circuits
Frankiewicz, Maciej; Kos, Andrzej
2015-03-01
The paper describes structure and measurement results of the system detecting present maximum temperature on the surface of an integrated circuit. The system consists of the set of proportional to absolute temperature sensors, temperature processing path and a digital part designed in VHDL. Analogue parts of the circuit where designed with full-custom technique. The system is a part of temperature-controlled oscillator circuit - a power management system based on dynamic frequency scaling method. The oscillator cooperates with microprocessor dedicated for thermal experiments. The whole system is implemented in UMC CMOS 0.18 μm (1.8 V) technology.
Zipf's law and maximum sustainable growth
Malevergne, Y; Sornette, D
2010-01-01
Zipf's law states that the number of firms with size greater than S is inversely proportional to S. Most explanations start with Gibrat's rule of proportional growth but require additional constraints. We show that Gibrat's rule, at all firm levels, yields Zipf's law under a balance condition between the effective growth rate of incumbent firms (which includes their possible demise) and the growth rate of investments in entrant firms. Remarkably, Zipf's law is the signature of the long-term optimal allocation of resources that ensures the maximum sustainable growth rate of an economy.
Sánchez Vergel, Alfredo; Fundación Valle de Lili
2002-01-01
Definición/Tipos de prótesis/ ¿Qué pacientes se podrían beneficiar de un reemplazo total de cadera?/Artrosis de cadera/Tipos de artrosis de cadera/Alternativas al reemplazo total de cadera/Preguntas frecuentes sobre el reemplazo total de cadera.
Sánchez Vergel, Alfredo; Fundación Valle de Lili
2002-01-01
Definición/Tipos de prótesis/ ¿Qué pacientes se podrían beneficiar de un reemplazo total de cadera?/Artrosis de cadera/Tipos de artrosis de cadera/Alternativas al reemplazo total de cadera/Preguntas frecuentes sobre el reemplazo total de cadera.
Finbow, Arthur; Frendrup, Allan; Vestergaard, Preben D.
cardinality then G is a total well dominated graph. In this paper we study composition and decomposition of total well dominated trees. By a reversible process we prove that any total well dominated tree can both be reduced to and constructed from a family of three small trees....
Bornø, Andreas; Aachmann-Andersen, Niels J; Munch-Andersen, Thor; Hulston, Carl J; Lundby, Carsten
2010-06-01
Haemoglobin concentration ([Hb]), reticulocyte percentage (retic%) and OFF(hr score) are well-implemented screening tools to determine potential recombinant human erythropoietin (rHuEpo) abuse in athletes. Recently, the International Cycling Union implemented the OFF(z score) and the Hb(z score) in their anti-doping testing programme. The aim of this study is to evaluate the sensitivity of these indirect screening methods. Twenty-four human subjects divided into three groups with eight subjects each (G1; G2 and G3) were injected with rHuEpo. G1 and G2 received rHuEpo for a 4-week period with 2 weeks of "boosting" followed by 2 weeks of "maintenance" and a wash-out period of 3 weeks. G3 received rHuEpo for a 10-week period (boost = 3 weeks; maintenance = 7 weeks; wash out = 1 week). Three, seven and eight of the 24 volunteers exceeded the cut-off limits for OFF(hr score), [Hb] and retic%, respectively. One subject from G1, nobody from G2, and seven subjects from G3 exceeded the cut-off limit for Hb(z score.) In total, ten subjects exceeded the cut-off limit for the OFF(z score); two subjects from G1, two subjects from G2 and six subjects from G3. In total, indirect screening methods were able to indicate rHuEpo injections in 58% of subjects. However, 42% of our rHuEpo-injected subjects were not detected. It should be emphasised that the test frequency in real world anti-doping is far less than the present study, and hence the detection rate will be lower.
Assessment of Corrosion, Fretting, and Material Loss of Retrieved Modular Total Knee Arthroplasties.
Martin, Audrey J; Seagers, Kirsten A; Van Citters, Douglas W
2017-07-01
Modular junctions in total hip arthroplasties have been associated with fretting, corrosion, and debris release. The purpose of this study is to analyze damage severity in total knee arthroplasties of a single design by qualitative visual assessment and quantitative material loss measurements to evaluate implant performance and patient impact via material loss. Twenty-two modular knee retrievals of the same manufacturer were identified from an institutional review board-approved database. Junction designs included tapers with an axial screw and tapers with a radial screw. Constructs consisted of 2 metal alloys: CoCr and Ti6Al4V. Components were qualitatively scored and quantitatively measured for corrosion and fretting. Negative values represent adhered material. Statistical differences were analyzed using sign tests. Correlations were tested with a Spearman rank order test (P material loss and the maximum linear depth for the total population were -0.23 mm(3) and 5.84 μm, respectively. CoCr components in mixed metal junctions had higher maximum linear depth (P = .007) than corresponding Ti components. Fretting scores of Ti6Al4V alloy components in mixed metal junctions were statistically higher than the remaining groups. Taper angle did not correlate with material loss. Results suggest that CoCr components in mixed metal junctions are more vulnerable to corrosion than other components, suggesting preferential corrosion when interfacing with Ti6Al4V. Overall, although corrosion was noted in this series, material loss was low, and none were revised for clinical metal-related reaction. This suggests the clinical impact from corrosion in total knee arthroplasty is low. Copyright © 2017 Elsevier Inc. All rights reserved.
Accurate structural correlations from maximum likelihood superpositions.
Douglas L Theobald
2008-02-01
Full Text Available The cores of globular proteins are densely packed, resulting in complicated networks of structural interactions. These interactions in turn give rise to dynamic structural correlations over a wide range of time scales. Accurate analysis of these complex correlations is crucial for understanding biomolecular mechanisms and for relating structure to function. Here we report a highly accurate technique for inferring the major modes of structural correlation in macromolecules using likelihood-based statistical analysis of sets of structures. This method is generally applicable to any ensemble of related molecules, including families of nuclear magnetic resonance (NMR models, different crystal forms of a protein, and structural alignments of homologous proteins, as well as molecular dynamics trajectories. Dominant modes of structural correlation are determined using principal components analysis (PCA of the maximum likelihood estimate of the correlation matrix. The correlations we identify are inherently independent of the statistical uncertainty and dynamic heterogeneity associated with the structural coordinates. We additionally present an easily interpretable method ("PCA plots" for displaying these positional correlations by color-coding them onto a macromolecular structure. Maximum likelihood PCA of structural superpositions, and the structural PCA plots that illustrate the results, will facilitate the accurate determination of dynamic structural correlations analyzed in diverse fields of structural biology.
Maximum entropy production and the fluctuation theorem
Dewar, R C [Unite EPHYSE, INRA Centre de Bordeaux-Aquitaine, BP 81, 33883 Villenave d' Ornon Cedex (France)
2005-05-27
Recently the author used an information theoretical formulation of non-equilibrium statistical mechanics (MaxEnt) to derive the fluctuation theorem (FT) concerning the probability of second law violating phase-space paths. A less rigorous argument leading to the variational principle of maximum entropy production (MEP) was also given. Here a more rigorous and general mathematical derivation of MEP from MaxEnt is presented, and the relationship between MEP and the FT is thereby clarified. Specifically, it is shown that the FT allows a general orthogonality property of maximum information entropy to be extended to entropy production itself, from which MEP then follows. The new derivation highlights MEP and the FT as generic properties of MaxEnt probability distributions involving anti-symmetric constraints, independently of any physical interpretation. Physically, MEP applies to the entropy production of those macroscopic fluxes that are free to vary under the imposed constraints, and corresponds to selection of the most probable macroscopic flux configuration. In special cases MaxEnt also leads to various upper bound transport principles. The relationship between MaxEnt and previous theories of irreversible processes due to Onsager, Prigogine and Ziegler is also clarified in the light of these results. (letter to the editor)
Thermodynamic hardness and the maximum hardness principle
Franco-Pérez, Marco; Gázquez, José L.; Ayers, Paul W.; Vela, Alberto
2017-08-01
An alternative definition of hardness (called the thermodynamic hardness) within the grand canonical ensemble formalism is proposed in terms of the partial derivative of the electronic chemical potential with respect to the thermodynamic chemical potential of the reservoir, keeping the temperature and the external potential constant. This temperature dependent definition may be interpreted as a measure of the propensity of a system to go through a charge transfer process when it interacts with other species, and thus it keeps the philosophy of the original definition. When the derivative is expressed in terms of the three-state ensemble model, in the regime of low temperatures and up to temperatures of chemical interest, one finds that for zero fractional charge, the thermodynamic hardness is proportional to T-1(I -A ) , where I is the first ionization potential, A is the electron affinity, and T is the temperature. However, the thermodynamic hardness is nearly zero when the fractional charge is different from zero. Thus, through the present definition, one avoids the presence of the Dirac delta function. We show that the chemical hardness defined in this way provides meaningful and discernible information about the hardness properties of a chemical species exhibiting integer or a fractional average number of electrons, and this analysis allowed us to establish a link between the maximum possible value of the hardness here defined, with the minimum softness principle, showing that both principles are related to minimum fractional charge and maximum stability conditions.
Maximum Likelihood Analysis in the PEN Experiment
Lehman, Martin
2013-10-01
The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.
Messina, Antonino; Fogliani, Anna Maria; Paradiso, Sergio
2010-08-01
An inverse correlation between social desirability and alexithymia has been observed in undergraduate students in Japan and Australia. It is not clear how this association is influenced by the personality dimension of neuroticism. This study examined the association of scores on social desirability with those on alexithymia controlled for neuroticism, in a sample of 111 Italian graduate students, with age range of 24 to 58 years. Students completed the Eysenck Personality Questionnaire (short form) and the Toronto Alexithymia Scale-20 (TAS-20). Social desirability scores inversely correlated with TAS-20 total scores, neuroticism scores, and the TAS-20 subscale, Difficulty identifying feelings. Neuroticism directly correlated with TAS-20 total score, Difficulty identifying feelings, and Difficulty describing feelings. Students with higher alexithymia and neuroticism scores seem to present themselves in less socially desirable ways. The correlation of social desirability with alexithymia was moderated by higher neuroticism scores.
Lake Basin Fetch and Maximum Length/Width
Minnesota Department of Natural Resources — Linear features representing the Fetch, Maximum Length and Maximum Width of a lake basin. Fetch, maximum length and average width are calcuated from the lake polygon...
Yılmaz, Eyüp Murat; Kapçı, Mücahit; Çelik, Sebahattin; Manoğlu, Berke; Avcil, Mücahit; Karacan, Erkan
2017-01-01
Acute appendicitis is one of the most common causes of abdominal pain seen in surgical clinics. Although it can be easily diagnosed, the picture may be confusing, particularly in premenopausal women and the elderly. The present study is an evaluation of 2 of the current scoring systems with respect to accurate diagnosis of the disease and indication of inflammation severity. A total of 105 patients diagnosed with acute appendicitis were included in the study. Subsequent to Alvarado and Ohmann scoring, ultrasonography image was obtained and appendectomy was performed. A unique intraoperative severity scoring system was used to measure severity of inflammation and to compare Alvarado and Ohmann scoring system results to assess accuracy of predictive value for acute appendicitis. Moderate positive correlation was found between Alvarado score and Ohmann score (r=0.508; pappendicitis based on histopathological results was statistically significant (p=0.027), while rate of Ohmann score was not statistically significant (p=0.807). Correlation between both scores and grading of inflammation performed during the operation was weak, but statistical significance was observed between Alvarado scoring system and intraoperative severity scoring (r=0.30; p=0.002). No statistical difference was observed between Ohmann scoring and intraoperative severity scoring (r=0.09; p=0.384). Alvarado score is better able to predict acute appendicitis and provide an idea of severity of inflammation. Ohmann score is more useful to provide guidance and eliminate acute appendicitis from consideration when conditions are more uncertain and obscured.
尤寿江
2014-01-01
Objective To study whether the total health risks in vascular events(THRIVE)score could predict the prognosis in the acute ischemic stroke patients with atrial fibrillation.Methods A total of 169 patients were enrolled in the study,with NIH Stroke Scale(NIHSS)score,THRIVE score and CHADS2score given to each patients at admission and modified Rankin Scale(mRS)given at3 months follow up.All patients were divided into the
A. Antonelli
2011-09-01
Full Text Available Background: Systemic sclerosis (SSc is an autoimmune disease characterized by fibrosis of the skin and visceral organs. The microangiopathy is early detectable in the course of the disease by nailfold videocapillaroscopy (NVC, a non-invasive technique with a high diagnostic value. Objective: Aim of our study was to evaluate the feasibility of a quantitative score and its correlation with the digital skin ulcers, which frequently complicate SSc microangiopathy. Methods: We retrospectively analysed the NVC of 65 SSc patients, performed by 200x videocapillaroscopy connected to image analyse software (Videocap; DS MediGroup, Milan, Italy. The analysis of NVC images included: total number of capillaries in the distal row (N, maximum diameter (D and number of giant capillaries (M, M/N ratio and percentage of M, presence/absence of micro-haemorrhages and tortuosity. Results: 21/65 SSc patients experienced digital ulcers within three months after the NVC examination. The N, D, M/N, and percentage of M significantly correlated with the appearance of ischemic ulcers. A multiple regression analysis showed a statistically significant correlation for N, M/N and D, while sensitivity and specificity of these parameters were unsatisfactory. A capillaroscopic score, according to the formula D·M/N2, showed a high specificity and sensibility (93.2% and 85.7% respectively; area under ROC curve: 0.918 to predict the appearance of digital ulcers. Conclusions: This capillaroscopic score may represent a feasible and simple tool in SSc patients’ assessment. The routinely use of this parameter might permit to recognize and to preventively treat SSc patients at high risk to develop digital ulcers.
Demirkaya, Erkan; Acikel, Cengizhan; Hashkes, Philip; Gattorno, Marco; Gul, Ahmet; Ozdogan, Huri; Turker, Turker; Karadag, Omer; Livneh, Avi; Ben-Chetrit, Eldad; Ozen, Seza
2016-06-01
To develop widely accepted international severity score for children and adult patients with familial Mediterranean fever (FMF) that can be easily applied, in research and clinical practice. Candidate severity criteria were suggested by several FMF expert physicians. After three rounds of Delphi survey, the candidate criteria, defined by the survey, were discussed by experts in a consensus meeting. Each expert brought data of clinical manifestations, laboratory findings and physician's global assessments (PGAs) of minimum 20 patients from their centres. We used the PGAs for disease severity as a gold standard. Logistic regression analysis was used to evaluate the predicting value of each item, and receiver operating characteristic curve analysis was performed to demonstrate the success of the criteria set. A total of 281 patients consist of 162 children and 119 adults with FMF were enrolled and available for validity analysis: Nine domains were included in the final core set of variables for the evaluation of disease severity in FMF. The International Severity Score for FMF (ISSF) may reach a maximum of 10 if all items are maximally scored. The threshold values to determine: severe disease ≥6, intermediate disease 3-5, mild disease ≤2. Area under the curve was calculated as 0.825 for this set in the whole group. The initial validity of ISSF both in children and adult with FMF was demonstrated. We anticipate that it will provide a robust tool to objectively define disease severity for clinical trials, future research as well as for therapeutic decisions in managing patients with FMF. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Duff, Sharon; Goyen, Traci-Anne
2010-01-01
To determine the reliability and aspects of validity of the Evaluation Tool of Children's Handwriting-Cursive (ETCH-C; Amundson, 1995), using the general scoring criteria, when assessing children who use alternative writing scripts. Children in Years 5 and 6 with handwriting problems and a group of matched control participants from their respective classrooms were assessed with the ETCH-C twice, 4 weeks apart. Total Letter scores were most reliable; more variability should be expected for Total Word scores. Total Numeral scores showed unacceptable reliability levels and are not recommended. We found good discriminant validity for Letter and Word scores and established cutoff scores to distinguish children with and without handwriting dysfunction (Total Letter <90%, Total Word <85%). The ETCH-C, using the general scoring criteria, is a reliable and valid test of handwriting for children using alternative scripts.
Use of a respiratory clinical score among different providers.
Liu, Lenna L; Gallaher, Margaret M; Davis, Robert L; Rutter, Carolyn M; Lewis, Toby C; Marcuse, Edgar K
2004-03-01
Respiratory assessment of children with asthma or bronchiolitis is problematic because both the components of the assessment and their relative importance vary among care providers. Use of a systematic standard assessment process and clinical score may reduce interobserver variation. Our objective was to determine observer agreement among physicians (MD), nurses (RN), and respiratory therapists (RT), using a standard respiratory clinical score. A clinical score was developed incorporating four physiologic parameters: respiratory rate, retractions, dyspnea, and auscultation. One hundred and sixty-five provider pairs (e.g., MD-MD, RN-RT) independently assessed a total of 55 patients admitted for asthma, bronchiolitis, or wheezing at an urban tertiary-care hospital. A weighted kappa statistic measured agreement beyond chance. Rater pairs had high observed agreement on total score of 82-88% and weighted kappas ranging from 0.52 (MD-RN; 95% CI, 0.19, 0.79) to 0.65 (RN-RN; 95% CI, 0.46, 0.87). Observed agreement on individual components of the score ranged from 58% (auscultation) to 74% (dyspnea), with unweighted kappas of 0.36 (respiratory rate; 95% CI, 0.26, 0.46) to 0.53 (dyspnea; 95% CI, 0.41, 0.65). In conclusion, this respiratory clinical score demonstrates good interobserver agreement between MDs, RNs, and RTs. Future research is needed to examine validity and responsiveness in clinical settings. By standardizing respiratory assessments, use of a clinical score may facilitate care coordination by physicians, nurses, and respiratory therapists and thereby improve care of children hospitalized with asthma and bronchiolitis.
Association between the Gensini Score and Carotid Artery Stenosis
Fidan, Serdar; Tabakçı, Mehmet Mustafa; Toprak, Cuneyt; Alizade, Elnur; Acar, Emrah; Bayam, Emrah; Tellice, Muhammet; Naser, Abdurrahman; Kargın, Ramazan
2016-01-01
Background and Objectives The aim of this study was to evaluate the association between the extent of coronary artery disease assessed by the Gensini score and/or the SYNTAX score and the significant carotid stenosis in patients undergoing coronary artery bypass grafting (CABG). Subjects and Methods A total of 225 patients who had carotid doppler ultrasonography prior to CABG were included retrospectively. Significant coronary artery disease was assumed as a lumen diameter stenosis of ≥50% in any of the major epicardial coronary arteries. The severity of carotid stenosis was determined by B-mode and duplex ultrasonography. Clinically significant carotid stenosis was defined as peak systolic velocity greater than 125 cm/s. Results The mean value of SYNTAX score and Gensini score was highest in patients allocated to significant carotid stenosis (22.98±7.32, p<0.001 and 77.40±32.35, p<0.001, respectively). The other risk factors for significant carotid stenosis were found to be male gender (p=0.029), carotid bruit (p<0.001), diabetes (p=0.021), left main disease (p=0.002), 3-vessel disease (p=0.008), chronic total coronary occlusion (p=0.001), and coronary artery calcification (p=0.001) in univariate analysis. However, only the Gensini score (odds ratio[OR]=1.030, p=0.004), carotid bruit (OR=0.068, p<0.001), and male gender (OR=0.190, p=0.003) were the independent predictors. The Gensini score cut off value predicting significant carotid stenosis was 50.5 with 77% sensitivity (p<0.001). Conclusion The Gensini score may be used to identify patients at high risk for significant carotid stenosis prior to CABG. PMID:27721854
Estimating landscape carrying capacity through maximum clique analysis.
Donovan, Therese M; Warrington, Gregory S; Schwenk, W Scott; Dinitz, Jeffrey H
2012-12-01
Habitat suitability (HS) maps are widely used tools in wildlife science and establish a link between wildlife populations and landscape pattern. Although HS maps spatially depict the distribution of optimal resources for a species, they do not reveal the population size a landscape is capable of supporting--information that is often crucial for decision makers and managers. We used a new approach, "maximum clique analysis," to demonstrate how HS maps for territorial species can be used to estimate the carrying capacity, N(k), of a given landscape. We estimated the N(k) of Ovenbirds (Seiurus aurocapillus) and bobcats (Lynx rufus) in an 1153-km2 study area in Vermont, USA. These two species were selected to highlight different approaches in building an HS map as well as computational challenges that can arise in a maximum clique analysis. We derived 30-m2 HS maps for each species via occupancy modeling (Ovenbird) and by resource utilization modeling (bobcats). For each species, we then identified all pixel locations on the map (points) that had sufficient resources in the surrounding area to maintain a home range (termed a "pseudo-home range"). These locations were converted to a mathematical graph, where any two points were linked if two pseudo-home ranges could exist on the landscape without violating territory boundaries. We used the program Cliquer to find the maximum clique of each graph. The resulting estimates of N(k) = 236 Ovenbirds and N(k) = 42 female bobcats were sensitive to different assumptions and model inputs. Estimates of N(k) via alternative, ad hoc methods were 1.4 to > 30 times greater than the maximum clique estimate, suggesting that the alternative results may be upwardly biased. The maximum clique analysis was computationally intensive but could handle problems with < 1500 total pseudo-home ranges (points). Given present computational constraints, it is best suited for species that occur in clustered distributions (where the problem can be
Prognostic Value of AIMS65 Score in Cirrhotic Patients with Upper Gastrointestinal Bleeding
Vinaya Gaduputi
2014-01-01
Full Text Available Introduction. Unlike Rockall scoring system, AIMS65 is based only on clinical and laboratory features. In this study we investigated the correlation between the AIMS65 score and Endoscopic Rockall score, in cirrhotic and noncirrhotic patients. Methods. This is a retrospective study of patients admitted with overt UGIB and undergoing esophagogastroduodenoscopy (EGD. AIMS65 and Rockall scores were calculated at the time of admission. We investigated the correlation between both scores along with stigmata of bleed seen on endoscopy. Results. A total of 1255 patients were studied. 152 patients were cirrhotic while 1103 patients were noncirrhotic. There was significant correlation between AIMS65 and Total Rockall scores in patients of both groups. There was significant correlation between AIMS65 score and Endoscopic Rockall score in noncirrhotics but not cirrhotics. AIMS65 scores in both cirrhotic and noncirrhotic groups were significantly higher in patients who died from UGIB than in patients who did not. Conclusion. We observed statistically significant correlation between AIMS65 score and length of hospitalization and mortality in noncirrhotic patients. We found that AIMS65 score paralleled the endoscopic grading of lesion causing UGIB in noncirrhotics. AIMS65 score correlated only with mortality but not the length of hospitalization or endoscopic stigmata of bleed in cirrhotics.
Cardiovascular risk scores for coronary atherosclerosis.
Yalcin, Murat; Kardesoglu, Ejder; Aparci, Mustafa; Isilak, Zafer; Uz, Omer; Yiginer, Omer; Ozmen, Namik; Cingozbay, Bekir Yilmaz; Uzun, Mehmet; Cebeci, Bekir Sitki
2012-10-01
The objective of this study was to compare frequently used cardiovascular risk scores in predicting the presence of coronary artery disease (CAD) and 3-vessel disease. In 350 consecutive patients (218 men and 132 women) who underwent coronary angiography, the cardiovascular risk level was determined using the Framingham Risk Score (FRS), the Modified Framingham Risk Score (MFRS), the Prospective Cardiovascular Münster (PROCAM) score, and the Systematic Coronary Risk Evaluation (SCORE). The area under the curve for receiver operating characteristic curves showed that FRS had more predictive value than the other scores for CAD (area under curve, 0.76, P MFRS, PROCAM, and SCORE) may predict the presence and severity of coronary atherosclerosis.The FRS had better predictive value than the other scores.
Maximum entropy principle and texture formation
Arminjon, M; Arminjon, Mayeul; Imbault, Didier
2006-01-01
The macro-to-micro transition in a heterogeneous material is envisaged as the selection of a probability distribution by the Principle of Maximum Entropy (MAXENT). The material is made of constituents, e.g. given crystal orientations. Each constituent is itself made of a large number of elementary constituents. The relevant probability is the volume fraction of the elementary constituents that belong to a given constituent and undergo a given stimulus. Assuming only obvious constraints in MAXENT means describing a maximally disordered material. This is proved to have the same average stimulus in each constituent. By adding a constraint in MAXENT, a new model, potentially interesting e.g. for texture prediction, is obtained.
MLDS: Maximum Likelihood Difference Scaling in R
Kenneth Knoblauch
2008-01-01
Full Text Available The MLDS package in the R programming language can be used to estimate perceptual scales based on the results of psychophysical experiments using the method of difference scaling. In a difference scaling experiment, observers compare two supra-threshold differences (a,b and (c,d on each trial. The approach is based on a stochastic model of how the observer decides which perceptual difference (or interval (a,b or (c,d is greater, and the parameters of the model are estimated using a maximum likelihood criterion. We also propose a method to test the model by evaluating the self-consistency of the estimated scale. The package includes an example in which an observer judges the differences in correlation between scatterplots. The example may be readily adapted to estimate perceptual scales for arbitrary physical continua.
Maximum Profit Configurations of Commercial Engines
Yiran Chen
2011-06-01
Full Text Available An investigation of commercial engines with finite capacity low- and high-price economic subsystems and a generalized commodity transfer law [n ∝ Δ (P m] in commodity flow processes, in which effects of the price elasticities of supply and demand are introduced, is presented in this paper. Optimal cycle configurations of commercial engines for maximum profit are obtained by applying optimal control theory. In some special cases, the eventual state—market equilibrium—is solely determined by the initial conditions and the inherent characteristics of two subsystems; while the different ways of transfer affect the model in respects of the specific forms of the paths of prices and the instantaneous commodity flow, i.e., the optimal configuration.
Maximum Segment Sum, Monadically (distilled tutorial
Jeremy Gibbons
2011-09-01
Full Text Available The maximum segment sum problem is to compute, given a list of integers, the largest of the sums of the contiguous segments of that list. This problem specification maps directly onto a cubic-time algorithm; however, there is a very elegant linear-time solution too. The problem is a classic exercise in the mathematics of program construction, illustrating important principles such as calculational development, pointfree reasoning, algebraic structure, and datatype-genericity. Here, we take a sideways look at the datatype-generic version of the problem in terms of monadic functional programming, instead of the traditional relational approach; the presentation is tutorial in style, and leavened with exercises for the reader.
Maximum Information and Quantum Prediction Algorithms
McElwaine, J N
1997-01-01
This paper describes an algorithm for selecting a consistent set within the consistent histories approach to quantum mechanics and investigates its properties. The algorithm uses a maximum information principle to select from among the consistent sets formed by projections defined by the Schmidt decomposition. The algorithm unconditionally predicts the possible events in closed quantum systems and ascribes probabilities to these events. A simple spin model is described and a complete classification of all exactly consistent sets of histories formed from Schmidt projections in the model is proved. This result is used to show that for this example the algorithm selects a physically realistic set. Other tentative suggestions in the literature for set selection algorithms using ideas from information theory are discussed.
Maximum process problems in optimal control theory
Goran Peskir
2005-01-01
Full Text Available Given a standard Brownian motion (Btt≥0 and the equation of motion dXt=vtdt+2dBt, we set St=max0≤s≤tXs and consider the optimal control problem supvE(Sτ−Cτ, where c>0 and the supremum is taken over all admissible controls v satisfying vt∈[μ0,μ1] for all t up to τ=inf{t>0|Xt∉(ℓ0,ℓ1} with μ0g∗(St, where s↦g∗(s is a switching curve that is determined explicitly (as the unique solution to a nonlinear differential equation. The solution found demonstrates that the problem formulations based on a maximum functional can be successfully included in optimal control theory (calculus of variations in addition to the classic problem formulations due to Lagrange, Mayer, and Bolza.
Maximum Spectral Luminous Efficacy of White Light
Murphy, T W
2013-01-01
As lighting efficiency improves, it is useful to understand the theoretical limits to luminous efficacy for light that we perceive as white. Independent of the efficiency with which photons are generated, there exists a spectrally-imposed limit to the luminous efficacy of any source of photons. We find that, depending on the acceptable bandpass and---to a lesser extent---the color temperature of the light, the ideal white light source achieves a spectral luminous efficacy of 250--370 lm/W. This is consistent with previous calculations, but here we explore the maximum luminous efficacy as a function of photopic sensitivity threshold, color temperature, and color rendering index; deriving peak performance as a function of all three parameters. We also present example experimental spectra from a variety of light sources, quantifying the intrinsic efficacy of their spectral distributions.
Maximum entropy model for business cycle synchronization
Xi, Ning; Muneepeerakul, Rachata; Azaele, Sandro; Wang, Yougui
2014-11-01
The global economy is a complex dynamical system, whose cyclical fluctuations can mainly be characterized by simultaneous recessions or expansions of major economies. Thus, the researches on the synchronization phenomenon are key to understanding and controlling the dynamics of the global economy. Based on a pairwise maximum entropy model, we analyze the business cycle synchronization of the G7 economic system. We obtain a pairwise-interaction network, which exhibits certain clustering structure and accounts for 45% of the entire structure of the interactions within the G7 system. We also find that the pairwise interactions become increasingly inadequate in capturing the synchronization as the size of economic system grows. Thus, higher-order interactions must be taken into account when investigating behaviors of large economic systems.
Quantum gravity momentum representation and maximum energy
Moffat, J. W.
2016-11-01
We use the idea of the symmetry between the spacetime coordinates xμ and the energy-momentum pμ in quantum theory to construct a momentum space quantum gravity geometry with a metric sμν and a curvature tensor Pλ μνρ. For a closed maximally symmetric momentum space with a constant 3-curvature, the volume of the p-space admits a cutoff with an invariant maximum momentum a. A Wheeler-DeWitt-type wave equation is obtained in the momentum space representation. The vacuum energy density and the self-energy of a charged particle are shown to be finite, and modifications of the electromagnetic radiation density and the entropy density of a system of particles occur for high frequencies.
Video segmentation using Maximum Entropy Model
QIN Li-juan; ZHUANG Yue-ting; PAN Yun-he; WU Fei
2005-01-01
Detecting objects of interest from a video sequence is a fundamental and critical task in automated visual surveillance.Most current approaches only focus on discriminating moving objects by background subtraction whether or not the objects of interest can be moving or stationary. In this paper, we propose layers segmentation to detect both moving and stationary target objects from surveillance video. We extend the Maximum Entropy (ME) statistical model to segment layers with features, which are collected by constructing a codebook with a set of codewords for each pixel. We also indicate how the training models are used for the discrimination of target objects in surveillance video. Our experimental results are presented in terms of the success rate and the segmenting precision.
Cosmic shear measurement with maximum likelihood and maximum a posteriori inference
Hall, Alex
2016-01-01
We investigate the problem of noise bias in maximum likelihood and maximum a posteriori estimators for cosmic shear. We derive the leading and next-to-leading order biases and compute them in the context of galaxy ellipticity measurements, extending previous work on maximum likelihood inference for weak lensing. We show that a large part of the bias on these point estimators can be removed using information already contained in the likelihood when a galaxy model is specified, without the need for external calibration. We test these bias-corrected estimators on simulated galaxy images similar to those expected from planned space-based weak lensing surveys, with very promising results. We find that the introduction of an intrinsic shape prior mitigates noise bias, such that the maximum a posteriori estimate can be made less biased than the maximum likelihood estimate. Second-order terms offer a check on the convergence of the estimators, but are largely sub-dominant. We show how biases propagate to shear estima...
Santiago-Torres, Margarita; Tinker, Lesley F; Allison, Matthew A; Breymeyer, Kara L; Garcia, Lorena; Kroenke, Candyce H; Lampe, Johanna W; Shikany, James M; Van Horn, Linda; Neuhouser, Marian L
2015-12-01
Women of Mexican descent are disproportionally affected by obesity, systemic inflammation, and insulin resistance (IR). Available approaches used to give scores to dietary patterns relative to dietary guidelines may not effectively capture traditional diets of Mexicans, who comprise the largest immigrant group in the United States. We characterized an a priori traditional Mexican diet (MexD) score high in corn tortillas, beans, soups, Mexican mixed dishes (e.g., tamales), fruits, vegetables, full-fat milk, and Mexican cheeses and low in refined grains and added sugars and evaluated the association of the MexD score with systemic inflammation and IR in 493 postmenopausal participants in the Women's Health Initiative (WHI) who are of Mexican ethnic descent. The MexD score was developed from the baseline (1993-1998) WHI food frequency questionnaire, which included Hispanic foods and was available in Spanish. Body mass index (BMI) was computed from baseline measured weight and height, and ethnicity was self-reported. Outcome variables were high sensitivity C-reactive protein (hsCRP), glucose, insulin, homeostasis model assessment of insulin resistance (HOMA-IR), and triglyceride concentrations measured at follow-up (2012-2013). Multivariable linear and logistic regression models were used to test the associations of the MexD score with systemic inflammation and IR. The mean ± SD MexD score was 5.8 ± 2.1 (12 maximum points) and was positively associated with intakes of carbohydrates, vegetable protein, and dietary fiber and inversely associated with intakes of added sugars and total fat (P low MexD scores, consistent with a more-traditional Mexican diet, had 23% and 15% lower serum hsCRP (P obese women (P-interaction adherence to a traditional Mexican diet could help reduce the future risk of systemic inflammation and IR in women of Mexican descent. © 2015 American Society for Nutrition.
An ultrasound score for knee osteoarthritis
Riecke, B F; Christensen, R.; Torp-Pedersen, S
2014-01-01
OBJECTIVE: To develop standardized musculoskeletal ultrasound (MUS) procedures and scoring for detecting knee osteoarthritis (OA) and test the MUS score's ability to discern various degrees of knee OA, in comparison with plain radiography and the 'Knee injury and Osteoarthritis Outcome Score' (KO...
Breaking of scored tablets : a review
van Santen, E; Barends, D M; Frijlink, H W
2002-01-01
The literature was reviewed regarding advantages, problems and performance indicators of score lines. Scored tablets provide dose flexibility, ease of swallowing and may reduce the costs of medication. However, many patients are confronted with scored tablets that are broken unequally and with diffi
Developing Score Reports for Cognitive Diagnostic Assessments
Roberts, Mary Roduta; Gierl, Mark J.
2010-01-01
This paper presents a framework to provide a structured approach for developing score reports for cognitive diagnostic assessments ("CDAs"). Guidelines for reporting and presenting diagnostic scores are based on a review of current educational test score reporting practices and literature from the area of information design. A sample diagnostic…
Credit Scores, Race, and Residential Sorting
Nelson, Ashlyn Aiko
2010-01-01
Credit scores have a profound impact on home purchasing power and mortgage pricing, yet little is known about how credit scores influence households' residential location decisions. This study estimates the effects of credit scores on residential sorting behavior using a novel mortgage industry data set combining household demographic, credit, and…
Credit Scores, Race, and Residential Sorting
Nelson, Ashlyn Aiko
2010-01-01
Credit scores have a profound impact on home purchasing power and mortgage pricing, yet little is known about how credit scores influence households' residential location decisions. This study estimates the effects of credit scores on residential sorting behavior using a novel mortgage industry data set combining household demographic, credit, and…
Semiparametric score sevel susion: Gaussian sopula approach
Susyanyo, N.; Klaassen, C.A.J.; Veldhuis, R.N.J.; Spreeuwers, L.J.
2015-01-01
Score level fusion is an appealing method for combining multi-algorithms, multi- representations, and multi-modality biometrics due to its simplicity. Often, scores are assumed to be independent, but even for dependent scores, accord- ing to the Neyman-Pearson lemma, the likelihood ratio is the opti
An objective fluctuation score for Parkinson's disease.
Malcolm K Horne
Full Text Available Establishing the presence and severity of fluctuations is important in managing Parkinson's Disease yet there is no reliable, objective means of doing this. In this study we have evaluated a Fluctuation Score derived from variations in dyskinesia and bradykinesia scores produced by an accelerometry based system.The Fluctuation Score was produced by summing the interquartile range of bradykinesia scores and dyskinesia scores produced every 2 minutes between 0900-1800 for at least 6 days by the accelerometry based system and expressing it as an algorithm.This Score could distinguish between fluctuating and non-fluctuating patients with high sensitivity and selectivity and was significant lower following activation of deep brain stimulators. The scores following deep brain stimulation lay in a band just above the score separating fluctuators from non-fluctuators, suggesting a range representing adequate motor control. When compared with control subjects the score of newly diagnosed patients show a loss of fluctuation with onset of PD. The score was calculated in subjects whose duration of disease was known and this showed that newly diagnosed patients soon develop higher scores which either fall under or within the range representing adequate motor control or instead go on to develop more severe fluctuations.The Fluctuation Score described here promises to be a useful tool for identifying patients whose fluctuations are progressing and may require therapeutic changes. It also shows promise as a useful research tool. Further studies are required to more accurately identify therapeutic targets and ranges.
Revision Total Elbow Arthroplasty.
Ramirez, Miguel A; Cheung, Emilie V; Murthi, Anand M
2017-08-01
Despite recent technologic advances, total elbow arthroplasty has complication rates higher than that of total joint arthroplasty in other joints. With new antirheumatic treatments, the population receiving total elbow arthroplasty has shifted from patients with rheumatoid arthritis to those with posttraumatic arthritis, further compounding the high complication rate. The most common reasons for revision include infection, aseptic loosening, fracture, and component failure. Common mechanisms of total elbow arthroplasty failure include infection, aseptic loosening, fracture, component failure, and instability. Tension band fixation, allograft struts with cerclage wire, and/or plate and screw constructs can be used for fracture stabilization.
Totalization Data Exchange (TDEX)
Social Security Administration — The Totalization Data Exchange (TDEX) process is an exchange between SSA and its foreign country partners to identify deaths of beneficiaries residing abroad. The...
Empirical evaluation of scoring functions for Bayesian network model selection.
Liu, Zhifa; Malone, Brandon; Yuan, Changhe
2012-01-01
In this work, we empirically evaluate the capability of various scoring functions of Bayesian networks for recovering true underlying structures. Similar investigations have been carried out before, but they typically relied on approximate learning algorithms to learn the network structures. The suboptimal structures found by the approximation methods have unknown quality and may affect the reliability of their conclusions. Our study uses an optimal algorithm to learn Bayesian network structures from datasets generated from a set of gold standard Bayesian networks. Because all optimal algorithms always learn equivalent networks, this ensures that only the choice of scoring function affects the learned networks. Another shortcoming of the previous studies stems from their use of random synthetic networks as test cases. There is no guarantee that these networks reflect real-world data. We use real-world data to generate our gold-standard structures, so our experimental design more closely approximates real-world situations. A major finding of our study suggests that, in contrast to results reported by several prior works, the Minimum Description Length (MDL) (or equivalently, Bayesian information criterion (BIC)) consistently outperforms other scoring functions such as Akaike's information criterion (AIC), Bayesian Dirichlet equivalence score (BDeu), and factorized normalized maximum likelihood (fNML) in recovering the underlying Bayesian network structures. We believe this finding is a result of using both datasets generated from real-world applications rather than from random processes used in previous studies and learning algorithms to select high-scoring structures rather than selecting random models. Other findings of our study support existing work, e.g., large sample sizes result in learning structures closer to the true underlying structure; the BDeu score is sensitive to the parameter settings; and the fNML performs pretty well on small datasets. We also
SCORE DIGITAL TECHNOLOGY: THE CONVERGENCE
Chernyshov Alexander V.
2013-12-01
Full Text Available Explores the role of digital scorewriters in today's culture, education, and music industry and media environment. The main principle of the development of software is not only publishing innovation (relating to the sheet music, and integration into the area of composition, arrangement, education, creative process for works based on digital technology (films, television and radio broadcasting, Internet, audio and video art. Therefore the own convergence of musically-computer technology is a total phenomenon: notation program combined with means MIDI-sequencer, audio and video editor. The article contains the unique interview with the creator of music notation processors.
The biochemical composition of plankton in a subsurface chlorophyll maximum
Dortch, Quay
1987-06-01
The biochemical composition of plankton at a station with a deep, subsurface chlorophyll maximum (SCM) below a nitrogen-depleted surface layer off the Washington coast was determined in order to answer long-standing questions about the nature and causes of SCM. The chlorophyll maximum did not correspond to a protein-biomass maximum, and chlorophyll: protein ratios indicate that only in the SCM were phytoplankton a major constituent of the total biomass. Ratios of free amino acids: protein in the particulate matter were high at all depths in the euphotic zone. From this it can be concluded that phytoplankton in the SCM are N-sufficient, since they make up 80-90% of the biomass there. Above and below the SCM, where non-phytoplankton predominate, the state of N deficiency or sufficiency of the phytoplankton cannot be ascertained until more is known about how the chemical composition of phytoplankton, zooplankton and bacteria are related. However, if it is assumed that very N-sufficient zooplankton and bacteria would not coexist with very N-deficient phytoplankton, then it seems likely that the phytoplankton were also N-sufficient or nearly so. Thus, the biochemical indicators do not support the hypothesis that the SCM forms because it represents the only layer in the water column with adequate N and light for phytoplankton growth. Comparison of the chlorophyll: protein ratios with those from cultures and from other regions suggests that oligotrophic areas have a much higher proportion of non-phytoplankton biomass than do eutrophic areas.
Azam Zaka
2014-10-01
Full Text Available This paper is concerned with the modifications of maximum likelihood, moments and percentile estimators of the two parameter Power function distribution. Sampling behavior of the estimators is indicated by Monte Carlo simulation. For some combinations of parameter values, some of the modified estimators appear better than the traditional maximum likelihood, moments and percentile estimators with respect to bias, mean square error and total deviation.
A maximum principle for diffusive Lotka-Volterra systems of two competing species
Chen, Chiun-Chuan; Hung, Li-Chang
2016-10-01
Using an elementary approach, we establish a new maximum principle for the diffusive Lotka-Volterra system of two competing species, which involves pointwise estimates of an elliptic equation consisting of the second derivative of one function, the first derivative of another function, and a quadratic nonlinearity. This maximum principle gives a priori estimates for the total mass of the two species. Moreover, applying it to the system of three competing species leads to a nonexistence theorem of traveling wave solutions.
The Influence of Hospital Market Competition on Patient Mortality and Total Performance Score.
Haley, Donald Robert; Zhao, Mei; Spaulding, Aaron; Hamadi, Hanadi; Xu, Jing; Yeomans, Katelyn
2016-01-01
The Affordable Care Act of 2010 launch of Medicare Value-Based Purchasing has become the platform for payment reform. It is a mechanism by which buyers of health care services hold providers accountable for high-quality and cost-effective care. The objective of the study was to examine the relationship between quality of hospital care and hospital competition using the quality-quantity behavioral model of hospital behavior. The quality-quantity behavioral model of hospital behavior was used as the conceptual framework for this study. Data from the American Hospital Association database, the Hospital Compare database, and the Area Health Resources Files database were used. Multivariate regression analysis was used to examine the effect of hospital competition on patient mortality. Hospital market competition was significantly and negatively related to the 3 mortality rates. Consistent with the literature, hospitals located in more competitive markets had lower mortality rates for patients with acute myocardial infarction, heart failure, and pneumonia. The results suggest that hospitals may be more readily to compete on quality of care and patient outcomes. The findings are important because policies that seek to control and negatively influence a competitive hospital environment, such as Certificate of Need legislation, may negatively affect patient mortality rates. Therefore, policymakers should encourage the development of policies that facilitate a more competitive and transparent health care marketplace to potentially and significantly improve patient mortality.
Fjetland, Lars, E-mail: lars.fjetland@lyse.net; Roy, Sumit, E-mail: sumit.roy@sus.no; Kurz, Kathinka D., E-mail: kathinka.dehli.kurz@sus.no [Stavanger University Hospital, Department of Radiology (Norway); Solbakken, Tore, E-mail: tore.solbakken@sus.no [Stavanger University Hospital, Department of Neurology (Norway); Larsen, Jan Petter, E-mail: jan.petter.larsen@sus.no; Kurz, Martin W., E-mail: martin.kurz@sus.no [The Norwegian Center for Movement Disorders, Stavanger University Hospital (Norway)
2013-10-15
Purpose: Intra-arterial therapy (IAT) is used increasingly as a treatment option for acute stroke caused by central large vessel occlusions. Despite high rates of recanalization, the clinical outcome is highly variable. The authors evaluated the Houston IAT (HIAT) and the totaled health risks in vascular events (THRIVE) score, two predicting scores designed to identify patients likely to benefit from IAT. Methods: Fifty-two patients treated at the Stavanger University Hospital with IAT from May 2009 to June 2012 were included in this study. We combined the scores in an additional analysis. We also performed an additional analysis according to high age and evaluated the scores in respect of technical efficacy. Results: Fifty-two patients were evaluated by the THRIVE score and 51 by the HIAT score. We found a strong correlation between the level of predicted risk and the actual clinical outcome (THRIVE p = 0.002, HIAT p = 0.003). The correlations were limited to patients successfully recanalized and to patients <80 years. By combining the scores additional 14.3 % of the patients could be identified as poor candidates for IAT. Both scores were insufficient to identify patients with a good clinical outcome. Conclusions: Both scores showed a strong correlation to poor clinical outcome in patients <80 years. The specificity of the scores could be enhanced by combining them. Both scores were insufficient to identify patients with a good clinical outcome and showed no association to clinical outcome in patients aged {>=}80 years.
The Total Embedding Distributions of Cacti and Necklaces
Yi Chao CHEN; Yan Pei LIU; Tao WANG
2006-01-01
Total embedding distributions have been known for only a few classes of graphs. In this paper the total embedding distributions of the cacti and the necklaces are obtained. Furthermore we obtain the total embedding distributions of all graphs with maximum genus 1 by using the method of this paper.
Strictness and Totality Analysis
Solberg, K. L.; Nielson, Hanne Riis; Nielson, Flemming
1998-01-01
We define a novel inference system for strictness and totality analysis for the simply-typed lazy lambda-calculus with constants and fixpoints. Strictness information identifies those terms that definitely denote bottom (i.e. do not evaluate to WHNF) whereas totality information identifies those ...
Genoptraening efter total knaealloplastik
Holm, Bente; Kehlet, Henrik
2009-01-01
The short- and long-term benefits of post-discharge physiotherapy regimens after total knee arthroplasty are debatable. A national survey including hospitals in Denmark that perform total knee arthroplasty showed a large variability in indication and regimen for post-knee arthroplasty rehabilitat...
Committee Opinion No. 644: The Apgar Score.
2015-10-01
The Apgar score provides an accepted and convenient method for reporting the status of the newborn infant immediately after birth and the response to resuscitation if needed. The Apgar score alone cannot be considered to be evidence of or a consequence of asphyxia, does not predict individual neonatal mortality or neurologic outcome, and should not be used for that purpose. An Apgar score assigned during a resuscitation is not equivalent to a score assigned to a spontaneously breathing infant. The American Academy of Pediatrics and the American College of Obstetricians and Gynecologists encourage use of an expanded Apgar score reporting form that accounts for concurrent resuscitative interventions.
Total Monomeric Anthocyanin and Total Flavonoid Content of Processed Purple
Potato Florentina Damşa
2016-01-01
Full Text Available It is well known that processing change physical and chemical composition of foods, thus affecting the content in bioactive substances. Potatoes are almost always consumed after processing (baked, fried or boiled making it critical to understand the effect of such processing techniques on the containing in bioactive compounds. In order to determine the influence of processing on the content of anthocyanin pigments and flavonoids was achieved the extraction of these compounds from boiled and baked purple potato tuber (Albastru-Violet de Galanesti variety. Also, in order to obtain the maximum amount of anthocyanin pigments and flavonoids from processed potatoes was applied ultrasonic extraction (20 kHz and was performed the mathematical modeling (central composite design using SigmaXL software. The total anthocyanins content were determined spectrophotometrically by the pH differential method and the total flavonoids content were determine colorimetric by AlCl3 method. This study proves that the potato processing decreases the content of anthocyanin pigments and flavonoids.
Dependence of maximum concentration from chemical accidents on release duration
Hanna, Steven; Chang, Joseph
2017-01-01
Chemical accidents often involve releases of a total mass, Q, of stored material in a tank over a time duration, td, of less than a few minutes. The value of td is usually uncertain because of lack of knowledge of key information, such as the size and location of the hole and the pressure and temperature of the chemical. In addition, it is rare that eyewitnesses or video cameras are present at the time of the accident. For inhalation hazards, serious health effects (such as damage to the respiratory system) are determined by short term averages (pressurized liquefied chlorine releases from tanks are given, focusing on scenarios from the Jack Rabbit I (JR I) field experiment. The analytical calculations and the predictions of the SLAB dense gas dispersion model agree that the ratio of maximum C for two different td's is greatest (as much as a factor of ten) near the source. At large distances (beyond a few km for the JR I scenarios), where tt exceeds both td's, the ratio of maximum C approaches unity.
Maximum Entropy, Word-Frequency, Chinese Characters, and Multiple Meanings
Yan, Xiao-Yong
2014-01-01
The word-frequency distribution of a text written by an author is well accounted for by a maximum entropy distribution, the RGF (random group formation)-prediction. The RGF-distribution is completely determined by the a priori values of the total number of words in the text (M), the number of distinct words (N) and the number of repetitions of the most common word (k_max). It is here shown that this maximum entropy prediction also describes a text written in Chinese characters. In particular it is shown that although the same Chinese text written in words and Chinese characters have quite differently shaped distributions, they are nevertheless both well predicted by their respective three a priori characteristic values. It is pointed out that this is analogous to the change in the shape of the distribution when translating a given text to another language. Another consequence of the RGF-prediction is that taking a part of a long text will change the input parameters (M, N, k_max) and consequently also the sha...
Ionization and maximum energy of nuclei in shock acceleration theory
Morlino, Giovanni
2011-01-01
We study the acceleration of heavy nuclei at SNR shocks when the process of ionization is taken into account. Heavy atoms ($Z_N >$ few) in the interstellar medium which start the diffusive shock acceleration (DSA) are never fully ionized at the moment of injection. The ionization occurs during the acceleration process, when atoms already move relativistically. For typical environment around SNRs the photo-ionization due to the background galactic radiation dominates over Coulomb collisions. The main consequence of ionization is the reduction of the maximum energy which ions can achieve with respect to the standard result of the DSA. In fact the photo-ionization has a timescale comparable to the beginning of the Sedov-Taylor phase, hence the maximum energy is no more proportional to the nuclear charge, as predicted by standard DSA, but rather to the effective ions' charge during the acceleration process, which is smaller than the total nuclear charge $Z_N$. This result can have a direct consequence in the pred...
IDENTIFICATION OF IDEOTYPES BY CANONICAL ANALYSIS IN Panicum maximum
Janaina Azevedo Martuscello
2015-04-01
Full Text Available Grouping of genotypes by canonical variable analysis is an important tool in breeding. It allows the grouping of individuals with similar characteristics that are associated with superior agronomic performance and may indicate the ideal profile of a plant for the region. The objective of the present study was to define, by canonical analysis, the agronomic profile of Panicum maximum plants adapted to the Agreste region. The experiment was conducted in a completely randomized design with 28 treatments, 22 genotypes of Panicum maximum, and cultivars Mombasa, Tanzania, Massai, Milenio, BRS Zuri, and BRS Tamani in triplicate in 4-m² plots. Plots were harvested five times and the following traits were evaluated: plant height; total, leaf, and stem; dead dry matter yields; leaf:stem ratio; leaf percentage; and volumetric density of forage. The analysis of canonical variables was performed based on the phenotypic means of the evaluated traits and on the residual variance and covariance matrix. Genotype PM34 showed higher mean leaf dry matter yield under the conditions of the Agreste of Alagoas (on average 53% higher than cultivars Mombasa, Tanzania, Milenio and Massai. It was possible to summarize the variation observed in eight agronomic characteristics in only two canonical variables accounting for 81.44 % of the data variation. The ideotype plant adapted to the conditions of the Agreste should be tall and present high leaf yield, leaf percentage, and leaf:stem ratio, and intermediate values of volumetric density of forage.
Azour, Lea; Kadoch, Michael A; Ward, Thomas J; Eber, Corey D; Jacobi, Adam H
Coronary artery calcium (CAC) is often identified on routine chest computed tomography (CT). The purpose of our study was to evaluate whether ordinal scoring of CAC on non-gated, routine chest CT is an accurate predictor of Agatston score ranges in a community-based population, and in particular to determine the accuracy of an ordinal score of zero on routine chest CT. Two thoracic radiologists reviewed consecutive same-day ECG-gated and routine non-gated chest CT scans of 222 individuals. CAC was quantified using the Agatston scoring on the ECG-gated scans, and using an ordinal method on routine scans, with a score from 0 to 12. The pattern and distribution of CAC was assessed. The correlation between routine exam ordinal scores and Agatston scores in ECG-gated exams, as well as the accuracy of assigning a zero calcium score on routine chest CT was determined. CAC was most prevalent in the left anterior descending coronary artery in both single and multi-vessel coronary artery disease. There was a strong correlation between the non-gated ordinal and ECG-gated Agatston scores (r = 0.811, p < 0.01). Excellent inter-reader agreement (k = 0.95) was shown for the presence (total ordinal score ≥1) or absence (total ordinal score = 0) of CAC on routine chest CT. The negative predictive value for a total ordinal score of zero on routine CT was 91.6% (95% CI, 85.1-95.9). Total ordinal scores of 0, 1-3, 4-5, and ≥6 corresponded to average Agatston scores of 0.52 (0.3-0.8), 98.7 (78.2-117.1), 350.6 (264.9-436.3) and 1925.4 (1526.9-2323.9). Visual assessment of CAC on non-gated routine chest CT accurately predicts Agatston score ranges, including the zero score, in ECG-gated CT. Inclusion of this information in radiology reports may be useful to convey important information on cardiovascular risk, particularly premature atherosclerosis in younger patients. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights
WMAXC: a weighted maximum clique method for identifying condition-specific sub-network.
Amgalan, Bayarbaatar; Lee, Hyunju
2014-01-01
Sub-networks can expose complex patterns in an entire bio-molecular network by extracting interactions that depend on temporal or condition-specific contexts. When genes interact with each other during cellular processes, they may form differential co-expression patterns with other genes across different cell states. The identification of condition-specific sub-networks is of great importance in investigating how a living cell adapts to environmental changes. In this work, we propose the weighted MAXimum clique (WMAXC) method to identify a condition-specific sub-network. WMAXC first proposes scoring functions that jointly measure condition-specific changes to both individual genes and gene-gene co-expressions. It then employs a weaker formula of a general maximum clique problem and relates the maximum scored clique of a weighted graph to the optimization of a quadratic objective function under sparsity constraints. We combine a continuous genetic algorithm and a projection procedure to obtain a single optimal sub-network that maximizes the objective function (scoring function) over the standard simplex (sparsity constraints). We applied the WMAXC method to both simulated data and real data sets of ovarian and prostate cancer. Compared with previous methods, WMAXC selected a large fraction of cancer-related genes, which were enriched in cancer-related pathways. The results demonstrated that our method efficiently captured a subset of genes relevant under the investigated condition.
Using family atopy scores to identify the risk of atopic dermatitis in infants
Melisa Anggraeni
2014-11-01
Full Text Available Background Atopic dermatitis is the first manifestation of allergic disease in early life. Early interventions may prevent the development of allergy disease. Allergy trace cards have been used to identify the level of allergic risk, based on family atopy scores. Because environmental factors may also influence the development of atopic dermatitis, the usefulness of the allergy trace card needs to be reevaluated. Objective To compare the incidence of atopic dermatitis in infants aged 0-4 months with total family atopy scores of > 0 to those with scores of 0. Methods We conducted this cohort study from June 1, 2012 to December 31, 2012 at Sanglah Hospital, Denpasar. Family atopy score was tabulated from all pregnant woman in the Obstetric Outpatient Clinic and the Maternity Room. Subjects were divided into two groups based on their total family atopy score: those with scores > 0 and those with scores of 0. The appearance of atopic dermatitis symptoms in the infants were evaluated until they reached 4 months of age. The incidence of atopic dermatitis in two groups was compared using Chi-square test. Results The incidence of atopic dermatitis in this study was 10.9%. The group with total family atopy scores of 0 had a significantly higher incidence of atopic dermatitis than the group with scores > 0 (adjusted RR 22.5; 95%CI 8.8 to 57.0; P = 0.001. Conclusion The incidence of atopic dermatitis is higher in infants with total family atopy score > 0 and this group has a 22.5 times higher risk of atopic dermatitis compared to infants with total family atopy score of 0. Allergy trace cards are relevant in differentiating the risk of atopy with regards to development of atopic dermatitis. We suggest that family atopy scores be evaluated during antenatal care in order to limit the development of atopic dermatitis in infants. [Paediatr Indones. 2014;54:330-7.].
Suh, Youngjoo; Kim, Hoirin
2014-12-01
In this paper, a new discriminative likelihood score weighting technique is proposed for speaker identification. The proposed method employs a discriminative weighting of frame-level log-likelihood scores with acoustic-phonetic classification in the Gaussian mixture model (GMM)-based speaker identification. Experiments performed on the Aurora noise-corrupted TIMIT database showed that the proposed approach provides meaningful performance improvement with an overall relative error reduction of 15.8% over the maximum likelihood-based baseline GMM approach.
20 CFR 211.14 - Maximum creditable compensation.
2010-04-01
... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Maximum creditable compensation. 211.14... CREDITABLE RAILROAD COMPENSATION § 211.14 Maximum creditable compensation. Maximum creditable compensation... Employment Accounts shall notify each employer of the amount of maximum creditable compensation applicable...
49 CFR 230.24 - Maximum allowable stress.
2010-10-01
... 49 Transportation 4 2010-10-01 2010-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...
Molé, C; Simon, E
2015-06-01
The management of cleft lip, alveolar and palate sequelae remains problematic today. To optimize it, we tried to establish a new clinical index for diagnostic and prognostic purposes. Seven tissue indicators, that we consider to be important in the management of alveolar sequelae, are listed by assigning them individual scores. The final score, obtained by adding together the individual scores, can take a low, high or maximum value. We propose a new classification (ACS: Alveolar Cleft Score) that guides the therapeutic team to a prognosis approach, in terms of the recommended surgical and prosthetic reconstruction, the type of medical care required, and the preventive and supportive therapy to establish. Current studies are often only based on a standard radiological evaluation of the alveolar bone height at the cleft site. However, the gingival, the osseous and the cellular areas bordering the alveolar cleft sequelae induce many clinical parameters, which should be reflected in the morphological diagnosis, to better direct the surgical indications and the future prosthetic requirements, and to best maintain successful long term aesthetic and functional results. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Theoretical Estimate of Maximum Possible Nuclear Explosion
Bethe, H. A.
1950-01-31
The maximum nuclear accident which could occur in a Na-cooled, Be moderated, Pu and power producing reactor is estimated theoretically. (T.R.H.) 2O82 Results of nuclear calculations for a variety of compositions of fast, heterogeneous, sodium-cooled, U-235-fueled, plutonium- and power-producing reactors are reported. Core compositions typical of plate-, pin-, or wire-type fuel elements and with uranium as metal, alloy, and oxide were considered. These compositions included atom ratios in the following range: U-23B to U-235 from 2 to 8; sodium to U-235 from 1.5 to 12; iron to U-235 from 5 to 18; and vanadium to U-235 from 11 to 33. Calculations were performed to determine the effect of lead and iron reflectors between the core and blanket. Both natural and depleted uranium were evaluated as the blanket fertile material. Reactors were compared on a basis of conversion ratio, specific power, and the product of both. The calculated results are in general agreement with the experimental results from fast reactor assemblies. An analysis of the effect of new cross-section values as they became available is included. (auth)
Proposed principles of maximum local entropy production.
Ross, John; Corlan, Alexandru D; Müller, Stefan C
2012-07-12
Articles have appeared that rely on the application of some form of "maximum local entropy production principle" (MEPP). This is usually an optimization principle that is supposed to compensate for the lack of structural information and measurements about complex systems, even systems as complex and as little characterized as the whole biosphere or the atmosphere of the Earth or even of less known bodies in the solar system. We select a number of claims from a few well-known papers that advocate this principle and we show that they are in error with the help of simple examples of well-known chemical and physical systems. These erroneous interpretations can be attributed to ignoring well-established and verified theoretical results such as (1) entropy does not necessarily increase in nonisolated systems, such as "local" subsystems; (2) macroscopic systems, as described by classical physics, are in general intrinsically deterministic-there are no "choices" in their evolution to be selected by using supplementary principles; (3) macroscopic deterministic systems are predictable to the extent to which their state and structure is sufficiently well-known; usually they are not sufficiently known, and probabilistic methods need to be employed for their prediction; and (4) there is no causal relationship between the thermodynamic constraints and the kinetics of reaction systems. In conclusion, any predictions based on MEPP-like principles should not be considered scientifically founded.
Maximum entropy production and plant optimization theories.
Dewar, Roderick C
2010-05-12
Plant ecologists have proposed a variety of optimization theories to explain the adaptive behaviour and evolution of plants from the perspective of natural selection ('survival of the fittest'). Optimization theories identify some objective function--such as shoot or canopy photosynthesis, or growth rate--which is maximized with respect to one or more plant functional traits. However, the link between these objective functions and individual plant fitness is seldom quantified and there remains some uncertainty about the most appropriate choice of objective function to use. Here, plants are viewed from an alternative thermodynamic perspective, as members of a wider class of non-equilibrium systems for which maximum entropy production (MEP) has been proposed as a common theoretical principle. I show how MEP unifies different plant optimization theories that have been proposed previously on the basis of ad hoc measures of individual fitness--the different objective functions of these theories emerge as examples of entropy production on different spatio-temporal scales. The proposed statistical explanation of MEP, that states of MEP are by far the most probable ones, suggests a new and extended paradigm for biological evolution--'survival of the likeliest'--which applies from biomacromolecules to ecosystems, not just to individuals.
Maximum likelihood continuity mapping for fraud detection
Hogden, J.
1997-05-01
The author describes a novel time-series analysis technique called maximum likelihood continuity mapping (MALCOM), and focuses on one application of MALCOM: detecting fraud in medical insurance claims. Given a training data set composed of typical sequences, MALCOM creates a stochastic model of sequence generation, called a continuity map (CM). A CM maximizes the probability of sequences in the training set given the model constraints, CMs can be used to estimate the likelihood of sequences not found in the training set, enabling anomaly detection and sequence prediction--important aspects of data mining. Since MALCOM can be used on sequences of categorical data (e.g., sequences of words) as well as real valued data, MALCOM is also a potential replacement for database search tools such as N-gram analysis. In a recent experiment, MALCOM was used to evaluate the likelihood of patient medical histories, where ``medical history`` is used to mean the sequence of medical procedures performed on a patient. Physicians whose patients had anomalous medical histories (according to MALCOM) were evaluated for fraud by an independent agency. Of the small sample (12 physicians) that has been evaluated, 92% have been determined fraudulent or abusive. Despite the small sample, these results are encouraging.
Maximum life spiral bevel reduction design
Savage, M.; Prasanna, M. G.; Coe, H. H.
1992-07-01
Optimization is applied to the design of a spiral bevel gear reduction for maximum life at a given size. A modified feasible directions search algorithm permits a wide variety of inequality constraints and exact design requirements to be met with low sensitivity to initial values. Gear tooth bending strength and minimum contact ratio under load are included in the active constraints. The optimal design of the spiral bevel gear reduction includes the selection of bearing and shaft proportions in addition to gear mesh parameters. System life is maximized subject to a fixed back-cone distance of the spiral bevel gear set for a specified speed ratio, shaft angle, input torque, and power. Significant parameters in the design are: the spiral angle, the pressure angle, the numbers of teeth on the pinion and gear, and the location and size of the four support bearings. Interpolated polynomials expand the discrete bearing properties and proportions into continuous variables for gradient optimization. After finding the continuous optimum, a designer can analyze near optimal designs for comparison and selection. Design examples show the influence of the bearing lives on the gear parameters in the optimal configurations. For a fixed back-cone distance, optimal designs with larger shaft angles have larger service lives.
CORA - emission line fitting with Maximum Likelihood
Ness, J.-U.; Wichmann, R.
2002-07-01
The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.
Finding maximum JPEG image block code size
Lakhani, Gopal
2012-07-01
We present a study of JPEG baseline coding. It aims to determine the minimum storage needed to buffer the JPEG Huffman code bits of 8-bit image blocks. Since DC is coded separately, and the encoder represents each AC coefficient by a pair of run-length/AC coefficient level, the net problem is to perform an efficient search for the optimal run-level pair sequence. We formulate it as a two-dimensional, nonlinear, integer programming problem and solve it using a branch-and-bound based search method. We derive two types of constraints to prune the search space. The first one is given as an upper-bound for the sum of squares of AC coefficients of a block, and it is used to discard sequences that cannot represent valid DCT blocks. The second type constraints are based on some interesting properties of the Huffman code table, and these are used to prune sequences that cannot be part of optimal solutions. Our main result is that if the default JPEG compression setting is used, space of minimum of 346 bits and maximum of 433 bits is sufficient to buffer the AC code bits of 8-bit image blocks. Our implementation also pruned the search space extremely well; the first constraint reduced the initial search space of 4 nodes down to less than 2 nodes, and the second set of constraints reduced it further by 97.8%.
Maximum likelihood estimates of pairwise rearrangement distances.
Serdoz, Stuart; Egri-Nagy, Attila; Sumner, Jeremy; Holland, Barbara R; Jarvis, Peter D; Tanaka, Mark M; Francis, Andrew R
2017-06-21
Accurate estimation of evolutionary distances between taxa is important for many phylogenetic reconstruction methods. Distances can be estimated using a range of different evolutionary models, from single nucleotide polymorphisms to large-scale genome rearrangements. Corresponding corrections for genome rearrangement distances fall into 3 categories: Empirical computational studies, Bayesian/MCMC approaches, and combinatorial approaches. Here, we introduce a maximum likelihood estimator for the inversion distance between a pair of genomes, using a group-theoretic approach to modelling inversions introduced recently. This MLE functions as a corrected distance: in particular, we show that because of the way sequences of inversions interact with each other, it is quite possible for minimal distance and MLE distance to differently order the distances of two genomes from a third. The second aspect tackles the problem of accounting for the symmetries of circular arrangements. While, generally, a frame of reference is locked, and all computation made accordingly, this work incorporates the action of the dihedral group so that distance estimates are free from any a priori frame of reference. The philosophy of accounting for symmetries can be applied to any existing correction method, for which examples are offered. Copyright © 2017 Elsevier Ltd. All rights reserved.
Conditional Reliability Coefficients for Test Scores.
Nicewander, W Alan
2017-04-06
The most widely used, general index of measurement precision for psychological and educational test scores is the reliability coefficient-a ratio of true variance for a test score to the true-plus-error variance of the score. In item response theory (IRT) models for test scores, the information function is the central, conditional index of measurement precision. In this inquiry, conditional reliability coefficients for a variety of score types are derived as simple transformations of information functions. It is shown, for example, that the conditional reliability coefficient for an ordinary, number-correct score, X, is equal to, ρ(X,X'|θ)=I(X,θ)/[I(X,θ)+1] Where: θ is a latent variable measured by an observed test score, X; p(X, X'|θ) is the conditional reliability of X at a fixed value of θ; and I(X, θ) is the score information function. This is a surprisingly simple relationship between the 2, basic indices of measurement precision from IRT and classical test theory (CTT). This relationship holds for item scores as well as test scores based on sums of item scores-and it holds for dichotomous as well as polytomous items, or a mix of both item types. Also, conditional reliabilities are derived for computerized adaptive test scores, and for θ-estimates used as alternatives to number correct scores. These conditional reliabilities are all related to information in a manner similar-or-identical to the 1 given above for the number-correct (NC) score. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Boedeker, Peter
2017-01-01
Hierarchical linear modeling (HLM) is a useful tool when analyzing data collected from groups. There are many decisions to be made when constructing and estimating a model in HLM including which estimation technique to use. Three of the estimation techniques available when analyzing data with HLM are maximum likelihood, restricted maximum…
Laparoscopic total pancreatectomy
Wang, Xin; Li, Yongbin; Cai, Yunqiang; Liu, Xubao; Peng, Bing
2017-01-01
Abstract Rationale: Laparoscopic total pancreatectomy is a complicated surgical procedure and rarely been reported. This study was conducted to investigate the safety and feasibility of laparoscopic total pancreatectomy. Patients and Methods: Three patients underwent laparoscopic total pancreatectomy between May 2014 and August 2015. We reviewed their general demographic data, perioperative details, and short-term outcomes. General morbidity was assessed using Clavien–Dindo classification and delayed gastric emptying (DGE) was evaluated by International Study Group of Pancreatic Surgery (ISGPS) definition. Diagnosis and Outcomes: The indications for laparoscopic total pancreatectomy were intraductal papillary mucinous neoplasm (IPMN) (n = 2) and pancreatic neuroendocrine tumor (PNET) (n = 1). All patients underwent laparoscopic pylorus and spleen-preserving total pancreatectomy, the mean operative time was 490 minutes (range 450–540 minutes), the mean estimated blood loss was 266 mL (range 100–400 minutes); 2 patients suffered from postoperative complication. All the patients recovered uneventfully with conservative treatment and discharged with a mean hospital stay 18 days (range 8–24 days). The short-term (from 108 to 600 days) follow up demonstrated 3 patients had normal and consistent glycated hemoglobin (HbA1c) level with acceptable quality of life. Lessons: Laparoscopic total pancreatectomy is feasible and safe in selected patients and pylorus and spleen preserving technique should be considered. Further prospective randomized studies are needed to obtain a comprehensive understanding the role of laparoscopic technique in total pancreatectomy. PMID:28099344
Estonian total ozone climatology
K. Eerme
Full Text Available The climatological characteristics of total ozone over Estonia based on the Total Ozone Mapping Spectrometer (TOMS data are discussed. The mean annual cycle during 1979–2000 for the site at 58.3° N and 26.5° E is compiled. The available ground-level data interpolated before TOMS, have been used for trend detection. During the last two decades, the quasi-biennial oscillation (QBO corrected systematic decrease of total ozone from February–April was 3 ± 2.6% per decade. Before 1980, a spring decrease was not detectable. No decreasing trend was found in either the late autumn ozone minimum or in the summer total ozone. The QBO related signal in the spring total ozone has an amplitude of ± 20 DU and phase lag of 20 months. Between 1987–1992, the lagged covariance between the Singapore wind and the studied total ozone was weak. The spring (April–May and summer (June–August total ozone have the best correlation (coefficient 0.7 in the yearly cycle. The correlation between the May and August total ozone is higher than the one between the other summer months. Seasonal power spectra of the total ozone variance show preferred periods with an over 95% significance level. Since 1986, during the winter/spring, the contribution period of 32 days prevails instead of the earlier dominating 26 days. The spectral densities of the periods from 4 days to 2 weeks exhibit high interannual variability.
Key words. Atmospheric composition and structure (middle atmosphere – composition and chemistry; volcanic effects – Meteorology and atmospheric dynamics (climatology
Evaluation of scoring systems in predicting acute appendicitis in children.
Macco, Sven; Vrouenraets, Bart C; de Castro, Steve M M
2016-12-01
Acute appendicitis can be difficult to diagnose, especially in children. Appendicitis scoring systems have been developed as a diagnostic tool to improve the decision-making process in patients with suspected acute appendicitis. This study evaluates the Appendicitis Inflammatory Response score, Alvarado score, and Pediatric Appendicitis Score in children suspected of acute appendicitis. Data were collected retrospectively. All children younger than 18 years suspected of acute appendicitis who presented to the emergency department between January 2006 and June 2014 were included in this study. Variables were registered to evaluate 3 different appendicitis scoring systems. The diagnostic performance of the 3 scores was analyzed using the area under the receiver-operating curve and by calculating the diagnostic performances at different cut-off points. The present study included 747 consecutive children. There were 399 boys (53%) and 348 girls (47%) with a mean age of 11 years (range, 1-17 years). In total, 269 children (36%) were diagnosed with acute appendicitis. The area under the receiver-operating curve of the Appendicitis Inflammatory Response score was 0.90, the Alvarado score was 0.87, and the Pediatric Appendicitis Score was 0.82 (P Appendicitis Inflammatory Response score were better at predicting an acute appendicitis than that of the Alvarado score and Pediatric Appendicitis Score. In children with a low-risk acute appendicitis, false negative rates of 14% for the Appendicitis Inflammatory Response, 7% for the Alvarado, and 18% for the Pediatric Appendicitis Score were measured. In this study, the Appendicitis Inflammatory Response score had the highest discriminating power and outperformed the Alvarado score and Pediatric Appendicitis Score in predicting acute appendicitis in children. Excluding acute appendicitis safely in children with the scoring systems still remains uncertain. Copyright © 2016 Elsevier Inc. All rights reserved.
Total Water Management - slides
Total Water Management (TWM) examines urban water systems in an interconnected manner. It encompasses reducing water demands, increasing water recycling and reuse, creating water supply assets from stormwater management, matching water quality to end-use needs, and achieving envi...
NONE
2005-02-01
This document presents the 2004 results of Total Group: consolidated account, special items, number of shares, market environment, adjustment for amortization of Sanofi-Aventis merger-related intangibles, 4. quarter 2004 results (operating and net incomes, cash flow), upstream (results, production, reserves, recent highlights), downstream (results, refinery throughput, recent highlights), chemicals (results, recent highlights), Total's full year 2004 results (operating and net income, cash flow), 2005 sensitivities, Total SA parent company accounts and proposed dividend, adoption of IFRS accounting, summary and outlook, main operating information by segment for the 4. quarter and full year 2004: upstream (combined liquids and gas production by region, liquids production by region, gas production by region), downstream (refined product sales by region, chemicals), Total financial statements: consolidated statement of income, consolidated balance sheet (assets, liabilities and shareholder's equity), consolidated statements of cash flows, business segments information. (J.S.)
The Revised Total Coliform Rule (RTCR) aims to increase public health protection through the reduction of potential pathways for fecal contamination in the distribution system of a public water system (PWS).
U.S. Geological Survey, Department of the Interior — Total ecosystem carbon includes above- and below-ground live plant components (such as leaf, branch, stem and root), dead biomass (such as standing dead wood, down...
Maximum speeds and alpha angles of flowing avalanches
McClung, David; Gauer, Peter
2016-04-01
A flowing avalanche is one which initiates as a slab and, if consisting of dry snow, will be enveloped in a turbulent snow dust cloud once the speed reaches about 10 m/s. A flowing avalanche has a dense core of flowing material which dominates the dynamics by serving as the driving force for downslope motion. The flow thickness typically on the order of 1 -10 m which is on the order of about 1% of the length of the flowing mass. We have collected estimates of maximum frontal speed um (m/s) from 118 avalanche events. The analysis is given here with the aim of using the maximum speed scaled with some measure of the terrain scale over which the avalanches ran. We have chosen two measures for scaling, from McClung (1990), McClung and Schaerer (2006) and Gauer (2012). The two measures are the √H0-;√S0-- (total vertical drop; total path length traversed). Our data consist of 118 avalanches with H0 (m)estimated and 106 with S0 (m)estimated. Of these, we have 29 values with H0 (m),S0 (m)and um (m/s)estimated accurately with the avalanche speeds measured all or nearly all along the path. The remainder of the data set includes approximate estimates of um (m/s)from timing the avalanche motion over a known section of the path where approximate maximum speed is expected and with either H0or S0or both estimated. Our analysis consists of fitting the values of um/√H0--; um/√S0- to probability density functions (pdf) to estimate the exceedance probability for the scaled ratios. In general, we found the best fits for the larger data sets to fit a beta pdf and for the subset of 29, we found a shifted log-logistic (s l-l) pdf was best. Our determinations were as a result of fitting the values to 60 different pdfs considering five goodness-of-fit criteria: three goodness-of-fit statistics :K-S (Kolmogorov-Smirnov); A-D (Anderson-Darling) and C-S (Chi-squared) plus probability plots (P-P) and quantile plots (Q-Q). For less than 10% probability of exceedance the results show that
Forecasting the value of credit scoring
Saad, Shakila; Ahmad, Noryati; Jaffar, Maheran Mohd
2017-08-01
Nowadays, credit scoring system plays an important role in banking sector. This process is important in assessing the creditworthiness of customers requesting credit from banks or other financial institutions. Usually, the credit scoring is used when customers send the application for credit facilities. Based on the score from credit scoring, bank will be able to segregate the "good" clients from "bad" clients. However, in most cases the score is useful at that specific time only and cannot be used to forecast the credit worthiness of the same applicant after that. Hence, bank will not know if "good" clients will always be good all the time or "bad" clients may become "good" clients after certain time. To fill up the gap, this study proposes an equation to forecast the credit scoring of the potential borrowers at a certain time by using the historical score related to the assumption. The Mean Absolute Percentage Error (MAPE) is used to measure the accuracy of the forecast scoring. Result shows the forecast scoring is highly accurate as compared to actual credit scoring.
Maximum likelihood molecular clock comb: analytic solutions.
Chor, Benny; Khetan, Amit; Snir, Sagi
2006-04-01
Maximum likelihood (ML) is increasingly used as an optimality criterion for selecting evolutionary trees, but finding the global optimum is a hard computational task. Because no general analytic solution is known, numeric techniques such as hill climbing or expectation maximization (EM), are used in order to find optimal parameters for a given tree. So far, analytic solutions were derived only for the simplest model--three taxa, two state characters, under a molecular clock. Four taxa rooted trees have two topologies--the fork (two subtrees with two leaves each) and the comb (one subtree with three leaves, the other with a single leaf). In a previous work, we devised a closed form analytic solution for the ML molecular clock fork. In this work, we extend the state of the art in the area of analytic solutions ML trees to the family of all four taxa trees under the molecular clock assumption. The change from the fork topology to the comb incurs a major increase in the complexity of the underlying algebraic system and requires novel techniques and approaches. We combine the ultrametric properties of molecular clock trees with the Hadamard conjugation to derive a number of topology dependent identities. Employing these identities, we substantially simplify the system of polynomial equations. We finally use tools from algebraic geometry (e.g., Gröbner bases, ideal saturation, resultants) and employ symbolic algebra software to obtain analytic solutions for the comb. We show that in contrast to the fork, the comb has no closed form solutions (expressed by radicals in the input data). In general, four taxa trees can have multiple ML points. In contrast, we can now prove that under the molecular clock assumption, the comb has a unique (local and global) ML point. (Such uniqueness was previously shown for the fork.).
Curtis, Alexander E; Smith, Tanya A; Ziganshin, Bulat A; Elefteriades, John A
2016-08-01
Reliable methods for measuring the thoracic aorta are critical for determining treatment strategies in aneurysmal disease. Z-scores are a pragmatic alternative to raw diameter sizes commonly used in adult medicine. They are particularly valuable in the pediatric population, who undergo rapid changes in physical development. The advantage of the Z-score is its inclusion of body surface area (BSA) in determining whether an aorta is within normal size limits. Therefore, Z-scores allow us to determine whether true pathology exists, which can be challenging in growing children. In addition, Z-scores allow for thoughtful interpretation of aortic size in different genders, ethnicities, and geographical regions. Despite the advantages of using Z-scores, there are limitations. These include intra- and inter-observer bias, measurement error, and variations between alternative Z-score nomograms and BSA equations. Furthermore, it is unclear how Z-scores change in the normal population over time, which is essential when interpreting serial values. Guidelines for measuring aortic parameters have been developed by the American Society of Echocardiography Pediatric and Congenital Heart Disease Council, which may reduce measurement bias when calculating Z-scores for the aortic root. In addition, web-based Z-score calculators have been developed to aid in efficient Z-score calculations. Despite these advances, clinicians must be mindful of the limitations of Z-scores, especially when used to demonstrate beneficial treatment effect. This review looks to unravel the mystery of the Z-score, with a focus on the thoracic aorta. Here, we will discuss how Z-scores are calculated and the limitations of their use.
Medium-term evaluation of total knee arthroplasty without patellar replacement
José Wanderley Vasconcelos
2013-06-01
Full Text Available OBJECTIVE: To mid-term evaluate patients who were submitted to total knee arthroplasty without patellar resurfacing. METHODS: It was realized a retrospective cross-sectional study of patients who were submitted to total knee arthroplasty without patellar resurfacing. In all patients clinical examination was done based on the protocol of the Knee Society Scoring System, which assessed pain, range of motion, stability, contraction, knee alignment and function, and radiological evaluation. RESULTS: A total of 36 patients were evaluated. Of these, 07 were operated only on left knee, 12 only on right knee and 17 were operated bilaterally, totaling 53 knees. Ages ranged from 26 to 84 years. Of the 53 knees evaluated, 33 (62.26% had no pain. The maximum flexion range of motion averaged 104.7°. No knee had difficulty in active extension. As to the alignment for anatomical axis twelve knees (22.64% showed deviation between 0° and 4° varus. Thirty-nine (75.49% knees showed pace without restriction and the femorotibial angle ranged between 3° varus and 13° valgus with an average of 5° valgus. The patellar index ranged from 0.2 to 1.1. CONCLUSION: Total knee arthroplasty whitout patellar resurfacing provides good results in mid-term evaluation.
Nakayama, Meijin; Yao, Kazuo; Nishiyama, Kouichirou; Nagai, Hiromi; Ito, Akihiko; Yokobori, Satoru; Okamoto, Makito; Hirose, Hajime
2002-01-01
We studied postoperative swallowing in 4 patients undergoing CHEP and 1 undergoing CHP. Swallowing was obtained by intense swallowing rehabilitation since only 1/4 of the larynx remained after near-total laryngectomy. Our swallowing rehabilitation program is detailed in this paper. The improvement of swallowing is classified into 3 stages. In stage I, volus directly intrudes into the trachea. In stage II, volus stagnates between laryngeal inlet and tracheal stoma. In stage III, volus directly flows through the esophageal inlet. Stage III indicates that rehabilitation is almost completed. Stage I is shortest at 2 to 14 days and Stage II longest at 7 to 80 days. The MTF (Method, Time, Food) score described by Fujimoto et al was used to analyze swallowing. Three cases following CHEP showed high scores shortly after the introduction of rehabilitation and reached the maximum score at discharge (15 points = normal swallowing). At present, these 3 patients are satisfied with swallowing and enjoy a good quality of life. In 2 other cases (1 CHEP and 1 CHP), both had a wide laryngeal inlet and still have some difficulty with liquids. Further modification of the surgical technique is needed especially for CHP.
Heo, So Young; Park, Noh Hyuck; Park, Chan Sub; Seong, Su Ok [Dept. of Radiology, Myongji Hospital, Seonam University College of Medicine, Goyang (Korea, Republic of)
2016-02-15
We explored the association between Framingham risk score (FRS) and coronary artery calcium score (CACS) in asymptomatic Korean individuals. We retrospectively analyzed 2216 participants who underwent routine health screening and CACS using the 64-slice multidetector computed tomography between January 2010 and June 2014. Relationship between CACS and FRS, and factors associated with discrepancy between CACS and FRS were analyzed. CACS and FRS were positively correlated (p < 0.0001). However, in 3.7% of participants with low coronary event risk and high CACS, age, male gender, smoker, hypertension, total cholesterol, diabetes mellitus, and body mass index (BMI; ≥ 35) were associated with the discrepancy. In the diagnostic prediction model for discrepancy, the receiver operating characteristic curve including factors associated with FRS, diastolic blood pressure (≥ 75 mm Hg), diabetes mellitus, and BMI (≥ 35) showed that the area under the curve was 0.854 (95% confidence interval, 0.819–0.890), indicating good sensitivity. Diabetes mellitus or obesity (BMI ≥ 35) compensate for the weakness of FRS and may be potential indicators for application of CACS in asymptomatic Koreans with low coronary event risk.
Sutherland, D.E.; Ferguson, R.M.; Simmons, R.L.; Kim, T.H.; Slavin, S.; Najarian, J.S.
1983-05-01
Total lymphoid irradiation by itself can produce sufficient immunosuppression to prolong the survival of a variety of organ allografts in experimental animals. The degree of prolongation is dose-dependent and is limited by the toxicity that occurs with higher doses. Total lymphoid irradiation is more effective before transplantation than after, but when used after transplantation can be combined with pharmacologic immunosuppression to achieve a positive effect. In some animal models, total lymphoid irradiation induces an environment in which fully allogeneic bone marrow will engraft and induce permanent chimerism in the recipients who are then tolerant to organ allografts from the donor strain. If total lymphoid irradiation is ever to have clinical applicability on a large scale, it would seem that it would have to be under circumstances in which tolerance can be induced. However, in some animal models graft-versus-host disease occurs following bone marrow transplantation, and methods to obviate its occurrence probably will be needed if this approach is to be applied clinically. In recent years, patient and graft survival rates in renal allograft recipients treated with conventional immunosuppression have improved considerably, and thus the impetus to utilize total lymphoid irradiation for its immunosuppressive effect alone is less compelling. The future of total lymphoid irradiation probably lies in devising protocols in which maintenance immunosuppression can be eliminated, or nearly eliminated, altogether. Such protocols are effective in rodents. Whether they can be applied to clinical transplantation remains to be seen.
Accelerated protein structure comparison using TM-score-GPU.
Hung, Ling-Hong; Samudrala, Ram
2012-08-15
Accurate comparisons of different protein structures play important roles in structural biology, structure prediction and functional annotation. The root-mean-square-deviation (RMSD) after optimal superposition is the predominant measure of similarity due to the ease and speed of computation. However, global RMSD is dependent on the length of the protein and can be dominated by divergent loops that can obscure local regions of similarity. A more sophisticated measure of structure similarity, Template Modeling (TM)-score, avoids these problems, and it is one of the measures used by the community-wide experiments of critical assessment of protein structure prediction to compare predicted models with experimental structures. TM-score calculations are, however, much slower than RMSD calculations. We have therefore implemented a very fast version of TM-score for Graphical Processing Units (TM-score-GPU), using a new and novel hybrid Kabsch/quaternion method for calculating the optimal superposition and RMSD that is designed for parallel applications. This acceleration in speed allows TM-score to be used efficiently in computationally intensive applications such as for clustering of protein models and genome-wide comparisons of structure. TM-score-GPU was applied to six sets of models from Nutritious Rice for the World for a total of 3 million comparisons. TM-score-GPU is 68 times faster on an ATI 5870 GPU, on average, than the original CPU single-threaded implementation on an AMD Phenom II 810 quad-core processor. The complete source, including the GPU code and the hybrid RMSD subroutine, can be downloaded and used without restriction at http://software.compbio.washington.edu/misc/downloads/tmscore/. The implementation is in C++/OpenCL.
Assessment of interobserver concordance in polysomnography scoring of sleep bruxism☆
Ferraz, Otávio; de Moura Guimarães, Thais; Maluly Filho, Milton; Dal-Fabbro, Cibele; Abraão Crosara Cunha, Thays; Cristina Lotaif, Ana; Cristina Barros Schütz, Teresa; Santos-Silva, Rogério; Tufik, Sergio; Bittencourt, Lia
2015-01-01
Introduction Objective evaluation of sleep bruxism (SB) using whole-night polysomnography (PSG) is relevant for diagnostic confirmation. Nevertheless, the PSG electromyogram (EMG) scoring may give rise to controversy, particularly when audiovisual monitoring is not performed. Therefore, the present study assessed the concordance between two independent scorers to visual SB on a PSG performed without audiovisual monitoring. Methods Fifty-six PSG tests were scored from individuals with clinical history and polysomnography criteria of SB. In addition to the protocol of conventional whole-night PSG, electrodes were also placed bilaterally on the masseter and temporal muscles. Visual EMG scoring without audio video monitoring was scored by two independent scorers (Dentist 1 and Dentist 2) according the recommendations formulated in the AASM manual (2007). Kendall Tau correlation was used to assess interobserver concordance relative to variables “total duration of events (seconds), “shortest events”, “longest events” and index in each phasic, tonic or mixed event. Results The correlation was positive and significant relative to all the investigated variables, being T>0.54. Conclusion It was found a good inter-examiner concordance rate in SB scoring in absence of audio video monitoring. PMID:26779318
A Novel scoring system for distinguishing keratoconus from normal eyes.
Oruçoğlu, Faik; Toker, Ebru
2016-10-01
To evaluate the accuracy of a novel scoring system in differentiation of keratoconus (KC) eyes from normal eyes using a Scheimpflug camera system tomography. Marmara University Hospital, Istanbul, Turkey and Birinci Eye Hospital, Istanbul, Turkey. Retrospective case-control study. The study included 624 keratoconus eyes and 512 healthy eyes. Thirty nine significant parameters obtained from the Scheimpflug imaging system (Pentacam-Oculus Optikgeräte GmbH, Wetzlar, Germany) were studied. The cut-off value and area under receiver operating characteristic (AUROC) curve analysis for each studied parameter were established in the previous study. Minus three and plus three standard deviations of the cut-off value were scored after multiplication of AUROC for each parameter. The sum of all scores (TKS; Total Keratoconus Score) was compared between keratoconus and normal eyes. Average TKS value was -29.57±5.65 (Range from -43.11 to -7.09) in normal eyes and 36.23±24.3 (Range from -16.82 to 97.45) in keratoconus eyes (pkeratoconus group from the normal group with 99% sensitivity and 99% specificity at the best cut-off point of -12.45. The new scoring system measured by the Scheimpflug imaging system provides perfect discrimination of keratoconus from normal corneas. Copyright © 2016 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.
Pemphigus Vulgaris Activity Score and Assessment of Convergent Validity
Cheyda Chams-Davatchi
2013-04-01
Full Text Available Pemphigus is a rare autoimmune blistering disease with different phenotypes. The evaluation of therapeutic interventions requires a reliable, valid and feasible to use measurement. However, there is no gold standard to measure the disease activity in clinical trials. In this study we aimed to introduce the pemphigus vulgaris activity score (PVAS measurement and to assess the convergent validity with the experts’ opinion of disease activity. In PVAS scoring, the distribution of pemphigus vulgaris antigen expression in different anatomical regions is taking in to account with special consideration of the healing process. PVAS is a 0-18 scale, based on the extent of mucocutaneous involvement, type of lesion and the presence of Nikolsky’s sign. The sum of the scores of total number of lesions, number of different anatomic regions involvement and Nikolsky’s sign is weighted by the type of lesion. In the present study, PVAS was assessed in 50 patients diagnosed with pemphigus vulgaris by one dermatologist. Independently, five blinded experts scored all the patients through physician’s global assessment (PGA. The convergent validity with experts’ opinion was assessed. The Spearman coefficient of correlation showed the acceptable value of 0.751 (95%CI: 0.534- 0.876. PVAS is a valid, objective and simple-to-use scoring measurement. It showed a good correlation with PGA of pemphigus disease activity in Iranian patients with pemphigus vulgaris
Lecture Evaluations by Medical Students: Concepts That Correlate With Scores.
Jen, Aaron; Webb, Emily M; Ahearn, Bren; Naeger, David M
2016-01-01
The didactic lecture remains one of the most popular teaching formats in medical education; yet, factors that most influence lecturing success in radiology education are unknown. The purpose of this study is to identify patterns of narrative student feedback that are associated with relatively higher and lower evaluation scores. All student evaluations from our core radiology elective during 1 year were compiled. All evaluation comments were tagged, to identify discrete descriptive concepts. Correlation coefficients were calculated, for each tag with mean evaluation scores. Tags that were the most strongly associated with the highest- versus lowest-rated (> or lectures were identified. A total of 3,262 comments, on 273 lectures, rated by 77 senior medical students, were analyzed. The mean lecture score was 8.96 ± 0.62. Three tags were significantly positively correlated with lecture score: "interactive"; "fun/engaging"; and "practical/important content" (r = 0.39, r = 0.34, and r = 0.32, respectively; all P lectures yielded similar results. Several factors were identified that were strongly associated with lecture score. Among the actionable characteristics, interactive lectures with appropriately targeted content (ie, practical/useful) were the most highly rated. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Pemphigus vulgaris activity score and assessment of convergent validity.
Chams-Davatchi, Cheyda; Rahbar, Ziba; Daneshpazhooh, Maryam; Mortazavizadeh, Seyed Mohammad Ali; Akhyani, Maryam; Esmaili, Nafiseh; Balighi, Kamran
2013-05-07
Pemphigus is a rare autoimmune blistering disease with different phenotypes. The evaluation of therapeutic interventions requires a reliable, valid and feasible to use measurement. However, there is no gold standard to measure the disease activity in clinical trials. In this study we aimed to introduce the pemphigus vulgaris activity score (PVAS) measurement and to assess the convergent validity with the experts' opinion of disease activity. In PVAS scoring, the distribution of pemphigus vulgaris antigen expression in different anatomical regions is taking in to account with special consideration of the healing process. PVAS is a 0-18 scale, based on the extent of mucocutaneous involvement, type of lesion and the presence of Nikolsky's sign. The sum of the scores of total number of lesions, number of different anatomic regions involvement and Nikolsky's sign is weighted by the type of lesion. In the present study, PVAS was assessed in 50 patients diagnosed with pemphigus vulgaris by one dermatologist. Independently, five blinded experts scored all the patients through physician's global assessment (PGA). The convergent validity with experts' opinion was assessed. The Spearman coefficient of correlation showed the acceptable value of 0.751 (95%CI: 0.534- 0.876). PVAS is a valid, objective and simple-to-use scoring measurement. It showed a good correlation with PGA of pemphigus disease activity in Iranian patients with pemphigus vulgaris.
Prediction of perineal tear during childbirth by assessment of striae gravidarum score
Shital Kapadia; Swena Kapoor; Kartikeya Parmar; Kavita Patadia; Monark Vyas
2014-01-01
Background: The objective of this study was to explore the association between striae gravidarum and the risk for perineal tear during childbirth. Methods: Three hundred patients delivered normally were included in this study. Striae gravidarum score was assessed using the Atwal numerical scoring system. The association was examined between striae and perineal tear as the outcome measure, defined by tears or laceration, and the total striae scores (TSS) was obtained. Results: Mean age...
Random Walk Picture of Basketball Scoring
Gabel, Alan
2011-01-01
We present evidence, based on play-by-play data from all 6087 games from the 2006/07--2009/10 seasons of the National Basketball Association (NBA), that basketball scoring is well described by a weakly-biased continuous-time random walk. The time between successive scoring events follows an exponential distribution, with little memory between different scoring intervals. Using this random-walk picture that is augmented by features idiosyncratic to basketball, we account for a wide variety of statistical properties of scoring, such as the distribution of the score difference between opponents and the fraction of game time that one team is in the lead. By further including the heterogeneity of team strengths, we build a computational model that accounts for essentially all statistical features of game scoring data and season win/loss records of each team.
Bounds on the inverse signed total domination numbers in graphs
M. Atapour
2016-01-01
Full Text Available Let \\(G=(V,E\\ be a simple graph. A function \\(f:V\\rightarrow \\{-1,1\\}\\ is called an inverse signed total dominating function if the sum of its function values over any open neighborhood is at most zero. The inverse signed total domination number of \\(G\\, denoted by \\(\\gamma_{st}^0(G\\, equals to the maximum weight of an inverse signed total dominating function of \\(G\\. In this paper, we establish upper bounds on the inverse signed total domination number of graphs in terms of their order, size and maximum and minimum degrees.
Maximum mass of a barotropic spherical star
Fujisawa, Atsuhito; Yoo, Chul-Moon; Nambu, Yasusada
2015-01-01
The ratio of total mass $M$ to surface radius $R$ of spherical perfect fluid ball has an upper bound, $M/R < B$. Buchdahl obtained $B = 4/9$ under the assumptions; non-increasing mass density in outward direction, and barotropic equation of states. Barraco and Hamity decreased the Buchdahl's bound to a lower value $B = 3/8$ $(< 4/9)$ by adding the dominant energy condition to Buchdahl's assumptions. In this paper, we further decrease the Barraco-Hamity's bound to $B \\simeq 0.3636403$ $(< 3/8)$ by adding the subluminal (slower-than-light) condition of sound speed. In our analysis, we solve numerically Tolman-Oppenheimer-Volkoff equations, and the mass-to-radius ratio is maximized by variation of mass, radius and pressure inside the fluid ball as functions of mass density.
Dilemmas in Uncemented Total Hip Arthroplasty
Goosen, J. H. M.
2009-01-01
In this thesis, different aspects that are related to the survivorship and clinical outcome in uncemented total hip arthroplasty are analysed. In Chapter 2, the survival rate, Harris Hip score and radiographic features of a proximally hydroxyapatite coated titanium alloy femoral stem (Bi-Metric, Biomet) was evaluated. In conclusion, at an average follow-up of 8 years, this proximally HA-coated femoral component showed favorable clinical and radiological outcome and excellent survivorship. In ...
The Prediction of Maximum Amplitudes of Solar Cycles and the Maximum Amplitude of Solar Cycle 24
无
2002-01-01
We present a brief review of predictions of solar cycle maximum ampli-tude with a lead time of 2 years or more. It is pointed out that a precise predictionof the maximum amplitude with such a lead-time is still an open question despiteprogress made since the 1960s. A method of prediction using statistical character-istics of solar cycles is developed: the solar cycles are divided into two groups, ahigh rising velocity (HRV) group and a low rising velocity (LRV) group, dependingon the rising velocity in the ascending phase for a given duration of the ascendingphase. The amplitude of Solar Cycle 24 can be predicted after the start of thecycle using the formula derived in this paper. Now, about 5 years before the startof the cycle, we can make a preliminary prediction of 83.2-119.4 for its maximumamplitude.
Scoring functions for AutoDock.
Hill, Anthony D; Reilly, Peter J
2015-01-01
Automated docking allows rapid screening of protein-ligand interactions. A scoring function composed of a force field and linear weights can be used to compute a binding energy from a docked atom configuration. For different force fields or types of molecules, it may be necessary to train a custom scoring function. This chapter describes the data and methods one must consider in developing a custom scoring function for use with AutoDock.
Inter-expert and intra-expert reliability in sleep spindle scoring
Wendt, Sabrina Lyngbye; Welinder, Peter; Sørensen, Helge Bjarup Dissing
2015-01-01
with higher reliability than the estimation of spindle duration. Reliability of sleep spindle scoring can be improved by using qualitative confidence scores, rather than a dichotomous yes/no scoring system. Conclusions We estimate that 2–3 experts are needed to build a spindle scoring dataset......Objectives To measure the inter-expert and intra-expert agreement in sleep spindle scoring, and to quantify how many experts are needed to build a reliable dataset of sleep spindle scorings. Methods The EEG dataset was comprised of 400 randomly selected 115 s segments of stage 2 sleep from 110...... sleeping subjects in the general population (57 ± 8, range: 42–72 years). To assess expert agreement, a total of 24 Registered Polysomnographic Technologists (RPSGTs) scored spindles in a subset of the EEG dataset at a single electrode location (C3-M2). Intra-expert and inter-expert agreements were...
Pneumonia severity scores in resource poor settings
Jamie Rylance
2014-06-01
Full Text Available Clinical prognostic scores are increasingly used to streamline care in well-resourced settings. The potential benefits of identifying patients at risk of clinical deterioration and poor outcome, delivering appropriate higher level clinical care, and increasing efficiency are clear. In this focused review, we examine the use and applicability of severity scores applied to patients with community acquired pneumonia in resource poor settings. We challenge clinical researchers working in such systems to consider the generalisability of existing severity scores in their populations, and where performance of scores is suboptimal, to promote efforts to develop and validate new tools for the benefit of patients and healthcare systems.
Security Risk Scoring Incorporating Computers' Environment
Eli Weintraub
2016-04-01
Full Text Available A framework of a Continuous Monitoring System (CMS is presented, having new improved capabilities. The system uses the actual real-time configuration of the system and environment characterized by a Configuration Management Data Base (CMDB which includes detailed information of organizational database contents, security and privacy specifications. The Common Vulnerability Scoring Systems' (CVSS algorithm produces risk scores incorporating information from the CMDB. By using the real updated environmental characteristics the system enables achieving accurate scores compared to existing practices. Framework presentation includes systems' design and an illustration of scoring computations.
Coronary artery calcium score: current status
Neves, Priscilla Ornellas; Andrade, Joalbo; Monção, Henry
2017-01-01
The coronary artery calcium score plays an Important role In cardiovascular risk stratification, showing a significant association with the medium- or long-term occurrence of major cardiovascular events. Here, we discuss the following: protocols for the acquisition and quantification of the coronary artery calcium score by multidetector computed tomography; the role of the coronary artery calcium score in coronary risk stratification and its comparison with other clinical scores; its indications, interpretation, and prognosis in asymptomatic patients; and its use in patients who are symptomatic or have diabetes. PMID:28670030
[The cardiovascular surgeon and the Syntax score].
Gómez-Sánchez, Mario; Soulé-Egea, Mauricio; Herrera-Alarcón, Valentín; Barragán-García, Rodolfo
2015-01-01
The Syntax score has been established as a tool to determine the complexity of coronary artery disease and as a guide for decision-making among coronary artery bypass surgery and percutaneous coronary intervention. The purpose of this review is to systematically examine what the Syntax score is, and how the surgeon should integrate the information in the selection and treatment of patients. We reviewed the results of the SYNTAX Trial, the clinical practice guidelines, as well as the benefits and limitations of the score. Finally we discuss the future directions of the Syntax score.
Friedrich, S
2008-08-11
The total energy monitor (TE) is a thermal sensor that determines the total energy of each FEL pulse based on the temperature rise induced in a silicon wafer upon absorption of the FEL. The TE provides a destructive measurement of the FEL pulse energy in real-time on a pulse-by-pulse basis. As a thermal detector, the TE is expected to suffer least from ultra-fast non-linear effects and to be easy to calibrate. It will therefore primarily be used to cross-calibrate other detectors such as the Gas Detector or the Direct Imager during LCLS commissioning. This document describes the design of the TE and summarizes the considerations and calculations that have led to it. This document summarizes the physics behind the operation of the Total Energy Monitor at LCLS and derives associated engineering specifications.
Algebraic totality, towards completeness
Tasson, Christine
2009-01-01
Finiteness spaces constitute a categorical model of Linear Logic (LL) whose objects can be seen as linearly topologised spaces, (a class of topological vector spaces introduced by Lefschetz in 1942) and morphisms as continuous linear maps. First, we recall definitions of finiteness spaces and describe their basic properties deduced from the general theory of linearly topologised spaces. Then we give an interpretation of LL based on linear algebra. Second, thanks to separation properties, we can introduce an algebraic notion of totality candidate in the framework of linearly topologised spaces: a totality candidate is a closed affine subspace which does not contain 0. We show that finiteness spaces with totality candidates constitute a model of classical LL. Finally, we give a barycentric simply typed lambda-calculus, with booleans ${\\mathcal{B}}$ and a conditional operator, which can be interpreted in this model. We prove completeness at type ${\\mathcal{B}}^n\\to{\\mathcal{B}}$ for every n by an algebraic metho...
[Total temporomandibular joint prostheses].
Zwetyenga, N; Amroun, S; Wajszczak, B-L; Moris, V
2016-09-01
The temporomandibular joint (TMJ) is probably the most complex human joint. As in all joints, its prosthetic replacement may be indicated in selected cases. Significant advances have been made in the design of TMJ prostheses during the last three decades and the indications have been clarified. The aim of our work was to make an update on the current total TMJ total joint replacement. Indications, contraindications, prosthetic components, advantages, disadvantages, reasons for failure or reoperation, virtual planning and surgical protocol have been exposed. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
On the total domatic number of regular graphs
H. Aram
2012-03-01
Full Text Available A set S of vertices of a graph G = (V;E without isolated vertex is a total dominating set if every vertex of V (G is adjacent to some vertex in S. The total domatic number of a graph G is the maximum number of total dominating sets into which the vertex set of G can be partitioned. We show that the total domatic number of a random r-regular graph is almost surely at most r
Pattern formation, logistics, and maximum path probability
Kirkaldy, J. S.
1985-05-01
The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are
Widening clinical applications of the SYNTAX Score.
Farooq, Vasim; Head, Stuart J; Kappetein, Arie Pieter; Serruys, Patrick W
2014-02-01
The SYNTAX Score (http://www.syntaxscore.com) has established itself as an anatomical based tool for objectively determining the complexity of coronary artery disease and guiding decision-making between coronary artery bypass graft (CABG) surgery and percutaneous coronary intervention (PCI). Since the landmark SYNTAX (Synergy between PCI with Taxus and Cardiac Surgery) Trial comparing CABG with PCI in patients with complex coronary artery disease (unprotected left main or de novo three vessel disease), numerous validation studies have confirmed the clinical validity of the SYNTAX Score for identifying higher-risk subjects and aiding decision-making between CABG and PCI in a broad range of patient types. The SYNTAX Score is now advocated in both the European and US revascularisation guidelines for decision-making between CABG and PCI as part of a SYNTAX-pioneered heart team approach. Since establishment of the SYNTAX Score, widening clinical applications of this clinical tool have emerged. The purpose of this review is to systematically examine the widening applications of tools based on the SYNTAX Score: (1) by improving the diagnostic accuracy of the SYNTAX Score by adding a functional assessment of lesions; (2) through amalgamation of the anatomical SYNTAX Score with clinical variables to enhance decision-making between CABG and PCI, culminating in the development and validation of the SYNTAX Score II, in which objective and tailored decisions can be made for the individual patient; (3) through assessment of completeness of revascularisation using the residual and post-CABG SYNTAX Scores for PCI and CABG patients, respectively. Finally, the future direction of the SYNTAX Score is covered through discussion of the ongoing development of a non-invasive, functional SYNTAX Score and review of current and planned clinical trials.
Total Quality Management Simplified.
Arias, Pam
1995-01-01
Maintains that Total Quality Management (TQM) is one method that helps to monitor and improve the quality of child care. Lists four steps for a child-care center to design and implement its own TQM program. Suggests that quality assurance in child-care settings is an ongoing process, and that TQM programs help in providing consistent, high-quality…
Total versus subtotal hysterectomy
Gimbel, Helga; Zobbe, Vibeke; Andersen, Anna Birthe;
2005-01-01
The aim of this study was to compare total and subtotal abdominal hysterectomy for benign indications, with regard to urinary incontinence, postoperative complications, quality of life (SF-36), constipation, prolapse, satisfaction with sexual life, and pelvic pain at 1-year postoperative. Eighty...
Total Quality Management Simplified.
Arias, Pam
1995-01-01
Maintains that Total Quality Management (TQM) is one method that helps to monitor and improve the quality of child care. Lists four steps for a child-care center to design and implement its own TQM program. Suggests that quality assurance in child-care settings is an ongoing process, and that TQM programs help in providing consistent, high-quality…
Total Quality Management Seminar.
Massachusetts Career Development Inst., Springfield.
This booklet is one of six texts from a workplace literacy curriculum designed to assist learners in facing the increased demands of the workplace. The booklet contains seven sections that cover the following topics: (1) meaning of total quality management (TQM); (2) the customer; (3) the organization's culture; (4) comparison of management…
Slavković Nemanja
2012-01-01
Full Text Available Total hip arthroplasty is most common reconstructive hip procedure in adults. In this surgery we replace some parts of the upper femur and acetabulum with biocompatible materials. The main goal of this surgery is to eliminate pain and regain full extent of joint motion, maintaining hip stability. Surgical technique, biomaterials, design of the prosthesis and fixation techniques have evolved with time adjusting to each other. After total hip arthroplasty patients’ quality of life should be improved. There are many various postoperative complications. Some of them are fatal, and some are minor, which may become manifested years after surgery. Each next surgical procedure following previous hip surgery is associated with considerably lower chances to be successful. Therefore, in primary total hip arthroplasty, preoperative evaluation and preparation of patients are essential. Every orthopaedic surgeon needs to improve already adopted surgical skills applying them with precision and without compromise, with the main goal to achieve long-term durability of the selected implant. The number of total hip arthroplasties will also increase in future, and newer and higher quality materials will be used.
CSF total protein is a test to determine the amount of protein in your spinal fluid, also called cerebrospinal fluid (CSF). ... The normal protein range varies from lab to lab, but is typically about 15 to 60 milligrams per deciliter (mg/dL) ...
Zachariassen, Frederik
2007-01-01
Total Cost of Ownership (TCO), som giver et bud på, hvordan virksomheder kan opnå en bedre indsigt i, hvilke leverandører der forårsager hvilke omkostninger og dermed danne et forbedret beslutningsgrundlag for besparelser i leverandørleddet. I artiklen argumenteres først og fremmest for, hvorfor TCO er...
Supravaginal eller total hysterektomi?
Edvardsen, L; Madsen, E M
1994-01-01
is examined. It is concluded that the risk of developing carcinoma of the cervical stump is low, and no longer a weighty indication for the total in preference to the supravaginal hysterectomy as long as subsequent screening of the cervix is performed. At the same time it is important to inform the women...
Saya, J.M.; Vos, K.; Klein Nijenhuis, R.A.; van Maarseveen, J.H.; Ingemann, S.; Hiemstra, H.
2015-01-01
A total synthesis of the sesquiterpene lactone aquatolide has been accomplished. The central step is an intramolecular [2 + 2]-photocycloaddition of an allene onto an alpha,beta-unsaturated delta-lactone. Other key steps are an intramolecular Horner-Wadsworth-Emmons reaction to close the lactone and
Schrøder, Henrik M.; Petersen, Michael M.
2016-01-01
Total knee arthroplasty (TKA) is a successful treatment of the osteoarthritic knee, which has increased dramatically over the last 30 years. The indication is a painful osteoarthritic knee with relevant radiographic findings and failure of conservative measures like painkillers and exercise. Trea...
HU Wen-Xiang; WANG Jian-Ying; XU Ming
2003-01-01
@@ Naloxone (1) is one of the 14-hydroxyl substituted opium antagonists which are valuable medications for treat ment of opiate abuse, opiate overdose, and alcohol addiction. Here, the total synthesis of naloxone was described. We selected 2,6-dihydroxynaphalene (2) as the starting material.
Focus in Change, 1992
1992-01-01
The philosophy known as Total Quality Management (TQM) is frequently presented as a way to change and improve public education. This issue of "Focus in Change" examines Deming's original 14 TQM points and their application to education. Myron Tribus lays out the core philosophy of the movement and discusses its possible application to…
Saturation biopsy improves preoperative Gleason scoring of prostate cancer.
Kahl, Philip; Wolf, Susanne; Adam, Alexander; Heukamp, Lukas Carl; Ellinger, Jörg; Vorreuther, Roland; Solleder, Gerold; Buettner, Reinhard
2009-01-01
We evaluated the differences between conventional needle biopsy (CB) and saturation biopsy (SB) techniques with regard to the prediction of Gleason score, tumor stage, and insignificant prostate cancer. Data from a total number of 240 patients were analyzed. The main group, consisting of 185 patients, was diagnosed according to a saturation prostate needle biopsy protocol (SB), by which more than 12 cores were taken per biopsy. The control group was diagnosed using CB, by which 12 or less than 12 cores were taken per biopsy (n=55). In the main group, the Gleason score of the biopsy was confirmed in 19.5%, in the control group in 23.5% according to the prostatectomy specimen (p=0.50). Upgrading after the operation was found in 56.7% in the main group and in 60% in the control group (p=0.24). Downgrading after the operation was found in 23.9% in the main group and in 16.3% in the control group (p=0.24). If the Gleason score of the postoperative specimens differed by only one point from the biopsy, we considered this a minor deviation. In the main group, 59% of the carcinomas were preoperatively classified correctly or revealed minor deviation in Gleason scores. In contrast, only 47% of the carcinomas in the control group were assessed correctly or with minor deviation in Gleason scores. Thus, the main group demonstrated a better rate of preoperative prediction in tumor grading assessed by Gleason score (p=0.05). In addition, the Gleason scores of both protocols were assigned to three groups (Gleason 7), and the group changes from the biopsy to the prostatectomy specimen were found to be significantly more frequent in the CB group (p=0.04). There was no significant difference between the two types of biopsy techniques regarding tumor stage or the detection of insignificant carcinomas. The advantage of the extensive prostate needle biopsy technique (SB) is a better preoperative prediction of the Gleason score as well as the risk groups with Gleason scores 7. Both