WorldWideScience

Sample records for code strength values

  1. Reference Values of Grip Strength, Prevalence of Low Grip Strength, and Factors Affecting Grip Strength Values in Chinese Adults.

    Science.gov (United States)

    Yu, Ruby; Ong, Sherlin; Cheung, Osbert; Leung, Jason; Woo, Jean

    2017-06-01

    The objectives of this study were to update the reference values of grip strength, to estimate the prevalence of low grip strength, and to examine the impact of different aspects of measurement protocol on grip strength values in Chinese adults. A cross-sectional survey of Chinese men (n = 714) and women (n = 4014) aged 18-102 years was undertaken in different community settings in Hong Kong. Grip strength was measured with a digital dynamometer (TKK 5401 Grip-D; Takei, Niigata, Japan). Low grip strength was defined as grip strength 2 standard deviations or more below the mean for young adults. The effects of measurement protocol on grip strength values were examined in a subsample of 45 men and women with repeated measures of grip strength taken with a hydraulic dynamometer (Baseline; Fabrication Enterprises Inc, Irvington, NY), using pair t-tests, intraclass correlation coefficient, and Bland and Altman plots. Grip strength was greater among men than among women (P values than the Baseline hydraulic dynamometer (P values were also observed when the measurement was performed with the elbow extended in a standing position, compared with that with the elbow flexed at 90° in a sitting position, using the same dynamometer (P values of grip strength and estimated the prevalence of low grip strength among Chinese adults spanning a wide age range. These findings might be useful for risk estimation and evaluation of interventions. However, grip strength measurements should be interpreted with caution, as grip strength values can be affected by type of dynamometer used, assessment posture, and elbow position. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  2. Normal values for hand grip strength in healthy Nigerian adults ...

    African Journals Online (AJOL)

    Background: Assessment of hand grip strength is used in a wide range of clinical settings particularly during management of hand injuries and diseases affecting hand function. This study aimed to determine age and gender specific normal values of hand grip strength in healthy adults in Nigeria and compare values ...

  3. Normative values of eccentric hip abduction strength in novice runners

    DEFF Research Database (Denmark)

    Ramskov, D; Pedersen, M B; Kastrup, K

    2014-01-01

    PURPOSE: Low eccentric strength of the hip abductors, might increase the risk of patellofemoral pain syndrome and iliotibial band syndrome in runners. No normative values for maximal eccentric hip abduction strength have been established. Therefore the purpose of this study was to establish norma...

  4. Normative values of eccentric hip abduction strength in novice runners

    DEFF Research Database (Denmark)

    Jørgensen, Daniel Ramskov; Pedersen, Mette Broen; Kastrup, Kristrian

    2014-01-01

    normative values of maximal eccentric hip abduction strength in novice runners. METHODS: Novice healthy runners (n = 831) were recruited through advertisements at a hospital and a university. Maximal eccentric hip abduction strength was measured with a hand-held dynamometer. The demographic variables...... associated with maximal eccentric hip abduction strength from a univariate analysis were included in a multivariate linear regression model. Based on the results from the regression model, a regression equation for normative hip abduction strength is presented. RESULTS: A SIGNIFICANT DIFFERENCE IN MAXIMAL...

  5. Reference Values for Respiratory Muscle Strength in Children and Adolescents.

    Science.gov (United States)

    Hulzebos, Erik; Takken, Tim; Reijneveld, Elja A; Mulder, Mark M G; Bongers, Bart C

    2018-01-17

    Measurement of respiratory muscle function is important in the diagnosis of respiratory muscle disease, respiratory failure, to assess the impact of chronic diseases, and/or to evaluate respiratory muscle function after treatment. To establish reference values for maximal inspiratory and expiratory pressure, and the tension-time index at rest in healthy children and adolescents aged 8-19 years, as well as to present sex- and age-related reference centiles normalized for demographic and anthropometric determinants. In this cross-sectional observational study, demographic, anthropometric, and spirometric data were assessed, as well as data on respiratory muscle strength (PImax and PEmax) and work of breathing at rest (TT0.1), in a total of 251 children (117 boys and 134 girls; mean age 13.4 ± 2.9 years). Reference values are presented as reference centiles developed by use of the lambda, mu, sigma method. Boys had significantly higher PImax and PEmax values. Next to sex and age, fat-free mass appeared to be an important predictor of respiratory muscle strength. Reference centiles demonstrated a slight, almost linear increase in PImax with age in boys, and a less steep increase with age in girls. TT0.1 values did not differ between boys and girls and decreased linearly with age. This study provides reference values for respiratory muscle strength and work of breathing at rest. In addition to sex and age, fat-free mass was found to be an important predictor of respiratory muscle strength in boys and girls. © 2018 S. Karger AG, Basel.

  6. Do the Microshear Test Variables Affect the Bond Strength Values?

    Directory of Open Access Journals (Sweden)

    Andrea M. Andrade

    2012-01-01

    Full Text Available Little is known about the effect of specimen preparation and testing protocols on the micro-shear bond strength (μSBS results. To evaluate whether variations in polyethylene rod use affect (μSBS. Human dentin disks were randomly distributed into six groups (: polyethylene tube (3 levels and adhesive system (2 levels. In Group 1, polyethylene tubes filled with polymerized composite were placed on adhesive covered surfaces. Tubes were removed 24 h after water storage, leaving the rods only. In Group 2, the same procedure was performed; however, tubes were kept in place during testing. In Group 3, composite rods without tubes were placed on adhesive covered dentin. In all groups, adhesives were photoactivated after positioning filled tubes/rods on adhesive covered surfaces. Specimens were tested under shear mode and the data subjected to a two-way ANOVA and Tukey’s tests. Groups 1 and 2 resulted in statistically similar mean μSBS (; however, a greater number of pretest failures were observed for Group 1. Higher μSBS values were detected for Group 3, irrespective of adhesive system used (. Removing the polyethylene tube before composite rod is placed on dentin affects μSBS values.

  7. The Nursing Code of Ethics: Its Value, Its History.

    Science.gov (United States)

    Epstein, Beth; Turner, Martha

    2015-05-31

    To practice competently and with integrity, today's nurses must have in place several key elements that guide the profession, such as an accreditation process for education, a rigorous system for licensure and certification, and a relevant code of ethics. The American Nurses Association has guided and supported nursing practice through creation and implementation of a nationally accepted Code of Ethics for Nurses with Interpretive Statements. This article will discuss ethics in society, professions, and nursing and illustrate how a professional code of ethics can guide nursing practice in a variety of settings. We also offer a brief history of the Code of Ethics, discuss the modern Code of Ethics, and describe the importance of periodic revision, including the inclusive and thorough process used to develop the 2015 Code and a summary of recent changes. Finally, the article provides implications for practicing nurses to assure that this document is a dynamic, useful resource in a variety of healthcare settings.

  8. Characteristic compression strength of a brickwork masonry starting from the strength of its components. Experimental verification of analitycal equations of european codes

    OpenAIRE

    Rolando, A.

    2006-01-01

    In this paper the compression strength of a clay brickwork masonry bound with cement mortar is analyzed. The target is to obtain the characteristic compression strength of unreinforced brickwork masonry. This research try to test the validity of the analytical equations in European codes, comparing the experimental strength with the analytically obtained from the strength of its components (clay brick and cement mortar).En este artículo se analiza la resistencia a compresión de una ...

  9. The added value of measuring thumb and finger strength when comparing strength measurements in hypoplastic thumb patients.

    Science.gov (United States)

    Molenaar, H M Ties; Selles, Ruud W; de Kraker, Marjolein; Stam, Henk J; Hovius, Steven E R

    2013-10-01

    When interventions to the hand are aimed at improving function of specific fingers or the thumb, the RIHM (Rotterdam Intrinsic Hand Myometer) is a validated tool and offers more detailed information to assess strength of the involved joints besides grip and pinch measurements. In this study, strength was measured in 65 thumbs in 40 patients diagnosed with thumb hypoplasia. These 65 thumbs were classified according to Blauth. Longitudinal radial deficiencies were also classified. The strength measurements comprised of grip, tip, tripod and key pinch. Furthermore palmar abduction and opposition of the thumb as well as abduction of the index and little finger were measured with the RIHM. For all longitudinal radial deficiency patients, grip and pinch strength as well as palmar abduction and thumb opposition were significantly lower than reference values (P<0.001). However, strength in the index finger abduction and the little finger abduction was maintained or decreased to a lesser extent according to the degree of longitudinal radial deficiency. All strength values decreased with increasing Blauth-type. Blauth-type II hands (n=15) with flexor digitorum superficialis 4 opposition transfer including stabilization of the metacarpophalangeal joint showed a trend toward a higher opposition strength without reaching statistical significance (P=0.094),however compared to non-operated Blauth-type II hands (n=6) they showed a lower grip strength (P=0.019). The RIHM is comparable in accuracy to other strength dynamometers. Using the RIHM, we were able to illustrate strength patterns on finger-specific level, showing added value when evaluating outcome in patients with hand related problems. © 2013.

  10. A multidisciplinary approach to vascular surgery procedure coding improves coding accuracy, work relative value unit assignment, and reimbursement.

    Science.gov (United States)

    Aiello, Francesco A; Judelson, Dejah R; Messina, Louis M; Indes, Jeffrey; FitzGerald, Gordon; Doucet, Danielle R; Simons, Jessica P; Schanzer, Andres

    2016-08-01

    Vascular surgery procedural reimbursement depends on accurate procedural coding and documentation. Despite the critical importance of correct coding, there has been a paucity of research focused on the effect of direct physician involvement. We hypothesize that direct physician involvement in procedural coding will lead to improved coding accuracy, increased work relative value unit (wRVU) assignment, and increased physician reimbursement. This prospective observational cohort study evaluated procedural coding accuracy of fistulograms at an academic medical institution (January-June 2014). All fistulograms were coded by institutional coders (traditional coding) and by a single vascular surgeon whose codes were verified by two institution coders (multidisciplinary coding). The coding methods were compared, and differences were translated into revenue and wRVUs using the Medicare Physician Fee Schedule. Comparison between traditional and multidisciplinary coding was performed for three discrete study periods: baseline (period 1), after a coding education session for physicians and coders (period 2), and after a coding education session with implementation of an operative dictation template (period 3). The accuracy of surgeon operative dictations during each study period was also assessed. An external validation at a second academic institution was performed during period 1 to assess and compare coding accuracy. During period 1, traditional coding resulted in a 4.4% (P = .004) loss in reimbursement and a 5.4% (P = .01) loss in wRVUs compared with multidisciplinary coding. During period 2, no significant difference was found between traditional and multidisciplinary coding in reimbursement (1.3% loss; P = .24) or wRVUs (1.8% loss; P = .20). During period 3, traditional coding yielded a higher overall reimbursement (1.3% gain; P = .26) than multidisciplinary coding. This increase, however, was due to errors by institution coders, with six inappropriately used codes

  11. Correlation Between P-wave Velocity and Strength Index for Shale to Predict Uniaxial Compressive Strength Value

    Directory of Open Access Journals (Sweden)

    Awang H.

    2017-01-01

    Full Text Available Seismic refraction survey is a non destructive method used in site investigation to identify the seismic velocity subsurface strata. Although it is widely known, the reliability of the result is still doubtable for some reason as well as due to an engineer’s ignorant, which insist on using conventional method rather than new advanced method causing the lack of usage in geophysical method for testing. This study aims to produce a correlation between P-wave velocity value and point load strength index value for shale. Both field and laboratory tests were carried out. In order to obtain the P-wave value, seismic refraction method was conducted as a field test at Precint 4, Putrajaya, Malaysia to achieve the Pwave velocity value of the shale bed. Ten samples of shale were collected from the field and laboratory tests were conducted. The tests are divided into three sections, namely non-destructive laboratory test, physical properties test and mechanical properties test. Ultrasonic Velocity Test via PUNDIT test was conducted as non-destructive laboratory test to acknowledge the P-wave velocity value in laboratory. Both field and laboratory P-wave velocity value were then compared and the result delivers are reliable due to it is within the range. For physical properties test, the rock density and porosity were acknowledged. Meanwhile, Point Load Test was conducted as mechanical properties. Correlation for both Pwave velocity value and point load strength value were achieved via producing an empirical relationship as the end result. Prediction of uniaxial compressive strength (UCS value was made via converting the point load strength value to UCS value using a correlation. By acknowledging this empirical relationship, it shows that geophysical methods are able to produce a reliable result. Hence more and widely used of geophysical method will be profound in the future.

  12. normal values of key pinch strength in a healthy nigerian population

    African Journals Online (AJOL)

    Hand grip values in an African population have been found to be lower than that of the Caucasian population.9,10 There is a dearth of data on key pinch strength pinch in Africans in the literature. The objectives of the study were to have normal values of key pinch strength amongst a Nigerian population, to determine the ...

  13. Comparison of Calculated value by the Core Design ASTRA Code with Measured Data

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Ji Eun; Yang, Sung Tae [Korea Hydro and Nuclear Power Co., Daejeon (Korea, Republic of)

    2012-05-15

    ASTRA is a unique proprietary core design code being developed by KEPCO NF. This code is still under development, but it can nonetheless be analyzed and given feedback by a utility. This study will be used to develop a risk assessment procedure based on the ASTRA code. The calculated values of the ASTRA code and the measurement values are compared here using the data for Cycle 12 of Younggwang Nuclear Unit 3. The reactor core is composed of 177 fuel assemblies, consisting of a 16x16 array with 236 fuel rods and 5 guide tubes

  14. Multiple-valued logic-protected coding for an optical non-quantum communication line

    NARCIS (Netherlands)

    Antipov, A. L.; Bykovsky, A. Yu.; Vasiliev, N. A.; Egorov, A. A.

    2006-01-01

    A simple and cheap method of secret coding in an optical line is proposed based on multiple-valued logic. This method is shown to have very high cryptography resources and is designated for bidirectional information exchange in a team of mobile robots, where quantum teleportation coding cannot yet

  15. Coefficient αcc in design value of concrete compressive strength

    Directory of Open Access Journals (Sweden)

    Goleš Danica

    2016-01-01

    Full Text Available Coefficient αcc introduces the effects of rate and duration of loading on compressive strength of concrete. These effects may be partially or completely compensated by the increase in concrete strength over time. Selection of the value of this coefficient, in recommended range between 0.8 and 1.0, is carried out through the National Annexes to Eurocode 2. This paper presents some considerations related to the introduction of this coefficient and its value adopted in some European countries. The article considers the effect of the adoption of conservative value αcc=0.85 on design value of compressive and flexural resistance of rectangular cross-section made of normal and high strength concrete. It analyzes the influence of different values of coefficient αcc on the area of reinforcement required to achieve the desired resistance of cross-section.

  16. 14 CFR 27.613 - Material strength properties and design values.

    Science.gov (United States)

    2010-01-01

    ...) Design values must be chosen to minimize the probability of structural failure due to material... be shown by selecting design values that assure material strength with the following probability— (1) Where applied loads are eventually distributed through a single member within an assembly, the failure...

  17. 14 CFR 29.613 - Material strength properties and design values.

    Science.gov (United States)

    2010-01-01

    .... (b) Design values must be chosen to minimize the probability of structural failure due to material... be shown by selecting design values that assure material strength with the following probability— (1) Where applied loads are eventually distributed through a single member within an assembly, the failure...

  18. 14 CFR 23.613 - Material strength properties and design values.

    Science.gov (United States)

    2010-01-01

    ... statistical basis. (b) Design values must be chosen to minimize the probability of structural failure due to... must be shown by selecting design values that ensure material strength with the following probability: (1) Where applied loads are eventually distributed through a single member within an assembly, the...

  19. Handgrip strength and its prognostic value for mortality in Moscow, Denmark, and England

    DEFF Research Database (Denmark)

    Oksuzyan, Anna; Demakakos, Panayotes; Shkolnikova, Maria

    2017-01-01

    BACKGROUND: This study compares handgrip strength and its association with mortality across studies conducted in Moscow, Denmark, and England. MATERIALS: The data collected by the Study of Stress, Aging, and Health in Russia, the Study of Middle-Aged Danish Twins and the Longitudinal Study of Aging...... Danish Twins, and the English Longitudinal Study of Ageing was utilized. RESULTS: Among the male participants, the age-standardized grip strength was 2 kg and 1 kg lower in Russia than in Denmark and in England, respectively. The age-standardized grip strength among the female participants was 1.9 kg...... in mortality among the English men and women, respectively. CONCLUSION: The study suggests that, although absolute grip strength values appear to vary across the Muscovite, Danish, and English samples, the degree to which grip strength is predictive of mortality is comparable across national populations...

  20. EVALUATION OF THE RE LATIONSHIP BETWEEN L EG STRENGTH AND VELOCITY VALUES IN AMATEUR FOOTBALL PLAYERS

    Directory of Open Access Journals (Sweden)

    İsmail G Ö K H A N

    2015-08-01

    Full Text Available The purpose of this study is to analyse the relationship between conditional parameters ( leg strength, back strength, velocity 30 Mt, flexibility by measuring some physical (height, body w eight and physiological (systole, diastole, KAH characteristics of male football players of Karakopru Belediyespor and Harran University. According to the results obtained from the measurements, mean age was 23, 46±3,50 /years; as a part of physical cha racteristics ,mean height was 176,20±5,10 (cm and mean body weight was 70,16±5,21 (kg . As a part of physiological characteristics, mean Systolic Blood Pressure was 123,87±14,23 (mmhg , mean Diastolic Blood Pressure was 73,60±16,42 (mmhg and mean Resting Heart Rate was 64,50±10,48 (beats/min . As a part of conditional parameters, mean leg strength was 101,83±40,48 (kg, back strength was 75,83±19,43 (kg, flexibility was 34,16±6,65 (cm and mean velocity in 30 Mt. was 4,15± 0,20 (sec. It was observed that there was a relationship between 30 meters velocity and leg strength parameters (r= - , 407 . There was no relationship between 30 meters velocity and back strength parameters (r=, 429; and between 30 meters velocity and flexibility param eters (r=, 659 . As a result, while the relationship between velocity and back strength values of the amateur football players was not significant (p>0.05 ; the relationship between velocity and leg strength values was found to be significant (p<0.05.

  1. Prognostic value of a decreased tongue strength for survival time in patients with amyotrophic lateral sclerosis

    NARCIS (Netherlands)

    Alexander Geurts; J. Weikamp; J. Hendriks; J. Schelhaas; Bert de Swart

    2012-01-01

    Decreased tongue strength (TS) might herald bulbar involvement in patients with amyotrophic lateral sclerosis (ALS) well before dysarthria or dysphagia occur, and as such might be prognostic of short survival. The purpose of this study was to investigate the prognostic value of a decreased TS, in

  2. Prognostic value of decreased tongue strength on survival time in patients with amyotrophic lateral sclerosis

    NARCIS (Netherlands)

    Weikamp, J.G.; Schelhaas, H.J.; Hendriks, J.C.M.; Swart, B.J.M. de; Geurts, A.C.H.

    2012-01-01

    Decreased tongue strength (TS) might herald bulbar involvement in patients with amyotrophic lateral sclerosis (ALS) well before dysarthria or dysphagia occur, and as such might be prognostic of short survival. The purpose of this study was to investigate the prognostic value of a decreased TS, in

  3. Characteristic compression strength of a brickwork masonry starting from the strength of its components. Experimental verification of analitycal equations of european codes

    Directory of Open Access Journals (Sweden)

    Rolando, A.

    2006-09-01

    Full Text Available In this paper the compression strength of a clay brickwork masonry bound with cement mortar is analyzed. The target is to obtain the characteristic compression strength of unreinforced brickwork masonry. This research try to test the validity of the analytical equations in European codes, comparing the experimental strength with the analytically obtained from the strength of its components (clay brick and cement mortar.En este artículo se analiza la resistencia a compresión de una fábrica de ladrillo cerámico, asentado con mortero de cemento.El objetivo es obtener la resistencia característica a compresión de la fábrica sin armar.La investigación comprueba la fiabilidad de las expresiones analíticas existentes en la normativa europea, comparando la resistencia obtenida experimentalmente con la obtenida analíticamente, a partir de la resistencia de sus componentes (ladrillo cerámico y mortero de cemento.

  4. Handgrip strength and its prognostic value for mortality in Moscow, Denmark, and England.

    Science.gov (United States)

    Oksuzyan, Anna; Demakakos, Panayotes; Shkolnikova, Maria; Thinggaard, Mikael; Vaupel, James W; Christensen, Kaare; Shkolnikov, Vladimir M

    2017-01-01

    This study compares handgrip strength and its association with mortality across studies conducted in Moscow, Denmark, and England. The data collected by the Study of Stress, Aging, and Health in Russia, the Study of Middle-Aged Danish Twins and the Longitudinal Study of Aging Danish Twins, and the English Longitudinal Study of Ageing was utilized. Among the male participants, the age-standardized grip strength was 2 kg and 1 kg lower in Russia than in Denmark and in England, respectively. The age-standardized grip strength among the female participants was 1.9 kg and 1.6 kg lower in Russia than in Denmark and in England, respectively. In Moscow, a one-kilogram increase in grip strength was associated with a 4% (hazard ratio [HR] = 0.96, 95% confidence interval [CI]: 0.94, 0.99) reduction in mortality among men and a 10% (HR = 0.90, 95%CI: 0.86, 0.94) among women. Meanwhile, a one-kilogram increase in grip strength was associated with a 6% (HR = 0.94, 95%CI: 0.93, 0.95) and an 8% (HR = 0.92, 95%CI: 0.90, 0.94) decrease in mortality among Danish men and women, respectively, and with a 2% (HR = 0.98, 95%CI: 0.97, 0.99) and a 3% (HR = 0.97, 95%CI: 0.95, 0.98) reduction in mortality among the English men and women, respectively. The study suggests that, although absolute grip strength values appear to vary across the Muscovite, Danish, and English samples, the degree to which grip strength is predictive of mortality is comparable across national populations with diverse socioeconomic and health profiles and life expectancy levels.

  5. Handgrip strength and its prognostic value for mortality in Moscow, Denmark, and England.

    Directory of Open Access Journals (Sweden)

    Anna Oksuzyan

    Full Text Available This study compares handgrip strength and its association with mortality across studies conducted in Moscow, Denmark, and England.The data collected by the Study of Stress, Aging, and Health in Russia, the Study of Middle-Aged Danish Twins and the Longitudinal Study of Aging Danish Twins, and the English Longitudinal Study of Ageing was utilized.Among the male participants, the age-standardized grip strength was 2 kg and 1 kg lower in Russia than in Denmark and in England, respectively. The age-standardized grip strength among the female participants was 1.9 kg and 1.6 kg lower in Russia than in Denmark and in England, respectively. In Moscow, a one-kilogram increase in grip strength was associated with a 4% (hazard ratio [HR] = 0.96, 95% confidence interval [CI]: 0.94, 0.99 reduction in mortality among men and a 10% (HR = 0.90, 95%CI: 0.86, 0.94 among women. Meanwhile, a one-kilogram increase in grip strength was associated with a 6% (HR = 0.94, 95%CI: 0.93, 0.95 and an 8% (HR = 0.92, 95%CI: 0.90, 0.94 decrease in mortality among Danish men and women, respectively, and with a 2% (HR = 0.98, 95%CI: 0.97, 0.99 and a 3% (HR = 0.97, 95%CI: 0.95, 0.98 reduction in mortality among the English men and women, respectively.The study suggests that, although absolute grip strength values appear to vary across the Muscovite, Danish, and English samples, the degree to which grip strength is predictive of mortality is comparable across national populations with diverse socioeconomic and health profiles and life expectancy levels.

  6. Positive predictive values of peripheral arterial and venous thrombosis codes in French hospital database.

    Science.gov (United States)

    Prat, Mandy; Derumeaux, Hélène; Sailler, Laurent; Lapeyre-Mestre, Maryse; Moulis, Guillaume

    2018-02-01

    French hospital database, called Programme de Médicalisation des Systèmes d'Information (PMSI), covers all hospital stays in France (>66 million inhabitants). The aim of this study was to estimate the positive predictive values (PPVs) of primary diagnosis codes of peripheral arterial and venous thrombosis codes in the PMSI, encoded with the International Classification of Diseases, 10th revision. Data were extracted from the PMSI database of Toulouse University Hospital, south of France. We identified all the hospital stays in 2015 with a code of peripheral arterial or venous thrombosis as primary diagnosis. We randomly selected 100 stays for each category of thrombosis and reviewed the corresponding medical charts. The PPV of peripheral arterial thrombosis codes was 83.0%, 95% confidence interval (CI): 73.9-89.1, and the PPV of correct location of thrombosis was 81.0%, 95% CI: 72.2-87.5. The PPV of pulmonary embolism was 99.0%, 95% CI: 93.8-99.9. The PPV of peripheral venous thrombosis was 95.0%, 95% CI: 88.2-98.1, and the PPV of correct location of thrombosis was 85.0%, 95% CI: 76.7-90.7. Primary diagnoses of peripheral arterial and venous thrombosis demonstrated good PPVs in the PMSI. © 2017 Société Française de Pharmacologie et de Thérapeutique.

  7. Mapping scores from the Strengths and Difficulties Questionnaire (SDQ) to preference-based utility values.

    Science.gov (United States)

    Furber, Gareth; Segal, Leonie; Leach, Matthew; Cocks, Jane

    2014-03-01

    Quality of life mapping methods such as "Transfer to Utility" can be used to translate scores on disease-specific measures to utility values, when traditional utility measurement methods (e.g. standard gamble, time trade-off, preference-based multi-attribute instruments) have not been used. The aim of this study was to generate preliminary ordinary least squares (OLS) regression-based algorithms to transform scores from the Strengths and Difficulties Questionnaires (SDQ), a widely used measure of mental health in children and adolescents, to utility values obtained using the preference-based Child Health Utility (CHU9D) instrument. Two hundred caregivers of children receiving community mental health services completed the SDQ and CHU9D during a telephone interview. Two OLS regressions were run with the CHU9D utility value as the dependent variable and SDQ subscales as predictors. Resulting algorithms were validated by comparing predicted and observed group mean utility values in randomly selected subsamples. Preliminary validation was obtained for two algorithms, utilising five and three subscales of the SDQ, respectively. Root mean square error values (.124) for both models suggested poor fit at an individual level, but both algorithms performed well in predicting mean group observed utility values. This research generated algorithms for translating SDQ scores to utility values and providing researchers with an additional tool for conducting health economic evaluations with child and adolescent mental health data.

  8. Calculation of low-cycle fatigue in accordance with the national standard and strength codes

    Science.gov (United States)

    Kontorovich, T. S.; Radin, Yu. A.

    2017-08-01

    Over the most recent 15 years, the Russian power industry has largely relied on imported equipment manufactured in compliance with foreign standards and procedures. This inevitably necessitates their harmonization with the regulatory documents of the Russian Federation, which include calculations of strength, low cycle fatigue, and assessment of the equipment service life. An important regulatory document providing the engineering foundation for cyclic strength and life assessment for high-load components of the boiler and steamline of a water/steam circuit is RD 10-249-98:2000: Standard Method of Strength Estimation in Stationary Boilers and Steam and Water Piping. In January 2015, the National Standard of the Russian Federation 12952-3:2001 was introduced regulating the issues of design and calculation of the pressure parts of water-tube boilers and auxiliary installations. Thus, there appeared to be two documents simultaneously valid in the same energy field and using different methods for calculating the low-cycle fatigue strength, which leads to different results. In this connection, the current situation can lead to incorrect ideas about the cyclic strength and the service life of high-temperature boiler parts. The article shows that the results of calculations performed in accordance with GOST R 55682.3-2013/EN 12952-3: 2001 are less conservative than the results of the standard RD 10-249-98. Since the calculation of the expected service life of boiler parts should use GOST R 55682.3-2013/EN 12952-3: 2001, it becomes necessary to establish the applicability scope of each of the above documents.

  9. Interval Estimation of Stress-Strength Reliability Based on Lower Record Values from Inverse Rayleigh Distribution

    Directory of Open Access Journals (Sweden)

    Bahman Tarvirdizade

    2014-01-01

    Full Text Available We consider the estimation of stress-strength reliability based on lower record values when X and Y are independently but not identically inverse Rayleigh distributed random variables. The maximum likelihood, Bayes, and empirical Bayes estimators of R are obtained and their properties are studied. Confidence intervals, exact and approximate, as well as the Bayesian credible sets for R are obtained. A real example is presented in order to illustrate the inferences discussed in the previous sections. A simulation study is conducted to investigate and compare the performance of the intervals presented in this paper and some bootstrap intervals.

  10. Generating Code with Polymorphic let: A Ballad of Value Restriction, Copying and Sharing

    Directory of Open Access Journals (Sweden)

    Oleg Kiselyov

    2017-02-01

    Full Text Available Getting polymorphism and effects such as mutation to live together in the same language is a tale worth telling, under the recurring refrain of copying vs. sharing. We add new stanzas to the tale, about the ordeal to generate code with polymorphism and effects, and be sure it type-checks. Generating well-typed-by-construction polymorphic let-expressions is impossible in the Hindley-Milner type system: even the author believed that. The polymorphic-let generator turns out to exist. We present its derivation and the application for the lightweight implementation of quotation via a novel and unexpectedly simple source-to-source transformation to code-generating combinators. However, generating let-expressions with polymorphic functions demands more than even the relaxed value restriction can deliver. We need a new deal for let-polymorphism in ML. We conjecture the weaker restriction and implement it in a practically-useful code-generation library. Its formal justification is formulated as the research program.

  11. Value and probability coding in a feedback-based learning task utilizing food rewards.

    Science.gov (United States)

    Tricomi, Elizabeth; Lempert, Karolina M

    2015-01-01

    For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedback by training participants to associate cues with food rewards and then employing a selective satiety procedure to devalue one food outcome. Using functional magnetic resonance imaging, we examined brain activity related to receipt of expected rewards, receipt of devalued outcomes, omission of expected rewards, omission of devalued outcomes, and expected omissions of an outcome. Nucleus accumbens activation was greater for rewarding outcomes than devalued outcomes, but activity in this region did not correlate with the probability of reward receipt. Activation of the right caudate and putamen, however, was largest in response to rewarding outcomes relative to expected omissions of reward. The dorsal striatum (caudate and putamen) at the time of feedback also showed a parametric increase correlating with the trialwise probability of reward receipt. Our results suggest that the ventral striatum is sensitive to the motivational relevance, or subjective value, of the outcome, while the dorsal striatum codes for a more complex signal that incorporates reward probability. Value and probability information may be integrated in the dorsal striatum, to facilitate action planning and allocation of effort. Copyright © 2015 the American Physiological Society.

  12. The neural dynamics of reward value and risk coding in the human orbitofrontal cortex.

    Science.gov (United States)

    Li, Yansong; Vanni-Mercier, Giovanna; Isnard, Jean; Mauguière, François; Dreher, Jean-Claude

    2016-04-01

    The orbitofrontal cortex is known to carry information regarding expected reward, risk and experienced outcome. Yet, due to inherent limitations in lesion and neuroimaging methods, the neural dynamics of these computations has remained elusive in humans. Here, taking advantage of the high temporal definition of intracranial recordings, we characterize the neurophysiological signatures of the intact orbitofrontal cortex in processing information relevant for risky decisions. Local field potentials were recorded from the intact orbitofrontal cortex of patients suffering from drug-refractory partial epilepsy with implanted depth electrodes as they performed a probabilistic reward learning task that required them to associate visual cues with distinct reward probabilities. We observed three successive signals: (i) around 400 ms after cue presentation, the amplitudes of the local field potentials increased with reward probability; (ii) a risk signal emerged during the late phase of reward anticipation and during the outcome phase; and (iii) an experienced value signal appeared at the time of reward delivery. Both the medial and lateral orbitofrontal cortex encoded risk and reward probability while the lateral orbitofrontal cortex played a dominant role in coding experienced value. The present study provides the first evidence from intracranial recordings that the human orbitofrontal cortex codes reward risk both during late reward anticipation and during the outcome phase at a time scale of milliseconds. Our findings offer insights into the rapid mechanisms underlying the ability to learn structural relationships from the environment. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. The Value Compressive Strength and Split Tensile Strength on Concrete Mixture With Expanded Polystyrene Coated by Surfactant Span 80 as a Partial Substitution of Fine Aggregate

    Directory of Open Access Journals (Sweden)

    Hidayat Irpan

    2014-03-01

    Full Text Available The value of the density normal concrete which ranges between 2200–2400 kg/m3. Therefore the use of Expanded Polystyrene (EPS as a subitute to fine aggregate can reduce the density of concrete. The purpose this research is to reduce the density of normal concrete but increase compressive strength of EPS concrete, with use surfactant as coating for the EPS. Variables of substitution percentage of EPS and EPS coated by surfactant are 5%,10%,15%,20%,25%. Method of concrete mix design based on SNI 03-2834-2000 “Tata Cara Pembuatan Rencana Campuran Beton Normal (Provisions for Proportioning Normal Concrete Mixture”. The result of testing, every increase percentage of EPS substitution will decrease the compressive strength around 1,74 MPa and decrease density 34,03 kg/m3. Using Surfactant as coating of EPS , compressive strength increase from the EPS’s compressive strength. Average of increasing compressive strength 0,19 MPa and increase the density 20,03 kg/m3,average decrease of the tensile split strength EPS coated surfaktan is 0,84 MPa.

  14. The Value Compressive Strength and Split Tensile Strength on Concrete Mixture With Expanded Polystyrene Coated by Surfactant Span 80 as a Partial Substitution of Fine Aggregate

    Science.gov (United States)

    Hidayat, Irpan; Siauwantara, Alice

    2014-03-01

    The value of the density normal concrete which ranges between 2200-2400 kg/m3. Therefore the use of Expanded Polystyrene (EPS) as a subitute to fine aggregate can reduce the density of concrete. The purpose this research is to reduce the density of normal concrete but increase compressive strength of EPS concrete, with use surfactant as coating for the EPS. Variables of substitution percentage of EPS and EPS coated by surfactant are 5%,10%,15%,20%,25%. Method of concrete mix design based on SNI 03-2834-2000 "Tata Cara Pembuatan Rencana Campuran Beton Normal (Provisions for Proportioning Normal Concrete Mixture)". The result of testing, every increase percentage of EPS substitution will decrease the compressive strength around 1,74 MPa and decrease density 34,03 kg/m3. Using Surfactant as coating of EPS , compressive strength increase from the EPS's compressive strength. Average of increasing compressive strength 0,19 MPa and increase the density 20,03 kg/m3,average decrease of the tensile split strength EPS coated surfaktan is 0,84 MPa.

  15. Eating Disorder Behaviors, Strength of Faith, and Values in Late Adolescents and Emerging Adults: An Exploration of Associations

    Science.gov (United States)

    King, Stephanie L.

    2012-01-01

    Adolescents entering college are often affected by eating disorders and during this transition to emerging adulthood, individuals begin to establish personal values and beliefs, which makes this population interesting when studying Eating Disorders, values, and faith. This research project seeks to examine the association among strength of…

  16. Developing a Material Strength Design Value Based on Compression after Impact Damage for the Ares I Composite Interstage

    Science.gov (United States)

    Nettles, A. T.; Jackson, J. R.

    2009-01-01

    The derivation of design values for compression after impact strength for two types of honeycomb sandwich structures are presented. The sandwich structures in this study had an aluminum core and composite laminate facesheets of either 16-ply quasi or 18-ply directional lay-ups. The results show that a simple power law curve fit to the data can be used to create A- and B-basis residual strength curves.

  17. Received Signal Strength Recovery in Green WLAN Indoor Positioning System Using Singular Value Thresholding

    Directory of Open Access Journals (Sweden)

    Lin Ma

    2015-01-01

    Full Text Available Green WLAN is a promising technique for accessing future indoor Internet services. It is designed not only for high-speed data communication purposes but also for energy efficiency. The basic strategy of green WLAN is that all the access points are not always powered on, but rather work on-demand. Though powering off idle access points does not affect data communication, a serious asymmetric matching problem will arise in a WLAN indoor positioning system due to the fact the received signal strength (RSS readings from the available access points are different in their offline and online phases. This asymmetry problem will no doubt invalidate the fingerprint algorithm used to estimate the mobile device location. Therefore, in this paper we propose a green WLAN indoor positioning system, which can recover RSS readings and achieve good localization performance based on singular value thresholding (SVT theory. By solving the nuclear norm minimization problem, SVT recovers not only the radio map, but also online RSS readings from a sparse matrix by sensing only a fraction of the RSS readings. We have implemented the method in our lab and evaluated its performances. The experimental results indicate the proposed system could recover the RSS readings and achieve good localization performance.

  18. Received signal strength recovery in green WLAN indoor positioning system using singular value thresholding.

    Science.gov (United States)

    Ma, Lin; Xu, Yubin

    2015-01-12

    Green WLAN is a promising technique for accessing future indoor Internet services. It is designed not only for high-speed data communication purposes but also for energy efficiency. The basic strategy of green WLAN is that all the access points are not always powered on, but rather work on-demand. Though powering off idle access points does not affect data communication, a serious asymmetric matching problem will arise in a WLAN indoor positioning system due to the fact the received signal strength (RSS) readings from the available access points are different in their offline and online phases. This asymmetry problem will no doubt invalidate the fingerprint algorithm used to estimate the mobile device location. Therefore, in this paper we propose a green WLAN indoor positioning system, which can recover RSS readings and achieve good localization performance based on singular value thresholding (SVT) theory. By solving the nuclear norm minimization problem, SVT recovers not only the radio map, but also online RSS readings from a sparse matrix by sensing only a fraction of the RSS readings. We have implemented the method in our lab and evaluated its performances. The experimental results indicate the proposed system could recover the RSS readings and achieve good localization performance.

  19. Mean Hand Grip Strength and Cut-off Value for Sarcopenia in Korean Adults Using KNHANES VI.

    Science.gov (United States)

    Yoo, Jun Il; Choi, Hana; Ha, Yong Chan

    2017-05-01

    The purpose of this study was to report age- and gender-specific distribution of the hand grip strength (HGS) using data from the Korea National Health and Nutrition Examination Survey (KNHANES) VI-3 (2015) survey and determine cut-off values for low muscle strength of HGS of Koreans. Of a total of 7,380 participants, 4,553 were subjected to measurements of HGS, including 1,997 men and 2,556 women with a mean age of 49.3 years (range, 19-80 years). The mean ages of men and women were 49.0 and 49.5 years, respectively. HGS was measured using a digital hand dynamometer. It was defined as maximal measured grip strength of the dominant hand. The cut-off value for low muscle strength was defined as the lower 20th percentile of HGS of the study population. Maximum grip strength of men was significantly higher than that of women (40.2 kg in men vs. 24.2 kg in women, P cut-off values of HGS in male and female elderly healthy populations were 28.6 and 16.4 kg, respectively. These data might be used as reference values when evaluating sarcopenia and assessing hand injuries. © 2017 The Korean Academy of Medical Sciences.

  20. Prognostic value of long non-coding RNA HOTAIR in various cancers.

    Directory of Open Access Journals (Sweden)

    Qiwen Deng

    Full Text Available Long non-coding RNA has been involved in cancer progression, and high HOX transcript antisense intergenic RNA (HOTAIR is thought to be a poor prognostic indicator in tumorigenesis of multiple types of cancer. Hence, the present study further reveals its prognostic value in tumor malignancy. A systematic review of PubMed and Web of Science was carried out to select literatures relevant to the correlation between HOTAIR expression levels and clinical outcome of various tumors. Overall survival (OS, metastasis-free survival (MFS, recurrence-free survival (RFS, and disease-free survival (DFS were subsequently analyzed. Data from studies directly reporting a hazard ratio (HR and the corresponding 95% confidence interval (CI or a P value as well as survival curves were pooled in the current meta-analysis. A total of 2255 patients from 19 literatures almost published in 2011 or later were included in the analysis. The results suggest that HOTAIR was highly associated with HR for OS of 2.33 (95%CI = 1.77-3.09, Pheterogeneity = 0.016. Stratified analyses indicate that elevated levels of HOTAIR appears to be a powerful prognostic biomarker for patients with colorectal cancer (HR = 3.02, 95CI% = 1.84-4.95, Pheterogeneity = 0.699 and esophageal squamous cell carcinomas (HR = 2.24, 95CI% = 1.67-3.01, Pheterogeneity = 0.711, a similar effect was also observed in analysis method and specimen, except for ethnicity. In addition, Hazard ratios for up-regulation of HOTAIR for MFS, RFS, and DFS were 2.32 (P<0.001, 1.98 (P = 0.369, and 3.29 (P = 0.001, respectively. In summary, the high level of HOTAIR is intimately associated with an adverse OS in numerous cancers, suggesting that HOTAIR may act as a potential biomarker for the development of malignancies.

  1. Strength Measurements in Acute Hamstring Injuries: Intertester Reliability and Prognostic Value of Handheld Dynamometry

    NARCIS (Netherlands)

    Reurink, Gustaaf; Goudswaard, Gert Jan; Moen, Maarten H.; Tol, Johannes L.; Verhaar, Jan A. N.; Weir, Adam

    2016-01-01

    Study Design Cohort study, repeated measures. Background Although hamstring strength measurements are used for assessing prognosis and monitoring recovery after hamstring injury, their actual clinical relevance has not been established. Handheld dynamometry (HHD) is a commonly used method of

  2. Handgrip strength and its prognostic value for mortality in Moscow, Denmark, and England

    OpenAIRE

    Anna Oksuzyan; Panayotes Demakakos; Maria Shkolnikova; Mikael Thinggaard; Vaupel, James W.; Kaare Christensen; Shkolnikov, Vladimir M

    2017-01-01

    BACKGROUND: This study compares handgrip strength and its association with mortality across studies conducted in Moscow, Denmark, and England.MATERIALS: The data collected by the Study of Stress, Aging, and Health in Russia, the Study of Middle-Aged Danish Twins and the Longitudinal Study of Aging Danish Twins, and the English Longitudinal Study of Ageing was utilized.RESULTS: Among the male participants, the age-standardized grip strength was 2 kg and 1 kg lower in Russia than in Denmark and...

  3. Deviance among young italians: investigating the predictive strength of value systems.

    Science.gov (United States)

    Froggio, Giacinto; Lori, Massimo

    2010-08-01

    Despite more than three decades of research on the relationship between values and deviance, until now results have not completely clarified what kind of relationship it is. Criminological theories (sociological and psychosocial) emphasize on a relationship between certain values, such as hedonistic and materialistic ones, and deviance, but few theorists explain whether these values are predictors of deviance. In this study, using a sample of 500 young Italians, the authors try to verify whether value systems can be considered predictors of juvenile deviance and if so, which type of value system. An attempt is made to identify what is the values' capability to explain deviance in comparison with other psychosocial and demographic predictors. Results confirm the past findings of a relation between hedonistic and materialistic values and deviance but show only a modest and peripheral ability of values to explain deviance.

  4. Value and probability coding in a feedback-based learning task utilizing food rewards

    OpenAIRE

    Tricomi, Elizabeth; Karolina M Lempert

    2014-01-01

    For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedb...

  5. Coding of the long-term value of multiple future rewards in the primate striatum.

    Science.gov (United States)

    Yamada, Hiroshi; Inokawa, Hitoshi; Matsumoto, Naoyuki; Ueda, Yasumasa; Enomoto, Kazuki; Kimura, Minoru

    2013-02-01

    Decisions maximizing benefits involve a tradeoff between the quantity of a reward and the cost of elapsed time until an animal receives it. The estimation of long-term reward values is critical to attain the most desirable outcomes over a certain period of time. Reinforcement learning theories have established algorithms to estimate the long-term reward values of multiple future rewards in which the values of future rewards are discounted as a function of how many steps of choices are necessary to achieve them. Here, we report that presumed striatal projection neurons represent the long-term values of multiple future rewards estimated by a standard reinforcement learning model while monkeys are engaged in a series of trial-and-error choices and adaptive decisions for multiple rewards. We found that the magnitude of activity of a subset of neurons was positively correlated with the long-term reward values, and that of another subset of neurons was negatively correlated throughout the entire decision-making process in individual trials: from the start of the task trial, estimation of the values and their comparison among alternatives, choice execution, and evaluation of the received rewards. An idiosyncratic finding was that neurons showing negative correlations represented reward values in the near future (high discounting), while neurons showing positive correlations represented reward values not only in the near future, but also in the far future (low discounting). These findings provide a new insight that long-term value signals are embedded in two subsets of striatal neurons as high and low discounting of multiple future rewards.

  6. A parametric study of MELCOR Accident Consequence Code System 2 (MACCS2) Input Values for the Predicted Health Effect

    Energy Technology Data Exchange (ETDEWEB)

    Kim, So Ra; Min, Byung Il; Park, Ki Hyun; Yang, Byung Mo; Suh, Kyung Suk [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The MELCOR Accident Consequence Code System 2, MACCS2, has been the most widely used through the world among the off-site consequence analysis codes. MACCS2 code is used to estimate the radionuclide concentrations, radiological doses, health effects, and economic consequences that could result from the hypothetical nuclear accidents. Most of the MACCS model parameter values are defined by the user and those input parameters can make a significant impact on the output. A limited parametric study was performed to identify the relative importance of the values of each input parameters in determining the predicted early and latent health effects in MACCS2. These results would not be applicable to every case of the nuclear accidents, because only the limited calculation was performed with Kori-specific data. The endpoints of the assessment were early- and latent cancer-risk in the exposed population, therefore it might produce the different results with the parametric studies for other endpoints, such as contamination level, absorbed dose, and economic cost. Accident consequence assessment is important for decision making to minimize the health effect from radiation exposure, accordingly the sufficient parametric studies are required for the various endpoints and input parameters in further research.

  7. Effect of hardness values on the creep rupture strength in a Mod.9Cr1Mo steel

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Y. S.; Ryu, S. H.; Kong, B. O.; Kim, J. T. [Doosan Heavy Industries and Construction, Changwon (Korea, Republic of)

    2003-07-01

    The modified 9Cr-1Mo steel identified as T91, P91 and F91 in the ASME specification has been widely used for the construction of modern power plants. The available data on the influence of process parameters during manufacturing and fabrication on its properties are not sufficient. In this study, the influence of various thermal cycles on the hardness and the creep rupture strength was analyzed in the base metal and the weldments made in tube and pipe of a Mod.9Cr-1Mo steel. The low hardness, 155Hv, showed low creep rupture strength below the allowable stresses of T91 base metal in the ASME specification. This low value was attributed to the fully recovered dislocation structure and the weakening of precipitation hardening associated with the abnormal thermal cycles.

  8. 14 CFR 25.613 - Material strength properties and material design values.

    Science.gov (United States)

    2010-01-01

    ... following probability: (1) Where applied loads are eventually distributed through a single member within an... statistical basis. (b) Material design values must be chosen to minimize the probability of structural... probability with 95 percent confidence. (2) For redundant structure, in which the failure of individual...

  9. Positive predictive value of diagnosis coding for hemolytic anemias in the Danish National Patient Register.

    Science.gov (United States)

    Hansen, Dennis Lund; Overgaard, Ulrik Malthe; Pedersen, Lars; Frederiksen, Henrik

    2016-01-01

    The nationwide public health registers in Denmark provide a unique opportunity for evaluation of disease-associated morbidity if the positive predictive values (PPVs) of the primary diagnosis are known. The aim of this study was to evaluate the predictive values of hemolytic anemias registered in the Danish National Patient Register. All patients with a first-ever diagnosis of hemolytic anemia from either specialist outpatient clinic contact or inpatient admission at Odense University Hospital from January 1994 through December 2011 were considered for inclusion. Patients with mechanical reason for hemolysis such as an artificial heart valve, and patients with vitamin-B12 or folic acid deficiency were excluded. We identified 412 eligible patients: 249 with a congenital hemolytic anemia diagnosis and 163 with acquired hemolytic anemia diagnosis. In all, hemolysis was confirmed in 359 patients, yielding an overall PPV of 87.1% (95% confidence interval [CI]: 83.5%-90.2%). A diagnosis could be established in 392 patients of whom 355 patients had a hemolytic diagnosis. Diagnosis was confirmed in 197 of the 249 patients with congenital hemolytic anemia, yielding a PPV of 79.1% (95% CI: 73.5%-84.0%). Diagnosis of acquired hemolytic anemia could be confirmed in 136 of the 163 patients, resulting in a PPV of 83.4% (95% CI: 76.8%-88.8%). For hemoglobinopathy PPV was 84.1% (95% CI: 77.4%-89.4%), for hereditary spherocytosis PPV was 80.6% (95% CI: 69.5%-88.9%), and for autoimmune hemolytic anemia PPV was 78.4% (95% CI: 70.4%-85.0%). The PPV of hemolytic anemias was moderately high. The PPVs were comparable in the three main categories of overall hemolysis, and congenital and acquired hemolytic anemia.

  10. Definition of the ethical values and ethics codes for Turkish midwifery: a focused group study in kocaeli.

    Science.gov (United States)

    Berkiten Ergin, Ayla; Ozcan, Müesser; Ersoy, Nermin; Acar, Zeynep

    2013-09-01

    The independent roles of midwives have not been properly defined, and midwifery ethical values and moral codes proper to Turkish culture have not been developed. The absence of legal regulations concerning midwifery has negatively affected midwifery in the process of professionalization. The purpose of this study was to identify the professional values of midwifery in Turkey. A focus group was created with the participation of nine midwives working at two state hospitals and a university hospital that provide birth service for women in Kocaeli, which is the most important industrial city in Turkey. The opinions of the midwives on the characteristics that a good midwife should possess and the professional values that a good midwife should observe were collected via in-depth interviews. The interviews were recorded. A total of three meetings were held with the participants. Finally, the notes taken by the reporter during these interviews were rearranged, and the recordings were transcribed by the researchers. THE CHARACTERISTICS SUGGESTED BY THE PARTICIPANTS WERE CLASSIFIED INTO THREE CATEGORIES: professional, personal, and interpersonal. Professional competence, capacity to properly inform interested parties, trustworthiness, respect for individuals and human dignity, and empathy were the most commonly named characteristics. As for the professional values of midwifery, professional competence, trustworthiness, responsibility, maximum benefit, and protection of privacy were the most often identified. Midwives also reported that most of the difficulties they faced in the exercise of daily tasks concerned protecting the privacy of their patients as well as the integrity and prestige of the profession, achieving the maximum benefit and least harm for patients, and providing a just and equal service. The professional values were mentioned by participant midwives were similar to the values proposed by international professional organizations. But there were some

  11. Positive predictive value of ICD-9th codes for upper gastrointestinal bleeding and perforation in the Sistema Informativo Sanitario Regionale database.

    Science.gov (United States)

    Cattaruzzi, C; Troncon, M G; Agostinis, L; García Rodríguez, L A

    1999-06-01

    We identified patients whose records in the Sistema Informativo Sanitario Regionale database in the Italian region of Friuli-Venezia Giulia showed a code of upper gastrointestinal bleeding (UGIB) and perforation according to codes of the International Classification of Diseases (ICD)-9th revision. The validity of site- and lesion-specific codes (531 to 534) and nonspecific codes (5780, 5781, and 5789) was ascertained through manual review of hospital clinical records. The initial group was made of 1779 potential cases of UGIB identified with one of these codes recorded. First, the positive predictive values (PPV) were calculated in a random sample. As a result of the observed high PPV of 531 and 532 codes, additional hospital charts were solely requested for all remaining potential cases with 533, 534, and 578 ICD-9 codes. The overall PPV reached a high of 97% for 531 and 532 site-specific codes, 84% for 534 site-specific codes, and 80% for 533 lesion-specific codes, and a low of 59% for nonspecific codes. These data suggest a considerable research potential for this new computerized health care database in Southern Europe.

  12. Nonlinear QR code based optical image encryption using spiral phase transform, equal modulus decomposition and singular value decomposition

    Science.gov (United States)

    Kumar, Ravi; Bhaduri, Basanta; Nishchal, Naveen K.

    2018-01-01

    In this study, we propose a quick response (QR) code based nonlinear optical image encryption technique using spiral phase transform (SPT), equal modulus decomposition (EMD) and singular value decomposition (SVD). First, the primary image is converted into a QR code and then multiplied with a spiral phase mask (SPM). Next, the product is spiral phase transformed with particular spiral phase function, and further, the EMD is performed on the output of SPT, which results into two complex images, Z 1 and Z 2. Among these, Z 1 is further Fresnel propagated with distance d, and Z 2 is reserved as a decryption key. Afterwards, SVD is performed on Fresnel propagated output to get three decomposed matrices i.e. one diagonal matrix and two unitary matrices. The two unitary matrices are modulated with two different SPMs and then, the inverse SVD is performed using the diagonal matrix and modulated unitary matrices to get the final encrypted image. Numerical simulation results confirm the validity and effectiveness of the proposed technique. The proposed technique is robust against noise attack, specific attack, and brutal force attack. Simulation results are presented in support of the proposed idea.

  13. A compilation of radionuclide transfer factors for the plant, meat, milk, and aquatic food pathways and the suggested default values for the RESRAD code

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.Y.; Biwer, B.M.; Yu, C.

    1993-08-01

    The ongoing development and revision of the RESRAD computer code at Argonne National Laboratory requires update of radionuclide transfer factors for the plant, meat, milk, and aquatic food pathways. Default values for these transfer factors used in published radiological assessment reports are compiled and compared with values used in RESRAD. The differences among the reported default values used in different radiological assessment codes and reports are also discussed. In data comparisons, values used in more recent reports are given more weight because more recent experimental work tends to be conducted under better-defined laboratory or field conditions. A new default value is suggested for RESRAD if one of the following conditions is met: (1) values used in recent reports are an order of magnitude higher or lower than the default value currently used in RESRAD, or (2) the same default value is used in several recent radiological assessment reports.

  14. Acceptance and valued living as critical appraisal and coping strengths for caregivers dealing with terminal illness and bereavement.

    Science.gov (United States)

    Davis, Esther L; Deane, Frank P; Lyons, Geoffrey C B

    2015-04-01

    Informal caregivers of palliative care patients play an essential role in the coordination of care for patients during their final phases of life. However, undertaking a caregiving role can have enduring psychological consequences for caregivers and interfere with functioning. Studies have investigated a variety of factors associated with individual differences in caregiver psychosocial outcomes, but little is known about their relative impact, and there is a need for guiding models to support research in this area. A review of the literature was conducted on factors influencing the psychological distress and grief of caregivers. Drawing from acceptance and commitment therapy (ACT) and Stroebe and colleagues' integrative risk factor framework, we developed a process model to describe individual differences in caregiver psychological distress and grief. The model presents caregiver psychological distress and grief as functions of death attitudes and communication about death and dying, mediated by acceptance and valued living from an ACT perspective. An outline of the empirical and theoretical underpinnings for each component in the model is provided. SIGNIFICANT OF RESULTS: The presented model is an inherently strengths-based model that is concordant with acceptance- and values- (ACT) based interventions to facilitate coping in caregivers.

  15. Adherence to a standardized protocol for measuring grip strength and appropriate cut-off values in adults over 65 years with sarcopenia: a systematic review protocol.

    Science.gov (United States)

    Fox, Benjamin; Henwood, Tim; Schaap, Laura; Bruyère, Olivier; Reginster, Jean-Yves; Beaudart, Charlotte; Buckinx, Fanny; Roberts, Helen; Cooper, Cyrus; Cherubini, Antonio; dellʼAquilla, Giuseppina; Maggio, Marcello; Volpato, Stefano

    2015-10-01

    The objective of this review is to examine the use of grip strength analysis in well and unwell populations in adults 65 years and over as a tool to establish muscle strength in sarcopenia.More specifically, the main review question is:1. What protocol, if any, is most commonly used among older adults with sarcopenia and does this match the standardized protocol suggested in 2011 by Roberts et al.1?Secondary review questions are:2. What are the reported cut-off values being used to determine sarcopenia in older adults, with consideration for ethnic and gender variability?3. Is grip strength, as a tool to measure muscle strength, suitable for people with common comorbidities and geriatric syndromes, such as osteoarthritis, often associated with sarcopenia? Sarcopenia, a commonly used concept in geriatrics and gerontology, is characterized by a loss of muscle mass, muscle strength and/or physical functioning. Prevalence rates vary between 1-39% in community dwelling older populations and 14-33% in long-term care populations. Several epidemiological studies have shown the association of sarcopenia with adverse health outcomes such as falls, disability, hospitalization and mortality. Originally, sarcopenia refers to the loss of muscle mass with aging, which was later complemented with loss of muscle strength and physical functioning.In 2010, the European Working Group on Sarcopenia in Older People (EWGSOP) reported a consensus definition of sarcopenia, which included measurement of low muscle mass and low muscle function (strength or physical performance). This consensus definition can be used to identify sarcopenia patients in clinical practice and to select individuals for clinical trials. Well-designed clinical trials could ultimately lead to effective treatment and prevention strategies for sarcopenia. Since the publication of the consensus report, many studies have adopted this definition, which could potentially lead to better comparison of results between

  16. Emerging putative associations between non-coding RNAs and protein-coding genes in Neuropathic Pain. Added value from re-using microarray data.

    Directory of Open Access Journals (Sweden)

    Enrico Capobianco

    2016-10-01

    Full Text Available Regeneration of injured nerves is likely occurring in the peripheral nervous system, but not in the central nervous system. Although protein-coding gene expression has been assessed during nerve regeneration, little is currently known about the role of non-coding RNAs (ncRNAs. This leaves open questions about the potential effects of ncRNAs at transcriptome level. Due to the limited availability of human neuropathic pain data, we have identified the most comprehensive time-course gene expression profile referred to sciatic nerve injury, and studied in a rat model, using two neuronal tissues, namely dorsal root ganglion (DRG and sciatic nerve (SN. We have developed a methodology to identify differentially expressed bioentities starting from microarray probes, and re-purposing them to annotate ncRNAs, while analyzing the expression profiles of protein-coding genes. The approach is designed to reuse microarray data and perform first profiling and then meta-analysis through three main steps. First, we used contextual analysis to identify what we considered putative or potential protein coding targets for selected ncRNAs. Relevance was therefore assigned to differential expression of neighbor protein-coding genes, with neighborhood defined by a fixed genomic distance from long or antisense ncRNA loci, and of parent genes associated with pseudogenes. Second, connectivity among putative targets was used to build networks, in turn useful to conduct inference at interactomic scale. Last, network paths were annotated to assess relevance to neuropathic pain. We found significant differential expression in long-intergenic ncRNAs (32 lincRNAs in SN, and 8 in DRG, antisense RNA (31 asRNA in SN, and 12 in DRG and pseudogenes (456 in SN, 56 in DRG. In particular, contextual analysis centered on pseudogenes revealed some targets with known association to neurodegeneration and/or neurogenesis processes. While modules of the olfactory receptors were clearly

  17. A review of the empirical evidence of the value of structuring and coding of clinical information within electronic health records for direct patient care

    Directory of Open Access Journals (Sweden)

    Dipak Kalra

    2013-05-01

    Full Text Available Background The case has historically been presented that structured and/or coded electronic health records (EHRs benefit direct patient care, but the evidence base for this is not well documented.Methods We searched for evidence of direct patient care value from the use of structured and/or coded information within EHRs. We interrogated nine international databases from 1990 to 2011. Value was defined using the Institute of Medicine’s six areas for improvement for healthcare systems: effectiveness, safety, patient-centredness, timeliness, efficiency and equitability. We included studies satisfying the Cochrane Effective Practice and Organisation of Care (EPOC group criteria.Results Of 5016 potentially eligible papers, 13 studies satisfied our criteria: 10 focused on effectiveness, with eight demonstrating potential for improved proxy and actual clinical outcomes if a structured and/or coded EHR was combined with alerting or advisory systems in a focused clinical domain. Three studies demonstrated improvement in safety outcomes. No studies were found reporting value in relation to patient-centredness, timeliness, efficiency or equitability.Conclusions We conclude that, to date, there has been patchy effort to investigate empirically the value from structuring and coding EHRs for direct patient care. Future investments in structuring and coding of EHRs should be informed by robust evidence as to the clinical scenarios in which patient care benefits may be realised.

  18. Predictive value of vertebral artery extracranial color-coded duplex sonography for ischemic stroke-related vertigo

    Directory of Open Access Journals (Sweden)

    Li-Min Liou

    2013-12-01

    Full Text Available Vertigo can be a major presentation of posterior circulation stroke and can be easily misdiagnosed because of its complicated presentation. We thus prospectively assessed the predictive value of vertebral artery extracranial color-coded duplex sonography (ECCS for the prediction of ischemic stroke-related vertigo. The inclusion criteria were: (1 a sensation of whirling (vertigo; (2 intractable vertigo for more than 1 hour despite appropriate treatment; and (3 those who could complete cranial magnetic resonance imaging (MRI and vertebral artery (V2 segment ECCS studies. Eventually, 76 consecutive participants with vertigo were enrolled from Kaohsiung Municipal Hsiao-Kang Hospital, Kaohsiung, Taiwan between August 2010 and August 2011. Demographic data, neurological symptoms, neurologic examinations, and V2 ECCS were assessed. We chose the parameters of peak systolic velocity (PSV, end diastolic velocity (EDV, PSV/EDV, mean velocity (MV, resistance index (RI, and pulsatility index (PI to represent the hemodynamics. Values from both sides of V2 segments were averaged. We then calculated the average RI (aRI, average PI (aPI, average PSV (aPSV/EDV, and average (aMV. Axial and coronal diffusion-weighted MRI findings determined the existence of acute ischemic stroke. We grouped and analyzed participants in two ways (way I and way II analyses based on the diffusion-weighted MRI findings (to determine whether there was acute stroke and neurological examinations. Using way I analysis, the “MRI (+” group had significantly higher impedance (aRI, aPI, and aPSV/EDV ratio and lower velocity (aPSV, aEDV, and aMV(PSV + EDV/2, compared to the “MRI (–” group. The cutoff value/sensitivity/specificity of aPSV, aEDV, aMV, aPI, aRI, and aPSV/EDV between the MRI (+ and MRI (– groups were 41.15/61.5/66.0 (p = 0.0101, 14.55/69.2/72.0 (p = 0.0003, 29.10/92.1/38.0 (p = 0.0013, 1.07/76.9/64.0 (p = 0.0066, 0.62/76.9/64.0 (p = 0.0076, and 2

  19. A Maximum Muscle Strength Prediction Formula Using Theoretical Grade 3 Muscle Strength Value in Daniels et al.’s Manual Muscle Test, in Consideration of Age: An Investigation of Hip and Knee Joint Flexion and Extension

    Directory of Open Access Journals (Sweden)

    Hideyuki Usa

    2017-01-01

    Full Text Available This study attempted to develop a formula for predicting maximum muscle strength value for young, middle-aged, and elderly adults using theoretical Grade 3 muscle strength value (moment fair: Mf—the static muscular moment to support a limb segment against gravity—from the manual muscle test by Daniels et al. A total of 130 healthy Japanese individuals divided by age group performed isometric muscle contractions at maximum effort for various movements of hip joint flexion and extension and knee joint flexion and extension, and the accompanying resisting force was measured and maximum muscle strength value (moment max, Mm was calculated. Body weight and limb segment length (thigh and lower leg length were measured, and Mf was calculated using anthropometric measures and theoretical calculation. There was a linear correlation between Mf and Mm in each of the four movement types in all groups, excepting knee flexion in elderly. However, the formula for predicting maximum muscle strength was not sufficiently compatible in middle-aged and elderly adults, suggesting that the formula obtained in this study is applicable in young adults only.

  20. A comprehensive strength testing protocol offers no clinical value in predicting risk of hamstring injury: a prospective cohort study of 413 professional football players.

    Science.gov (United States)

    van Dyk, Nicol; Bahr, Roald; Burnett, Angus F; Whiteley, Rod; Bakken, Arnhild; Mosler, Andrea; Farooq, Abdulaziz; Witvrouw, Erik

    2017-12-01

    Hamstring injuries remain prevalent across a number of professional sports. In football, the incidence has even increased by 4% per year at the Champions League level over the last decade. The role of muscle strength or strength ratios and their association with risk of hamstring injury remain restricted by small sample sizes and inconclusive results. The purpose of this study is to identify risk factors for hamstring injury in professional football players in an adequately powered, prospective cohort study. Using both established (isokinetic) and novel (eccentric hamstring test device) measures of muscle strength, we aimed to investigate the relationship between these strength characteristics over the entire range of motion with risk of hamstring injury. All teams (n=18) eligible to compete in the premier football league in Qatar underwent a comprehensive strength assessment during their annual periodic health evaluation at Aspetar Orthopaedic and Sports Medicine Hospital in Doha, Qatar. Variables included isokinetic strength, Nordic hamstring exercise strength and dynamic hamstring: quadriceps ratios. Of the 413 players included (68.2% of all league players), 66 suffered a hamstring injury over the two seasons. Only isokinetic quadriceps concentric at 300°/s (adjusted for bodyweight) was associated with risk of hamstring injury when considered categorically. Age, body mass and playing position were also associated with risk of hamstring injury. None of the other 23 strength variables examined were found to be associated with hamstring injury. The clinical value of isolated strength testing is limited, and its use in musculoskeletal screening to predict future hamstring injury is unfounded. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Reference values of grip strength measured with a Jamar dynamometer in 1526 adults with intellectual disabilities and compared to adults without intellectual disability

    NARCIS (Netherlands)

    Cuesta-Vargas, A. (Antonio); T.I.M. Hilgenkamp (Thessa)

    2015-01-01

    textabstractAim: The aim of this study was to investigate grip strength in a large sample of people with intellectual disabilities, to establish reference values for adults with intellectual disabilities (ID) and compare it to adults without intellectual disability. Methods: This study analysed

  2. Software testing and source code for the calculation of clearance values. Final report; Erprobung von Software und Quellcode zur Berechnung von Freigabewerten. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Artmann, Andreas; Meyering, Henrich

    2016-11-15

    The GRS research project was aimed to the test the appropriateness of the software package ''residual radioactivity'' (RESRAD) for the calculation of clearance values according to German and European regulations. Comparative evaluations were performed with RESRAD-OFFSITE and the code SiWa-PRO DSS used by GRS and the GRS program code ARTM. It is recommended to use RESRAD-OFFSITE for comparative calculations. The dose relevant air-path dispersion of radionuclides should not be modeled using RESRAD-OFFSITE, the use of ARTM is recommended. The sensitivity analysis integrated into RESRAD-OFFSITE allows a fast identification of crucial parameters.

  3. Positive predictive values of the International Classification of Disease, 10th edition diagnoses codes for diverticular disease in the Danish National Registry of Patients

    Directory of Open Access Journals (Sweden)

    Rune Erichsen

    2010-10-01

    Full Text Available Rune Erichsen1, Lisa Strate2, Henrik Toft Sørensen1, John A Baron31Department of Clinical Epidemiology, Aarhus University Hospital, Denmark; 2Division of Gastroenterology, University of Washington, Seattle, WA, USA; 3Departments of Medicine and of Community and Family Medicine, Dartmouth Medical School, NH, USAObjective: To investigate the accuracy of diagnostic coding for diverticular disease in the Danish National Registry of Patients (NRP.Study design and setting: At Aalborg Hospital, Denmark, with a catchment area of 640,000 inhabitants, we identified 100 patients recorded in the NRP with a diagnosis of diverticular disease (International Classification of Disease codes, 10th revision [ICD-10] K572–K579 during the 1999–2008 period. We assessed the positive predictive value (PPV as a measure of the accuracy of discharge codes for diverticular disease using information from discharge abstracts and outpatient notes as the reference standard.Results: Of the 100 patients coded with diverticular disease, 49 had complicated diverticular disease, whereas 51 had uncomplicated diverticulosis. For the overall diagnosis of diverticular disease (K57, the PPV was 0.98 (95% confidence intervals [CIs]: 0.93, 0.99. For the more detailed subgroups of diagnosis indicating the presence or absence of complications (K573–K579 the PPVs ranged from 0.67 (95% CI: 0.09, 0.99 to 0.92 (95% CI: 0.52, 1.00. The diagnosis codes did not allow accurate identification of uncomplicated disease or any specific complication. However, the combined ICD-10 codes K572, K574, and K578 had a PPV of 0.91 (95% CI: 0.71, 0.99 for any complication.Conclusion: The diagnosis codes in the NRP can be used to identify patients with diverticular disease in general; however, they do not accurately discern patients with uncomplicated diverticulosis or with specific diverticular complications.Keywords: diverticulum, colon, diverticulitis, validation studies

  4. Systematic Reviews and Meta-Analyses - Literature-based Recommendations for Evaluating Strengths, Weaknesses, and Clinical Value.

    Science.gov (United States)

    Beitz, Janice M; Bolton, Laura L

    2015-11-01

    Good quality systematic reviews (SRs) summarizing best available evidence can help inform clinical decisions, improv- ing patient and wound outcomes. Weak SRs can misinform readers, undermining care decisions and evidence-based practice. To examine the strengths and weaknesses of SRs and meta-analyses and the role of SRs in contemporary evidence-based wound care practice, and using the search terms systematic review, meta-analysis, and evidence-based practice, the authors searched Medline and the Cumulative Index to Nursing and Allied Health Literature (CINAHL) for important terminology and recommendations to help clinicians evaluate SRs with meta-analysis. Reputable websites, recent textbooks, and synthesized available literature also were reviewed to describe and summarize SR strengths and weaknesses. After developing a checklist for critically evaluating SR objectives, inclusion/exclusion criteria, study quality, data extraction and synthesis methods, meta-analysis homogeneity, accuracy of results, interpretation, and consistency between significant findings and abstract or conclusions, the checklist was applied to topical wound care SRs identified in Cochrane and MEDLINE searches. Best available evidence included in the SRs from 169 randomized controlled trials on 11,571 patients supporting topical intervention healing effects on burns, surgical sites, and diabetic, venous, or pressure ulcers was summarized and showed SRs and clinical trials can demonstrate different outcomes because the information/data are compiled differently. The results illustrate how evidence insufficient to support firm conclusions may still meet immediate needs to guide carefully considered clinical wound and patient care decisions while encouraging better future science.

  5. Protocol for validating cardiovascular and cerebrovascular ICD-9-CM codes in healthcare administrative databases: the Umbria Data Value Project

    Science.gov (United States)

    Cozzolino, Francesco; Orso, Massimiliano; Mengoni, Anna; Cerasa, Maria Francesca; Eusebi, Paolo; Ambrosio, Giuseppe; Montedori, Alessandro

    2017-01-01

    Introduction Administrative healthcare databases can provide a comprehensive assessment of the burden of diseases in terms of major outcomes, such as mortality, hospital readmissions and use of healthcare resources, thus providing answers to a wide spectrum of research questions. However, a crucial issue is the reliability of information gathered. Aim of this protocol is to validate International Classification of Diseases, 9th Revision—Clinical Modification (ICD-9-CM) codes for major cardiovascular diseases, including acute myocardial infarction (AMI), heart failure (HF), atrial fibrillation (AF) and stroke. Methods and analysis Data from the centralised administrative database of the entire Umbria Region (910 000 residents, located in Central Italy) will be considered. Patients with a first hospital discharge for AMI, HF, AF or stroke, between 2012 and 2014, will be identified in the administrative database using the following groups of ICD-9-CM codes located in primary position: (1) 410.x for AMI; (2) 427.31 for AF; (3) 428 for HF; (4) 433.x1, 434 (excluding 434.x0), 436 for ischaemic stroke, 430 and 431 for haemorrhagic stroke (subarachnoid haemorrhage and intracerebral haemorrhage). A random sample of cases, and of non-cases, will be selected, and the corresponding medical charts retrieved and reviewed for validation by pairs of trained, independent reviewers. For each condition considered, case adjudication of disease will be based on symptoms, laboratory and diagnostic tests, as available in medical charts. Divergences will be resolved by consensus. Sensitivity and specificity with 95% CIs will be calculated. Ethics and dissemination Research protocol has been granted approval by the Regional Ethics Committee. Study results will be disseminated widely through peer-reviewed publications and presentations at national and international conferences. PMID:28360241

  6. The persuasive strength of values, reputation, and interest arguments for promoting ethical behavior in a global corporate setting

    DEFF Research Database (Denmark)

    Trapp, Leila

    2010-01-01

    This paper examines survey results regarding staff evaluations of various company-issued arguments used to promote ethical behavior in a global corporate setting. The aim of this is to question the appropriateness of approaching business ethics communication from within a corporate communication......, reveal that although there are some important differences between affiliates, there is also an impressive degree of agreement that corporate identity, values, and reputation are important sources of motivation for ethical behavior. These findings provide practical guidance for the development...... of persuasive business ethics programs while strengthening the corporate communication stance that consistent global communication has its benefits, at least in the case of global, values-based companies....

  7. A Method to Determine Lankford Coefficients (R-Values) for Ultra High Strength Low Alloy (Uhsla) Steels

    Science.gov (United States)

    Gösling, M.

    2017-09-01

    For Ultra High Strength Low Alloy Steels (UHSLAS) it is difficult to determine Lankford parameters, since the measurement of a stable strain ratio is often not possible. This report presents a method for determining Lankford coefficients for UHSLA Steels. The method is based on a combination of a theoretical material model and on experiences from a material data base. The Hill’48 yield condition is used to calculate the Lankford coefficients as a function of the yield stress. An empirical model based on the BILSTEIN material data base is used to predict the anisotropy. The result from earing test is used as an input parameter for the empirical model. The method is first checked using data from tensile tests. The predicted Lankford coefficients are compared with measured Lankford coefficients. In a further step, this method is applied to low alloyed steels with a yield stress of more than 900 MPa. For these materials the Lankford coefficients could not be measured by tensile tests. Predicted Lankford coefficients are used in the numerical simulation of earing test and compared with experimental results. In summary, it can be stated that the method presented here is suitable for predicting Lankford coefficients in case of an impossible direct measurement.

  8. Separate mechanisms for magnitude and order processing in the spatial-numerical association of response codes (SNARC) effect: The strange case of musical note values.

    Science.gov (United States)

    Prpic, Valter; Fumarola, Antonia; De Tommaso, Matteo; Luccio, Riccardo; Murgia, Mauro; Agostini, Tiziano

    2016-08-01

    The spatial-numerical association of response codes (SNARC) effect is considered an evidence of the association between numbers and space, with faster left key-press responses to small numbers and faster right key-press responses to large numbers. We examined whether visually presented note values produce a SNARC-like effect. Differently from numbers, note values are represented as a decreasing left-to-right progression, allowing us to disambiguate the contribution of order and magnitude in determining the direction of the effect. Musicians with formal education performed a note value comparison in Experiment 1 (direct task), a line orientation judgment in Experiment 2 (indirect task), and a detection task in Experiment 3 (indirect task). When note values were task relevant (direct task), participants responded faster to large note values with the left key-press, and vice versa. Conversely, when note values were task irrelevant (indirect tasks), the direction of this association was reversed. This evidence suggests the existence of separate mechanisms underlying the SNARC effect. Namely, an Order-Related Mechanism (ORM) and a Magnitude-Related Mechanism (MRM) that are revealed by different task demands. Indeed, according to a new model we proposed, ordinal and magnitude related information appears to be preferentially involved in direct and indirect tasks, respectively. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. The predictive value of ICD-10 diagnostic coding used to assess Charlson comorbidity index conditions in the population-based Danish National Registry of Patients

    Directory of Open Access Journals (Sweden)

    Lash Timothy L

    2011-05-01

    Full Text Available Abstract Background The Charlson comorbidity index is often used to control for confounding in research based on medical databases. There are few studies of the accuracy of the codes obtained from these databases. We examined the positive predictive value (PPV of the ICD-10 diagnostic coding in the Danish National Registry of Patients (NRP for the 19 Charlson conditions. Methods Among all hospitalizations in Northern Denmark between 1 January 1998 and 31 December 2007 with a first-listed diagnosis of a Charlson condition in the NRP, we selected 50 hospital contacts for each condition. We reviewed discharge summaries and medical records to verify the NRP diagnoses, and computed the PPV as the proportion of confirmed diagnoses. Results A total of 950 records were reviewed. The overall PPV for the 19 Charlson conditions was 98.0% (95% CI; 96.9, 98.8. The PPVs ranged from 82.0% (95% CI; 68.6%, 91.4% for diabetes with diabetic complications to 100% (one-sided 97.5% CI; 92.9%, 100% for congestive heart failure, peripheral vascular disease, chronic pulmonary disease, mild and severe liver disease, hemiplegia, renal disease, leukaemia, lymphoma, metastatic tumour, and AIDS. Conclusion The PPV of NRP coding of the Charlson conditions was consistently high.

  10. Adaptive coding of the value of social cues with oxytocin, an fMRI study in autism spectrum disorder.

    Science.gov (United States)

    Andari, Elissar; Richard, Nathalie; Leboyer, Marion; Sirigu, Angela

    2016-03-01

    The neuropeptide oxytocin (OT) is one of the major targets of research in neuroscience, with respect to social functioning. Oxytocin promotes social skills and improves the quality of face processing in individuals with social dysfunctions such as autism spectrum disorder (ASD). Although one of OT's key functions is to promote social behavior during dynamic social interactions, the neural correlates of this function remain unknown. Here, we combined acute intranasal OT (IN-OT) administration (24 IU) and fMRI with an interactive ball game and a face-matching task in individuals with ASD (N = 20). We found that IN-OT selectively enhanced the brain activity of early visual areas in response to faces as compared to non-social stimuli. OT inhalation modulated the BOLD activity of amygdala and hippocampus in a context-dependent manner. Interestingly, IN-OT intake enhanced the activity of mid-orbitofrontal cortex in response to a fair partner, and insula region in response to an unfair partner. These OT-induced neural responses were accompanied by behavioral improvements in terms of allocating appropriate feelings of trust toward different partners' profiles. Our findings suggest that OT impacts the brain activity of key areas implicated in attention and emotion regulation in an adaptive manner, based on the value of social cues. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  12. Positive predictive value of the International Classification of Diseases, 10th edition diagnosis codes for anemia caused by bleeding in the Danish National Registry of Patients.

    Science.gov (United States)

    Zalfani, Jihen; Frøslev, Trine; Olsen, Morten; Ben Ghezala, Inès; Gammelager, Henrik; Arendt, Johan Fb; Erichsen, Rune

    2012-01-01

    Valid data on anemia caused by bleeding are needed for epidemiological research and monitoring health care. The Danish National Registry of Patients (DNRP) is a nationwide medical database with information on all Danish residents' hospital history. We aimed to assess the positive predictive value (PPV) of the diagnostic coding of anemia caused by bleeding in the DNRP. In the DNRP, we identified all patients with International Classification of Disease, 10th edition codes for anemia caused by bleeding (acute: D50.0; chronic: D62.6) at three Danish hospitals from 2000 through 2009. For these patients we computed the PPV using hemoglobin level data, from Aarhus University laboratory database, as reference standard. Anemia was defined by a hemoglobin level less than 7.0 mmol/L for women and less than 8.0 mmol/L for men. We identified 3391 patients in the DNRP with a diagnosis of anemia caused by bleeding. The overall PPV was 95.4% (95% confidence interval [CI]: 94.6%-96.0%). The PPV was 97.6% (95% CI: 96.6%-98.3%) for men and 94.0% (95% CI: 92.9%-94.9%) for women, and the PPV increased with age at diagnosis. The PPV varied according to type of discharging departments, from 89.2% (95% CI: 83.4%-93.4%) in gynecology to 96.8% (95% CI: 94.9%-98.2%) in surgery, and was lower for outpatients compared with inpatients. We found a high PPV of the coding for anemia caused by bleeding in the DNRP. The registry is a valid source of data on anemia caused by bleeding for various purposes including research and monitoring health care.

  13. Values and ethical principles for practicing as magistrate/ legal advisor out of the perspective of the codes and national and international statements of principles

    Directory of Open Access Journals (Sweden)

    Marţian Iovan

    2016-10-01

    Full Text Available The coordinating and regulating role of the moral values, of the Deontological Code in practicing the magistrate/ legal advisor position is analysed in this article, so that their decisions correspond the universal imperative of practical accomplishment of justice, implicitly to the audience’s expectations with regard to the efficiency and efficacy of the services delivered by the institutions in the judicial system. The subject is of obvious actuality, fact which results in the existence of a relevant number of cases of violation, deforming of the ethical principles, of the specific deontological norms for the legal advisors, especially for the magistrates, which occur in performing the act of justice. The author highlights through examples, the harmful effects of some magistrates’ side-slipping from the ethical principles (Independence, Impartiality, Integrity stipulated in the most important deontological codes, statements of principles or national and international conventions. The logical conclusion, resulting from the analyses, aims to perfection the judicial system, the moral part of the legal higher education, of the magistrates’ continuous training and assessment.

  14. Long Non-Coding RNA lincRNA-ROR Promotes the Progression of Colon Cancer and Holds Prognostic Value by Associating with miR-145.

    Science.gov (United States)

    Zhou, Peng; Sun, Lixia; Liu, Danfeng; Liu, Changkuo; Sun, Lei

    2016-10-01

    Large intergenic non-coding RNA ribonucleic acids-ROR (lincRNA-ROR) has been reported to exert impacts on the maintenance of induced pluripotent stem cells and embryonic stem cells, and play important roles in human hepatocellular cancer. It contributes to tumorigenesis and metastasis and functions as a competing endogenous RNA (ceRNA) by sponging miR-145 in breast cancer. However, its clinical significance and prognostic value in colon cancer remain unknown. The aim of the present study was to clarify the clinicopathological role and prognostic value of lincRNA-ROR and miR-145 in colon cancer. In the present study, qRT-PCR was performed to measure the expression levels of lincRNA-ROR in colon cancer tissues and cell lines. Then, the clinicopathological significance and prognostic value of lincRNA-ROR were analyzed. LincRNA-ROR expression correlated with pT stage, pN stage, AJCC stage and vascular invasion. Knockdown of lincRNA-ROR restored the expression of miR-145, and had a significant influence on colon cancer cell proliferation, migration and invasion. Patients of the high lincRNA-ROR/low miR-145 group had significantly poorer outcomes than those of the low lincRNA-ROR/high miR-145 group. Taken together, Overexpression of lincRNA-ROR combined with depletion of miR-145 may exert crucial impact on colon cancer prognosis evaluation and treatment.

  15. Positive predictive value between medical-chart body-mass-index category and obesity versus codes in a claims-data warehouse.

    Science.gov (United States)

    Caplan, Eleanor O; Kamble, Pravin S; Harvey, Raymond A; Smolarz, B Gabriel; Renda, Andrew; Bouchard, Jonathan R; Huang, Joanna C

    2018-01-01

    To evaluate the positive predictive value of claims-based V85 codes for identifying individuals with varying degrees of BMI relative to their measured BMI obtained from medical record abstraction. This was a retrospective validation study utilizing administrative claims and medical chart data from 1 January 2009 to 31 August 2015. Randomly selected samples of patients enrolled in a Medicare Advantage Prescription Drug (MAPD) or commercial health plan and with a V85 claim were identified. The claims-based BMI category (underweight, normal weight, overweight, obese class I-III) was determined via corresponding V85 codes and compared to the BMI category derived from chart abstracted height, weight and/or BMI. The positive predictive values (PPVs) of the claims-based BMI categories were calculated with the corresponding 95% confidence intervals (CIs). The overall PPVs (95% CIs) in the MAPD and commercial samples were 90.3% (86.3%-94.4%) and 91.1% (87.3%-94.9%), respectively. In each BMI category, the PPVs (95% CIs) for the MAPD and commercial samples, respectively, were: underweight, 71.0% (55.0%-87.0%) and 75.9% (60.3%-91.4%); normal, 93.8% (85.4%-100%) and 87.8% (77.8%-97.8%); overweight, 97.4% (92.5%-100%) and 93.5% (84.9%-100%); obese class I, 96.9 (90.9%-100%) and 97.2% (91.9%-100%); obese class II, 97.0% (91.1%-100%) and 93.0% (85.4%-100%); and obese class III, 85.0% (73.3%-96.1%) and 97.1% (91.4%-100%). BMI categories derived from administrative claims, when available, can be used successfully particularly in the context of obesity research.

  16. Autocorrelation Structure in the Macaque Dorsolateral, But not Orbital or Polar, Prefrontal Cortex Predicts Response-Coding Strength in a Visually Cued Strategy Task.

    Science.gov (United States)

    Fascianelli, Valeria; Tsujimoto, Satoshi; Marcos, Encarni; Genovesio, Aldo

    2017-12-08

    In previous work, we studied the activity of neurons in the dorsolateral (PFdl), orbital (PFo), and polar (PFp) prefrontal cortex while monkeys performed a strategy task with 2 spatial goals. A cue instructed 1 of 2 strategies in each trial: stay with the previous goal or shift to the alternative goal. Each trial started with a fixation period, followed by a cue. Subsequently, a delay period was followed by a "go" signal that instructed the monkeys to choose one goal. After each choice, feedback was provided. In this study, we focused on the temporal receptive fields of the neurons, as measured by the decay in autocorrelation (time constant) during the fixation period, and examined the relationship with response and strategy coding. The temporal receptive field in PFdl correlated with the response-related but not with the strategy-related modulation in the delay and the feedback periods: neurons with longer time constants in PFdl tended to show stronger and more prolonged response coding. No such correlation was found in PFp or PFo. These findings demonstrate that the temporal specialization of neurons for temporally extended computations is predictive of response coding, and neurons in PFdl, but not PFp or PFo, develop such predictive properties. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Prioritizing association strength versus value: the influence of self-regulatory modes on means evaluation in single goal and multigoal contexts.

    Science.gov (United States)

    Orehek, Edward; Mauro, Romina; Kruglanski, Arie W; van der Bles, Anne Marthe

    2012-01-01

    Means of goal attainment are said to be multifinal when they are capable of attaining more than 1 goal at the same time. Such means have an advantage over unifinal means because they have the potential to attain greater overall value. However, they have the disadvantage (relative to unifinal means) of diluting the association between the means and each of the goals (Zhang, Fishbach, & Kruglanski, 2007). In turn,diluted association strength is often interpreted as reduced means’ instrumentality. Given these tradeoffs between value (favoring a multifinal option) and instrumentality (favoring the unifinal option), the question is under what conditions 1 or the other would be selected. Based on regulatory mode theory(Higgins, Kruglanski, & Pierro, 2003; Kruglanski et al., 2000), we predicted and found in 5 experiments that individuals operating in a locomotion self-regulatory mode prefer a unifinal to multifinal means,whereas individuals operating in an assessment mode prefer multifinal to unifinal means. Implications of these findings for self-regulatory phenomena are discussed.

  18. Twisted Reed-Solomon Codes

    DEFF Research Database (Denmark)

    Beelen, Peter; Puchinger, Sven; Rosenkilde ne Nielsen, Johan

    2017-01-01

    We present a new general construction of MDS codes over a finite field Fq. We describe two explicit subclasses which contain new MDS codes of length at least q/2 for all values of q ≥ 11. Moreover, we show that most of the new codes are not equivalent to a Reed-Solomon code.......We present a new general construction of MDS codes over a finite field Fq. We describe two explicit subclasses which contain new MDS codes of length at least q/2 for all values of q ≥ 11. Moreover, we show that most of the new codes are not equivalent to a Reed-Solomon code....

  19. Pull strength evaluation of Sn-Pb solder joints made to Au-Pt-Pd and Au thick film structures on low-temperature co-fired ceramic -final report for the MC4652 crypto-coded switch (W80).

    Energy Technology Data Exchange (ETDEWEB)

    Uribe, Fernando; Vianco, Paul Thomas; Zender, Gary L.

    2006-06-01

    A study was performed that examined the microstructure and mechanical properties of 63Sn-37Pb (wt.%, Sn-Pb) solder joints made to thick film layers on low-temperature co-fired (LTCC) substrates. The thick film layers were combinations of the Dupont{trademark} 4596 (Au-Pt-Pd) conductor and Dupont{trademark} 5742 (Au) conductor, the latter having been deposited between the 4596 layer and LTCC substrate. Single (1x) and triple (3x) thicknesses of the 4596 layer were evaluated. Three footprint sizes were evaluated of the 5742 thick film. The solder joints exhibited excellent solderability of both the copper (Cu) lead and thick film surface. In all test sample configurations, the 5742 thick film prevented side wall cracking of the vias. The pull strengths were in the range of 3.4-4.0 lbs, which were only slightly lower than historical values for alumina (Al{sub 2}O{sub 3}) substrates. General (qualitative) observations: (a) The pull strength was maximized when the total number of thick film layers was between two and three. Fewer that two layers did not develop as strong of a bond at the thick film/LTCC interface; more than three layers and of increased footprint area, developed higher residual stresses at the thick film/LTCC interface and in the underlying LTCC material that weakened the joint. (b) Minimizing the area of the weaker 4596/LTCC interface (e.g., larger 5742 area) improved pull strength. Specific observations: (a) In the presence of vias and the need for the 3x 4596 thick film, the preferred 4596:5742 ratio was 1.0:0.5. (b) For those LTCC components that require the 3x 4596 layer, but do not have vias, it is preferred to refrain from using the 5742 layer. (c) In the absence of vias, the highest strength was realized with a 1x thick 5742 layer, a 1x thick 4596 layer, and a footprint ratio of 1.0:1.0.

  20. Cyclone Codes

    OpenAIRE

    Schindelhauer, Christian; Jakoby, Andreas; Köhler, Sven

    2016-01-01

    We introduce Cyclone codes which are rateless erasure resilient codes. They combine Pair codes with Luby Transform (LT) codes by computing a code symbol from a random set of data symbols using bitwise XOR and cyclic shift operations. The number of data symbols is chosen according to the Robust Soliton distribution. XOR and cyclic shift operations establish a unitary commutative ring if data symbols have a length of $p-1$ bits, for some prime number $p$. We consider the graph given by code sym...

  1. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  2. Potential value of color-coded dynamic breast-specific gamma-imaging; comparing {sup 99m}Tc-(V)-DMSA, {sup 99m}Tc-MIBI, and {sup 99m}Tc-HDP in a mouse mammary tumor model

    Energy Technology Data Exchange (ETDEWEB)

    Leeuwen, Fijs W.B. van, E-mail: fw.v.leeuwen@nki.n [Departments of Radiology and Nuclear Medicine, Netherlands Cancer Institute, Antoni van Leeuwenhoek Hospital, 1066 CX Amsterdam (Netherlands); Buckle, Tessa; Batteau, Lukas; Pool, Bert; Sinaasappel, Michiel [Departments of Radiology and Nuclear Medicine, Netherlands Cancer Institute, Antoni van Leeuwenhoek Hospital, 1066 CX Amsterdam (Netherlands); Jonkers, Jos [Department of Molecular Biology, Netherlands Cancer Institute, Antoni van Leeuwenhoek Hospital, 1066 CX Amsterdam (Netherlands); Gilhuijs, Kenneth G.A. [Departments of Radiology and Nuclear Medicine, Netherlands Cancer Institute, Antoni van Leeuwenhoek Hospital, 1066 CX Amsterdam (Netherlands)

    2010-12-15

    Using a mouse mammary tumor model based on orthotopic transplantation of luciferase-expressing mouse ILC cells (KEP1-Luc cells), we evaluated the diagnostic value of three clinically applied tracers: {sup 99m}Tc-(V)-DMSA, {sup 99m}Tc-MIBI, and {sup 99m}Tc-HDP. Uptake of the tracers is compared using static and dynamic imaging procedures. We found that dynamic imaging in combination with pixel-by-pixel color coding has an added value over (high resolution) static imaging procedures. Such dynamic imaging procedures could enhance the potential of breast-specific gamma-imaging.

  3. Strength Training

    Science.gov (United States)

    ... big difference between strength training, powerlifting, and competitive bodybuilding! Strength training uses resistance methods like free weights, ... a person can lift at one time. Competitive bodybuilding involves evaluating muscle definition and symmetry, as well ...

  4. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  5. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class......Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class...

  6. code {poems}

    Directory of Open Access Journals (Sweden)

    Ishac Bertran

    2012-08-01

    Full Text Available "Exploring the potential of code to communicate at the level of poetry," the code­ {poems} project solicited submissions from code­writers in response to the notion of a poem, written in a software language which is semantically valid. These selections reveal the inner workings, constitutive elements, and styles of both a particular software and its authors.

  7. The association between patient attitudes and values and the strength of consideration for contralateral prophylactic mastectomy in a population-based sample of breast cancer patients.

    Science.gov (United States)

    Hawley, Sarah T; Griffith, Kent A; Hamilton, Ann S; Ward, Kevin C; Morrow, Monica; Janz, Nancy K; Katz, Steven J; Jagsi, Reshma

    2017-12-01

    Little is known about how the individual decision styles and values of breast cancer patients at the time of treatment decision making are associated with the consideration of different treatment options and specifically with the consideration of contralateral prophylactic mastectomy (CPM). Newly diagnosed patients with early-stage breast cancer who were treated in 2013-2014 were identified through the Surveillance, Epidemiology, and End Results registries of Los Angeles and Georgia and were surveyed approximately 7 months after surgery (n = 2578; response rate, 71%). The primary outcome was the consideration of CPM (strong vs less strong). The association between patients' values and decision styles and strong consideration was assessed with multivariate logistic regression. Approximately one-quarter of women (25%) reported strong/very strong consideration of CPM, and another 29% considered it moderately/weakly. Decision styles, including a rational-intuitive approach to decision making, varied. The factors most valued by women at the time of treatment decision making were as follows: avoiding worry about recurrence (82%) and reducing the need for more surgery (73%). In a multivariate analysis, patients who preferred to make their own decisions, those who valued avoiding worry about recurrence, and those who valued avoiding radiation significantly more often strongly considered CPM (P breast did so less often. Many patients considered CPM, and the consideration was associated with both decision styles and values. The variability in decision styles and values observed in this study suggests that formally evaluating these characteristics at or before the initial treatment encounter could provide an opportunity for improving patient clinician discussions. Cancer 2017;123:4547-4555. © 2017 American Cancer Society. © 2017 American Cancer Society.

  8. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  9. COLLISION STRENGTHS AND EFFECTIVE COLLISION STRENGTHS FOR TRANSITIONS WITHIN THE GROUND-STATE CONFIGURATION OF S III

    Energy Technology Data Exchange (ETDEWEB)

    Hudson, C. E.; Ramsbottom, C. A.; Scott, M. P., E-mail: c.hudson@qub.ac.uk, E-mail: c.ramsbottom@qub.ac.uk, E-mail: p.scott@qub.ac.uk [Department of Applied Maths and Theoretical Physics, The Queen' s University of Belfast, Belfast BT7 1NN (United Kingdom)

    2012-05-01

    We have carried out a 29-state R-matrix calculation in order to calculate collision strengths and effective collision strengths for the electron impact excitation of S III. The recently developed parallel RMATRX II suite of codes have been used, which perform the calculation in intermediate coupling. Collision strengths have been generated over an electron energy range of 0-12 Ryd, and effective collision strength data have been calculated from these at electron temperatures in the range 1000-100,000 K. Results are here presented for the fine-structure transitions between the ground-state configurations of 3s {sup 2}3p {sup 23} P{sub 0,1,2}, {sup 1}D{sub 2}, and {sup 1} S{sub 0}, and the values given resolve a discrepancy between two previous R-matrix calculations.

  10. Prioritizing Association Strength Versus Value : The Influence of Self-Regulatory Modes on Means Evaluation in Single Goal and Multigoal Contexts

    NARCIS (Netherlands)

    Orehek, Edward; Mauro, Romina; Kruglanski, Arie W.; van der Bles, Anne Marthe

    Means of goal attainment are said to be multifinal when they are capable of attaining more than 1 goal at the same time. Such means have an advantage over unifinal means because they have the potential to attain greater overall value. However, they have the disadvantage (relative to unifinal means)

  11. Sharing code.

    Science.gov (United States)

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  12. Multidetector-row computed tomographic evaluation of myocardial perfusion in reperfused chronic myocardial infarction: value of color-coded perfusion map in a porcine model.

    Science.gov (United States)

    Yim, Nam Yeol; Kim, Yun-Hyeon; Choi, Song; Seon, Hyun Ju; Kim, Yeong Cheol; Jeong, Gwang Woo; Min, Byeong In; Lee, Sang Rok; Jeong, Myeong Ho; Kim, Jae Kyu; Park, Jin Gyoon; Kang, Heoung Keun

    2009-04-01

    We aimed to develop color-coded CT perfusion maps (CPM) of infarcted myocardium and assess the utility of CPM in evaluating ischemic heart disease on a cardiac multi-detector CT (MDCT) in a porcine reperfused-myocardial-infarction model. Myocardial infarctions were induced by 30 min occlusions of the proximal left anterior descending coronary artery (LAD) in 17 healthy adult female pigs. First-pass and 5 min-delayed cardiac MDCTs were performed after 4 weeks of LAD occlusion. Myocardial CPMs were obtained by using the CPM program. Triphenyltetrazolium chloride (TTC)-staining was performed on the cardiac specimens. We analyzed the intermodality agreement on the size and location of the myocardial infarctions. TTC staining revealed myocardial infarction in 16 of 17 pigs, and 15 of these (94%) showed matched infarcts on the CPM and first-pass images. The areas of perfusion deficit noted in early arterial phase images and CPM coincided exactly with the areas of poor TTC staining in 12 of 15 pigs (80%). In the three remaining pigs, the areas of poor TTC staining were larger than those of a perfusion deficit demonstrated by either early arterial phase images or CPM. The agreement between these tests is calculated to be moderate to good (k = 0.736, P infarction; CPM was helpful in visualizing the infarcted myocardium.

  13. Analog Coding.

    Science.gov (United States)

    CODING, ANALOG SYSTEMS), INFORMATION THEORY, DATA TRANSMISSION SYSTEMS , TRANSMITTER RECEIVERS, WHITE NOISE, PROBABILITY, ERRORS, PROBABILITY DENSITY FUNCTIONS, DIFFERENTIAL EQUATIONS, SET THEORY, COMPUTER PROGRAMS

  14. On Some Ternary LCD Codes

    OpenAIRE

    Darkunde, Nitin S.; Patil, Arunkumar R.

    2018-01-01

    The main aim of this paper is to study $LCD$ codes. Linear code with complementary dual($LCD$) are those codes which have their intersection with their dual code as $\\{0\\}$. In this paper we will give rather alternative proof of Massey's theorem\\cite{8}, which is one of the most important characterization of $LCD$ codes. Let $LCD[n,k]_3$ denote the maximum of possible values of $d$ among $[n,k,d]$ ternary $LCD$ codes. In \\cite{4}, authors have given upper bound on $LCD[n,k]_2$ and extended th...

  15. Divergence coding for convolutional codes

    Directory of Open Access Journals (Sweden)

    Valery Zolotarev

    2017-01-01

    Full Text Available In the paper we propose a new coding/decoding on the divergence principle. A new divergent multithreshold decoder (MTD for convolutional self-orthogonal codes contains two threshold elements. The second threshold element decodes the code with the code distance one greater than for the first threshold element. Errorcorrecting possibility of the new MTD modification have been higher than traditional MTD. Simulation results show that the performance of the divergent schemes allow to approach area of its effective work to channel capacity approximately on 0,5 dB. Note that we include the enough effective Viterbi decoder instead of the first threshold element, the divergence principle can reach more. Index Terms — error-correcting coding, convolutional code, decoder, multithreshold decoder, Viterbi algorithm.

  16. Assessing the value of and contextual and cultural acceptability of the Strength and Difficulties Questionnaire (SDQ) in evaluating mental health problems in HIV/AIDS affected children.

    Science.gov (United States)

    Skinner, Donald; Sharp, Carla; Marais, Lochner; Serekoane, Motsaathebe; Lenka, Molefi

    The Strengths and Difficulties Questionnaire (SDQ) is a robust, powerful and internationally recognised diagnostic screening tool for emotional and behaviour problems among children, with the particular advantage that it can be used by non-health professionals. This makes it useful in a South African context characterized by shortages of professional mental health carers. However the cultural and contextual acceptability and potential uses of the SDQ have not yet been examined in the South African context. The aim of the current study was to evaluate the acceptability of the SDQ in a Sesotho speaking area of South Africa. As part of a larger study to standardise the SDQ for use among Sotho speakers, teachers were asked to use the tool to assess learners in their class. Ten teachers were then asked to write a report on their experience of the SDQ and how useful and applicable they found it for their school setting. These findings were discussed at two later meetings with larger groupings of teachers. Reports were analysed using a modified contextualised interpretative content analysis method. Teachers found the SDQ very useful in the classroom and easy to administer and understand. They found it contextually relevant and particularly useful in gaining an understanding of the learners and the challenges that learners were facing. It further allowed them to differentiate between scholastic and emotional problems, assisting them in developing relationships with the pupils and facilitating accurate referrals. There were very few concerns raised, with the major problem being that it was difficult to assess items concerning contexts outside of the school setting. The teachers expressed interest in obtaining further training in the interpretation of the SDQ and a greater understanding of diagnostic labels so as to assist their learners. The SDQ was found to be acceptable and useful in the context of this very disadvantaged community. The teachers felt it assisted them in

  17. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...... development, Speaking Code unfolds an argument to undermine the distinctions between criticism and practice, and to emphasize the aesthetic and political aspects of software studies. Not reducible to its functional aspects, program code mirrors the instability inherent in the relationship of speech......; alternatives to mainstream development, from performances of the live-coding scene to the organizational forms of commons-based peer production; the democratic promise of social media and their paradoxical role in suppressing political expression; and the market’s emptying out of possibilities for free...

  18. Reference values for hand grip strength, muscle mass, walking time, and one-leg standing time as indices for locomotive syndrome and associated disability: the second survey of the ROAD study.

    Science.gov (United States)

    Yoshimura, Noriko; Oka, Hiroyuki; Muraki, Shigeyuki; Akune, Toru; Hirabayashi, Naoki; Matsuda, Shinji; Nojiri, Takako; Hatanaka, Kazuhiro; Ishimoto, Yuyu; Nagata, Keiji; Yoshida, Munehito; Tokimura, Fumiaki; Kawaguchi, Hiroshi; Nakamura, Kozo

    2011-11-01

    We established reference values for hand grip strength, muscle mass, walking time, and one-leg standing time as indices reflecting components of locomotive syndrome and associated disability using a large-scale population-based sample from the second survey of the Research on Osteoarthritis/Osteoporosis Against Disability (ROAD) cohort. We measured the above-mentioned indices in 2,468 individuals ≥ 40 years old (826 men, 1,642 women; mean age 71.8 years) during the second visit of the ROAD study. Disability was defined as certified disability according to the long-term care insurance system through public health centres of each municipality. Mean values for hand grip strength (weaker side), muscle mass of the thighs, walking time for 6 m at the usual pace, and the fastest pace for men were 32.7 kg, 7.0 kg, 5.6 s, and 3.7 s, respectively, and those for women were 20.8 kg, 5.2 kg, 5.9 s, and 4.1 s, respectively. The median values for one-leg standing time (weaker side) were 14 s for men and 12 s for women. The prevalence of disability in men aged 65-69, 70-74, 75-79, and ≥ 80 was 0.0, 1.0, 6.3, and 8.8%, respectively, and in women was 3.4, 3.5, 9.2, and 14.7%, respectively. There were significant associations between the presence of disability and walking time for 6 m at the usual pace and at the fastest pace, and between the presence of disability and walking speed. We established reference values for indices reflecting components of locomotive syndrome, and identified significant associations between walking ability and disability.

  19. Strengths only or strengths and relative weaknesses? A preliminary study.

    Science.gov (United States)

    Rust, Teri; Diessner, Rhett; Reade, Lindsay

    2009-10-01

    Does working on developing character strengths and relative character weaknesses cause lower life satisfaction than working on developing character strengths only? The present study provides a preliminary answer. After 76 college students completed the Values in Action Inventory of Strengths (C. Peterson & M. E. P. Seligman, 2004), the authors randomly assigned them to work on 2 character strengths or on 1 character strength and 1 relative weakness. Combined, these groups showed significant gains on the Satisfaction With Life Scale (E. Diener, R. A. Emmons, R. J. Larsen, & S. Griffin, 1985), compared with a 32-student no-treatment group. However, there was no significant difference in gain scores between the 2-strengths group and the 1-character-strength-and-1-relative-character-weakness group. The authors discuss how focusing on relative character weaknesses (along with strengths) does not diminish-and may assist in increasing-life satisfaction.

  20. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  1. Coding labour

    National Research Council Canada - National Science Library

    McCosker, Anthony; Milne, Esther

    2014-01-01

    ... software. Code encompasses the laws that regulate human affairs and the operation of capital, behavioural mores and accepted ways of acting, but it also defines the building blocks of life as DNA...

  2. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  3. Codes of Good Governance

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Sørensen, Ditte-Lene

    2013-01-01

    Good governance is a broad concept used by many international organizations to spell out how states or countries should be governed. Definitions vary, but there is a clear core of common public values, such as transparency, accountability, effectiveness, and the rule of law. It is quite likely......, however, that national views of good governance reflect different political cultures and institutional heritages. Fourteen national codes of conduct are analyzed. The findings suggest that public values converge and that they match model codes from the United Nations and the European Council as well...... as conceptions of good governance from other international organizations. While values converge, they are balanced and communicated differently, and seem to some extent to be translated into the national cultures. The set of global public values derived from this analysis include public interest, regime dignity...

  4. Optimal, Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2002-01-01

    Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....

  5. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  6. Network Coding

    Indian Academy of Sciences (India)

    Network coding is a technique to increase the amount of information °ow in a network by mak- ing the key observation that information °ow is fundamentally different from commodity °ow. Whereas, under traditional methods of opera- tion of data networks, intermediate nodes are restricted to simply forwarding their incoming.

  7. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  8. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621. Keywords.

  9. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  10. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  11. Empirically Derived Strength of Residential Roof Structures for Solar Installations.

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, Stephen F.; Sanchez, Alfred; Campos, Ivan A.; Gerstle, Walter H.

    2014-12-01

    Engineering certification for the installation of solar photovoltaic (PV) modules on wood roofs is often denied because existing wood roofs do not meet structural design codes. This work is intended to show that many roofs are actually sufficiently strong given the conservatism in codes, documented allowable strengths, roof structure system effects, and beam composite action produced by joist-sheathing interaction. This report provides results from a testing program to provide actual load carrying capacity of residential rooftops. The results reveal that the actual load carrying capacity of structural members and systems tested are significantly stronger than allowable loads provided by the International Residential Code (IRC 2009) and the national structural code found in Minimum Design Loads for Buildings and Other Structures (ASCE 7-10). Engineering analysis of residential rooftops typically ignores the system affects and beam composite action in determining rooftop stresses given a potential PV installation. This extreme conservatism combined with conservatism in codes and published allowable stress values for roof building materials (NDS 2012) lead to the perception that well built homes may not have adequate load bearing capacity to enable a rooftop PV installation. However, based on the test results presented in this report of residential rooftop structural systems, the actual load bearing capacity is several times higher than published values (NDS 2012).

  12. Polar Codes

    Science.gov (United States)

    2014-12-01

    added by the decoder is K/ρ+Td. By the last assumption, Td and Te are both ≤ K/ρ, so the total latency added is between 2K/ρ and 4K /ρ. For example...better resolution near the decision point. Reference [12] showed that in decoding a (1024, 512) polar code, using 6-bit LLRs resulted in per- formance

  13. Convolutional-Code-Specific CRC Code Design

    OpenAIRE

    Lou, Chung-Yu; Daneshrad, Babak; Wesel, Richard D.

    2015-01-01

    Cyclic redundancy check (CRC) codes check if a codeword is correctly received. This paper presents an algorithm to design CRC codes that are optimized for the code-specific error behavior of a specified feedforward convolutional code. The algorithm utilizes two distinct approaches to computing undetected error probability of a CRC code used with a specific convolutional code. The first approach enumerates the error patterns of the convolutional code and tests if each of them is detectable. Th...

  14. Positive predictive values of International Classification of Diseases, 10th revision codes for dermatologic events and hypersensitivity leading to hospitalization or emergency room visit among women with postmenopausal osteoporosis in the Danish and Swedish national patient registries

    Directory of Open Access Journals (Sweden)

    Adelborg K

    2017-03-01

    Full Text Available Kasper Adelborg,1 Lotte Brix Christensen,1 Troels Munch,1 Johnny Kahlert,1 Ylva Trolle Lagerros,2,3 Grethe S Tell,4 Ellen M Apalset,4,5 Fei Xue,6 Vera Ehrenstein1 1Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus N, Denmark; 2Department of Medicine, Clinical Epidemiology Unit, Karolinska Institutet, 3Department of Medicine, Clinic of Endocrinology, Metabolism and Diabetes, Karolinska University Hospital, Stockholm, Sweden; 4Department of Global Public Health and Primary Care, University of Bergen, 5Department of Rheumatology, Haukeland University Hospital, Bergen, Norway; 6Center for Observational Research, Amgen Inc. Thousand Oaks, CA, USA Background: Clinical epidemiology research studies, including pharmacoepidemiology and pharmacovigilance studies, use routinely collected health data, such as diagnoses recorded in national health and administrative registries, to assess clinical effectiveness and safety of treatments. We estimated positive predictive values (PPVs of International Classification of Diseases, 10th revision (ICD-10 codes for primary diagnoses of dermatologic events and hypersensitivity recorded at hospitalization or emergency room visit in the national patient registries of Denmark and Sweden among women with postmenopausal osteoporosis (PMO. Methods: This validation study included women with PMO identified from the Danish and Swedish national patient registries (2005–2014. Medical charts of the potential cases served as the gold standard for the diagnosis confirmation and were reviewed and adjudicated by physicians. Results: We obtained and reviewed 189 of 221 sampled medical records (86%. The overall PPV was 92.4% (95% confidence interval [CI], 85.1%–96.3% for dermatologic events, while the PPVs for bullous events and erythematous dermatologic events were 52.5% (95% CI, 37.5%–67.1% and 12.5% (95% CI, 2.2%–47.1%, respectively. The PPV was 59.0% (95% CI, 48.3%–69.0% for hypersensitivity; however

  15. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  16. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  17. Concatenated codes with convolutional inner codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Thommesen, Christian; Zyablov, Viktor

    1988-01-01

    The minimum distance of concatenated codes with Reed-Solomon outer codes and convolutional inner codes is studied. For suitable combinations of parameters the minimum distance can be lower-bounded by the product of the minimum distances of the inner and outer codes. For a randomized ensemble...... of concatenated codes a lower bound of the Gilbert-Varshamov type is proved...

  18. Towards Actualizing the Value Potential of Korea Health Insurance Review and Assessment (HIRA) Data as a Resource for Health Research: Strengths, Limitations, Applications, and Strategies for Optimal Use of HIRA Data.

    Science.gov (United States)

    Kim, Jee Ae; Yoon, Seokjun; Kim, Log Young; Kim, Dong Sook

    2017-05-01

    Health Insurance and Review Assessment (HIRA) in South Korea, also called National Health Insurance (NHI) data, is a repository of claims data collected in the process of reimbursing healthcare providers. Under the universal coverage system, having fee-for-services covering all citizens in South Korea, HIRA contains comprehensive and rich information pertaining to healthcare services such as treatments, pharmaceuticals, procedures, and diagnoses for almost 50 million beneficiaries. This corpus of HIRA data, which constitutes a large repository of data in the healthcare sector, has enormous potential to create value in several ways: enhancing the efficiency of the healthcare delivery system without compromising quality of care; adding supporting evidence for a given intervention; and providing the information needed to prevent (or monitor) adverse events. In order to actualize this potential, HIRA data need to actively be utilized for research. Thus understanding this data would greatly enhance this potential. We introduce HIRA data as an important source for health research and provide guidelines for researchers who are currently utilizing HIRA, or interested in doing so, to answer their research questions. We present the characteristics and structure of HIRA data. We discuss strengths and limitations that should be considered in conducting research with HIRA data and suggest strategies for optimal utilization of HIRA data by reviewing published research using HIRA data. © 2017 The Korean Academy of Medical Sciences.

  19. Some optimal partial-unit-memory codes. [time-invariant binary convolutional codes

    Science.gov (United States)

    Lauer, G. S.

    1979-01-01

    A class of time-invariant binary convolutional codes is defined, called partial-unit-memory codes. These codes are optimal in the sense of having maximum free distance for given values of R, k (the number of encoder inputs), and mu (the number of encoder memory cells). Optimal codes are given for rates R = 1/4, 1/3, 1/2, and 2/3, with mu not greater than 4 and k not greater than mu + 3, whenever such a code is better than previously known codes. An infinite class of optimal partial-unit-memory codes is also constructed based on equidistant block codes.

  20. Adaptive Hybrid Picture Coding.

    Science.gov (United States)

    1983-02-05

    process, namely displacement or motion detection and estimation. DWSPLACEENT AD MOTION Simply stated, motion is defined to be a time series of spatial...regressive model in that the prediction is made with respect to a time series . That is future values of a time series are to be predicted on...B8 - 90. Robbins, John D., and Netravali, Arun N., "Interframe Telivision Coding Using Movement Compensation," Internation Conference on

  1. The use of QR Code as a learning technology: an exploratory study

    Directory of Open Access Journals (Sweden)

    Stefano Besana

    2010-12-01

    Full Text Available This paper discusses a pilot study on the potential benefits of QR (Quick Response Codes as a tool for facilitating and enhancing learning processes. An analysis is given of the strengths and added value of QR technologies applied to museum visits, with precautions regarding the design of learning environments like the one presented. Some possible future scenarios are identified for implementing these technologies in contexts more strictly related to teaching and education.

  2. Forearm Torque and Lifting Strength: Normative Data.

    Science.gov (United States)

    Axelsson, Peter; Fredrikson, Per; Nilsson, Anders; Andersson, Jonny K; Kärrholm, Johan

    2018-02-10

    To establish reference values for new methods designed to quantitatively measure forearm torque and lifting strength and to compare these values with grip strength. A total of 499 volunteers, 262 males and 237 females, aged 15 to 85 (mean, 44) years, were tested for lifting strength and forearm torque with the Kern and Baseline dynamometers. These individuals were also tested for grip strength with a Jamar dynamometer. Standardized procedures were used and information about sex, height, weight, hand dominance, and whether their work involved high or low manual strain was collected. Men had approximately 70% higher forearm torque and lifting strength compared with females. Male subjects aged 26 to 35 years and female subjects aged 36 to 45 years showed highest strength values. In patients with dominant right side, 61% to 78% had a higher or equal strength on this side in the different tests performed. In patients with dominant left side, the corresponding proportions varied between 41% and 65%. There was a high correlation between grip strength and forearm torque and lifting strength. Sex, body height, body weight, and age showed a significant correlation to the strength measurements. In a multiple regression model sex, age (entered as linear and squared) could explain 51% to 63% of the total variances of forearm torque strength and 30% to 36% of lifting strength. Reference values for lifting strength and forearm torque to be used in clinical practice were acquired. Grip strength has a high correlation to forearm torque and lifting strength. Sex, age, and height can be used to predict forearm torque and lifting strength. Prediction equations using these variables were generated. Normative data of forearm torque and lifting strength might improve the quality of assessment of wrist and forearm disorders as well as their treatments. Copyright © 2018 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  3. Codes of Ethics and Teachers' Professional Autonomy

    Science.gov (United States)

    Schwimmer, Marina; Maxwell, Bruce

    2017-01-01

    This article considers the value of adopting a code of professional ethics for teachers. After having underlined how a code of ethics stands to benefits a community of educators--namely, by providing a mechanism for regulating autonomy and promoting a shared professional ethic--the article examines the principal arguments against codes of ethics.…

  4. Repetition versus noiseless quantum codes for correlated errors

    Energy Technology Data Exchange (ETDEWEB)

    Cafaro, Carlo, E-mail: carlo.cafaro@unicam.i [Dipartimento di Fisica, Universita di Camerino, I-62032 Camerino (Italy); Mancini, Stefano, E-mail: stefano.mancini@unicam.i [Dipartimento di Fisica, Universita di Camerino, I-62032 Camerino (Italy)

    2010-06-07

    We study the performance of simple quantum error correcting codes with respect to correlated noise errors characterized by a finite correlation strength {mu}. Specifically, we consider bit flip (phase flip) noisy quantum memory channels and use repetition and noiseless quantum codes. We characterize the performance of the codes by means of the entanglement fidelity F({mu},p) as function of the error probability p and degree of memory {mu}. Finally, comparing the entanglement fidelities of repetition and noiseless quantum codes, we find a threshold {mu}{sup *}(p) for the correlation strength that allows to select the code with better performance.

  5. Code of ethics: principles for ethical leadership.

    Science.gov (United States)

    Flite, Cathy A; Harman, Laurinda B

    2013-01-01

    The code of ethics for a professional association incorporates values, principles, and professional standards. A review and comparative analysis of a 1934 pledge and codes of ethics from 1957, 1977, 1988, 1998, 2004, and 2011 for a health information management association was conducted. Highlights of some changes in the healthcare delivery system are identified as a general context for the codes of ethics. The codes of ethics are examined in terms of professional values and changes in the language used to express the principles of the various codes.

  6. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  7. Assessment of Shear Strength in Silty Soils

    Directory of Open Access Journals (Sweden)

    Stefaniak Katarzyna

    2015-06-01

    Full Text Available The article presents a comparison of shear strength values in silty soils from the area of Poznań, determined based on selected Nkt values recommended in literature, with values of shear strength established on the basis of Nkt values recommended by the author. Analysed silty soils are characterized by the carbonate cementation zone, which made it possible to compare selected empirical coefficients both in normally consolidated and overconsolidated soils

  8. Strength properties of fly ash based controlled low strength materials.

    Science.gov (United States)

    Türkel, S

    2007-08-25

    Controlled low strength material (CLSM) is a flowable mixture that can be used as a backfill material in place of compacted soils. Flowable fill requires no tamping or compaction to achieve its strength and typically has a load carrying capacity much higher than compacted soils, but it can still be excavated easily. The selection of CLSM type should be based on technical and economical considerations for specific applications. In this study, a mixture of high volume fly ash (FA), crushed limestone powder (filler) and a low percentage of pozzolana cement have been tried in different compositions. The amount of pozzolana cement was kept constant for all mixes as, 5% of fly ash weight. The amount of mixing water was chosen in order to provide optimum pumpability by determining the spreading ratio of CLSM mixtures using flow table method. The shear strength of the material is a measure of the materials ability to support imposed stresses on the material. The shear strength properties of CLSM mixtures have been investigated by a series of laboratory tests. The direct shear test procedure was applied for determining the strength parameters Phi (angle of shearing resistance) and C(h) (cohesion intercept) of the material. The test results indicated that CLSM mixtures have superior shear strength properties compared to compacted soils. Shear strength, cohesion intercept and angle of shearing resistance values of CLSM mixtures exceeded conventional soil materials' similar properties at 7 days. These parameters proved that CLSM mixtures are suitable materials for backfill applications.

  9. Oscillator strengths for Be I

    Energy Technology Data Exchange (ETDEWEB)

    Ates, Sule, E-mail: suleates@selcuk.edu.tr; Oezarslan, Selma; Celik, Gueltekin; Taser, Mehmet

    2012-07-15

    The electric dipole oscillator strengths for lines between some singlet and triplet levels have been calculated using the weakest bound electron potential model theory and the quantum defect orbital theory for Be I. In the calculations both multiplet and fine structure transitions are studied. We employed both the numerical Coulomb approximation method and numerical non-relativistic Hartree-Fock wavefunctions for expectation values of radii. The necessary energy values have been taken from experimental energy data in the literature. The calculated oscillator strengths have been compared with available theoretical results. A good agreement with the results in the literature has been obtained.

  10. An Outline of the New Norwegian Criminal Code

    Directory of Open Access Journals (Sweden)

    Jørn Jacobsen

    2015-12-01

    Full Text Available This article gives an overview of the new criminal code, its background and content. It maps out the code’s background, the legislative process and central ideas. Furthermore, the article gives an outline of the general criteria for criminal responsibility according to the code, the offences and forms of punishment and other reactions. The article emphasises the most important changes from the previous code of 1902. To some degree, strengths and weaknesses of the new code are addressed.

  11. Orthogonality of binary codes derived from Reed-Solomon codes

    Science.gov (United States)

    Retter, Charles T.

    1991-07-01

    A simple method is developed for determining the orthogonality of binary codes derived from Reed-Solomon codes and other cyclic codes of length (2 exp m) - 1 over GF(2 exp m) for m bits. Depending on the spectra of the codes, it is sufficient to test a small number of single-frequency pairs for orthogonality, and a pair of bases may be tested in each case simply by summing the appropriate powers of elements of the dual bases. This simple test can be used to find self-orthogonal codes. For even values of m, the author presents a technique that can be used to choose a basis that produces a self-orthogonal, doubly-even code in certain cases, particularly when m is highly composite. If m is a power of 2, this technique can be used to find self-dual bases for GF(2 exp m). Although the primary emphasis is on testing for self orthogonality, the fundamental theorems presented apply also to the orthogonality of two different codes.

  12. Affine Grassmann codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Beelen, Peter; Ghorpade, Sudhir Ramakant

    2010-01-01

    We consider a new class of linear codes, called affine Grassmann codes. These can be viewed as a variant of generalized Reed-Muller codes and are closely related to Grassmann codes.We determine the length, dimension, and the minimum distance of any affine Grassmann code. Moreover, we show...

  13. Turbo Codes Extended with Outer BCH Code

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    1996-01-01

    The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is propose...... including an outer BCH code correcting a few bit errors.......The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is proposed...

  14. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  15. Generalized concatenated quantum codes

    Science.gov (United States)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng, Bei

    2009-05-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  16. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  17. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  18. The Value of Value

    DEFF Research Database (Denmark)

    Sørensen, Asger

    As a social scientist of ethics and morality, Luhmann has noticed the ethical wave that has recently swept across the western world, and states that this particular kind of wave seems to have a wavelength of about one hundred years (cf. Luhmann 1989: 9 ff.). Even though the frequency and the regu...... attempted to answer this question by investigating what the use of the term `value' leads to in ethical discourses, i.e., what moral implications it has for ethics to focus on the concept of value....... parts of business ethics given prominence to especially one term, namely `value'. The question that interests me is the following: What does the articulation of ethics and morality in terms of values mean for ethics and morality as such. Or, to put the question in a more fashionably way: What...... is the value of value for morality and ethics?To make things a bit more precise, we can make use of the common distinction between ethics and morality, i.e. that morality is the immediate, collective and unconscious employment of morals, whereas ethics is the systematic, individual and conscious reflections...

  19. Muscle Strength and Poststroke Hemiplegia: A Systematic Review of Muscle Strength Assessment and Muscle Strength Impairment.

    Science.gov (United States)

    Kristensen, Otto H; Stenager, Egon; Dalgas, Ulrik

    2017-02-01

    To systematically review (1) psychometric properties of criterion isokinetic dynamometry testing of muscle strength in persons with poststroke hemiplegia (PPSH); and (2) literature that compares muscle strength in patients poststroke with that in healthy controls assessed by criterion isokinetic dynamometry. A systematic literature search of 7 databases was performed. Included studies (1) enrolled participants with definite poststroke hemiplegia according to defined criteria; (2) assessed muscle strength or power by criterion isokinetic dynamometry; (3) had undergone peer review; and (4) were available in English or Danish. The psychometric properties of isokinetic dynamometry were reviewed with respect to reliability, validity, and responsiveness. Furthermore, comparisons of strength between paretic, nonparetic, and comparable healthy muscles were reviewed. Twenty studies covering 316 PPSH were included. High intraclass correlation coefficient (ICC) inter- and intrasession reliability was reported for isokinetic dynamometry, which was independent of the tested muscle group, contraction mode, and contraction velocity. Slightly higher ICC values were found for the nonparetic extremity. Standard error of the mean (SEM) values showed that a change of 7% to 20% was required for a real group change to take place for most muscle groups, with the knee extensors showing the smallest SEM% values. The muscle strength of paretic muscles showed deficits when compared with both healthy and nonparetic muscles, independent of muscle group, contraction mode, and contraction velocity. Nonparetic muscles only showed minor strength impairments when compared with healthy muscles. Criterion isokinetic dynamometry is a reliable test in persons with stroke, generally showing marked reductions in muscle strength of paretic and, to a lesser degree, nonparetic muscles when compared with healthy controls, independent of muscle group, contraction mode, and contraction velocity. Copyright

  20. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  1. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  2. Voluntary codes: private governance, the public interest and innovation

    National Research Council Canada - National Science Library

    Webb, Kernaghan

    2004-01-01

    This volume is a logical extension of the Office of Consumer Affairs' work in the area of voluntary codes that may assist all parties in developing a better understanding of the strengths, weaknesses...

  3. Algebraic geometric codes

    Science.gov (United States)

    Shahshahani, M.

    1991-01-01

    The performance characteristics are discussed of certain algebraic geometric codes. Algebraic geometric codes have good minimum distance properties. On many channels they outperform other comparable block codes; therefore, one would expect them eventually to replace some of the block codes used in communications systems. It is suggested that it is unlikely that they will become useful substitutes for the Reed-Solomon codes used by the Deep Space Network in the near future. However, they may be applicable to systems where the signal to noise ratio is sufficiently high so that block codes would be more suitable than convolutional or concatenated codes.

  4. What Value "Value Added"?

    Science.gov (United States)

    Richards, Andrew

    2015-01-01

    Two quantitative measures of school performance are currently used, the average points score (APS) at Key Stage 2 and value-added (VA), which measures the rate of academic improvement between Key Stage 1 and 2. These figures are used by parents and the Office for Standards in Education to make judgements and comparisons. However, simple…

  5. Monomial-like codes

    CERN Document Server

    Martinez-Moro, Edgar; Ozbudak, Ferruh; Szabo, Steve

    2010-01-01

    As a generalization of cyclic codes of length p^s over F_{p^a}, we study n-dimensional cyclic codes of length p^{s_1} X ... X p^{s_n} over F_{p^a} generated by a single "monomial". Namely, we study multi-variable cyclic codes of the form in F_{p^a}[x_1...x_n] / . We call such codes monomial-like codes. We show that these codes arise from the product of certain single variable codes and we determine their minimum Hamming distance. We determine the dual of monomial-like codes yielding a parity check matrix. We also present an alternative way of constructing a parity check matrix using the Hasse derivative. We study the weight hierarchy of certain monomial like codes. We simplify an expression that gives us the weight hierarchy of these codes.

  6. Strength and failure modes of ceramic multilayers

    DEFF Research Database (Denmark)

    Sørensen, Bent F.; Toftegaard, Helmuth Langmaack; Linderoth, Søren

    2012-01-01

    A model was developed for the prediction of the tensile strength of thin, symmetric 3-layer sandwich specimens. The model predictions rationalize the effect of heat-treatment temperature on the strength of sandwich specimens consisting of an YSZ (Yttria-Stabilized Zirconia) substrate coated...... and propagating into the substrate. These predictions are consistent with microstructural observations of the fracture surfaces. A good agreement was found between the measured strength values and model predictions. © 2012 Elsevier Ltd. All rights reserved....

  7. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  8. Review of design codes of concrete encased steel short columns under axial compression

    Directory of Open Access Journals (Sweden)

    K.Z. Soliman

    2013-08-01

    Full Text Available In recent years, the use of encased steel concrete columns has been increased significantly in medium-rise or high-rise buildings. The aim of the present investigation is to assess experimentally the current methods and codes for evaluating the ultimate load behavior of concrete encased steel short columns. The current state of design provisions for composite columns from the Egyptian codes ECP203-2007 and ECP-SC-LRFD-2012, as well as, American Institute of Steel Construction, AISC-LRFD-2010, American Concrete Institute, ACI-318-2008, and British Standard BS-5400-5 was reviewed. The axial capacity portion of both the encased steel section and the concrete section was also studied according to the previously mentioned codes. Ten encased steel concrete columns have been investigated experimentally to study the effect of concrete confinement and different types of encased steel sections. The measured axial capacity of the tested ten composite columns was compared with the values calculated by the above mentioned codes. It is concluded that non-negligible discrepancies exist between codes and the experimental results as the confinement effect was not considered in predicting both the strength and ductility of concrete. The confining effect was obviously influenced by the shape of the encased steel section. The tube-shaped steel section leads to better confinement than the SIB section. Among the used codes, the ECP-SC-LRFD-2012 led to the most conservative results.

  9. A case for a code of ethics.

    Science.gov (United States)

    Bayliss, P

    1994-03-01

    Ethical dilemmas in business and health have become a familiar topic over recent times. Doubts remain, however, as to whether a code should be produced and the recently issued IHSM consultation paper argues the case for "a statement of primary values" rather than a code of ethics. In a second article on the subject, Paul Bayliss examines the importance of having a code, looks at some of the contextual issues and suggests an approach to producing one.

  10. A Realistic Model under which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, H.; van der Gulik, P.T.S.; Klau, G.W.; Schaffner, C.; Speijer, D.; Stougie, L.

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  11. A Realistic Model Under Which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, Harry; van der Gulik, Peter T. S.; Klau, Gunnar W.; Schaffner, Christian; Speijer, Dave; Stougie, Leen

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  12. Genetic code for sine

    Science.gov (United States)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  13. On the binding effect of air pollution concentration values of the Technical Code Clean Air (TA Luft) in favor of the plant operator. Zur Bindungswirkung der Emissionswerte der TA Luft zugunsten des Anlagenbetreibers

    Energy Technology Data Exchange (ETDEWEB)

    Steinhoff, A.

    1991-01-01

    The dogmatic and constitutional basis as well as points of definition concerning the Technical Code Clean Air (TA Luft) are explained at first, followed by an assessment of the air pollution concentrations within the overall complex of the air pollution abatement measures, and an analyses of air pollution concentrations as a concretization of the state-of-the-art. In connection with the differentiation of the various applicability levels of TA Luft, the inhouse validy for authorities, the importance for law courts and, finally, the external binding effect for plant operators and possible third parties are discussed. At the center of attention are questions concerning validity claims and validity possibilities. (HSCH).

  14. receive signal strength prediction in the gsm band using wavelet

    African Journals Online (AJOL)

    user

    which type of fading phenomenon attenuates the signal strength ... ANDROID play store of a ME (Samsung Galaxy Tab 4). The software is ... Phone Type. GSM. Transmitter. Distance. 1033m. MNC (Mobile. Network Code). MCC (Mobile Country Code). The 'system info' is calibrated from –113dBm to –47dBm where, from ...

  15. Feature-based fast coding unit partition algorithm for high efficiency video coding

    Directory of Open Access Journals (Sweden)

    Yih-Chuan Lin

    2015-04-01

    Full Text Available High Efficiency Video Coding (HEVC, which is the newest video coding standard, has been developed for the efficient compression of ultra high definition videos. One of the important features in HEVC is the adoption of a quad-tree based video coding structure, in which each incoming frame is represented as a set of non-overlapped coding tree blocks (CTB by variable-block sized prediction and coding process. To do this, each CTB needs to be recursively partitioned into coding unit (CU, predict unit (PU and transform unit (TU during the coding process, leading to a huge computational load in the coding of each video frame. This paper proposes to extract visual features in a CTB and uses them to simplify the coding procedure by reducing the depth of quad-tree partition for each CTB in HEVC intra coding mode. A measure for the edge strength in a CTB, which is defined with simple Sobel edge detection, is used to constrain the possible maximum depth of quad-tree partition of the CTB. With the constrained partition depth, the proposed method can reduce a lot of encoding time. Experimental results by HM10.1 show that the average time-savings is about 13.4% under the increase of encoded BD-Rate by only 0.02%, which is a less performance degradation in comparison to other similar methods.

  16. Spin resonance strength calculation through single particle tracking for RHIC

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Dutheil, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Huang, H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Ranjbar, V. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-05-03

    The strengths of spin resonances for the polarized-proton operation in the Relativistic Heavy Ion Collider are currently calculated with the code DEPOL, which numerically integrates through the ring based on an analytical approximate formula. In this article, we test a new way to calculate the spin resonance strengths by performing Fourier transformation to the actual transverse magnetic fields seen by a single particle traveling through the ring. Comparison of calculated spin resonance strengths is made between this method and DEPOL.

  17. Code Verification by the Method of Manufactured Solutions

    Energy Technology Data Exchange (ETDEWEB)

    SALARI,KAMBIZ; KNUPP,PATRICK

    2000-06-01

    A procedure for code Verification by the Method of Manufactured Solutions (MMS) is presented. Although the procedure requires a certain amount of creativity and skill, we show that MMS can be applied to a variety of engineering codes which numerically solve partial differential equations. This is illustrated by detailed examples from computational fluid dynamics. The strength of the MMS procedure is that it can identify any coding mistake that affects the order-of-accuracy of the numerical method. A set of examples which use a blind-test protocol demonstrates the kinds of coding mistakes that can (and cannot) be exposed via the MMS code Verification procedure. The principle advantage of the MMS procedure over traditional methods of code Verification is that code capabilities are tested in full generality. The procedure thus results in a high degree of confidence that all coding mistakes which prevent the equations from being solved correctly have been identified.

  18. TIPONLINE Code Table

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coded items are entered in the tiponline data entry program. The codes and their explanations are necessary in order to use the data

  19. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  20. Index coding via linear programming

    CERN Document Server

    Blasiak, Anna; Lubetzky, Eyal

    2010-01-01

    Index Coding has received considerable attention recently motivated in part by applications such as fast video-on-demand and efficient communication in wireless networks and in part by its connection to Network Coding. The basic setting of Index Coding encodes the side-information relation, the problem input, as an undirected graph and the fundamental parameter is the broadcast rate $\\beta$, the average communication cost per bit for sufficiently long messages (i.e. the non-linear vector capacity). Recent nontrivial bounds on $\\beta$ were derived from the study of other Index Coding capacities (e.g. the scalar capacity $\\beta_1$) by Bar-Yossef et al (FOCS'06), Lubetzky and Stav (FOCS'07) and Alon et al (FOCS'08). However, these indirect bounds shed little light on the behavior of $\\beta$ and its exact value remained unknown for \\emph{any graph} where Index Coding is nontrivial. Our main contribution is a hierarchy of linear programs whose solutions trap $\\beta$ between them. This enables a direct information-...

  1. ARC Code TI: ROC Curve Code Augmentation

    Data.gov (United States)

    National Aeronautics and Space Administration — ROC (Receiver Operating Characteristic) curve Code Augmentation was written by Rodney Martin and John Stutz at NASA Ames Research Center and is a modification of ROC...

  2. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  3. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  4. The Procions` code; Le code Procions

    Energy Technology Data Exchange (ETDEWEB)

    Deck, D.; Samba, G.

    1994-12-19

    This paper presents a new code to simulate plasmas generated by inertial confinement. This multi-kinds kinetic code is done with no angular approximation concerning ions and will work in plan and spherical geometry. First, the physical model is presented, using Fokker-Plank. Then, the numerical model is introduced in order to solve the Fokker-Plank operator under the Rosenbluth form. At the end, several numerical tests are proposed. (TEC). 17 refs., 27 figs.

  5. Some Bounds on Binary LCD Codes

    OpenAIRE

    Galvez, Lucky; Kim, Jon-Lark; Lee, Nari; Roe, Young Gun; Won, Byung-Sun

    2017-01-01

    A linear code with a complementary dual (or LCD code) is defined to be a linear code $C$ whose dual code $C^{\\perp}$ satisfies $C \\cap C^{\\perp}$= $\\left\\{ \\mathbf{0}\\right\\} $. Let $LCD{[}n,k{]}$ denote the maximum of possible values of $d$ among $[n,k,d]$ binary LCD codes. We give exact values of $LCD{[}n,k{]}$ for $1 \\le k \\le n \\le 12$. We also show that $LCD[n,n-i]=2$ for any $i\\geq2$ and $n\\geq2^{i}$. Furthermore, we show that $LCD[n,k]\\leq LCD[n,k-1]$ for $k$ odd and $LCD[n,k]\\leq LCD[...

  6. The structure of dual Grassmann codes

    DEFF Research Database (Denmark)

    Beelen, Peter; Pinero, Fernando

    2016-01-01

    weight codewords of the dual affine Grassmann codes. Combining the above classification results, we are able to show that the dual of a Grassmann code is generated by its minimum weight codewords. We use these properties to establish that the increase of value of successive generalized Hamming weights......In this article we study the duals of Grassmann codes, certain codes coming from the Grassmannian variety. Exploiting their structure, we are able to count and classify all their minimum weight codewords. In this classification the lines lying on the Grassmannian variety play a central role....... Related codes, namely the affine Grassmann codes, were introduced more recently in Beelen et al. (IEEE Trans Inf Theory 56(7):3166–3176, 2010), while their duals were introduced and studied in Beelen et al. (IEEE Trans Inf Theory 58(6):3843–3855, 2010). In this paper we also classify and count the minimum...

  7. Strengths-based Learning

    DEFF Research Database (Denmark)

    Ledertoug, Mette Marie

    Strength-based learning - Children͛s Character Strengths as Means to their Learning Potential͛ is a Ph.D.-project aiming to create a strength-based mindset in school settings and at the same time introducing strength-based interventions as specific tools to improve both learning and well......-being. The Ph.D.-project in Strength-based learning took place in a Danish school with 750 pupils age 6-16 and a similar school was functioning as a control group. The presentation will focus on both the aware-explore-apply processes and the practical implications for the schools involved, and on measurable...

  8. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    , Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  9. Strength Modeling Report

    Science.gov (United States)

    Badler, N. I.; Lee, P.; Wong, S.

    1985-01-01

    Strength modeling is a complex and multi-dimensional issue. There are numerous parameters to the problem of characterizing human strength, most notably: (1) position and orientation of body joints; (2) isometric versus dynamic strength; (3) effector force versus joint torque; (4) instantaneous versus steady force; (5) active force versus reactive force; (6) presence or absence of gravity; (7) body somatotype and composition; (8) body (segment) masses; (9) muscle group envolvement; (10) muscle size; (11) fatigue; and (12) practice (training) or familiarity. In surveying the available literature on strength measurement and modeling an attempt was made to examine as many of these parameters as possible. The conclusions reached at this point toward the feasibility of implementing computationally reasonable human strength models. The assessment of accuracy of any model against a specific individual, however, will probably not be possible on any realistic scale. Taken statistically, strength modeling may be an effective tool for general questions of task feasibility and strength requirements.

  10. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  11. Noisy Network Coding

    CERN Document Server

    Lim, Sung Hoon; Gamal, Abbas El; Chung, Sae-Young

    2010-01-01

    A noisy network coding scheme for sending multiple sources over a general noisy network is presented. For multi-source multicast networks, the scheme naturally extends both network coding over noiseless networks by Ahlswede, Cai, Li, and Yeung, and compress-forward coding for the relay channel by Cover and El Gamal to general discrete memoryless and Gaussian networks. The scheme also recovers as special cases the results on coding for wireless relay networks and deterministic networks by Avestimehr, Diggavi, and Tse, and coding for wireless erasure networks by Dana, Gowaikar, Palanki, Hassibi, and Effros. The scheme involves message repetition coding, relay signal compression, and simultaneous decoding. Unlike previous compress--forward schemes, where independent messages are sent over multiple blocks, the same message is sent multiple times using independent codebooks as in the network coding scheme for cyclic networks. Furthermore, the relays do not use Wyner--Ziv binning as in previous compress-forward sch...

  12. Theoretical oscillator strengths, transition probabilities, and radiative lifetimes of levels in Pb V

    Energy Technology Data Exchange (ETDEWEB)

    Colón, C., E-mail: cristobal.colon@upm.es [Dpto. Física Aplicada. E.U.I.T. Industrial, Universidad Politécnica de Madrid, Ronda de Valencia 3, 28012 Madrid (Spain); Alonso-Medina, A. [Dpto. Física Aplicada. E.U.I.T. Industrial, Universidad Politécnica de Madrid, Ronda de Valencia 3, 28012 Madrid (Spain); Porcher, P. [Laboratoire de Chimie Appliquée de l’Etat Solide, CNRS-UMR 7574, Paris (France)

    2014-01-15

    Theoretical values of oscillator strengths and transition probabilities for 306 spectral lines arising from the 5d{sup 9}ns(n=7,8,9),5d{sup 9}np(n=6,7),5d{sup 9}6d, and 5d{sup 9} 5f configurations, and radiative lifetimes of 9 levels, of Pb V have been obtained. These values were obtained in intermediate coupling (IC) and using ab initio relativistic Hartree–Fock calculations including core-polarization effects. We use for the IC calculations the standard method of least squares fitting of experimental energy levels by means of computer codes from Cowan. We included in these calculations the 5d{sup 8}6s6p and 5d{sup 8}6s{sup 2} configurations. These calculations have facilitated the identification of the 214.25, 216.79, and 227.66 nm spectral lines of Pb V. In the absence of experimental results of oscillator strengths and transition probabilities, we could not make a direct comparison with our results. However, the Stark broadening parameters calculated from these values are in excellent agreement with experimental widening found in the literature. -- Highlights: •Theoretical values of transition probabilities of Pb V have been obtained. •We use for the IC calculations the standard method of least square. •The parameters calculated from these values are in agreement with the experimental values.

  13. Strength of Chemical Bonds

    Science.gov (United States)

    Christian, Jerry D.

    1973-01-01

    Students are not generally made aware of the extraordinary magnitude of the strengths of chemical bonds in terms of the forces required to pull them apart. Molecular bonds are usually considered in terms of the energies required to break them, and we are not astonished at the values encountered. For example, the Cl2 bond energy, 57.00 kcal/mole, amounts to only 9.46 x 10(sup -20) cal/molecule, a very small amount of energy, indeed, and impossible to measure directly. However, the forces involved in realizing the energy when breaking the bond operate over a very small distance, only 2.94 A, and, thus, f(sub ave) approx. equals De/(r - r(sub e)) must be very large. The forces involved in dissociating the molecule are discussed in the following. In consideration of average forces, the molecule shall be assumed arbitrarily to be dissociated when the atoms are far enough separated so that the potential, relative to that of the infinitely separated atoms, is reduced by 99.5% from the potential of the molecule at the equilibrium bond length (r(sub e)) for Cl2 of 1.988 A this occurs at 4.928 A.

  14. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  15. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded......We welcome Tanya Stivers’s discussion (Stivers, 2015/this issue) of coding social interaction and find that her descriptions of the processes of coding open up important avenues for discussion, among other things of the precise ad hoc considerations that researchers need to bear in mind, both when....... Instead we propose that the promise of coding-based research lies in its ability to open up new qualitative questions....

  16. Overview of Code Verification

    Science.gov (United States)

    1983-01-01

    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  17. On the edge irregularity strength of corona product of cycle with isolated vertices

    Directory of Open Access Journals (Sweden)

    I. Tarawneh

    2016-12-01

    Full Text Available In this paper, we investigate the new graph characteristic, the edge irregularity strength, denoted as es, as a modification of the well known irregularity strength, total edge irregularity strength and total vertex irregularity strength. As a result, we obtain the exact value of an edge irregularity strength of corona product of cycle with isolated vertices.

  18. Effect of panel alignment and surface finish on bond strength

    Energy Technology Data Exchange (ETDEWEB)

    Wouters, J.M.; Doe, P.J. [Los Alamos National Lab., NM (United States); Baker, W.E. [New Mexico Univ., Albuquerque, NM (United States)

    1991-10-01

    The flexural strength of bonded acrylic is tested as a function of panel alignment and bond surface finish. Bond strength was shown to be highly dependent on both parameters with only a narrow range of values yielding a high strength bond. This study was performed for the heavy water-containing acrylic vessel for the Sudbury Neutrino Observatory detector.

  19. Character Strengths and Psychological Wellbeing among Students of Teacher Education

    Science.gov (United States)

    Gustems, Josep; Calderon, Caterina

    2014-01-01

    The relation between character strengths and psychological well-being can have an important effect on students' academic performance. We examined relationships between character strengths and psychological well-being as assessed by the Values in Action Inventory of Strengths and Brief Symptom Inventory. A sample of 98 teacher education students…

  20. On irregularity strength of disjoint union of friendship graphs

    Directory of Open Access Journals (Sweden)

    Ali Ahmad

    2013-11-01

    Full Text Available We investigate the vertex total and edge total modication of the well-known irregularity strength of graphs. We have determined the exact values of the total vertex irregularity strength and the total edge irregularity strength of a disjoint union of friendship graphs.

  1. Phonological coding during reading

    Science.gov (United States)

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  2. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  3. The strength compass

    DEFF Research Database (Denmark)

    Ledertoug, Mette Marie

    Individual paper presentation: The ‘Strength Compass’. The results of a PhDresearch project among schoolchildren (age 6-16) identifying VIAstrengths concerning age, gender, mother-tongue-langue and possible child psychiatric diagnosis. Strengths-based interventions in schools have a theoretical...... the results for strengths display for children aged 6-16 in different categories: • Different age groups – are the same strengths present in both small children and youths? • Gender – Do the results show differences between the two genders? • Danish as a mother- tongue language. Do the results show any...

  4. Does a home-based strength and balance programme in people aged > or =80 years provide the best value for money to prevent falls? A systematic review of economic evaluations of falls prevention interventions.

    Science.gov (United States)

    Davis, J C; Robertson, M C; Ashe, M C; Liu-Ambrose, T; Khan, K M; Marra, C A

    2010-02-01

    To investigate the value for money of strategies to prevent falls in older adults living in the community. Systematic review of peer reviewed journal articles reporting an economic evaluation of a falls prevention intervention as part of a randomised controlled trial or a controlled trial, or using an analytical model. MEDLINE, PUBMED, EMBASE and NHS EED databases were searched to identify cost-effectiveness, cost-utility and cost-benefit studies from 1945 through July 2008. The primary outcome measure was incremental cost-effectiveness, cost-utility and cost-benefit ratios in the reported currency and in pounds sterling at 2008 prices. The quality of the studies was assessed using two instruments: (1) an economic evaluation checklist developed by Drummond and colleagues and (2) the Quality of Health Economic Studies instrument. Nine studies meeting our inclusion criteria included eight cost-effectiveness analyses, one cost-utility and one cost-benefit analysis. Three effective falls prevention strategies were cost saving in a subgroup of (1) an individually customised multifactorial programme in those with four or more of the eight targeted fall risk factors, (2) the home-based Otago Exercise Programme in people > or =80 years and (3) a home safety programme in the subgroup with a previous fall. These three findings were from six studies that scored > or =75% on the Quality of Health Economic Studies instrument. Best value for money came from effective single factor interventions such as the Otago Exercise Programme which was cost saving in adults 80 years and older. This programme has broad applicability thus warranting warrants health policy decision-makers' close scrutiny.

  5. Generating code adapted for interlinking legacy scalar code and extended vector code

    Science.gov (United States)

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  6. Decoding of Cyclic Codes,

    Science.gov (United States)

    INFORMATION THEORY, *DECODING), (* DATA TRANSMISSION SYSTEMS , DECODING), STATISTICAL ANALYSIS, STOCHASTIC PROCESSES, CODING, WHITE NOISE, NUMBER THEORY, CORRECTIONS, BINARY ARITHMETIC, SHIFT REGISTERS, CONTROL SYSTEMS, USSR

  7. ARC Code TI: ACCEPT

    Data.gov (United States)

    National Aeronautics and Space Administration — ACCEPT consists of an overall software infrastructure framework and two main software components. The software infrastructure framework consists of code written to...

  8. Diameter Perfect Lee Codes

    CERN Document Server

    Horak, Peter

    2011-01-01

    Lee codes have been intensively studied for more than 40 years. Interest in these codes has been triggered by the Golomb-Welch conjecture on the existence of perfect error-correcting Lee codes. In this paper we deal with the existence and enumeration of diameter perfect Lee codes. As main results we determine all q for which there exists a linear diameter-4 perfect Lee code of word length n over Z_{q}, and prove that for each n\\geq3 there are unaccountably many diameter-4 perfect Lee codes of word length n over Z. This is in a strict contrast with perfect error-correcting Lee codes of word length n over Z as there is a unique such code for n=3, and its is conjectured that this is always the case when 2n+1 is a prime. Diameter perfect Lee codes will be constructed by an algebraic construction that is based on a group homomorphism. This will allow us to design an efficient algorithm for their decoding.

  9. Expander chunked codes

    Science.gov (United States)

    Tang, Bin; Yang, Shenghao; Ye, Baoliu; Yin, Yitong; Lu, Sanglu

    2015-12-01

    Chunked codes are efficient random linear network coding (RLNC) schemes with low computational cost, where the input packets are encoded into small chunks (i.e., subsets of the coded packets). During the network transmission, RLNC is performed within each chunk. In this paper, we first introduce a simple transfer matrix model to characterize the transmission of chunks and derive some basic properties of the model to facilitate the performance analysis. We then focus on the design of overlapped chunked codes, a class of chunked codes whose chunks are non-disjoint subsets of input packets, which are of special interest since they can be encoded with negligible computational cost and in a causal fashion. We propose expander chunked (EC) codes, the first class of overlapped chunked codes that have an analyzable performance, where the construction of the chunks makes use of regular graphs. Numerical and simulation results show that in some practical settings, EC codes can achieve rates within 91 to 97 % of the optimum and outperform the state-of-the-art overlapped chunked codes significantly.

  10. Evolution of coding microsatellites in primate genomes.

    Science.gov (United States)

    Loire, Etienne; Higuet, Dominique; Netter, Pierre; Achaz, Guillaume

    2013-01-01

    Microsatellites (SSRs) are highly susceptible to expansions and contractions. When located in a coding sequence, the insertion or the deletion of a single unit for a mono-, di-, tetra-, or penta(nucleotide)-SSR creates a frameshift. As a consequence, one would expect to find only very few of these SSRs in coding sequences because of their strong deleterious potential. Unexpectedly, genomes contain many coding SSRs of all types. Here, we report on a study of their evolution in a phylogenetic context using the genomes of four primates: human, chimpanzee, orangutan, and macaque. In a set of 5,015 orthologous genes unambiguously aligned among the four species, we show that, except for tri- and hexa-SSRs, for which insertions and deletions are frequently observed, SSRs in coding regions evolve mainly by substitutions. We show that the rate of substitution in all types of coding SSRs is typically two times higher than in the rest of coding sequences. Additionally, we observe that although numerous coding SSRs are created and lost by substitutions in the lineages, their numbers remain constant. This last observation suggests that the coding SSRs have reached equilibrium. We hypothesize that this equilibrium involves a combination of mutation, drift, and selection. We thus estimated the fitness cost of mono-SSRs and show that it increases with the number of units. We finally show that the cost of coding mono-SSRs greatly varies from function to function, suggesting that the strength of the selection that acts against them can be correlated to gene functions.

  11. ADAPTIVE NETWORK CODING IN WIRELESS COMMUNICATIONS

    DEFF Research Database (Denmark)

    2017-01-01

    A first network node (eNB) is configured to receive (404), from a second network node (UE), channel performance indicator values regarding a serving cell, and estimate (404) a number of network-coded packets based on the received channel performance indicator values, such that the estimated numbe...

  12. A Code of Ethics for Democratic Leadership

    Science.gov (United States)

    Molina, Ricardo; Klinker, JoAnn Franklin

    2012-01-01

    Democratic leadership rests on sacred values, awareness, judgement, motivation and courage. Four turning points in a 38-year school administrator's career revealed decision-making in problematic moments stemmed from values in a personal and professional code of ethics. Reflection on practice and theory added vocabulary and understanding to make…

  13. Public Values

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Rutgers, Mark R.

    2015-01-01

    administration is approached in terms of processes guided or restricted by public values and as public value creating: public management and public policy-making are both concerned with establishing, following and realizing public values. To study public values a broad perspective is needed. The article suggest......This article provides the introduction to a symposium on contemporary public values research. It is argued that the contribution to this symposium represent a Public Values Perspective, distinct from other specific lines of research that also use public value as a core concept. Public...... a research agenda for this encompasing kind of public values research. Finally the contributions to the symposium are introduced....

  14. Mapping strengths into virtues: The relation of the 24 VIA-strengths to six ubiquitous virtues

    Directory of Open Access Journals (Sweden)

    Willibald eRuch

    2015-04-01

    Full Text Available The Values-in-Action (VIA-classification distinguishes six core virtues and 24 strengths. As the assignment of the strengths to the virtues was done on theoretical grounds it still needs empirical verification. As an alternative to factor analytic investigations the present study utilizes expert judgments. In a pilot study the conceptual overlap among five sources of knowledge (strength’s name including synonyms, short definitions, brief descriptions, longer theoretical elaborations, and item content about a particular strength was examined. The results show that the five sources converged quite well, with the short definitions and the items being slightly different from the other. All strengths exceeded a cut-off value but the convergence was much better for some strengths (e.g., zest than for others (e.g., perspective. In the main study 70 experts (from psychology, philosophy, theology, etc. and 41 laypersons rated how prototypical the strengths are for each of the six virtues. The results showed that 10 were very good markers for their virtues, 9 were good markers, four were acceptable markers, and only one strength failed to reach the cut-off score for its assigned virtue. However, strengths were often markers for two or even three virtues, and occasionally they marked the other virtue more strongly than the one they were assigned to. The virtue prototypicality ratings were slightly positively correlated with higher coefficients being found for justice and humanity. A factor analysis of the 24 strengths across the ratings yielded the six factors with an only slightly different composition of strengths and double loadings. It is proposed to adjust either the classification (by reassigning strengths and by allowing strengths to be subsumed under more than one virtue or to change the definition of certain strengths so that they only exemplify one virtue. The results are discussed in the context of factor analytic attempts to verify the

  15. On {\\sigma}-LCD codes

    OpenAIRE

    Carlet, Claude; Mesnager, Sihem; Tang, Chunming; Qi, Yanfeng

    2017-01-01

    Linear complementary pairs (LCP) of codes play an important role in armoring implementations against side-channel attacks and fault injection attacks. One of the most common ways to construct LCP of codes is to use Euclidean linear complementary dual (LCD) codes. In this paper, we first introduce the concept of linear codes with $\\sigma$ complementary dual ($\\sigma$-LCD), which includes known Euclidean LCD codes, Hermitian LCD codes, and Galois LCD codes. As Euclidean LCD codes, $\\sigma$-LCD ...

  16. REPETITIVE STRENGTH AMONG STUDENTS OF AGE 14

    Directory of Open Access Journals (Sweden)

    Besim Halilaj

    2014-06-01

    Full Text Available The study involved 82 male students of the primary school “Qamil Ilazi” in Kaçanik-Kosovo.Four movement tests, which test the repetitive strength, were conducted: 1. Pull-up, 2. Sit-Up, 3. Back extension, 4. Push-up.The main goal of this study was to verify the actual motor status, respectively the component of the repetitive strength among students of age 14 of masculine gender. In addition to verifying the actual motor status, another objective was to verify the relationship between the variables employed.Basic statistical parameters show a distribution which is not significantly different from the normal distribution, yielded highly correlative values among the repetitive strength tests. Space factorization resulted in extracting two latent squares defined as repetitive strength of arms factor, and repetitive strength of body factor.

  17. Codes in Permutations and Error Correction for Rank Modulation

    CERN Document Server

    Barg, Alexander

    2009-01-01

    Codes for rank modulation have been recently proposed as a means of protecting flash memory devices from errors. We study basic coding theoretic problems for such codes, representing them as subsets of the set of permutations of $n$ elements equipped with the Kendall tau distance. We derive several lower and upper bounds on the size of codes. These bounds enable us to establish the exact scaling of the size of optimal codes for large values of $n$. We also show the existence of codes whose size is within a constant factor of the sphere packing bound for any fixed number of errors.

  18. Graph construction using adaptive Local Hybrid Coding scheme.

    Science.gov (United States)

    Dornaika, Fadi; Kejani, Mahdi Tavassoli; Bosaghzadeh, Alireza

    2017-11-01

    It is well known that dense coding with local bases (via Least Square coding schemes) can lead to large quantization errors or poor performances of machine learning tasks. On the other hand, sparse coding focuses on accurate representation without taking into account data locality due to its tendency to ignore the intrinsic structure hidden among the data. Local Hybrid Coding (LHC) (Xiang et al., 2014) was recently proposed as an alternative to the sparse coding scheme that is used in Sparse Representation Classifier (SRC). The LHC blends sparsity and bases-locality criteria in a unified optimization problem. It can retain the strengths of both sparsity and locality. Thus, the hybrid codes would have some advantages over both dense and sparse codes. This paper introduces a data-driven graph construction method that exploits and extends the LHC scheme. In particular, we propose a new coding scheme coined Adaptive Local Hybrid Coding (ALHC). The main contributions are as follows. First, the proposed coding scheme adaptively selects the local and non-local bases of LHC using data similarities provided by Locality-constrained Linear code. Second, the proposed ALHC exploits local similarities in its solution. Third, we use the proposed coding scheme for graph construction. For the task of graph-based label propagation, we demonstrate high classification performance of the proposed graph method on four benchmark face datasets: Extended Yale, PF01, PIE, and FERET. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. TECHNO-ECONOMIC ANALYSIS IN A SUPERSTRUCTURE OF A MULTIPLE FLOORS BUILDING (THREE, FIVE, SEVEN AND NINE FLOORS IN REINFORCED CONCRETE AND RIBBED SLABS WITH RECTANGULAR FORM AND DIFFERENT COMPRESSIVE STRENGTH VALUES

    Directory of Open Access Journals (Sweden)

    E. F. S. Moraes

    2017-12-01

    Full Text Available Adapting “fck" values between 25 MPa to 40 MPa, in three, five, seven and nine floor buildings for places under winds of up to 30 m/s, this research calculated the cost and inputs of these variations. The results have as a goal to improve multiple floors building design in reinforced concrete and ribbed slabs, and to contribute to economic gains. The results were analysed in five stages. (I Architectural design definition in a 1:1 proportion, (II structural conception, (III structural design, (IV cost composition and (V techno economic parameters. To sum up, the results showed that lower “fck” has presented more viability to few flooring. In addition, with the increase of floors also the “fck” raised, causing higher cost around 16,54% in the beams and 11,16% in the slabs. Moreover, the pillars showed a saving of 28,89% in the cost, ranging by up to 11,93% in the average thickness and 6,29% in the concrete form expenditure per m³. Therefore, the research showed an economic achievement of 5,14% in the overall cost between the number of floor.

  20. Anisotropic Concrete Compressive Strength

    DEFF Research Database (Denmark)

    Gustenhoff Hansen, Søren; Jørgensen, Henrik Brøner; Hoang, Linh Cao

    2017-01-01

    When the load carrying capacity of existing concrete structures is (re-)assessed it is often based on compressive strength of cores drilled out from the structure. Existing studies show that the core compressive strength is anisotropic; i.e. it depends on whether the cores are drilled parallel...

  1. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene; Dahlgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  2. Error Correcting Codes

    Indian Academy of Sciences (India)

    be fixed to define codes over such domains). New decoding schemes that take advantage of such connections can be devised. These may soon show up in a technique called code division multiple access (CDMA) which is proposed as a basis for digital cellular communication. CDMA provides a facility for many users to ...

  3. Codes of Conduct

    Science.gov (United States)

    Million, June

    2004-01-01

    Most schools have a code of conduct, pledge, or behavioral standards, set by the district or school board with the school community. In this article, the author features some schools that created a new vision of instilling code of conducts to students based on work quality, respect, safety and courtesy. She suggests that communicating the code…

  4. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March 1997 pp 33-47. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/002/03/0033-0047 ...

  5. Code Generation = A* + BURS

    NARCIS (Netherlands)

    Nymeyer, Albert; Katoen, Joost P.; Westra, Ymte; Alblas, H.; Gyimóthy, Tibor

    1996-01-01

    A system called BURS that is based on term rewrite systems and a search algorithm A* are combined to produce a code generator that generates optimal code. The theory underlying BURS is re-developed, formalised and explained in this work. The search algorithm uses a cost heuristic that is derived

  6. Dress Codes for Teachers?

    Science.gov (United States)

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  7. Informal control code logic

    NARCIS (Netherlands)

    Bergstra, J.A.

    2010-01-01

    General definitions as well as rules of reasoning regarding control code production, distribution, deployment, and usage are described. The role of testing, trust, confidence and risk analysis is considered. A rationale for control code testing is sought and found for the case of safety critical

  8. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  9. Nuremberg code turns 60

    OpenAIRE

    Thieren, Michel; Mauron, Alexandre

    2007-01-01

    This month marks sixty years since the Nuremberg code – the basic text of modern medical ethics – was issued. The principles in this code were articulated in the context of the Nuremberg trials in 1947. We would like to use this anniversary to examine its ability to address the ethical challenges of our time.

  10. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 1. Error Correcting Codes The Hamming Codes. Priti Shankar. Series Article Volume 2 Issue 1 January ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  11. Revealing the Maximum Strength in Nanotwinned Copper

    DEFF Research Database (Denmark)

    Lu, L.; Chen, X.; Huang, Xiaoxu

    2009-01-01

    The strength of polycrystalline materials increases with decreasing grain size. Below a critical size, smaller grains might lead to softening, as suggested by atomistic simulations. The strongest size should arise at a transition in deformation mechanism from lattice dislocation activities to grain...... boundary–related processes. We investigated the maximum strength of nanotwinned copper samples with different twin thicknesses. We found that the strength increases with decreasing twin thickness, reaching a maximum at 15 nanometers, followed by a softening at smaller values that is accompanied by enhanced...

  12. The p-Wave Strength Function

    OpenAIRE

    Izumi, FURUOYA; Department of Physics, Hosei University

    1982-01-01

    The effect of the intermediate structure, the doorway state, on the overall aspect of the p-wave strength function plotted with respect to mass number is investigated. Our qualitative method is analogous to that used by Block and Feshbach in their investigation on the s-wave strength function. It is shown that low values in the p-wave strength function near A=50 and A=160 can be explained by our theory. In particular it is found that the change of the number of doorway states contributing to ...

  13. Beyond Values Clarification: Addressing Client Values in Clinical Behavior Analysis

    Science.gov (United States)

    Bonow, Jordan T.; Follette, William C.

    2009-01-01

    Ethical principles of psychology, as exemplified in the American Psychological Association (APA) Code of Ethics (2002), provide impractical advice for addressing client values during psychotherapy. These principles seem to argue that each client's values should be respected and protected at all times, except in cases in which this would result in…

  14. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  15. Quantum Synchronizable Codes From Quadratic Residue Codes and Their Supercodes

    OpenAIRE

    Xie, Yixuan; Yuan, Jinhong; Fujiwara, Yuichiro

    2014-01-01

    Quantum synchronizable codes are quantum error-correcting codes designed to correct the effects of both quantum noise and block synchronization errors. While it is known that quantum synchronizable codes can be constructed from cyclic codes that satisfy special properties, only a few classes of cyclic codes have been proved to give promising quantum synchronizable codes. In this paper, using quadratic residue codes and their supercodes, we give a simple construction for quantum synchronizable...

  16. strength characterization of foundation soils at federal university ...

    African Journals Online (AJOL)

    HOD

    unconfined compression test is used to measure the shearing resistance and bearing capacity of soils. Value of undrained shear strength without confining pressure is equal to unconfined compressive strength. This value is theoretically twice as big as cohesion [21]. The three parameters are found in this study to be ...

  17. Effect of Temperature on the Tensile Strength and Thermoelectric ...

    African Journals Online (AJOL)

    The tensile strength and thermoelectric e.m.f. values of 6063 aluminum alloy quenched at different temperatures from 2500C to 6000C were investigated. The result empirically confirmed that a perfect correlation exists between the tensile strength and thermoelectric e.m.f. values with concurrent minimum temperature ...

  18. Pyramid image codes

    Science.gov (United States)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  19. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  20. Tensile strength of glulam laminations of Nordic spruce

    DEFF Research Database (Denmark)

    Hoffmeyer, Preben; Bräuner, Lise; Boström, Lars

    1999-01-01

    Design of glulam according to the European timber code Eurocode 5 is based on the standard document prEN1194 , according to which glulam beam strength is to be established either by full scale testing or by calculation. The calculation must be based on a knowledge of lamination tensile strength....... This knowledge may be obtained either by adopting a general rule that the characteristic tensile strength is sixty percent of the characteristic bending strength, or by performing tensile tests on an adequate number of laminations representative of the whole population. The present paper presents...... an investigation aimed at establishing such an adequate experimental background for the assignment of strength classes for glulam made of visually strength graded laminations from Nordic sawmills. The investigation includes more than 1800 boards (laminations) of Norway spruce (Picea abies) sampled from eight...

  1. Anisotropic Concrete Compressive Strength

    DEFF Research Database (Denmark)

    Gustenhoff Hansen, Søren; Jørgensen, Henrik Brøner; Hoang, Linh Cao

    2017-01-01

    When the load carrying capacity of existing concrete structures is (re-)assessed it is often based on compressive strength of cores drilled out from the structure. Existing studies show that the core compressive strength is anisotropic; i.e. it depends on whether the cores are drilled parallel...... correlation to the curing time. The experiments show no correlation between the anisotropy and the curing time and a small strength difference between the two drilling directions. The literature shows variations on which drilling direction that is strongest. Based on a Monto Carlo simulation of the expected...

  2. Strength of human pulleys.

    Science.gov (United States)

    Manske, P R; Lesker, P A

    1977-06-01

    The length, breaking stength, and tensile strength of each of the annular fibroosseous pulleys of digital flexor sheath in ten fresh human cadaver specimens were measured. The first annular pulley and the fourth annular pulley were found to be the strongest, while the second annular pulley was the weakest. The design of artificial pulleys should reproduce the strength of the first annular and fourth annular pulleys. Suggested minimum requirements for the breaking strength of artificial implant pulleys may be made based on these studies.

  3. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  4. Quantum coding theorems

    Science.gov (United States)

    Holevo, A. S.

    1998-12-01

    ContentsI. IntroductionII. General considerations § 1. Quantum communication channel § 2. Entropy bound and channel capacity § 3. Formulation of the quantum coding theorem. Weak conversionIII. Proof of the direct statement of the coding theorem § 1. Channels with pure signal states § 2. Reliability function § 3. Quantum binary channel § 4. Case of arbitrary states with bounded entropyIV. c-q channels with input constraints § 1. Coding theorem § 2. Gauss channel with one degree of freedom § 3. Classical signal on quantum background noise Bibliography

  5. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  6. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  7. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  8. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  9. Coded Random Access

    DEFF Research Database (Denmark)

    Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi

    2015-01-01

    , in which the structure of the access protocol can be mapped to a structure of an erasure-correcting code defined on graph. This opens the possibility to use coding theory and tools for designing efficient random access protocols, offering markedly better performance than ALOHA. Several instances of coded......The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered...... as waste. However, if the common receiver (e.g., base station) is capable to store the collision slots and use them in a transmission recovery process based on successive interference cancellation, the design space for access protocols is radically expanded. We present the paradigm of coded random access...

  10. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  11. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  12. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....

  13. Code de conduite

    International Development Research Centre (IDRC) Digital Library (Canada)

    irocca

    son point de vue, dans un esprit d'accueil et de respect. NOTRE CODE DE CONDUITE. Le CRDI s'engage à adopter un comportement conforme aux normes d'éthique les plus strictes dans toutes ses activités. Le Code de conduite reflète notre mission, notre philosophie en matière d'emploi et les résultats des discussions ...

  14. Open Coding Descriptions

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon PhD

    2016-12-01

    Full Text Available Open coding is a big source of descriptions that must be managed and controlled when doing GT research. The goal of generating a GT is to generate an emergent set of concepts and their properties that fit and work with relevancy to be integrated into a theory. To achieve this goal, the researcher begins his research with open coding, that is coding all his data in every possible way. The consequence of this open coding is a multitude of descriptions for possible concepts that often do not fit in the emerging theory. Thus in this case the researcher ends up with many irrelevant descriptions for concepts that do not apply. To dwell on descriptions for inapplicable concepts ruins the GT theory as it starts. It is hard to stop. Confusion easily sets in. Switching the study to a QDA is a simple rescue. Rigorous focusing on emerging concepts is vital before being lost in open coding descriptions. It is important, no matter how interesting the description may become. Once a core is possible, selective coding can start which will help control against being lost in multiple descriptions.

  15. Precision measurement of the electromagnetic dipole strengths in {sup 11}Be

    Energy Technology Data Exchange (ETDEWEB)

    Kwan, E., E-mail: kwan@nscl.msu.edu [Lawrence Livermore National Laboratory, PO Box 808, Livermore, CA 94550 (United States); Wu, C.Y., E-mail: wu24@llnl.gov [Lawrence Livermore National Laboratory, PO Box 808, Livermore, CA 94550 (United States); Summers, N.C. [Lawrence Livermore National Laboratory, PO Box 808, Livermore, CA 94550 (United States); Hackman, G. [TRIUMF, 4004 Wesbrook Mall, Vancouver, British Columbia, V6T 2A3 (Canada); Drake, T.E. [Department of Physics, University of Toronto, Toronto, Ontario, M5S 1A7 (Canada); Andreoiu, C.; Ashley, R. [Department of Chemistry, Simon Fraser University, Burnaby, British Columbia, V5A 1S6 (Canada); Ball, G.C.; Bender, P.C. [TRIUMF, 4004 Wesbrook Mall, Vancouver, British Columbia, V6T 2A3 (Canada); Boston, A.J.; Boston, H.C. [Department of Physics, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Chester, A. [Department of Chemistry, Simon Fraser University, Burnaby, British Columbia, V5A 1S6 (Canada); Close, A. [TRIUMF, 4004 Wesbrook Mall, Vancouver, British Columbia, V6T 2A3 (Canada); Cline, D. [Department of Physics and Astronomy, University of Rochester, Rochester, NY, 14627 (United States); Cross, D.S. [Department of Chemistry, Simon Fraser University, Burnaby, British Columbia, V5A 1S6 (Canada); Dunlop, R.; Finlay, A. [Department of Physics, University of Guelph, Guelph, Ontario, N1G 2W1 (Canada); Garnsworthy, A.B. [TRIUMF, 4004 Wesbrook Mall, Vancouver, British Columbia, V6T 2A3 (Canada); Hayes, A.B. [Department of Physics and Astronomy, University of Rochester, Rochester, NY, 14627 (United States); Laffoley, A.T. [Department of Physics, University of Guelph, Guelph, Ontario, N1G 2W1 (Canada); and others

    2014-05-01

    The electromagnetic dipole strength in {sup 11}Be between the bound states has been measured using low-energy projectile Coulomb excitation at bombarding energies of 1.73 and 2.09 MeV/nucleon on a {sup 196}Pt target. An electric dipole transition probability B(E1;1/2{sup −}→1/2{sup +})=0.102(2) e{sup 2}fm{sup 2} was determined using the semi-classical code Gosia, and a value of 0.098(4) e{sup 2}fm{sup 2} was determined using the Extended Continuum Discretized Coupled Channels method with the quantum mechanical code FRESCO. These extracted B(E1) values are consistent with the average value determined by a model-dependent analysis of intermediate energy Coulomb excitation measurements and are approximately 14% lower than that determined by a lifetime measurement. The much-improved precisions of 2% and 4% in the measured B(E1) values between the bound states deduced using Gosia and the Extended Continuum Discretized Coupled Channels method, respectively, compared to the previous accuracy of ∼10% will help in our understanding of and better improve the realistic inter-nucleon interactions.

  16. Coding stimulus amplitude by correlated neural activity.

    Science.gov (United States)

    Metzen, Michael G; Ávila-Åkerberg, Oscar; Chacron, Maurice J

    2015-04-01

    While correlated activity is observed ubiquitously in the brain, its role in neural coding has remained controversial. Recent experimental results have demonstrated that correlated but not single-neuron activity can encode the detailed time course of the instantaneous amplitude (i.e., envelope) of a stimulus. These have furthermore demonstrated that such coding required and was optimal for a nonzero level of neural variability. However, a theoretical understanding of these results is still lacking. Here we provide a comprehensive theoretical framework explaining these experimental findings. Specifically, we use linear response theory to derive an expression relating the correlation coefficient to the instantaneous stimulus amplitude, which takes into account key single-neuron properties such as firing rate and variability as quantified by the coefficient of variation. The theoretical prediction was in excellent agreement with numerical simulations of various integrate-and-fire type neuron models for various parameter values. Further, we demonstrate a form of stochastic resonance as optimal coding of stimulus variance by correlated activity occurs for a nonzero value of noise intensity. Thus, our results provide a theoretical explanation of the phenomenon by which correlated but not single-neuron activity can code for stimulus amplitude and how key single-neuron properties such as firing rate and variability influence such coding. Correlation coding by correlated but not single-neuron activity is thus predicted to be a ubiquitous feature of sensory processing for neurons responding to weak input.

  17. Development of K-Basin High-Strength Homogeneous Sludge Simulants and Correlations Between Unconfined Compressive Strength and Shear Strength

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Yasuo; Baer, Ellen BK; Chun, Jaehun; Yokuda, Satoru T.; Schmidt, Andrew J.; Sande, Susan; Buchmiller, William C.

    2011-02-20

    K-Basin sludge will be stored in the Sludge Transport and Storage Containers (STSCs) at an interim storage location on Central Plateau before being treated and packaged for disposal. During the storage period, sludge in the STSCs may consolidate/agglomerate, potentially resulting in high-shear-strength material. The Sludge Treatment Project (STP) plans to use water jets to retrieve K-Basin sludge after the interim storage. STP has identified shear strength to be a key parameter that should be bounded to verify the operability and performance of sludge retrieval systems. Determining the range of sludge shear strength is important to gain high confidence that a water-jet retrieval system can mobilize stored K-Basin sludge from the STSCs. The shear strength measurements will provide a basis for bounding sludge properties for mobilization and erosion. Thus, it is also important to develop potential simulants to investigate these phenomena. Long-term sludge storage tests conducted by Pacific Northwest National Laboratory (PNNL) show that high-uranium-content K-Basin sludge can self-cement and form a strong sludge with a bulk shear strength of up to 65 kPa. Some of this sludge has 'paste' and 'chunks' with shear strengths of approximately 3-5 kPa and 380-770 kPa, respectively. High-uranium-content sludge samples subjected to hydrothermal testing (e.g., 185 C, 10 hours) have been observed to form agglomerates with a shear strength up to 170 kPa. These high values were estimated by measured unconfined compressive strength (UCS) obtained with a pocket penetrometer. Due to its ease of use, it is anticipated that a pocket penetrometer will be used to acquire additional shear strength data from archived K-Basin sludge samples stored at the PNNL Radiochemical Processing Laboratory (RPL) hot cells. It is uncertain whether the pocket penetrometer provides accurate shear strength measurements of the material. To assess the bounding material strength and

  18. Pinch Strengths in Healthy Iranian Children and Young Adult Population

    Directory of Open Access Journals (Sweden)

    Iman Dianat

    2015-03-01

    Full Text Available Background: Data on the physical strength capabilities are essential for design-ing safe and usable products and are useful in a wide range of clinical settings especially during treatment of disease affecting the function of the hand. The purpose of this study was to determine peak lateral pinch strength, key pinch strength, tip-to-tip pinch strength and three-jaw pinch strength exertions in a healthy Iranian children and young adult population.Methods: The study was conducted among 511 participants (242 males and 269 females aged 7-30 years. Measurements were carried out with both dominant and non-dominant hands in standard sitting posture using a B&L pinch gauge. Two repetitions of each strength measurement were recorded for each condition and the average value of the two trials was used in the subsequent analysis.Results: The results showed significant differences in the pinch strength data in terms of the age, gender and hand dominance. The lateral pinch strength, key pinch strength, tip-to-tip pinch strength and three-jaw pinch strength exertions by females were 68.4%, 68.8%, 78.8% and 81.8% of those exerted by males, respectively. Strength exertions with the non-dominant hand were 6.4%, 5.2%, 6.6% and 5.1% lower than strength exertions of the dominant hand for the lat-eral pinch strength, key pinch strength, tip-to-tip pinch strength and three-jaw pinch strength exertions, respectively.Conclusion: These findings can be used to fill the gaps in strength data for Iranian population.

  19. Bandwidth efficient coding for fading channels - Code construction and performance analysis

    Science.gov (United States)

    Schlegel, Christian; Costello, Daniel J., Jr.

    1989-01-01

    The authors apply a general method of bounding the event error probability of trellis-coded modulation schemes to fading channels and use the effective length and the minimum-squared-product distance to replace the minimum-free-squared-Euclidean distance as code design parameters for Rayleigh and Rician fading channels with a substantial multipath component. They present 8-PSK trellis codes specifically constructed for fading channels that outperform equivalent codes designed for the additive white Gaussian noise channel when v is greater than or equal to 5. For quasiregular trellis codes there exists an efficient algorithm for evaluating event error probability, and numerical results on Pe which demonstrate the importance of the effective length as a code design parameter for fading channels with or without side information have been obtained. This is consistent with the case for binary signaling, where the Hamming distance remains the best code design parameter for fading channels. The authors show that the use of Reed-Solomon block codes with expanded signal sets becomes interesting only for large values of E(s)/N(0), where they begin to outperform trellis codes.

  20. Value Investing

    OpenAIRE

    Kubínyi, Tomáš

    2014-01-01

    This bachelor's thesis deals with value investing in the form defined by Benjamin Graham. In clarifying the theoretical aspects, particular attention is given to an intrinsic value of stocks and to its calculation methods. A way to overcome the deficiencies in the two most widely used models of calculation is introduced. It is value screening, which by defining of certain criteria makes an assumption of undervalued stocks. Then the investment approach of the most successful investor, Warren B...

  1. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  2. Comparative Study of Generalized Quantitative-Qualitative Inaccuracy Fuzzy Measures for Noiseless Coding Theorem and 1:1 Codes

    Directory of Open Access Journals (Sweden)

    H. D. Arora

    2015-01-01

    Full Text Available In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper, we have introduced mean codeword length of order α and type β for 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzy information measure has been established.

  3. Using facilitators in mock codes: recasting the parts for success.

    Science.gov (United States)

    Cuda, S; Doerr, D; Gonzalez, M

    1999-01-01

    Members of the CHRISTUS Santa Rosa Children's Hospital staff development committee identified a need for a mock code program which would address a range of learning needs for nurses and other caregivers with varying levels of knowledge, skills, and experience. We implemented a mock code program using experienced caregivers, usually emergency room and pediatric intensive care RNs and respiratory therapists to serve as facilitators to code participants during the mock code drills. Facilitators have dual roles of teaching and guiding the code participant as well as evaluating performance. Code participants and facilitators benefit from the design of this program. Debriefing session input and written program evaluations show that code participants value the opportunity to practice their skills in a nonthreatening situation in which they receive immediate feedback as needed. Facilitators learn to teach and coach and strengthen their own code knowledge and skills at the same time. This mock code program serves as a unique way to include novice and experienced nurses in mock codes together. The knowledge, skills, and confidence of the code participants and the facilitators have matured. The design of the program allows for immediate teaching/learning where needed, as well as appropriate evaluation. This program develops stronger, calmer, more efficient, and more confident nurses during codes. Practice and equipment changes can be based on findings from the mock codes. The program is invaluable to patients, staff, and hospital.

  4. Code blue: seizures.

    Science.gov (United States)

    Hoerth, Matthew T; Drazkowski, Joseph F; Noe, Katherine H; Sirven, Joseph I

    2011-06-01

    Eyewitnesses frequently perceive seizures as life threatening. If an event occurs on the hospital premises, a "code blue" can be called which consumes considerable resources. The purpose of this study was to determine the frequency and characteristics of code blue calls for seizures and seizure mimickers. A retrospective review of a code blue log from 2001 through 2008 identified 50 seizure-like events, representing 5.3% of all codes. Twenty-eight (54%) occurred in inpatients; the other 22 (44%) events involved visitors or employees on the hospital premises. Eighty-six percent of the events were epileptic seizures. Seizure mimickers, particularly psychogenic nonepileptic seizures, were more common in the nonhospitalized group. Only five (17.9%) inpatients had a known diagnosis of epilepsy, compared with 17 (77.3%) of the nonhospitalized patients. This retrospective survey provides insights into how code blues are called on hospitalized versus nonhospitalized patients for seizure-like events. Copyright © 2011. Published by Elsevier Inc.

  5. Error coding simulations

    Science.gov (United States)

    Noble, Viveca K.

    1993-11-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  6. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  7. Manufacturer Identification Code (MID) - ACE

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  8. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  9. Split field coding: low complexity error-resilient entropy coding for image compression

    Science.gov (United States)

    Meany, James J.; Martens, Christopher J.

    2008-08-01

    In this paper, we describe split field coding, an approach for low complexity, error-resilient entropy coding which splits code words into two fields: a variable length prefix and a fixed length suffix. Once a prefix has been decoded correctly, then the associated fixed length suffix is error-resilient, with bit errors causing no loss of code word synchronization and only a limited amount of distortion on the decoded value. When the fixed length suffixes are segregated to a separate block, this approach becomes suitable for use with a variety of methods which provide varying protection to different portions of the bitstream, such as unequal error protection or progressive ordering schemes. Split field coding is demonstrated in the context of a wavelet-based image codec, with examples of various error resilience properties, and comparisons to the rate-distortion and computational performance of JPEG 2000.

  10. Sensitivity analysis on ultimate strength of aluminium stiffened panels

    DEFF Research Database (Denmark)

    Rigo, P.; Sarghiuta, R.; Estefen, S.

    2003-01-01

    This paper presents the results of an extensive sensitivity analysis carried out by the Committee III.1 "Ultimate Strength" of ISSC?2003 in the framework of a benchmark on the ultimate strength of aluminium stiffened panels. Previously, different benchmarks were presented by ISSC committees...... stiffened aluminium panels (including extruded profiles). Main objectives are to compare codes/models and to perfom quantitative sensitivity analysis of the ultimate strength of a welded aluminium panel on various parameters (typically the heat-affected zone). Two phases were planned. In Phase A, alle...... of different parameters (sensitivity analysis)...

  11. Strength of Fibrous Composites

    CERN Document Server

    Huang, Zheng-Ming

    2012-01-01

    "Strength of Fibrous Composites" addresses evaluation of the strength of a fibrous composite by using its constituent material properties and its fiber architecture parameters. Having gone through the book, a reader is able to predict the progressive failure behavior and ultimate strength of a fibrous laminate subjected to an arbitrary load condition in terms of the constituent fiber and matrix properties, as well as fiber geometric parameters. The book is useful to researchers and engineers working on design and analysis for composite materials. Dr. Zheng-Ming Huang is a professor at the School of Aerospace Engineering & Applied Mechanics, Tongji University, China. Mr. Ye-Xin Zhou is a PhD candidate at the Department of Mechanical Engineering, the University of Hong Kong, China.

  12. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  13. High strength alloys

    Energy Technology Data Exchange (ETDEWEB)

    Maziasz, Phillip James; Shingledecker, John Paul; Santella, Michael Leonard; Schneibel, Joachim Hugo; Sikka, Vinod Kumar; Vinegar, Harold J.; John, Randy Carl; Kim, Dong Sub

    2012-06-05

    High strength metal alloys are described herein. At least one composition of a metal alloy includes chromium, nickel, copper, manganese, silicon, niobium, tungsten and iron. System, methods, and heaters that include the high strength metal alloys are described herein. At least one heater system may include a canister at least partially made from material containing at least one of the metal alloys. At least one system for heating a subterranean formation may include a tublar that is at least partially made from a material containing at least one of the metal alloys.

  14. High strength alloys

    Energy Technology Data Exchange (ETDEWEB)

    Maziasz, Phillip James [Oak Ridge, TN; Shingledecker, John Paul [Knoxville, TN; Santella, Michael Leonard [Knoxville, TN; Schneibel, Joachim Hugo [Knoxville, TN; Sikka, Vinod Kumar [Oak Ridge, TN; Vinegar, Harold J [Bellaire, TX; John, Randy Carl [Houston, TX; Kim, Dong Sub [Sugar Land, TX

    2010-08-31

    High strength metal alloys are described herein. At least one composition of a metal alloy includes chromium, nickel, copper, manganese, silicon, niobium, tungsten and iron. System, methods, and heaters that include the high strength metal alloys are described herein. At least one heater system may include a canister at least partially made from material containing at least one of the metal alloys. At least one system for heating a subterranean formation may include a tubular that is at least partially made from a material containing at least one of the metal alloys.

  15. Hand grip strength

    DEFF Research Database (Denmark)

    Frederiksen, Henrik; Gaist, David; Petersen, Hans Christian

    2002-01-01

    in life is a major problem in terms of prevalence, morbidity, functional limitations, and quality of life. It is therefore of interest to find a phenotype reflecting physical functioning which has a relatively high heritability and which can be measured in large samples. Hand grip strength is known......-55%). A powerful design to detect genes associated with a phenotype is obtained using the extreme discordant and concordant sib pairs, of whom 28 and 77 dizygotic twin pairs, respectively, were found in this study. Hence grip strength is a suitable phenotype for identifying genetic variants of importance to mid...

  16. General Sentiment and Value

    DEFF Research Database (Denmark)

    Arvidsson, Adam; Etter, Michael; Colleoni, Elanor

    The aim of this paper is to deepen the understanding of the relationship between corporate reputation and financial value. Theories as the resource based view or the contractual view lie ground for the assumption of a linear positive correlation between reputation and financial performance. However......, existing empirical studies have provided conflicting results regarding the direction and strength of this relationship so far. In this paper we claim that the assumption of a direct linear correlation between corporate reputation and financial value misrepresents current financial practices...... and underestimates the complexity of the calculative situations in which values are set on financial markets. Based on in-depth interviews with traders, equity analysts and financial tool developers about the real use of reputation measurement tools, we argue that reputation data rather function as interpretative...

  17. Graph Codes with Reed-Solomon Component Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2006-01-01

    We treat a specific case of codes based on bipartite expander graphs coming from finite geometries. The code symbols are associated with the branches and the symbols connected to a given node are restricted to be codewords in a Reed-Solomon code. We give results on the parameters of the codes...

  18. A study of pelvic ligament strength.

    Science.gov (United States)

    Cosson, Michel; Boukerrou, Malik; Lacaze, Sophie; Lambaudie, Eric; Fasel, Jean; Mesdagh, Henri; Lobry, Pierre; Ego, Anne

    2003-07-01

    To measure the strength at tearing of pelvic ligaments used in the cure of prolapse and urinary incontinence. We performed our measurements on pelvis ligaments from cadaveric specimens. We dissected 29 human female pelvis cadavers of which storage conditions differed. Ten were frozen, 10 fresh and 9 were stored in formalin. In each cadaver we dissected pre-vertebral ligaments at promontory and right and left symmetrical ligaments. These were the iliopectineal, sacrospinous and arcus tendineus of pelvic fascia. A subjective clinical evaluation of the ligament properties was performed by visual observation as well as finger palpation. Ligaments were classified into three groups. Group A contained high quality ligaments, in terms of thickness and apparent strength following finger palpation. Ligaments of doubtful quality were classified in group B and low apparent quality ligaments in group C. Then the ligaments were stitched by a suture taking the entire ligament and a force was applied on the vagina axis until tearing. The device used for strength measurement during traction was a SAMSON type force gauge, model EASY, serial number SMS-R-ES 300N manufactured by Andilog that was developed for the purpose of our study. Measurements were given in Newton (N). There was a great variability in the values obtained at tearing with minimal values at around 20N and maximal values at 200N. Individually measured, ligament strength varied between individuals, and for the same patient between the type of ligaments and the side. The pre-vertebral ligament was on average the strongest. There was no significant difference according to the storage condition except for the pre-vertebral ligament in formalin cadavers. For bilateral ligaments, there was no difference between the left and right side. The iliopectineal ligament was statistically significantly stronger than the sacrospinous and arcus tendineus of pelvic fascia. There was a correlation between subjective evaluation and

  19. The value of value congruence.

    Science.gov (United States)

    Edwards, Jeffrey R; Cable, Daniel M

    2009-05-01

    Research on value congruence has attempted to explain why value congruence leads to positive outcomes, but few of these explanations have been tested empirically. In this article, the authors develop and test a theoretical model that integrates 4 key explanations of value congruence effects, which are framed in terms of communication, predictability, interpersonal attraction, and trust. These constructs are used to explain the process by which value congruence relates to job satisfaction, organizational identification, and intent to stay in the organization, after taking psychological need fulfillment into account. Data from a heterogeneous sample of employees from 4 organizations indicate that the relationships that link individual and organizational values to outcomes are explained primarily by the trust that employees place in the organization and its members, followed by communication, and, to a lesser extent, interpersonal attraction. Polynomial regression analyses reveal that the relationships emanating from individual and organizational values often deviated from the idealized value congruence relationship that underlies previous theory and research. The authors' results also show that individual and organizational values exhibited small but significant relationships with job satisfaction and organizational identification that bypassed the mediators in their model, indicating that additional explanations of value congruence effects should be pursued in future research. (c) 2009 APA, all rights reserved.

  20. Code of Medical Ethics

    Directory of Open Access Journals (Sweden)

    . SZD-SZZ

    2017-03-01

    Full Text Available Te Code was approved on December 12, 1992, at the 3rd regular meeting of the General Assembly of the Medical Chamber of Slovenia and revised on April 24, 1997, at the 27th regular meeting of the General Assembly of the Medical Chamber of Slovenia. The Code was updated and harmonized with the Medical Association of Slovenia and approved on October 6, 2016, at the regular meeting of the General Assembly of the Medical Chamber of Slovenia.

  1. Physical Layer Network Coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Yomo, Hironori; Popovski, Petar

    2013-01-01

    Physical layer network coding (PLNC) has the potential to improve throughput of multi-hop networks. However, most of the works are focused on the simple, three-node model with two-way relaying, not taking into account the fact that there can be other neighboring nodes that can cause/receive inter......Physical layer network coding (PLNC) has the potential to improve throughput of multi-hop networks. However, most of the works are focused on the simple, three-node model with two-way relaying, not taking into account the fact that there can be other neighboring nodes that can cause...

  2. Principles of speech coding

    CERN Document Server

    Ogunfunmi, Tokunbo

    2010-01-01

    It is becoming increasingly apparent that all forms of communication-including voice-will be transmitted through packet-switched networks based on the Internet Protocol (IP). Therefore, the design of modern devices that rely on speech interfaces, such as cell phones and PDAs, requires a complete and up-to-date understanding of the basics of speech coding. Outlines key signal processing algorithms used to mitigate impairments to speech quality in VoIP networksOffering a detailed yet easily accessible introduction to the field, Principles of Speech Coding provides an in-depth examination of the

  3. Securing mobile code.

    Energy Technology Data Exchange (ETDEWEB)

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called &apos

  4. What to do with a Dead Research Code

    Science.gov (United States)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  5. Strengths-based positive psychology interventions: A randomized placebo-controlled online trial on long-term effects for a signature strengths- vs. a lesser strengths-intervention

    Directory of Open Access Journals (Sweden)

    René T. Proyer

    2015-04-01

    Full Text Available Recent years have seen an increasing interest in research in positive psychology interventions. There is broad evidence for their effectiveness in increasing well-being and ameliorating depression. Intentional activities that focus on those character strengths, which are most typical for a person (i.e., signature strengths and encourage their usage in a new way have been identified as highly effective. The current study aims at comparing an intervention aimed at using signature strengths with one on using individual low scoring (or lesser strengths in a randomized placebo-controlled trial. A total of 375 adults were randomly assigned to one of the two intervention conditions (i.e., using five signature vs. five lesser strengths in a new way or a placebo control condition (i.e., early memories. We measured happiness and depressive symptoms at five time points (i.e., pre- and post-test, 1-, 3-, and 6-months follow-ups and character strengths at pre-test. The main findings are that (1 there were increases in happiness for up to three months and decreased depressive symptoms in the short term in both intervention conditions; (2 participants found working with strengths equally rewarding (enjoyment and benefit in both conditions; (3 those participants that reported generally higher levels of strengths benefitted more from working on lesser strengths rather than signature strengths and those with comparatively lower levels of strengths tended to benefit more from working on signature strengths; and (4 deviations from an average profile derived from a large sample of German-speakers completing the Values-in-Action Inventory of Strengths (VIA-IS were associated with greater benefit from the interventions in the signature strengths intervention. We conclude that working on character strengths is effective for increasing happiness and discuss how these interventions could be tailored to the individual for promoting their effectiveness.

  6. Value Representations

    DEFF Research Database (Denmark)

    Rasmussen, Majken Kirkegaard; Petersen, Marianne Graves

    2011-01-01

    Stereotypic presumptions about gender affect the design process, both in relation to how users are understood and how products are designed. As a way to decrease the influence of stereotypic presumptions in design process, we propose not to disregard the aspect of gender in the design process......, as the perspective brings valuable insights on different approaches to technology, but instead to view gender through a value lens. Contributing to this perspective, we have developed Value Representations as a design-oriented instrument for staging a reflective dialogue with users. Value Representations...

  7. Comparison of tensile strength of different carbon fabric reinforced epoxy composites

    Directory of Open Access Journals (Sweden)

    Jane Maria Faulstich de Paiva

    2006-03-01

    Full Text Available Carbon fabric/epoxy composites are materials used in aeronautical industry to manufacture several components as flaps, aileron, landing-gear doors and others. To evaluate these materials become important to know their mechanical properties, for example, the tensile strength. Tensile tests are usually performed in aeronautical industry to determinate tensile property data for material specifications, quality assurance and structural analysis. For this work, it was manufactured four different laminate families (F155/PW, F155/HS, F584/PW and F584/HS using pre-impregnated materials (prepregs based on F155TM and F584TM epoxy resins reinforced with carbon fiber fabric styles Plain Weave (PW and Eight Harness Satin (8HS. The matrix F155TM code is an epoxy resin type DGEBA (diglycidil ether of bisphenol A that contains a curing agent and the F584TM code is a modified epoxy resin type. The laminates were obtained by handing lay-up process following an appropriate curing cycle in autoclave. The samples were evaluated by tensile tests according to the ASTM D3039. The F584/PW laminates presented the highest values of tensile strength. However, the highest modulus results were determined for the 8HS composite laminates. The correlation of these results emphasizes the importance of the adequate combination of the polymeric matrix and the reinforcement arrangement in the structural composite manufacture. The microscopic analyses of the tested specimens show valid failure modes for composites used in aeronautical industry.

  8. Practical significance of weld strength matching

    Energy Technology Data Exchange (ETDEWEB)

    Sloterdijk, W. [N.V. Nederlandse Gasunie, Groningen (Netherlands); Schipaanboord, W.N. [N.V. Nederlandse Gasunie, Groningen (Netherlands)

    1996-10-01

    Defect tolerance in welds in pipelines constructed in modern high strength material depends on the balance in strength between weld material and pipe material. The Guidelines on the assessment of girth weld defects published by the European Pipeline Research Group (EPRG) define in Tier 2 defect limits assuming that the (actual) weld metal yield strength is equal or greater than the yield strength of the parent material. The defect limits according to Tier 2 exceed the defect limits in `workmanship standards` (l>25 mm). Nevertheless, the draft European welding standard EN 288 does not yet require a test to measure and verify the weld metal yield strength. Gasunie has performed a test program with the aim to look at the practical significance of weld strength matching in a strain controlled situation and to verify the relevance of limits given in the European welding and line pipe codes, in combination with the EPRG Guidelines. It is concluded that the results of the tests confirm the defect acceptance limits according to Tier 2 of the EPRG Guidelines. (orig.) [Deutsch] Die Zulaessigkeit von Fehlern in Rundschweissnaehten in Rohrleitungen aus modernen hochfesten Baustaehlen haengt von dem Verhaeltnis der Werkstofffestigkeit des Schweissgutes zu der des Grundwerkstoffs ab. Die von der European Pipeline Research Group (EPRG) veroeffentlichte Richtlinie zur Bewertung von Schweissnahtfehlern gibt in der zweiten Bewertungsstufe (Tier 2) Werte fuer zulaessige Schweissnahtfehlergroessen unter der Bedingung an, dass die Dehngrenze des Schweissgutes groesser oder gleich der Dehngrenze des Grundwerkstoffs ist. Die nach Tier 2 zulaessigen Fehler sind groesser als die in `Good-workmanship`-Regelwerken angegebenen Fehlerlaenge (l>25 mm). Demgegenueber fehlt im Entwurf der europaeischen Schweissnorm EN 288 bislang ein solcher Dehngrenzennachweis. Gasunie hat ein Versuchsprogramm durchgefuehrt, um die Bedeutung der Schweissgutfestigkeit bei dehnungskontrollierter Belastung sowie

  9. The Strength Compass

    DEFF Research Database (Denmark)

    Ledertoug, Mette Marie

    present in both small children and youths? Gender: Do the results show differences between the two genders? Danish as a mother- tongue language: Do the results show any differences in the strengths display when considering different language and cultural backgrounds? Children with Special Needs: Do...

  10. Flexural strength and microhardness of anterior composites after accelerated aging.

    Science.gov (United States)

    Pala, Kanşad; Tekçe, Neslihan; Tuncer, Safa; Demirci, Mustafa; Öznurhan, Fatih; Serim, Merve

    2017-03-01

    This study aimed to evaluate the flexural strength and microhardness of three different anterior composites after 10 000 thermocycles. The mechanical properties of a nano-fill composite (Filtek Ultimate Universal Restorative (FUR) (Enamel)), a nano-hybrid composite (Clearfil Majesty ES2 (ES2) (Enamel)), and a micro-hybrid composite (G Aenial Anterior (GAA)) were investigated in this study. For the microhardness test, 8-mm diameter and 2-mm thickness composite discs were used (n = 10), and for the flexural strength test, 25x2x2 mm bar-shaped specimens were prepared (n = 13). The specimens were tested at 24 h and after 10 000 thermocycles. Data were analyzed using two-way analysis of variance and the post-hoc Tukey test (p microhardness values of the materials (p microhardness than ES2 and GAA. However, the flexural strength of three composites was statistically similar at 24 h (p > .05). Pearson correlation analysis revealed that there was a negative relationship between the mean hardness and flexural strength values (correlation coefficient = -0.367, p = .043). After 10 000 thermocycles, microhardness values of each material and flexural strength of ES2 and GAA decreased significantly according to 24 h. The nano-fill composite FUR displayed significantly higher microhardness values. However, each resin composite was statistically similar for flexural strength values. Ten thousand thermocycles significantly affected microhardness and flexural strength. Key words:Flexural strength, microhardness, anterior composites.

  11. Annotation of selection strengths in viral genomes

    DEFF Research Database (Denmark)

    McCauley, Stephen; de Groot, Saskia; Mailund, Thomas

    2007-01-01

    and pol are indeed annotated as such, we also discover several sites of less stringent negative selection within the env gene. The the best of our knowledge we are the first to subsequently provide a full selection annotation of the Hepatitis B genome by explicitly modelling the evolution within...... obtain an annotation of the coding regions, as well as a posterior probability for each site of the strength of selection acting on it. From this we may deduce the average posterior selection acting on the different genes. Whilst we are encouraged to see in HIV2, that the known to be conserved genes gag...

  12. Ptolemy Coding Style

    Science.gov (United States)

    2014-09-05

    because this would combine Ptolemy II with the GPL’d code and thus encumber Ptolemy II with the GPL. Another GNU license is the GNU Library General...permission on the source.eecs.berkeley.edu repositories, then use your local repository. bash-3.2$ svn co svn+ ssh ://source.eecs.berkeley.edu/chess

  13. Error Correcting Codes

    Indian Academy of Sciences (India)

    The images, which came from Oailleo's flyby of the moon on June 26-27. 1996 are reported to be 20 times better than those obtained from the Voyager. Priti Shankar .... a systematic way. Thus was born a brand new field,which has since been ..... mathematically oriented, compact book on coding, containing a few topics not ...

  14. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  15. (Almost) practical tree codes

    KAUST Repository

    Khina, Anatoly

    2016-08-15

    We consider the problem of stabilizing an unstable plant driven by bounded noise over a digital noisy communication link, a scenario at the heart of networked control. To stabilize such a plant, one needs real-time encoding and decoding with an error probability profile that decays exponentially with the decoding delay. The works of Schulman and Sahai over the past two decades have developed the notions of tree codes and anytime capacity, and provided the theoretical framework for studying such problems. Nonetheless, there has been little practical progress in this area due to the absence of explicit constructions of tree codes with efficient encoding and decoding algorithms. Recently, linear time-invariant tree codes were proposed to achieve the desired result under maximum-likelihood decoding. In this work, we take one more step towards practicality, by showing that these codes can be efficiently decoded using sequential decoding algorithms, up to some loss in performance (and with some practical complexity caveats). We supplement our theoretical results with numerical simulations that demonstrate the effectiveness of the decoder in a control system setting.

  16. New code of conduct

    CERN Multimedia

    Laëtitia Pedroso

    2010-01-01

    During his talk to the staff at the beginning of the year, the Director-General mentioned that a new code of conduct was being drawn up. What exactly is it and what is its purpose? Anne-Sylvie Catherin, Head of the Human Resources (HR) Department, talked to us about the whys and wherefores of the project.   Drawing by Georges Boixader from the cartoon strip “The World of Particles” by Brian Southworth. A code of conduct is a general framework laying down the behaviour expected of all members of an organisation's personnel. “CERN is one of the very few international organisations that don’t yet have one", explains Anne-Sylvie Catherin. “We have been thinking about introducing a code of conduct for a long time but lacked the necessary resources until now”. The call for a code of conduct has come from different sources within the Laboratory. “The Equal Opportunities Advisory Panel (read also the "Equal opportuni...

  17. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 10. Error Correcting Codes How Numbers Protect Themselves. Priti Shankar. Series Article Volume 1 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  18. Video Coding for ESL.

    Science.gov (United States)

    King, Kevin

    1992-01-01

    Coding tasks, a valuable technique for teaching English as a Second Language, are presented that enable students to look at patterns and structures of marital communication as well as objectively evaluate the degree of happiness or distress in the marriage. (seven references) (JL)

  19. Physical layer network coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Popovski, Petar; Yomo, Hiroyuki

    2014-01-01

    Physical layer network coding (PLNC) has been proposed to improve throughput of the two-way relay channel, where two nodes communicate with each other, being assisted by a relay node. Most of the works related to PLNC are focused on a simple three-node model and they do not take into account...

  20. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Computer Science and. Automation, liSe. Their research addresses various aspects of algebraic and combinatorial coding theory. 1 low Density Parity Check ..... lustrating how the variable Xd is decoded. As mentioned earlier, this algorithm runs iteratively. To start with, in the first iteration, only bits in the first level of the ...

  1. Broadcast Coded Slotted ALOHA

    DEFF Research Database (Denmark)

    Ivanov, Mikhail; Brännström, Frederik; Graell i Amat, Alexandre

    2016-01-01

    We propose an uncoordinated medium access control (MAC) protocol, called all-to-all broadcast coded slotted ALOHA (B-CSA) for reliable all-to-all broadcast with strict latency constraints. In B-CSA, each user acts as both transmitter and receiver in a half-duplex mode. The half-duplex mode gives...

  2. Student Dress Codes.

    Science.gov (United States)

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  3. Dress Codes and Uniforms.

    Science.gov (United States)

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  4. Dress Codes. Legal Brief.

    Science.gov (United States)

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  5. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    titled 'A Mathematical Theory of Communication' in the Bell Systems Technical Journal in 1948. The paper set up a ... 'existential' result but nota 'constructive' one. The construction of such a code evolved from the work ... several papers on hyperbolic geometry. He shifted to the Department of Pure Mathematics at Calcutta.

  6. Cracking the Codes

    Science.gov (United States)

    Heathcote, Dorothy

    1978-01-01

    Prescribes an attitude that teachers can take to help students "crack the code" of a dramatic work, combining a flexible teaching strategy, the suspension of beliefs or preconceived notions about the work, focusing on the drams's text, and choosing a reading strategy appropriate to the dramatic work. (RL)

  7. Corporate governance through codes

    NARCIS (Netherlands)

    Haxhi, I.; Aguilera, R.V.; Vodosek, M.; den Hartog, D.; McNett, J.M.

    2014-01-01

    The UK's 1992 Cadbury Report defines corporate governance (CG) as the system by which businesses are directed and controlled. CG codes are a set of best practices designed to address deficiencies in the formal contracts and institutions by suggesting prescriptions on the preferred role and

  8. Coded SQUID arrays

    NARCIS (Netherlands)

    Podt, M.; Weenink, J.; Weenink, J.; Flokstra, Jakob; Rogalla, Horst

    2001-01-01

    We report on a superconducting quantum interference device (SQUID) system to read out large arrays of cryogenic detectors. In order to reduce the number of SQUIDs required for an array of these detectors, we used code-division multiplexing. This simplifies the electronics because of a significantly

  9. Bond strength versus dentine structure: a modelling approach.

    Science.gov (United States)

    Pashley, D H; Ciucchi, B; Sano, H; Carvalho, R M; Russell, C M

    1995-12-01

    Bond strengths of a hypothetical hydrophilic dentine-bonding agent were calculated as a function of dentine depth and resin strength to evaluate the importance of several variables in a simple model. The tested hypothesis was that the total bond strength was the sum of the strengths of resin tags, hybrid layer and surface adhesion. Each of these three variables has a range of values that can influence its relative contribution. The resulting calculations indicate the potential for higher bond strengths to deep dentine than to superficial dentine in non-vital dentine and the importance of resin strength in the development of strong bonds. Comparison of the calculated bonds with published values indicated that they were within the same order of magnitude. Such theoretical modelling of dentine bonding can identify the relative importance of variables involved in the substrate, resins and surface adhesion.

  10. Reed-Solomon convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H; Schmale, W

    2005-01-01

    In this paper we will introduce a specific class of cyclic convolutional codes. The construction is based on Reed-Solomon block codes. The algebraic parameters as well as the distance of these codes are determined. This shows that some of these codes are optimal or near optimal.

  11. Causation, constructors and codes.

    Science.gov (United States)

    Hofmeyr, Jan-Hendrik S

    2017-09-13

    Relational biology relies heavily on the enriched understanding of causal entailment that Robert Rosen's formalisation of Aristotle's four causes has made possible, although to date efficient causes and the rehabilitation of final cause have been its main focus. Formal cause has been paid rather scant attention, but, as this paper demonstrates, is crucial to our understanding of many types of processes, not necessarily biological. The graph-theoretic relational diagram of a mapping has played a key role in relational biology, and the first part of the paper is devoted to developing an explicit representation of formal cause in the diagram and how it acts in combination with efficient cause to form a mapping. I then use these representations to show how Von Neumann's universal constructor can be cast into a relational diagram in a way that avoids the logical paradox that Rosen detected in his own representation of the constructor in terms of sets and mappings. One aspect that was absent from both Von Neumann's and Rosen's treatments was the necessity of a code to translate the description (the formal cause) of the automaton to be constructed into the construction process itself. A formal definition of codes in general, and organic codes in particular, allows the relational diagram to be extended so as to capture this translation of formal cause into process. The extended relational diagram is used to exemplify causal entailment in a diverse range of processes, such as enzyme action, construction of automata, communication through the Morse code, and ribosomal polypeptide synthesis through the genetic code. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Non‐coding RNAs in cardiac hypertrophy

    Science.gov (United States)

    Ottaviani, Lara

    2017-01-01

    Abstract Heart failure is one of the largest contributors to disease burden and healthcare outflow in the Western world. Despite significant progress in the treatment of heart failure, disease prognosis remains very poor, with the only curative therapy still being heart transplantation. To counteract the current situation, efforts have been made to better understand the underlying molecular pathways in the progression of cardiac disease towards heart failure, and to link the disease to novel therapeutic targets such as non‐coding RNAs. The non‐coding part of the genome has gained prominence over the last couple of decades, opening a completely new research field and establishing different non‐coding RNAs species as fundamental regulators of cellular functions. Not surprisingly, their dysregulation is increasingly being linked to pathology, including to cardiac disease. Pre‐clinically, non‐coding RNAs have been shown to be of great value as therapeutic targets in pathological cardiac remodelling and also as diagnostic/prognostic biomarkers for heart failure. Therefore, it is to be expected that non‐coding RNA‐based therapeutic strategies will reach the bedside in the future and provide new and more efficient treatments for heart failure. Here, we review recent discoveries linking the function and molecular interactions of non‐coding RNAs with the pathophysiology of cardiac hypertrophy and heart failure. PMID:28233323

  13. Predicting vertebral bone strength by vertebral static histomorphometry

    DEFF Research Database (Denmark)

    Thomsen, Jesper Skovhus; Ebbesen, Ebbe Nils; Mosekilde, Lis

    2002-01-01

    The study investigates the relationship between static histomorphometry and bone strength of human lumbar vertebral bone. The ability of vertebral histomorphometry to predict vertebral bone strength was compared with that of vertebral densitometry, and also with histomorphometry and bone strength...... of the entire vertebral bodies (L-2) were used for histomorphometry. The other iliac crest biopsies and the L-3 were destructively tested by compression. High correlation was found between BV/TV or Tb.Sp and vertebral bone strength (absolute value of r = 0.86 in both cases). Addition of Tb.Th significantly...... of improving the prediction of bone strength of the vertebral body. The correlations between BV/TV of L-2 and bone strength of L-3 were comparable with the correlation obtained by quantitative computed tomography (QCT), peripheral QCT (pQCT), and dual-energy X-ray absorptrometry (DEXA) of L-3 and bone strength...

  14. Effect of curing time on microstructure and mechanical strength ...

    Indian Academy of Sciences (India)

    Mechanical strength of alkali activated mortars cured at 65 °C was assessed for different curing times (4–168 h) using 10 molal NaOH solution as alkaline activator. Compressive strength values around 77MPa after three days of curing at 65 °C were obtained. 1.68MPa/h compressive strength gain rate was observed in the ...

  15. Variable weight spectral amplitude coding for multiservice OCDMA networks

    Science.gov (United States)

    Seyedzadeh, Saleh; Rahimian, Farzad Pour; Glesk, Ivan; Kakaee, Majid H.

    2017-09-01

    The emergence of heterogeneous data traffic such as voice over IP, video streaming and online gaming have demanded networks with capability of supporting quality of service (QoS) at the physical layer with traffic prioritisation. This paper proposes a new variable-weight code based on spectral amplitude coding for optical code-division multiple-access (OCDMA) networks to support QoS differentiation. The proposed variable-weight multi-service (VW-MS) code relies on basic matrix construction. A mathematical model is developed for performance evaluation of VW-MS OCDMA networks. It is shown that the proposed code provides an optimal code length with minimum cross-correlation value when compared to other codes. Numerical results for a VW-MS OCDMA network designed for triple-play services operating at 0.622 Gb/s, 1.25 Gb/s and 2.5 Gb/s are considered.

  16. Essential idempotents and simplex codes

    Directory of Open Access Journals (Sweden)

    Gladys Chalom

    2017-01-01

    Full Text Available We define essential idempotents in group algebras and use them to prove that every mininmal abelian non-cyclic code is a repetition code. Also we use them to prove that every minimal abelian code is equivalent to a minimal cyclic code of the same length. Finally, we show that a binary cyclic code is simplex if and only if is of length of the form $n=2^k-1$ and is generated by an essential idempotent.

  17. Coding Theory and Projective Spaces

    Science.gov (United States)

    Silberstein, Natalia

    2008-05-01

    The projective space of order n over a finite field F_q is a set of all subspaces of the vector space F_q^{n}. In this work, we consider error-correcting codes in the projective space, focusing mainly on constant dimension codes. We start with the different representations of subspaces in the projective space. These representations involve matrices in reduced row echelon form, associated binary vectors, and Ferrers diagrams. Based on these representations, we provide a new formula for the computation of the distance between any two subspaces in the projective space. We examine lifted maximum rank distance (MRD) codes, which are nearly optimal constant dimension codes. We prove that a lifted MRD code can be represented in such a way that it forms a block design known as a transversal design. The incidence matrix of the transversal design derived from a lifted MRD code can be viewed as a parity-check matrix of a linear code in the Hamming space. We find the properties of these codes which can be viewed also as LDPC codes. We present new bounds and constructions for constant dimension codes. First, we present a multilevel construction for constant dimension codes, which can be viewed as a generalization of a lifted MRD codes construction. This construction is based on a new type of rank-metric codes, called Ferrers diagram rank-metric codes. Then we derive upper bounds on the size of constant dimension codes which contain the lifted MRD code, and provide a construction for two families of codes, that attain these upper bounds. We generalize the well-known concept of a punctured code for a code in the projective space to obtain large codes which are not constant dimension. We present efficient enumerative encoding and decoding techniques for the Grassmannian. Finally we describe a search method for constant dimension lexicodes.

  18. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...

  19. Implementation of strength and burn models for plastic-bonded explosives and propellants

    Energy Technology Data Exchange (ETDEWEB)

    Reaugh, J E

    2009-05-07

    We have implemented the burn model in LS-DYNA. At present, the damage (porosity and specific surface area) is specified as initial conditions. However, history variables that are used by the strength model are reserved as placeholders for the next major revision, which will be a completely interactive model. We have implemented an improved strength model for explosives based on a model for concrete. The model exhibits peak strength and subsequent strain softening in uniaxial compression. The peak strength increases with increasing strain rate and/or reduced ambient temperature. Under triaxial compression compression, the strength continues to increase (or at least not decrease) with increasing strain. This behaviour is common to both concrete and polymer-bonded explosives (PBX) because the microstructure of these composites is similar. Both have aggregate material with a broad particle size distribution, although the length scale for concrete aggregate is two orders of magnitude larger than for PBX. The (cement or polymer) binder adheres to the aggregate, and is both pressure and rate sensitive. There is a larger bind binder content in concrete, compared to the explosive, and the aggregates have different hardness. As a result we expect the parameter values to differ, but the functional forms to be applicable to both. The models have been fit to data from tests on an AWE explosive that is HMX based. The decision to implement the models in LS-DYNA was based on three factors: LS-DYNA is used routinely by the AWE engineering analysis group and has a broad base of experienced users; models implemented in LS-DYNA can be transferred easily to LLNL's ALE 3D using a material model wrapper developed by Rich Becker; and LS-DYNA could accommodate the model requirements for a significant number of additional history variables without the significant time delay associated with code modification.

  20. Non-extensive trends in the size distribution of coding and non-coding DNA sequences in the human genome

    Science.gov (United States)

    Oikonomou, Th.; Provata, A.

    2006-03-01

    We study the primary DNA structure of four of the most completely sequenced human chromosomes (including chromosome 19 which is the most dense in coding), using non-extensive statistics. We show that the exponents governing the spatial decay of the coding size distributions vary between 5.2 ≤r ≤5.7 for the short scales and 1.45 ≤q ≤1.50 for the large scales. On the contrary, the exponents governing the spatial decay of the non-coding size distributions in these four chromosomes, take the values 2.4 ≤r ≤3.2 for the short scales and 1.50 ≤q ≤1.72 for the large scales. These results, in particular the values of the tail exponent q, indicate the existence of correlations in the coding and non-coding size distributions with tendency for higher correlations in the non-coding DNA.

  1. The Value of Value Sets

    DEFF Research Database (Denmark)

    Sløk-Madsen, Stefan Kirkegaard; Christensen, Jesper

    behavior. This paper attempt to test such a claim. This is done via unique and privileged access to top-level managers in a Fortune 250 company. This company is special in having very well-defined, long-running values that are in opposition to a narrowly defined homo-economics rationality. These values...... involving vignettes and games. Their results were compared to their actual knowledge of the content of the company corporate values. The results were tested against hypotheses on expected rational behavior and a control group consisting of similar level managers from other companies. This study makes...... and anecdotally true surprisingly little hard evidence has been produced either for or against. This study attempts to rectify this. The study claims that for corporate values to matter they must at least align, and potentially alter, employee decision-making hence their concept of optimality and rational...

  2. Swimbladder on Fish Target Strength

    Directory of Open Access Journals (Sweden)

    Sunardi

    2008-08-01

    Full Text Available This paper discusses of target strength (TS for the Selar boops (Oxeye scad and Megalaspis cordyla (Torpedo scad, the most commercially fish in Malaysia. TS can be determined from in situ measurements and acoustic calculation of fish model. TS value, depth, and position (x-y-z of targeted fish can be viewed from echogram using FQ-80 Analyzer by in situ measurement. X-ray imaged can be deployed to develop the acoustic fish model. The percentage of length and upper surface area for swimbladder to body fish of Selar boops more than Megalaspis cordyla can be measured after X-ray process. The percentage of width and volume of swimbladders to its each body are no significantly difference for both fish. These data of swimbladder physic support the result of in situ measurement which TS of Megalaspis cordyla stronger Selar boops.

  3. Advanced Code for Photocathode Design

    Energy Technology Data Exchange (ETDEWEB)

    Ives, Robert Lawrence [Calabazas Creek Research, Inc., San Mateo, CA (United States); Jensen, Kevin [Naval Research Lab. (NRL), Washington, DC (United States); Montgomery, Eric [Univ. of Maryland, College Park, MD (United States); Bui, Thuc [Calabazas Creek Research, Inc., San Mateo, CA (United States)

    2015-12-15

    The Phase I activity demonstrated that PhotoQE could be upgraded and modified to allow input using a graphical user interface. Specific calls to platform-dependent (e.g. IMSL) function calls were removed, and Fortran77 components were rewritten for Fortran95 compliance. The subroutines, specifically the common block structures and shared data parameters, were reworked to allow the GUI to update material parameter data, and the system was targeted for desktop personal computer operation. The new structures overcomes the previous rigid and unmodifiable library structures by implementing new, materials library data sets and repositioning the library values to external files. Material data may originate from published literature or experimental measurements. Further optimization and restructuring would allow custom and specific emission models for beam codes that rely on parameterized photoemission algorithms. These would be based on simplified and parametric representations updated and extended from previous versions (e.g., Modified Fowler-Dubridge, Modified Three-Step, etc.).

  4. Should managers have a code of conduct?

    Science.gov (United States)

    Bayliss, P

    1994-02-01

    Much attention is currently being given to values and ethics in the NHS. Issues of accountability are being explored as a consequence of the Cadbury report. The Institute of Health Services Management (IHSM) is considering whether managers should have a code of ethics. Central to this issue is what managers themselves think; the application of such a code may well stand or fall by whether managers are prepared to have ownership of it, and are prepared to make it work. Paul Bayliss reports on a survey of managers' views.

  5. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  6. Coding vs non-coding: Translatability of short ORFs found in putative non-coding transcripts.

    Science.gov (United States)

    Kageyama, Yuji; Kondo, Takefumi; Hashimoto, Yoshiko

    2011-11-01

    Genome analysis has identified a number of putative non-protein-coding transcripts that do not contain ORFs longer than 100 codons. Although evidence strongly suggests that non-coding RNAs are important in a variety of biological phenomena, the discovery of small peptide-coding mRNAs confirms that some transcripts that have been assumed to be non-coding actually have coding potential. Their abundance and importance in biological phenomena makes the sorting of non-coding RNAs from small peptide-coding mRNAs a key issue in functional genomics. However, validating the coding potential of small peptide-coding RNAs is complicated, because their ORF sequences are usually too short for computational analysis. In this review, we discuss computational and experimental methods for validating the translatability of these non-coding RNAs. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  7. Effect of Reinforcements Combination on the Mechanical Strength of ...

    African Journals Online (AJOL)

    A number of hand lay-up GRP laminates of E-glass are produced in various reinforcements combinations. Careful record of their strength tested under room temperature is tabulated against each corresponding laminate. A close study of these results show that their laminate strength varied considerably from initial values, ...

  8. Strength and stiffness capacity utilisation of timber members in roof ...

    African Journals Online (AJOL)

    Of all the individual strength properties, the mean bending strength capacity utilised per member was found to be the highest. The results of this study can be used for decision support related to wood property evaluation throughout the structural lumber value chain where roof truss members are the end products. Keywords: ...

  9. Synthetic histone code.

    Science.gov (United States)

    Fischle, Wolfgang; Mootz, Henning D; Schwarzer, Dirk

    2015-10-01

    Chromatin is the universal template of genetic information in all eukaryotic cells. This complex of DNA and histone proteins not only packages and organizes genomes but also regulates gene expression. A multitude of posttranslational histone modifications and their combinations are thought to constitute a code for directing distinct structural and functional states of chromatin. Methods of protein chemistry, including protein semisynthesis, amber suppression technology, and cysteine bioconjugation, have enabled the generation of so-called designer chromatin containing histones in defined and homogeneous modification states. Several of these approaches have matured from proof-of-concept studies into efficient tools and technologies for studying the biochemistry of chromatin regulation and for interrogating the histone code. We summarize pioneering experiments and recent developments in this exciting field of chemical biology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Efficient convolutional sparse coding

    Science.gov (United States)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  11. Movement velocity vs. strength training

    Directory of Open Access Journals (Sweden)

    Mário C. Marques

    2017-06-01

    Full Text Available Intensity during strength training has been commonly identified with relative load (percentage of one-repetition maximum, 1RM or with performing a given maximal number of repetitions in each set (XRM: 5RM, 10RM, 15 RM, etc.. Yet, none of these methods can be appropriate for precisely monitoring the real training effort in each training session. The first approach requires coaches to individually assess the 1RM value for each athlete. We may agree that expressing intensity as a percentage of the maximum repetition has the advantage that it can be used to program strength training for multiple athletes simultaneously, the loads being later transformed in absolute values (kg for each individual. Further, another advantage is that this expression of the intensity can clearly reflect the dynamics of the evolution of the training load if we understand the percentage of 1RM as an effort, and not as a simple arithmetic calculus. Nevertheless, direct assessment of 1RM has some possible disadvantages worth noting. It may be associated with risk of injury when performed incorrectly or by novice athlete’s and it is time-consuming and impractical for large groups. Moreover, the actual RM can change quite rapidly after only a few training sessions and often the obtained value is not the subject’s true maximum. The classic way to prescribe loading intensity is to determine, through trial and error, the maximum number of repetitions that one can be performed with a given submaximal weight. For example, 5RM refers to a weight that can only be lifted five times. Some studies identified the relationship between selected percentages of 1RM and the number of repetitions to failure, establishing a repetition maximum continuum. It is believed that certain performance characteristics are best trained using specific RM load ranges. This method eliminates the need for a direct 1RM test, but it is not without drawbacks either. Using exhaustive efforts is common

  12. Status of MARS Code

    Energy Technology Data Exchange (ETDEWEB)

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  13. Hydra Code Release

    OpenAIRE

    Couchman, H. M. P.; Pearce, F. R.; Thomas, P. A.

    1996-01-01

    Comment: A new version of the AP3M-SPH code, Hydra, is now available as a tar file from the following sites; http://coho.astro.uwo.ca/pub/hydra/hydra.html , http://star-www.maps.susx.ac.uk/~pat/hydra/hydra.html . The release now also contains a cosmological initial conditions generator, documentation, an installation guide and installation tests. A LaTex version of the documentation is included here

  14. 36 CFR 1210.42 - Codes of conduct.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Codes of conduct. 1210.42... § 1210.42 Codes of conduct. The recipient shall maintain written standards of conduct governing the... interest is not substantial or the gift is an unsolicited item of nominal value. The standards of conduct...

  15. 7 CFR 3019.42 - Codes of conduct.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Codes of conduct. 3019.42 Section 3019.42 Agriculture... § 3019.42 Codes of conduct. The recipient shall maintain written standards of conduct governing the... interest is not substantial or the gift is an unsolicited item of nominal value. The standards of conduct...

  16. 43 CFR 12.942 - Codes of conduct.

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Codes of conduct. 12.942 Section 12.942... Requirements § 12.942 Codes of conduct. The recipient shall maintain written standards of conduct governing the... interest is not substantial or the gift is an unsolicited item of nominal value. The standards of conduct...

  17. 29 CFR 95.42 - Codes of conduct.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Codes of conduct. 95.42 Section 95.42 Labor Office of the... Standards § 95.42 Codes of conduct. The recipient shall maintain written standards of conduct governing the... interest is not substantial or the gift is an unsolicited item of nominal value. The standards of conduct...

  18. 41 CFR 105-72.502 - Codes of conduct.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Codes of conduct. 105-72... § 105-72.502 Codes of conduct. The recipient shall maintain written standards of conduct governing the... interest is not substantial or the gift is an unsolicited item of nominal value. The standards of conduct...

  19. 34 CFR 74.42 - Codes of conduct.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Codes of conduct. 74.42 Section 74.42 Education Office... Procurement Standards § 74.42 Codes of conduct. The recipient shall maintain written standards of conduct... interest is not substantial or the gift is an unsolicited item of nominal value. The standards of conduct...

  20. 2 CFR 215.42 - Codes of conduct.

    Science.gov (United States)

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false Codes of conduct. 215.42 Section 215.42... Codes of conduct. The recipient shall maintain written standards of conduct governing the performance of... interest is not substantial or the gift is an unsolicited item of nominal value. The standards of conduct...

  1. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  2. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  3. Value of Fundamental Science

    Science.gov (United States)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  4. FEAMAC/CARES Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composites

    Science.gov (United States)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Bhatt, Ramakrishna

    2016-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  5. Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composite

    Science.gov (United States)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  6. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  7. Allele coding in genomic evaluation

    DEFF Research Database (Denmark)

    Standen, Ismo; Christensen, Ole Fredslund

    2011-01-01

    this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. \\paragraph*{Results:} Theoretical derivations showed that parameter...... coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being the best. \\paragraph*{Conclusions:} Different allele coding methods lead to the same inference in the marker-based and equivalent models when a fixed...

  8. Polynomial weights and code constructions.

    Science.gov (United States)

    Massey, J. L.; Costello, D. J., Jr.; Justesen, J.

    1973-01-01

    Study of certain polynomials with the 'weight-retaining' property that any linear combination of these polynomials with coefficients in a general finite field has Hamming weight at least as great as that of the minimum-degree polynomial included. This fundamental property is used in applications to Reed-Muller codes, a new class of 'repeated-root' binary cyclic codes, two new classes of binary convolutional codes derived from binary cyclic codes, and two new classes of binary convolutional codes derived from Reed-Solomon codes.

  9. Rupture Strength and Irregularity of Fracture Surfaces

    Science.gov (United States)

    Ficker, Tomáš

    2017-10-01

    Textural irregularities of fracture surfaces of cement-based materials seem to be an interesting source of information on some mechanical properties. Besides compressive strength, the flexural strength is strongly correlated with height irregularities (i.e. roughness) of fracture surfaces of hydrated cement pastes. This correlation has been a subject of experimental study. An analytical relation between flexural strength and height irregularities has been inferred. The formula contains height parameters, which represent basic descriptors of surface irregularities of fracture surfaces of cement pastes. These irregularities are governed by the capillary porosity of cement pastes with different water-to-cement ratios. The relation yields values that are in agreement with the empirical formula published in the technical literature.

  10. Hip strength and range of motion

    DEFF Research Database (Denmark)

    Mosler, Andrea B.; Crossley, Kay M.; Thorborg, Kristian

    2017-01-01

    Objectives To determine the normal profiles for hip strength and range of motion (ROM) in a professional football league in Qatar, and examine the effect of leg dominance, age, past history of injury, and ethnicity on these profiles. Design Cross-sectional cohort study. Methods Participants...... values are documented for hip strength and range of motion that can be used as reference profiles in the clinical assessment, screening, and management of professional football players. Leg dominance, recent past injury history and ethnicity do not need to be accounted for when using these profiles...... included 394 asymptomatic, male professional football players, aged 18–40 years. Strength was measured using a hand held dynamometer with an eccentric test in side-lying for hip adduction and abduction, and the squeeze test in supine with 45° hip flexion. Range of motion measures included: hip internal...

  11. Determining the in situ concrete strength of existing structures for assessing their structural safety

    NARCIS (Netherlands)

    Steenbergen, R.D.J.M.; Vervuurt, A.H.J.M.

    2012-01-01

    EN 13791 applies when assessing the in situ compressive strength of structures and precast concrete components. According to the code itself, it may be adopted when doubt arises about the compressive strength of a concrete. For assessing the structural safety of existing structures, however, the

  12. Fatigue experiments on very high strength steel base material and transverse butt welds

    NARCIS (Netherlands)

    Pijpers, R.J.M.; Kolstein, M.H.; Romeijn, A.; Bijlaard, F.S.K.

    2009-01-01

    Very High Strength Steels (VHSS) with nominal strengths up to 1100 MPa have been available on the market for many years. However, the use of these steels in the civil engineering industry is still uncommon, due to lack of design and fabrication knowledge and therefore limited inclusion in codes.

  13. Health Education in India: A Strengths, Weaknesses, Opportunities, and Threats (SWOT) Analysis

    Science.gov (United States)

    Sharma, Manoj

    2005-01-01

    The purpose of this study was to conduct a strengths, weaknesses, opportunities, and threats (SWOT) analysis of the health education profession and discipline in India. Materials from CINAHL, ERIC, MEDLINE, and Internet were collected to conduct the open coding of the SWOT analysis. Strengths of health education in India include an elaborate…

  14. Characteristics of the Strengths and Difficulties Questionnaire in Preschool Children

    NARCIS (Netherlands)

    Theunissen, Meinou H. C.; Vogels, Anton G. C.; de Wolff, Marianne S.; Reijneveld, Sijmen A.

    OBJECTIVES: Validated questionnaires help the preventive child healthcare (PCH) system to identify psychosocial problems. This study assesses the psychometric properties and added value of the Strengths and Difficulties Questionnaire (SDQ) for the identification of psychosocial problems among

  15. Compression strength perpendicular to grain of structural timber and glulam

    DEFF Research Database (Denmark)

    Damkilde, Lars; Hoffmeyer, Preben; Pedersen, Torben N.

    1998-01-01

    . Nonetheless test results show that the levels of characteristic compression strength perpendicular to grain are of the same order for structural timber and glulam. The values are slightly lower than those appearing in EN 1194 and less than half of those appearing in EN 338. The paper presents a numerical......The characteristic strength values for compression perpendicular to grain as they appear in EN 338 (structural timber) and EN 1194 (glulam) are currently up for discussion. The present paper provides experimental results based on EN 1193 that may assist in the correct assignment of such strength...... values. The dominant failure mode of glulam specimens is shown to be fundamentally different from that of structural timber specimens. Glulam specimens often show tension perpendicular to grain failure before the compression strength value is reached. Such failure mode is not seen for structural timber...

  16. Construction of new quantum MDS codes derived from constacyclic codes

    Science.gov (United States)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  17. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom; Thommesen, Christian

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes.......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes....

  18. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Thommesen, Christian; Høholdt, Tom

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved Reed/Solomon codes, which allows close to errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes. (NK) N-K......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved Reed/Solomon codes, which allows close to errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes. (NK) N-K...

  19. New Code Matched Interleaver for Turbo Codes with Short Frames

    Directory of Open Access Journals (Sweden)

    LAZAR, G. A.

    2010-02-01

    Full Text Available Turbo codes are a parallel concatenation of two or more convolutional codes, separated by interleavers, therefore their performance is not influenced just by the constituent encoders, but also by the interleaver. For short frame turbo codes, the selection of a proper interleaver becomes critical. This paper presents a new algorithm of obtaining a code matched interleaver leading to a very high minimum distance and improved performance.

  20. Shared Value

    OpenAIRE

    Jensen, Mette; Kampmann, Hack; Nielsen, Michalis; Larsen, Rasmus

    2016-01-01

    This project investigates the kind of value being presented when two seemingly different organisations - Red Bull and The Royal Theatre (Det Kongelige Teater) comes together to collaborate and present the Red Bull Cliff Diving 2015 series in Copenhagen in June 2015. The project draws on theories on Axiology, Experience Economy, Branding and Content Marketing, Culture Theory and Public Private Partnerships, all in relation to Theodor Adorno and Max Horkheimers’ theory of “Cultural Industry: En...

  1. Code Flows : Visualizing Structural Evolution of Source Code

    NARCIS (Netherlands)

    Telea, Alexandru; Auber, David

    2008-01-01

    Understanding detailed changes done to source code is of great importance in software maintenance. We present Code Flows, a method to visualize the evolution of source code geared to the understanding of fine and mid-level scale changes across several file versions. We enhance an existing visual

  2. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    P. C. Catherine. K. M. S Soyjaudah. Department of Electrical and Electronics Engineering ... in the 1960's, Gallager in his PhD thesis worked on low-density parity-check (LDPC) codes (Gallager 1963). ..... In any case however, it is hoped that the ideas behind TG codes will help in the development of future intelligent coding ...

  3. Code flows : Visualizing structural evolution of source code

    NARCIS (Netherlands)

    Telea, Alexandru; Auber, David

    Understanding detailed changes done to source code is of great importance in software maintenance. We present Code Flows, a method to visualize the evolution of source code geared to the understanding of fine and mid-level scale changes across several file versions. We enhance an existing visual

  4. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    This work proposes a blend of the two technologies, yielding a code that we nicknamed Turbo-Gallager or TG Code. The code has additional “intelligence” compared to its parents. It detects and corrects the so-called “undetected errors” and recovers from individual decoder failure by making use of a network of decoders.

  5. Effect of solute interactions in columbium /Nb/ on creep strength

    Science.gov (United States)

    Klein, M. J.; Metcalfe, A. G.

    1973-01-01

    The creep strength of 17 ternary columbium (Nb)-base alloys was determined using an abbreviated measuring technique, and the results were analyzed to identify the contributions of solute interactions to creep strength. Isostrength creep diagrams and an interaction strengthening parameter, ST, were used to present and analyze data. It was shown that the isostrength creep diagram can be used to estimate the creep strength of untested alloys and to identify compositions with the most economical use of alloy elements. Positive values of ST were found for most alloys, showing that interaction strengthening makes an important contribution to the creep strength of these ternary alloys.

  6. Further results on binary convolutional codes with an optimum distance profile

    DEFF Research Database (Denmark)

    Johannesson, Rolf; Paaske, Erik

    1978-01-01

    Fixed binary convolutional codes are considered which are simultaneously optimal or near-optimal according to three criteria: namely, distance profiled, free distanced_{ infty}, and minimum number of weightd_{infty}paths. It is shown how the optimum distance profile criterion can be used to limit...... the search for codes with a large value ofd_{infty}. We present extensive lists of such robustly optimal codes containing rateR = l/2nonsystematic codes, several withd_{infty}superior to that of any previously known code of the same rate and memory; rateR = 2/3systematic codes; and rateR = 2/3nonsystematic...... codes. As a counterpart to quick-look-in (QLI) codes which are not "transparent," we introduce rateR = 1/2easy-look-in-transparent (ELIT) codes with a feedforward inverse(1 + D,D). In general, ELIT codes haved_{infty}superior to that of QLI codes....

  7. Investigation of Dosimetric Parameters of $^{192}$Ir MicroSelectron v2 HDR Brachytherapy Source Using EGSnrc Monte Carlo Code

    CERN Document Server

    Naeem, Hamza; Zheng, Huaqing; Cao, Ruifen; Pei, Xi; Hu, Liqin; Wu, Yican

    2016-01-01

    The $^{192}$Ir sources are widely used for high dose rate (HDR) brachytherapy treatments. The aim of this study is to simulate $^{192}$Ir MicroSelectron v2 HDR brachytherapy source and calculate the air kerma strength, dose rate constant, radial dose function and anisotropy function established in the updated AAPM Task Group 43 protocol. The EGSnrc Monte Carlo (MC) code package is used to calculate these dosimetric parameters, including dose contribution from secondary electron source and also contribution of bremsstrahlung photons to air kerma strength. The Air kerma strength, dose rate constant and radial dose function while anisotropy functions for the distance greater than 0.5 cm away from the source center are in good agreement with previous published studies. Obtained value from MC simulation for air kerma strength is $9.762\\times 10^{-8} \\textrm{UBq}^{-1}$and dose rate constant is $1.108\\pm 0.13\\%\\textrm{cGyh}^{-1} \\textrm{U}^{-1}$.

  8. Phenotypic Graphs and Evolution Unfold the Standard Genetic Code as the Optimal

    Science.gov (United States)

    Zamudio, Gabriel S.; José, Marco V.

    2018-03-01

    In this work, we explicitly consider the evolution of the Standard Genetic Code (SGC) by assuming two evolutionary stages, to wit, the primeval RNY code and two intermediate codes in between. We used network theory and graph theory to measure the connectivity of each phenotypic graph. The connectivity values are compared to the values of the codes under different randomization scenarios. An error-correcting optimal code is one in which the algebraic connectivity is minimized. We show that the SGC is optimal in regard to its robustness and error-tolerance when compared to all random codes under different assumptions.

  9. Application and comparison of stability analysis of slope using circular arc method and strength reduction method

    OpenAIRE

    Xu Bin Bin; Xie Lin Bo; Si Wei

    2016-01-01

    In order to evaluate the accuracy of strength reduction method using FEM, the safety factors of the uniform clay slope and sand slope are investigated by Fellenius’s method, Bishop’s method using traditional limit equilibrium method and strength reduction method respectively. The limit equilibrium method is carried out based on the code for foundations in port engineering and the FE analysis is based on Plaxis3D. The results show that the safety coefficient obtained by strength reduction meth...

  10. Strength and Fractography of Glass Wool Fibres

    DEFF Research Database (Denmark)

    Lund, Majbritt Deichgræber; Yue, Yuanzheng

    strength. In spite of those advantages, GWFs show a certain degree of brittleness, which limits the mechanical performance of GWFs during both transportation and application. Therefore, a reduction in the brittleness of GWFs is an inevitable task for us. To do so, it is important to look into the fracture...... behaviour and its connection to the mechanical strength. Here we report a detailed study of fracture behaviour of GWFs by means of uniaxial tensile strength and SEM micrographs of fractured surfaces. The tensile strength data of GWFs are evaluated by Weibull statistics. The Weibull model does not take...... between fracture strength (sf) and mirror radius (r), i.e., sf = A*r, is confirmed for all the GWFs studied. The materials constant, A, (mirror constant) is found to be 2.4 ~ 2.7 MPam½ for basaltic wool and 2.0 MPam½ for E-glass wool, which is similar to the value reported in the literature for different...

  11. Code lists for interoperability - Principles and best practices in INSPIRE

    Science.gov (United States)

    Lutz, M.; Portele, C.; Cox, S.; Murray, K.

    2012-04-01

    Using free text for attribute values when exchanging geoscience data can lead to a number of problems, e.g. because different data providers and consumers use different languages, terminology or spellings. To overcome these issues, well-defined schemes of codes or concepts, known as code lists1, are preferred to free text in defining the value domain of an attribute. The "code list" concept is well established in geospatial modelling standards (e.g. ISO 191032), however, it has been used in many different ways. Here we present some considerations relating to code lists and related interoperability requirements in spatial data infrastructures (SDIs), in particular as discussed in the INSPIRE3 data specifications working groups. These will form the basis for the specification of code list requirements in the INSPIRE Implementing Rules on interoperability of spatial data sets and services4, which provide binding legal obligations for EU Member States for the interoperable provision of data related to the environment. Requirements or recommendations for code lists should address the following aspects: Governance: When modeling an application domain, for each feature attribute whose value is a 'term', should we re-use an existing code list or specify a new code list for the SDI initiative? Use of existing code lists is likely to maximize cross-initiative interoperability. Level of obligation: For each use of a code list, what is the level of obligation? Is use of a specified code list(s) mandatory or just recommended? This is particularly important where the specifications carry a legal mandate (as in the case of INSPIRE). Extensibility: Must data providers use only the specified values or may they extend the code list? Are arbitrary extensions allowed or do additional values have to be specialisations of existing values? Specifying values: For each code list, the allowed values have to be specified, either directly in the specification, or by reference to an existing

  12. Code Generation with Templates

    CERN Document Server

    Arnoldus, Jeroen; Serebrenik, A

    2012-01-01

    Templates are used to generate all kinds of text, including computer code. The last decade, the use of templates gained a lot of popularity due to the increase of dynamic web applications. Templates are a tool for programmers, and implementations of template engines are most times based on practical experience rather than based on a theoretical background. This book reveals the mathematical background of templates and shows interesting findings for improving the practical use of templates. First, a framework to determine the necessary computational power for the template metalanguage is presen

  13. Cinder begin creative coding

    CERN Document Server

    Rijnieks, Krisjanis

    2013-01-01

    Presented in an easy to follow, tutorial-style format, this book will lead you step-by-step through the multi-faceted uses of Cinder.""Cinder: Begin Creative Coding"" is for people who already have experience in programming. It can serve as a transition from a previous background in Processing, Java in general, JavaScript, openFrameworks, C++ in general or ActionScript to the framework covered in this book, namely Cinder. If you like quick and easy to follow tutorials that will let yousee progress in less than an hour - this book is for you. If you are searching for a book that will explain al

  14. Code of Ethics.

    Science.gov (United States)

    Sheppard, Glenn W.; Schulz, William E.; McMahon, Sylvia-Anne

    This booklet expresses the ethical principles and values of the Canadian Counseling Association and serves as a guide to the professional conduct of all its members. It also informs the public served by the association of the standards of ethical conduct for which members are to be responsible and accountable. This guide reflects the values of…

  15. The path of code linting

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Join the path of code linting and discover how it can help you reach higher levels of programming enlightenment. Today we will cover how to embrace code linters to offload cognitive strain on preserving style standards in your code base as well as avoiding error-prone constructs. Additionally, I will show you the journey ahead for integrating several code linters in the programming tools your already use with very little effort.

  16. Influence of dentin pretreatment on bond strength of universal adhesives.

    Science.gov (United States)

    Poggio, Claudio; Beltrami, Riccardo; Colombo, Marco; Chiesa, Marco; Scribante, Andrea

    2017-01-01

    Objective: The purpose of the present study was to compare bond strength of different universal adhesives under three different testing conditions: when no pretreatment was applied, after 37% phosphoric acid etching and after glycine application. Materials and methods: One hundred and fifty bovine permanent mandibular incisors were used as a substitute for human teeth. Five different universal adhesives were tested: Futurabond M+, Scotchbond Universal, Clearfil Universal Bond, G-Premio BOND, Peak Universal Bond. The adhesive systems were applied following each manufacturer's instructions. The teeth were randomly assigned to three different dentin surface pretreatments: no pretreatment agent (control), 37% phosphoric acid etching, glycine pretreatment. The specimens were placed in a universal testing machine in order to measure and compare bond strength values. Results: The Kruskal-Wallis analysis of variance and the Mann-Whitney test were applied to assess significant differences among the groups. Dentin pretreatments provided different bond strength values for the adhesives tested, while similar values were registered in groups without dentin pretreatment. Conclusions: In the present report, dentin surface pretreatment did not provide significant differences in shear bond strength values of almost all groups. Acid pretreatment lowered bond strength values of Futurabond and Peak Universal Adhesives, whereas glycine pretreatment increased bond strength values of G Praemio Bond adhesive system.

  17. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  18. Hip strength assessment using handheld dynamometry is subject to intertester bias when testers are of different sex and strength

    DEFF Research Database (Denmark)

    Thorborg, K; Bandholm, T; Schick, M

    2013-01-01

    of this study was to examine the intertester reliability concerning strength assessments of hip abduction, adduction, external and internal rotation, flexion and extension using HHD, and to test whether systematic differences in test values exist between testers of different upper-extremity strength. Fifty...... healthy individuals (29 women), aged 25 ± 5 years were included. Two physiotherapist students (one female, one male) of different upper-extremity strength performed the measurements. The tester order and strength test order were randomized. Intraclass correlation coefficients were used to quantify...

  19. Cracking the code of change.

    Science.gov (United States)

    Beer, M; Nohria, N

    2000-01-01

    Today's fast-paced economy demands that businesses change or die. But few companies manage corporate transformations as well as they would like. The brutal fact is that about 70% of all change initiatives fail. In this article, authors Michael Beer and Nitin Nohria describe two archetypes--or theories--of corporate transformation that may help executives crack the code of change. Theory E is change based on economic value: shareholder value is the only legitimate measure of success, and change often involves heavy use of economic incentives, layoffs, downsizing, and restructuring. Theory O is change based on organizational capability: the goal is to build and strengthen corporate culture. Most companies focus purely on one theory or the other, or haphazardly use a mix of both, the authors say. Combining E and O is directionally correct, they contend, but it requires a careful, conscious integration plan. Beer and Nohria present the examples of two companies, Scott Paper and Champion International, that used a purely E or purely O strategy to create change--and met with limited levels of success. They contrast those corporate transformations with that of UK-based retailer ASDA, which has successfully embraced the paradox between the opposing theories of change and integrated E and O. The lesson from ASDA? To thrive and adapt in the new economy, companies must make sure the E and O theories of business change are in sync at their own organizations.

  20. Critical Phenomena in Population Coding

    Science.gov (United States)

    Berkowitz, John; Sharpee, Tatyana

    2014-03-01

    Populations of neurons that code for sensory stimuli are often modeled as having sigmoidal tuning curves where the midpoint and slope of the curve represent, respectively, an intrinsic firing threshold and noise level. Recent studies have shown for two subpopulations of neurons that states below a critical noise level are associated with symmetry breaking between the populations' thresholds. In this work we consider the case of up to seven distinct subpopulations encoding a common gaussian stimulus. We optimized the mutual information between output patterns and stimuli by adjusting the thresholds for a fixed noise level. In the high-noise regime the optimal thresholds are fully redundant whereas the low noise limit predicts distinct threshold values that achieve histogram equalization of the input signal. Between the two limits, the thresholds exhibit a complex branching process that occur at successive values of the noise level. Each branch corresponds to a critical point of a continuous phase transition. The behavior of the system in the limit of a large number of subpopulations is also investigated, and critical phenomena are also present in the distribution of thresholds in this limit.

  1. Strength Development for Young Adolescents

    Science.gov (United States)

    McDaniel, Larry W.; Jackson, Allen; Gaudet, Laura

    2009-01-01

    Participation in strength training is important for older children or young adolescences who wish to improve fitness or participate in sports. When designing strength training programs for our youth this age group is immature anatomically, physiologically, and psychologically. For the younger or inexperienced group the strength training activities…

  2. Unified strength theory and its applications

    CERN Document Server

    Yu, Mao-Hong

    2004-01-01

    This is a completely new theory dealing with the yield and failure of materials under multi-axial stresses. It provides a system of yield and failure criteria adopted for most materials, from metallic materials to rocks, concretes, soils, polymers etc. The Unified Strength Theory has been applied successfully to analyse the elastic limit, plastic limit capacities, the dynamic response behavior for some structures under static and moderate impulsive load, and may be implemented in some elasto-plastic finite element computer codes. The Unified Strength Theory is described in detail and by using this theory a series of results can be obtained. The Unified Strength Theory can improve the conservative Mohr-Coulomb Theory, and since intermediate principal stress is not taken into account in the Mohr-Coulomb theory and most experimental data is not pertainable to the Mohr-Coulomb Theory, a considerable economic benefit may be obtained. The book can also increase the effect of most commercial finite element computer ...

  3. Values taught, values learned, attitude and performance in mathematics

    Science.gov (United States)

    Limbaco, K. S. A.

    2015-03-01

    The purpose of the study was to identify, describe and find the relationship among values taught, values learned, attitude and performance in mathematics. The researcher used descriptive-correlational method of research to gather information and to describe the nature of situation. The following instruments were used in this study: Math Attitude Inventory, Inventory of Values Taught and Learned which were content validated by experts in the field of Mathematics, Values and Education. Generally, most of the values were taught by the teachers. All of the values were learned by the students. The following got the highest mean ratings for values taught: moral strength, sharing, charity, valuing life, love of God, truth and honesty, reason, alternativism and articulation. The following got highest mean ratings for values learned: patience/tolerance, sharing, charity, valuing life, faith, love of God, truth and honesty, analogical thinking, confidence and individual liberty. Majority of the respondents have moderately positive attitude towards mathematics. Positive statements in the Mathematics Attitude Inventory are "Generally true" while negative statements are "Neutral." In conclusion, values were taught by mathematics teacher, thus, learned by the students. Therefore, mathematics is very much related to life. Values can be learned and strengthened through mathematics; there is a significant relationship between values taught by the teachers and values learned by the students and attitude towards mathematics and performance in mathematics; values taught does not affect attitude towards mathematics and performance in mathematics. A student may have a positive attitude towards mathematics or have an exemplary performance in mathematics even if the mathematics teacher did not teach values; values learned does not affect attitude towards mathematics and performance in mathematics. A student may have a positive attitude towards mathematics or have an exemplary performance

  4. The fracture strength and frictional strength of Weber Sandstone

    Science.gov (United States)

    Byerlee, J.D.

    1975-01-01

    The fracture strength and frictional strength of Weber Sandstone have been measured as a function of confining pressure and pore pressure. Both the fracture strength and the frictional strength obey the law of effective stress, that is, the strength is determined not by the confining pressure alone but by the difference between the confining pressure and the pore pressure. The fracture strength of the rock varies by as much as 20 per cent depending on the cement between the grains, but the frictional strength is independent of lithology. Over the range 0 2 kb, ??=0??5 + 0??6??n. This relationship also holds for other rocks such as gabbro, dunite, serpentinite, granite and limestone. ?? 1975.

  5. Authorship Attribution of Source Code

    Science.gov (United States)

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  6. Coded nanoscale self-assembly

    Indian Academy of Sciences (India)

    the number of starting particles. Figure 6. Coded self-assembly results in specific shapes. When the con- stituent particles are coded to only combine in a certain defined rules, it al- ways manages to generate the same shape. The simplest case of linear coding with multiseed option is presented here. in place the resultant ...

  7. Strongly-MDS convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H; Rosenthal, J; Smarandache, R

    Maximum-distance separable (MDS) convolutional codes have the property that their free distance is maximal among all codes of the same rate and the same degree. In this paper, a class of MDS convolutional codes is introduced whose column distances reach the generalized Singleton bound at the

  8. Order functions and evaluation codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pellikaan, Ruud; van Lint, Jack

    1997-01-01

    Based on the notion of an order function we construct and determine the parameters of a class of error-correcting evaluation codes. This class includes the one-point algebraic geometry codes as wella s the generalized Reed-Muller codes and the parameters are detremined without using the heavy...

  9. Coding Issues in Grounded Theory

    Science.gov (United States)

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  10. Product Codes for Optical Communication

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    2002-01-01

    Many optical communicaton systems might benefit from forward-error-correction. We present a hard-decision decoding algorithm for the "Block Turbo Codes", suitable for optical communication, which makes this coding-scheme an alternative to Reed-Solomon codes....

  11. Time-Varying Space-Only Codes for Coded MIMO

    CERN Document Server

    Duyck, Dieter; Takawira, Fambirai; Boutros, Joseph J; Moeneclaey, Marc

    2012-01-01

    Multiple antenna (MIMO) devices are widely used to increase reliability and information bit rate. Optimal error rate performance (full diversity and large coding gain), for unknown channel state information at the transmitter and for maximal rate, can be achieved by approximately universal space-time codes, but comes at a price of large detection complexity, infeasible for most practical systems. We propose a new coded modulation paradigm: error-correction outer code with space-only but time-varying precoder (as inner code). We refer to the latter as Ergodic Mutual Information (EMI) code. The EMI code achieves the maximal multiplexing gain and full diversity is proved in terms of the outage probability. Contrary to most of the literature, our work is not based on the elegant but difficult classical algebraic MIMO theory. Instead, the relation between MIMO and parallel channels is exploited. The theoretical proof of full diversity is corroborated by means of numerical simulations for many MIMO scenarios, in te...

  12. Positive Psychology and Character Strengths: Application to Strengths-Based School Counseling

    Science.gov (United States)

    Park, Nansook; Peterson, Christopher

    2008-01-01

    The basic premise of positive psychology is that the happiness and fulfillment of children and youth entail more than the identification and treatment of their problems. This article provides an overview of positive psychology and the Values in Action (VIA) project that classifies and measures 24 widely recognized character strengths. Good…

  13. Multiple Description Coding for Closed Loop Systems over Erasure Channels

    DEFF Research Database (Denmark)

    Østergaard, Jan; Quevedo, Daniel

    2013-01-01

    dropouts and delays, we transmit quantized control vectors containing current control values for the decoder as well as future predicted control values. Second, we utilize multiple description coding based on forward error correction codes to further aid in the robustness towards packet erasures......In this paper, we consider robust source coding in closed-loop systems. In particular, we consider a (possibly) unstable LTI system, which is to be stabilized via a network. The network has random delays and erasures on the data-rate limited (digital) forward channel between the encoder (controller...

  14. Supervised Transfer Sparse Coding

    KAUST Repository

    Al-Shedivat, Maruan

    2014-07-27

    A combination of the sparse coding and transfer learn- ing techniques was shown to be accurate and robust in classification tasks where training and testing objects have a shared feature space but are sampled from differ- ent underlying distributions, i.e., belong to different do- mains. The key assumption in such case is that in spite of the domain disparity, samples from different domains share some common hidden factors. Previous methods often assumed that all the objects in the target domain are unlabeled, and thus the training set solely comprised objects from the source domain. However, in real world applications, the target domain often has some labeled objects, or one can always manually label a small num- ber of them. In this paper, we explore such possibil- ity and show how a small number of labeled data in the target domain can significantly leverage classifica- tion accuracy of the state-of-the-art transfer sparse cod- ing methods. We further propose a unified framework named supervised transfer sparse coding (STSC) which simultaneously optimizes sparse representation, domain transfer and classification. Experimental results on three applications demonstrate that a little manual labeling and then learning the model in a supervised fashion can significantly improve classification accuracy.

  15. Reliability Based Code calibration. The use of the JCSS Probabilistic Model Code

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.

    2002-01-01

    A reliability based code calibration procedure is a two step procedure. In the first step target reliabilities are set on the basis of experience or optimisation and in the second step corresponding partial factors and other safety elements (e.g. PSI-values as in Eurocode 1) are derived. This paper

  16. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  17. The Art of Readable Code

    CERN Document Server

    Boswell, Dustin

    2011-01-01

    As programmers, we've all seen source code that's so ugly and buggy it makes our brain ache. Over the past five years, authors Dustin Boswell and Trevor Foucher have analyzed hundreds of examples of "bad code" (much of it their own) to determine why they're bad and how they could be improved. Their conclusion? You need to write code that minimizes the time it would take someone else to understand it-even if that someone else is you. This book focuses on basic principles and practical techniques you can apply every time you write code. Using easy-to-digest code examples from different languag

  18. Sub-Transport Layer Coding

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2014-01-01

    Packet losses in wireless networks dramatically curbs the performance of TCP. This paper introduces a simple coding shim that aids IP-layer traffic in lossy environments while being transparent to transport layer protocols. The proposed coding approach enables erasure correction while being...... oblivious to the congestion control algorithms of the utilised transport layer protocol. Although our coding shim is indifferent towards the transport layer protocol, we focus on the performance of TCP when ran on top of our proposed coding mechanism due to its widespread use. The coding shim provides gains...

  19. Modest Advertising Signals Strength.

    OpenAIRE

    Ram Orzach; Per Baltzer Overgaard; Yair Tauman

    2001-01-01

    This paper presents a signaling model where both price and advertising expenditures are used as signals of the initially unobservable quality of a newly introduced experience good. Consumers can be either "fastidious" or "indifferent". Fastidious individuals place a greater value on a high-quality product and a lesser value on the low-quality product than do indifferent individuals. It is shown that a sensible separating equilibrium exists where both firms set their full information prices. H...

  20. Fair Value or Market Value?

    Directory of Open Access Journals (Sweden)

    Bogdan Cosmin Gomoi

    2014-12-01

    Full Text Available When taking into consideration the issue of defining the “fair value” concept, those less experimented in the area often fall in the “price trap”, which is considered as an equivalent of the fair value of financial structures. This valuation basis appears as a consequence of the trial to provide an “accurate image” by the financial statements and, also, as an opportunity for the premises offered by the activity continuing principle. The specialized literature generates ample controversies regarding the “fair value” concept and the “market value” concept. The paper aims to debate this issue, taking into account various opinions.

  1. On strength of porous material

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    1999-01-01

    The question of non-destructive testing of porous materials has always been of interest for the engineering profession. A number of empirically based MOE-MOR relations between stiffness (Modulus Of Elasticity) and strength (Modulus OF Rupture) of materials have been established in order to control...... to the theoretical research on non-destructive testing of such materials relating strength to stiffness and pore geometry.It is demonstrated that solutions for stiffness, tensile strength, and pore strength (damaging pore pressure, frost, fire) for some ideal porous materials can be determined theoretically only...... from knowing about pore geometry, solid phase stiffness, and zero-porosity strength. Pore geometry is the very important common denominator which controls both both stiffness and strength.The accurate results obtained are finally used to suggest generalizations with respect to strength in general...

  2. Scoring the Strengths and Weaknesses of Underage Drinking Laws in the United States.

    Science.gov (United States)

    Fell, James C; Thomas, Sue; Scherer, Michael; Fisher, Deborah A; Romano, Eduardo

    2015-03-01

    Several studies have examined the impact of a number of minimum legal drinking age 21 (MLDA-21) laws on underage alcohol consumption and alcohol-related crashes in the United States. These studies have contributed to our understanding of how alcohol control laws affect drinking and driving among those who are under age 21. However, much of the extant literature examining underage drinking laws use a "Law/No law" coding which may obscure the variability inherent in each law. Previous literature has demonstrated that inclusion of law strengths may affect outcomes and overall data fit when compared to "Law/No law" coding. In an effort to assess the relative strength of states' underage drinking legislation, a coding system was developed in 2006 and applied to 16 MLDA-21 laws. The current article updates the previous endeavor and outlines a detailed strength coding mechanism for the current 20 MLDA-21 laws.

  3. Modeling of Compressive Strength for Self-Consolidating High-Strength Concrete Incorporating Palm Oil Fuel Ash

    Science.gov (United States)

    Safiuddin, Md.; Raman, Sudharshan N.; Abdus Salam, Md.; Jumaat, Mohd. Zamin

    2016-01-01

    Modeling is a very useful method for the performance prediction of concrete. Most of the models available in literature are related to the compressive strength because it is a major mechanical property used in concrete design. Many attempts were taken to develop suitable mathematical models for the prediction of compressive strength of different concretes, but not for self-consolidating high-strength concrete (SCHSC) containing palm oil fuel ash (POFA). The present study has used artificial neural networks (ANN) to predict the compressive strength of SCHSC incorporating POFA. The ANN model has been developed and validated in this research using the mix proportioning and experimental strength data of 20 different SCHSC mixes. Seventy percent (70%) of the data were used to carry out the training of the ANN model. The remaining 30% of the data were used for testing the model. The training of the ANN model was stopped when the root mean square error (RMSE) and the percentage of good patterns was 0.001 and ≈100%, respectively. The predicted compressive strength values obtained from the trained ANN model were much closer to the experimental values of compressive strength. The coefficient of determination (R2) for the relationship between the predicted and experimental compressive strengths was 0.9486, which shows the higher degree of accuracy of the network pattern. Furthermore, the predicted compressive strength was found very close to the experimental compressive strength during the testing process of the ANN model. The absolute and percentage relative errors in the testing process were significantly low with a mean value of 1.74 MPa and 3.13%, respectively, which indicated that the compressive strength of SCHSC including POFA can be efficiently predicted by the ANN. PMID:28773520

  4. Modeling of Compressive Strength for Self-Consolidating High-Strength Concrete Incorporating Palm Oil Fuel Ash

    Directory of Open Access Journals (Sweden)

    Md. Safiuddin

    2016-05-01

    Full Text Available Modeling is a very useful method for the performance prediction of concrete. Most of the models available in literature are related to the compressive strength because it is a major mechanical property used in concrete design. Many attempts were taken to develop suitable mathematical models for the prediction of compressive strength of different concretes, but not for self-consolidating high-strength concrete (SCHSC containing palm oil fuel ash (POFA. The present study has used artificial neural networks (ANN to predict the compressive strength of SCHSC incorporating POFA. The ANN model has been developed and validated in this research using the mix proportioning and experimental strength data of 20 different SCHSC mixes. Seventy percent (70% of the data were used to carry out the training of the ANN model. The remaining 30% of the data were used for testing the model. The training of the ANN model was stopped when the root mean square error (RMSE and the percentage of good patterns was 0.001 and ≈100%, respectively. The predicted compressive strength values obtained from the trained ANN model were much closer to the experimental values of compressive strength. The coefficient of determination (R2 for the relationship between the predicted and experimental compressive strengths was 0.9486, which shows the higher degree of accuracy of the network pattern. Furthermore, the predicted compressive strength was found very close to the experimental compressive strength during the testing process of the ANN model. The absolute and percentage relative errors in the testing process were significantly low with a mean value of 1.74 MPa and 3.13%, respectively, which indicated that the compressive strength of SCHSC including POFA can be efficiently predicted by the ANN.

  5. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  6. Effect of Composite Action on the Strength of Wood Roofs

    Directory of Open Access Journals (Sweden)

    Ivan A. Campos Varela

    2015-01-01

    Full Text Available Engineering certification for the installation of solar photovoltaic modules on wood roofs is often denied because existing wood roofs do not meet current building codes. Rather than requiring expensive structural retrofits, we desire to show that many roofs are actually sufficiently strong if the effect of composite action produced by joist-sheathing interaction is considered. In a series of laboratory experiments using a limited number of two-by-four wood joists with and without sheathing panels, conventionally sheathed stud-grade joists, surprisingly, exhibited between 18% and 63% higher nominal strength than similar bare joists. To explain this strength increase, a simple model was developed to predict the strengths of the nailed partially composite sections, but the model only justifies a 1.4% to 3.8% increase in bending strength of joists with an allowable bending strength of 1000 psi. More testing is indicated to resolve this discrepancy between laboratory results and analytical modeling results. In addition to elucidating nonlinear partial composite behavior of existing roof systems, this paper shows that, with minor changes in roof framing practices, strength increases of 70% or more are achievable, compared to the strengths of conventionally sheathed joists.

  7. Effects of strength training and detraining on knee extensor strength, muscle volume and muscle quality in elderly women.

    Science.gov (United States)

    Correa, Cleiton Silva; Baroni, Bruno Manfredini; Radaelli, Régis; Lanferdini, Fábio Juner; Cunha, Giovani dos Santos; Reischak-Oliveira, Álvaro; Vaz, Marco Aurélio; Pinto, Ronei Silveira

    2013-10-01

    Strength training seems to be an interesting approach to counteract decreases that affect knee extensor strength, muscle mass and muscle quality (force per unit of muscle mass) associated with ageing. However, there is no consensus regarding the changes in muscle mass and their contribution to strength during periods of training and detraining in the elderly. Therefore, this study aimed at verifying the behaviour of knee extensor muscle strength, muscle volume and muscle quality in elderly women in response to a 12-week strength training programme followed by a similar period of detraining. Statistical analysis showed no effect of time on muscle quality. However, strength and muscle volume increased from baseline to post-training (33 and 26 %, respectively). After detraining, the knee extensor strength remained 12 % superior to the baseline values, while the gains in muscle mass were almost completely lost. In conclusion, strength gains and losses due to strength training and detraining, respectively, could not be exclusively associated with muscle mass increases. Training-induced strength gains were partially maintained after 3 months of detraining in elderly subjects.

  8. New tools to analyze overlapping coding regions.

    Science.gov (United States)

    Bayegan, Amir H; Garcia-Martin, Juan Antonio; Clote, Peter

    2016-12-13

    Retroviruses transcribe messenger RNA for the overlapping Gag and Gag-Pol polyproteins, by using a programmed -1 ribosomal frameshift which requires a slippery sequence and an immediate downstream stem-loop secondary structure, together called frameshift stimulating signal (FSS). It follows that the molecular evolution of this genomic region of HIV-1 is highly constrained, since the retroviral genome must contain a slippery sequence (sequence constraint), code appropriate peptides in reading frames 0 and 1 (coding requirements), and form a thermodynamically stable stem-loop secondary structure (structure requirement). We describe a unique computational tool, RNAsampleCDS, designed to compute the number of RNA sequences that code two (or more) peptides p,q in overlapping reading frames, that are identical (or have BLOSUM/PAM similarity that exceeds a user-specified value) to the input peptides p,q. RNAsampleCDS then samples a user-specified number of messenger RNAs that code such peptides; alternatively, RNAsampleCDS can exactly compute the position-specific scoring matrix and codon usage bias for all such RNA sequences. Our software allows the user to stipulate overlapping coding requirements for all 6 possible reading frames simultaneously, even allowing IUPAC constraints on RNA sequences and fixing GC-content. We generalize the notion of codon preference index (CPI) to overlapping reading frames, and use RNAsampleCDS to generate control sequences required in the computation of CPI. Moreover, by applying RNAsampleCDS, we are able to quantify the extent to which the overlapping coding requirement in HIV-1 [resp. HCV] contribute to the formation of the stem-loop [resp. double stem-loop] secondary structure known as the frameshift stimulating signal. Using our software, we confirm that certain experimentally determined deleterious HCV mutations occur in positions for which our software RNAsampleCDS and RNAiFold both indicate a single possible nucleotide. We

  9. Water's Hydrogen Bond Strength

    CERN Document Server

    Chaplin, Martin

    2007-01-01

    Water is necessary both for the evolution of life and its continuance. It possesses particular properties that cannot be found in other materials and that are required for life-giving processes. These properties are brought about by the hydrogen bonded environment particularly evident in liquid water. Each liquid water molecule is involved in about four hydrogen bonds with strengths considerably less than covalent bonds but considerably greater than the natural thermal energy. These hydrogen bonds are roughly tetrahedrally arranged such that when strongly formed the local clustering expands, decreasing the density. Such low density structuring naturally occurs at low and supercooled temperatures and gives rise to many physical and chemical properties that evidence the particular uniqueness of liquid water. If aqueous hydrogen bonds were actually somewhat stronger then water would behave similar to a glass, whereas if they were weaker then water would be a gas and only exist as a liquid at sub-zero temperature...

  10. Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code

    Directory of Open Access Journals (Sweden)

    Marinkovic Slavica

    2006-01-01

    Full Text Available Quantized frame expansions based on block transforms and oversampled filter banks (OFBs have been considered recently as joint source-channel codes (JSCCs for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC or a fixed-length code (FLC. This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an -ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.

  11. Peripheral coding of taste

    Science.gov (United States)

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  12. Code des baux 2018

    CERN Document Server

    Vial-Pedroletti, Béatrice; Kendérian, Fabien; Chavance, Emmanuelle; Coutan-Lapalus, Christelle

    2017-01-01

    Le code des baux 2018 vous offre un contenu extrêmement pratique, fiable et à jour au 1er août 2017. Cette 16e édition intègre notamment : le décret du 27 juillet 2017 relatif à l’évolution de certains loyers dans le cadre d’une nouvelle location ou d’un renouvellement de bail, pris en application de l’article 18 de la loi n° 89-462 du 6 juillet 1989 ; la loi du 27 janvier 2017 relative à l’égalité et à la citoyenneté ; la loi du 9 décembre 2016 relative à la transparence, à la lutte contre la corruption et à la modernisation de la vie économique ; la loi du 18 novembre 2016 de modernisation de la justice du xxie siècle

  13. [Neural codes for perception].

    Science.gov (United States)

    Romo, R; Salinas, E; Hernández, A; Zainos, A; Lemus, L; de Lafuente, V; Luna, R

    This article describes experiments designed to show the neural codes associated with the perception and processing of tactile information. The results of these experiments have shown the neural activity correlated with tactile perception. The neurones of the primary somatosensory cortex (S1) represent the physical attributes of tactile perception. We found that these representations correlated with tactile perception. By means of intracortical microstimulation we demonstrated the causal relationship between S1 activity and tactile perception. In the motor areas of the frontal lobe is to be found the connection between sensorial and motor representation whilst decisions are being taken. S1 generates neural representations of the somatosensory stimuli which seen to be sufficient for tactile perception. These neural representations are subsequently processed by central areas to S1 and seem useful in perception, memory and decision making.

  14. Code-labelling

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid; Brynskov, Martin

    in programming education collected in an Action Research cycle. The results support the use of a structural approach to teaching programming to this target audience; particularly, the translation-grammar method seems to integrate well with programming education. The paper also explores the potential underlying......The code-labelling exercise is an attempt to apply natural language education techniques for solving the challenge of teaching introductory programming to non-STEM novices in higher education. This paper presents findings from a study exploring the use of natural language teaching techniques...... reasons. It seems the exercise invokes an assimilation of student's existing cognitive schemata and supports a deep-learning experience. The exercise is an invitation to other teachers to create further iterations to improve their own teaching. It also seeks to enrich the portfolio of teaching activities...

  15. Transionospheric Propagation Code (TIPC)

    Science.gov (United States)

    Roussel-Dupre, Robert; Kelley, Thomas A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of VHF signals following propagation through the ionosphere. The code is written in FORTRAN 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, delta times of arrival (DTOA) study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of DTOAs vs TECs for a specified pair of receivers.

  16. Galois LCD Codes over Finite Fields

    OpenAIRE

    Liu, Xiusheng; Fan, Yun; Liu, Hualu

    2017-01-01

    In this paper, we study the complementary dual codes in more general setting (which are called Galois LCD codes) by a uniform method. A necessary and sufficient condition for linear codes to be Galois LCD codes is determined, and constacyclic codes to be Galois LCD codes are characterized. Some illustrative examples which constacyclic codes are Galois LCD MDS codes are provided as well. In particular, we study Hermitian LCD constacyclic codes. Finally, we present a construction of a class of ...

  17. Quantum Quasi-Cyclic LDPC Codes

    OpenAIRE

    Hagiwara, Manabu; Imai, Hideki

    2007-01-01

    In this paper, a construction of a pair of "regular" quasi-cyclic LDPC codes as ingredient codes for a quantum error-correcting code is proposed. That is, we find quantum regular LDPC codes with various weight distributions. Furthermore our proposed codes have lots of variations for length, code rate. These codes are obtained by a descrete mathematical characterization for model matrices of quasi-cyclic LDPC codes. Our proposed codes achieve a bounded distance decoding (BDD) bound, or known a...

  18. Valuing vaccination

    Science.gov (United States)

    Bärnighausen, Till; Bloom, David E.; Cafiero-Fonseca, Elizabeth T.; O’Brien, Jennifer Carroll

    2014-01-01

    Vaccination has led to remarkable health gains over the last century. However, large coverage gaps remain, which will require significant financial resources and political will to address. In recent years, a compelling line of inquiry has established the economic benefits of health, at both the individual and aggregate levels. Most existing economic evaluations of particular health interventions fail to account for this new research, leading to potentially sizable undervaluation of those interventions. In line with this new research, we set forth a framework for conceptualizing the full benefits of vaccination, including avoided medical care costs, outcome-related productivity gains, behavior-related productivity gains, community health externalities, community economic externalities, and the value of risk reduction and pure health gains. We also review literature highlighting the magnitude of these sources of benefit for different vaccinations. Finally, we outline the steps that need to be taken to implement a broad-approach economic evaluation and discuss the implications of this work for research, policy, and resource allocation for vaccine development and delivery. PMID:25136129

  19. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  20. Aspen Code Development Collaboration

    Energy Technology Data Exchange (ETDEWEB)

    none,; Cherry, Robert S. [INL; Richard, Boardman D. [INL

    2013-10-03

    Wyoming has a wealth of primary energy resources in the forms of coal, natural gas, wind, uranium, and oil shale. Most of Wyoming?s coal and gas resources are exported from the state in unprocessed form rather than as refined higher value products. Wyoming?s leadership recognizes the opportunity to broaden the state?s economic base energy resources to make value-added products such as synthetic vehicle fuels and commodity chemicals. Producing these higher value products in an environmentally responsible manner can benefit from the use of clean energy technologies including Wyoming?s abundant wind energy and nuclear energy such as new generation small modular reactors including the high temperature gas-cooled reactors.

  1. Analysis of Iterated Hard Decision Decoding of Product Codes with Reed-Solomon Component Codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2007-01-01

    Products of Reed-Solomon codes are important in applications because they offer a combination of large blocks, low decoding complexity, and good performance. A recent result on random graphs can be used to show that with high probability a large number of errors can be corrected by iterating...... minimum distance decoding. We present an analysis related to density evolution which gives the exact asymptotic value of the decoding threshold and also provides a closed form approximation to the distribution of errors in each step of the decoding of finite length codes....

  2. Swirling Strength Vortex Study in Confined Rectangular Jet

    Science.gov (United States)

    Kong, Bo; Olsen, Michael; Fox, Rodney; Hill, James

    2009-11-01

    Vortex behavior in confined rectangular jet (Re = 20K, Re = 50K) were examined by using vortex swirling strength as a defining characteristic. Instantaneous velocity fields were collected for by using Particle Image Velocimetry(PIV). Swirling strength fields were calculated from velocity fields, and then filtered with a universal threshold of 1.5 times of swirling strength RMS value. By identifying clusters in filtered swirling strength fields, vortex structures were defined. Both instantaneous swirling strength field data and vortex population calculation indicate that the positively (counterclockwise) rotating vortices are dominant on the left side of the jet, and negatively (clockwise) rotating vortices are dominant on the right side. As flow develops further downstream, vortex population decreases and the flow approach channel flow. At the locations of the left peak of turbulent kinetic energy, two point spatial cross-correlation of swirling strength with velocity fluctuation were calculated. Linear stochastic estimation was also used to interpret the spatial correlation results and to determine conditional flow structures. High speed PIV data were also analyzed by using swirling strength technique to trace development of vortices. Vortex trajectories were found by tracing individual swirling strength clusters. The speed and strength of individual vortex were also studied by using this method.

  3. Benchmarking Tokamak edge modelling codes

    Science.gov (United States)

    Contributors To The Efda-Jet Work Programme; Coster, D. P.; Bonnin, X.; Corrigan, G.; Kirnev, G. S.; Matthews, G.; Spence, J.; Contributors to the EFDA-JET work programme

    2005-03-01

    Tokamak edge modelling codes are in widespread use to interpret and understand existing experiments, and to make predictions for future machines. Little direct benchmarking has been done between the codes, and the users of the codes have tended to concentrate on different experimental machines. An important validation step is to compare the codes for identical scenarios. In this paper, two of the major edge codes, SOLPS (B2.5-Eirene) and EDGE2D-NIMBUS are benchmarked against each other. A set of boundary conditions, transport coefficients, etc. for a JET plasma were chosen, and the two codes were run on the same grid. Initially, large differences were seen in the resulting plasmas. These differences were traced to differing physics assumptions with respect to the parallel heat flux limits. Once these were switched off in SOLPS, or implemented and switched on in EDGE2D-NIMBUS, the remaining differences were small.

  4. A Canonical Password Strength Measure

    OpenAIRE

    Panferov, Eugene

    2015-01-01

    We notice that the "password security" discourse is missing the most fundamental notion of the "password strength" -- it was never properly defined. We propose a canonical definition of the "password strength", based on the assessment of the efficiency of a set of possible guessing attack. Unlike naive password strength assessments our metric takes into account the attacker's strategy, and we demonstrate the necessity of that feature. This paper does NOT advise you to include "at least three ...

  5. Commitee III.1 Ultimate Strength

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    1997-01-01

    This report addresses the subject of ductile collapse of ships and offshore structures and their components due to buckling and excessive yielding under overload conditions. Consideration is given to load-deflection predictions for components with fabrication imperfections and in-service damage a...... and to the ultimate strength and post-ultimate behaviour of structural systems in order to identify the reserve strength. The effect of uncertainties in the modelling on the strength predictions is highlighted in two design examples....

  6. Adaptive subband coding of full motion video

    Science.gov (United States)

    Sharifi, Kamran; Xiao, Leping; Leon-Garcia, Alberto

    1993-10-01

    In this paper a new algorithm for digital video coding is presented that is suitable for digital storage and video transmission applications in the range of 5 to 10 Mbps. The scheme is based on frame differencing and, unlike recent proposals, does not employ motion estimation and compensation. A novel adaptive grouping structure is used to segment the video sequence into groups of frames of variable sizes. Within each group, the frame difference is taken in a closed loop Differential Pulse Code Modulation (DPCM) structure and then decomposed into different frequency subbands. The important subbands are transformed using the Discrete Cosine Transform (DCT) and the resulting coefficients are adaptively quantized and runlength coded. The adaptation is based on the variance of sample values in each subband. To reduce the computation load, a very simple and efficient way has been used to estimate the variance of the subbands. It is shown that for many types of sequences, the performance of the proposed coder is comparable to that of coding methods which use motion parameters.

  7. Coupling a Basin Modeling and a Seismic Code using MOAB

    KAUST Repository

    Yan, Mi

    2012-06-02

    We report on a demonstration of loose multiphysics coupling between a basin modeling code and a seismic code running on a large parallel machine. Multiphysics coupling, which is one critical capability for a high performance computing (HPC) framework, was implemented using the MOAB open-source mesh and field database. MOAB provides for code coupling by storing mesh data and input and output field data for the coupled analysis codes and interpolating the field values between different meshes used by the coupled codes. We found it straightforward to use MOAB to couple the PBSM basin modeling code and the FWI3D seismic code on an IBM Blue Gene/P system. We describe how the coupling was implemented and present benchmarking results for up to 8 racks of Blue Gene/P with 8192 nodes and MPI processes. The coupling code is fast compared to the analysis codes and it scales well up to at least 8192 nodes, indicating that a mesh and field database is an efficient way to implement loose multiphysics coupling for large parallel machines.

  8. Airborne field strength monitoring

    Science.gov (United States)

    Bredemeyer, J.; Kleine-Ostmann, T.; Schrader, T.; Münter, K.; Ritter, J.

    2007-06-01

    In civil and military aviation, ground based navigation aids (NAVAIDS) are still crucial for flight guidance even though the acceptance of satellite based systems (GNSS) increases. Part of the calibration process for NAVAIDS (ILS, DME, VOR) is to perform a flight inspection according to specified methods as stated in a document (DOC8071, 2000) by the International Civil Aviation Organization (ICAO). One major task is to determine the coverage, or, in other words, the true signal-in-space field strength of a ground transmitter. This has always been a challenge to flight inspection up to now, since, especially in the L-band (DME, 1GHz), the antenna installed performance was known with an uncertainty of 10 dB or even more. In order to meet ICAO's required accuracy of ±3 dB it is necessary to have a precise 3-D antenna factor of the receiving antenna operating on the airborne platform including all losses and impedance mismatching. Introducing precise, effective antenna factors to flight inspection to achieve the required accuracy is new and not published in relevant papers yet. The authors try to establish a new balanced procedure between simulation and validation by airborne and ground measurements. This involves the interpretation of measured scattering parameters gained both on the ground and airborne in comparison with numerical results obtained by the multilevel fast multipole algorithm (MLFMA) accelerated method of moments (MoM) using a complex geometric model of the aircraft. First results will be presented in this paper.

  9. Low complexity hevc intra coding

    OpenAIRE

    Ruiz Coll, José Damián

    2016-01-01

    Over the last few decades, much research has focused on the development and optimization of video codecs for media distribution to end-users via the Internet, broadcasts or mobile networks, but also for videoconferencing and for the recording on optical disks for media distribution. Most of the video coding standards for delivery are characterized by using a high efficiency hybrid schema, based on inter-prediction coding for temporal picture decorrelation, and intra-prediction coding for spat...

  10. IRIG Serial Time Code Formats

    Science.gov (United States)

    2016-08-01

    and G. It should be noted that this standard reflects the present state of the art in serial time code formatting and is not intended to constrain...separation for visual resolution. The LSB occurs first except for the fractional seconds subword that follows the day-of-year subword. The BCD TOY code...and P6 to complete the BCD time code word. An index marker occurs between the decimal digits in each subword to provide separation for visual

  11. Error correcting coding for OTN

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Pedersen, Lars A.

    2010-01-01

    Forward error correction codes for 100 Gb/s optical transmission are currently receiving much attention from transport network operators and technology providers. We discuss the performance of hard decision decoding using product type codes that cover a single OTN frame or a small number...... of such frames. In particular we argue that a three-error correcting BCH is the best choice for the component code in such systems....

  12. Indices for Testing Neural Codes

    OpenAIRE

    Jonathan D. Victor; Nirenberg, Sheila

    2008-01-01

    One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is ...

  13. A multiscale strength model in HYDRA

    Science.gov (United States)

    Marinak, M. M.; Barton, N. R.

    2016-10-01

    We describe a multiscale strength model recently implemented in HYDRA. The model incorporates results from a hierarchy of methods which span from the atomistic to the continuum level. Those are obtained from focused physics codes that treat density functional theory, molecular statics, molecular dynamics, dislocation dynamics and continuum mechanics. The model is designed to handle extreme pressures and temperatures, and is especially appropriate for strain rates in excess of 104 s-1. As such it can be used to provide insight into HEDP experimental observations. The model has demonstrated success in capturing planar Rayleigh-Taylor growth for 1 Mbar shocks in Ta and V. This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344.

  14. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-04-11

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaickingand 4D light field view synthesis.

  15. Coding, cryptography and combinatorics

    CERN Document Server

    Niederreiter, Harald; Xing, Chaoping

    2004-01-01

    It has long been recognized that there are fascinating connections between cod­ ing theory, cryptology, and combinatorics. Therefore it seemed desirable to us to organize a conference that brings together experts from these three areas for a fruitful exchange of ideas. We decided on a venue in the Huang Shan (Yellow Mountain) region, one of the most scenic areas of China, so as to provide the additional inducement of an attractive location. The conference was planned for June 2003 with the official title Workshop on Coding, Cryptography and Combi­ natorics (CCC 2003). Those who are familiar with events in East Asia in the first half of 2003 can guess what happened in the end, namely the conference had to be cancelled in the interest of the health of the participants. The SARS epidemic posed too serious a threat. At the time of the cancellation, the organization of the conference was at an advanced stage: all invited speakers had been selected and all abstracts of contributed talks had been screened by the p...

  16. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-12-01

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high-dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaicing and 4D light field view synthesis.

  17. The FLUKA code: an overview

    Energy Technology Data Exchange (ETDEWEB)

    Ballarini, F [University of Pavia and INFN (Italy); Battistoni, G [University of Milan and INFN (Italy); Campanella, M; Carboni, M; Cerutti, F [University of Milan and INFN (Italy); Empl, A [University of Houston, Houston (United States); Fasso, A [SLAC, Stanford (United States); Ferrari, A [CERN, CH-1211 Geneva (Switzerland); Gadioli, E [University of Milan and INFN (Italy); Garzelli, M V [University of Milan and INFN (Italy); Lantz, M [University of Milan and INFN (Italy); Liotta, M [University of Pavia and INFN (Italy); Mairani, A [University of Pavia and INFN (Italy); Mostacci, A [Laboratori Nazionali di Frascati, INFN (Italy); Muraro, S [University of Milan and INFN (Italy); Ottolenghi, A [University of Pavia and INFN (Italy); Pelliccioni, M [Laboratori Nazionali di Frascati, INFN (Italy); Pinsky, L [University of Houston, Houston (United States); Ranft, J [Siegen University, Siegen (Germany); Roesler, S [CERN, CH-1211 Geneva (Switzerland); Sala, P R [University of Milan and INFN (Italy); Scannicchio, D [University of Pavia and INFN (Italy); Trovati, S [University of Pavia and INFN (Italy); Villari, R; Wilson, T [Johnson Space Center, NASA (United States); Zapp, N [Johnson Space Center, NASA (United States); Vlachoudis, V [CERN, CH-1211 Geneva (Switzerland)

    2006-05-15

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  18. The FLUKA Code: an Overview

    Energy Technology Data Exchange (ETDEWEB)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani,; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U.

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  19. Understanding perception through neural "codes".

    Science.gov (United States)

    Freeman, Walter J

    2011-07-01

    A major challenge for cognitive scientists is to deduce and explain the neural mechanisms of the rapid transposition between stimulus energy and recalled memory-between the specific (sensation) and the generic (perception)-in both material and mental aspects. Researchers are attempting three explanations in terms of neural codes. The microscopic code: cellular neurobiologists correlate stimulus properties with the rates and frequencies of trains of action potentials induced by stimuli and carried by topologically organized axons. The mesoscopic code: cognitive scientists formulate symbolic codes in trains of action potentials from feature-detector neurons of phonemes, lines, odorants, vibrations, faces, etc., that object-detector neurons bind into representations of stimuli. The macroscopic code: neurodynamicists extract neural correlates of stimuli and associated behaviors in spatial patterns of oscillatory fields of dendritic activity, which self-organize and evolve on trajectories through high-dimensional brain state space. This multivariate code is expressed in landscapes of chaotic attractors. Unlike other scientific codes, such as DNA and the periodic table, these neural codes have no alphabet or syntax. They are epistemological metaphors that experimentalists need to measure neural activity and engineers need to model brain functions. My aim is to describe the main properties of the macroscopic code and the grand challenge it poses: how do very large patterns of textured synchronized oscillations form in cortex so quickly? © 2010 IEEE

  20. Programming Entity Framework Code First

    CERN Document Server

    Lerman, Julia

    2011-01-01

    Take advantage of the Code First data modeling approach in ADO.NET Entity Framework, and learn how to build and configure a model based on existing classes in your business domain. With this concise book, you'll work hands-on with examples to learn how Code First can create an in-memory model and database by default, and how you can exert more control over the model through further configuration. Code First provides an alternative to the database first and model first approaches to the Entity Data Model. Learn the benefits of defining your model with code, whether you're working with an exis

  1. High Order Modulation Protograph Codes

    Science.gov (United States)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  2. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  3. The FLUKA code: an overview

    Science.gov (United States)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fassò, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P. R.; Scannicchio, D.; Trovati, S.; Villari, R.; Wilson, T.; Zapp, N.; Vlachoudis, V.

    2006-05-01

    FLUKA is a multipurpose MonteCarlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  4. Golay and other box codes

    Science.gov (United States)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6x4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (63,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  5. Cancer Prevention: Distinguishing Strength of Evidence from Strength of Opinion

    Science.gov (United States)

    Barnett S. Kramer, MD, MPH, Associate Director for Disease Prevention and Director of the Office of Medical Applications of Research in the Office of Disease Prevention, Office of the Director, National Institutes of Health, Bethesda, MD, presented "Cancer Prevention: Distinguishing Strength of Evidence from Strength of Opinion".

  6. Bending strength of water-soaked glued laminated beams

    Science.gov (United States)

    Ronald W. Wolfe; Russell C. Moody

    1978-01-01

    The effects of water soaking on the bending strength and stiffness of laminated timber were determined by deriving wet-dry ratios for these properties. Values for these ratios, when compared to currently recommended wet use factors, confirm the value now used for modulus of rupture. For modulus of elasticity, the reduction due to water soaking was found to be less than...

  7. Page 1 > Strength and electronic structure 719 (ts). Before the ...

    Indian Academy of Sciences (India)

    (ts). Before the development of the dislocation theory of solids, this problem was of considerable interest because calculated values of the strength were typically greater than experimental values by a factor of 100 or more. Even at the present time, this problem is of theoretical and practical importance, as a knowledge of the ...

  8. Stochastic Models for Strength of Wind Turbine Blades using Tests

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2008-01-01

    The structural cost of wind turbine blades is dependent on the values of the partial safety factors which reflect the uncertainties in the design values, including statistical uncertainty from a limited number of tests. This paper presents a probabilistic model for ultimate and fatigue strength...

  9. FEAMAC-CARES Software Coupling Development Effort for CMC Stochastic-Strength-Based Damage Simulation

    Science.gov (United States)

    Nemeth, Noel N.; Bednarcyk, Brett A.; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Walton, Owen

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MACGMC composite material analysis code. The resulting code is called FEAMACCARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMACCARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMACCARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  10. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...... are optimal or best known for their parameters. In chapter five we study some graph codes with Reed–Solomon component codes. The underlying graph is well known and widely used for its good characteristics. This helps us to compute the dimension of the graph codes. We also introduce a combinatorial concept...... related to the iterative encoding of graph codes with MDS component code. The last chapter deals with affine Grassmann codes and Grassmann codes. We begin with some previously known codes and prove that they are also Tanner codes of the incidence graph of the point–line partial geometry...

  11. QR code for medical information uses.

    Science.gov (United States)

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  12. Methodology, status, and plans for development and assessment of the RELAP5 code

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, G.W.; Riemke, R.A. [Idaho National Engineering Laboratory, Idaho Falls, ID (United States)

    1997-07-01

    RELAP/MOD3 is a computer code used for the simulation of transients and accidents in light-water nuclear power plants. The objective of the program to develop and maintain RELAP5 was and is to provide the U.S. Nuclear Regulatory Commission with an independent tool for assessing reactor safety. This paper describes code requirements, models, solution scheme, language and structure, user interface validation, and documentation. The paper also describes the current and near term development program and provides an assessment of the code`s strengths and limitations.

  13. A Ten-Value-Type Framework Associated With Spectator Sports

    Directory of Open Access Journals (Sweden)

    Li-Shiue Gau

    2013-04-01

    Full Text Available Prior value studies in sport settings have been focused on participation rather than spectatorship. This study is an initial step in examining the values associated with spectator sports. Interviews and focus groups were utilized in this qualitative study within four progressive phases for triangulation, transferability, and constant comparative assessment. A total of 54 individuals participated in the study. Because values were subjectively perceived at the highest level of abstraction reflecting desirable preference experiences, the interviews were designed to include seven categories of questions: Three categories dealt with observed behaviors and implied metaphors, and four categories including questions comparing spectator sports and sport participation, religion, other leisure activities, and substitutes. Grounded data techniques guided the coding procedure. Using the transcriptions and notes from 26 interviews and three focus groups, five coders were used to provide evidence of interrater reliability. Based on the results of the data analyses, a 10-value-type framework was developed in relation to spectator sports: (a Enjoyment (pleasure and satisfaction, (b Sociability (social interaction through sport spectating, (c Identity (enhancing self-esteem, (d Status (pursuing social recognition, (e Moral, (f Spirituality (inner peace, strength, meaning, and purpose in life, (g Epistemic, (h Aesthetics, (i Ritual (sports spectating becomes a series of formal and serious acts followed regularly and invariably as end-experience, and (j no or negative values. Different from Kahle’s (1983, Maslow’s (1970a, and Schwartz’ (1992 value theories, the framework was specifically associated with spectator sports and is expected to better predict spectator sport behavior than does a scale measuring motivations of sports fans.

  14. Strength conditions for the elastic structures with a stress error

    Science.gov (United States)

    Matveev, A. D.

    2017-10-01

    As is known, the constraints (strength conditions) for the safety factor of elastic structures and design details of a particular class, e.g. aviation structures are established, i.e. the safety factor values of such structures should be within the given range. It should be noted that the constraints are set for the safety factors corresponding to analytical (exact) solutions of elasticity problems represented for the structures. Developing the analytical solutions for most structures, especially irregular shape ones, is associated with great difficulties. Approximate approaches to solve the elasticity problems, e.g. the technical theories of deformation of homogeneous and composite plates, beams and shells, are widely used for a great number of structures. Technical theories based on the hypotheses give rise to approximate (technical) solutions with an irreducible error, with the exact value being difficult to be determined. In static calculations of the structural strength with a specified small range for the safety factors application of technical (by the Theory of Strength of Materials) solutions is difficult. However, there are some numerical methods for developing the approximate solutions of elasticity problems with arbitrarily small errors. In present paper, the adjusted reference (specified) strength conditions for the structural safety factor corresponding to approximate solution of the elasticity problem have been proposed. The stress error estimation is taken into account using the proposed strength conditions. It has been shown that, to fulfill the specified strength conditions for the safety factor of the given structure corresponding to an exact solution, the adjusted strength conditions for the structural safety factor corresponding to an approximate solution are required. The stress error estimation which is the basis for developing the adjusted strength conditions has been determined for the specified strength conditions. The adjusted strength

  15. Tensile rock mass strength estimated using InSAR

    KAUST Repository

    Jonsson, Sigurjon

    2012-11-01

    The large-scale strength of rock is known to be lower than the strength determined from small-scale samples in the laboratory. However, it is not well known how strength scales with sample size. I estimate kilometer-scale tensional rock mass strength by measuring offsets across new tensional fractures (joints), formed above a shallow magmatic dike intrusion in western Arabia in 2009. I use satellite radar observations to derive 3D ground displacements and by quantifying the extension accommodated by the joints and the maximum extension that did not result in a fracture, I put bounds on the joint initiation threshold of the surface rocks. The results indicate that the kilometer-scale tensile strength of the granitic rock mass is 1–3 MPa, almost an order of magnitude lower than typical laboratory values.

  16. Bone strength and material properties of the glenoid

    DEFF Research Database (Denmark)

    Frich, Lars Henrik; Jensen, N.C.; Odgaard, A.

    1997-01-01

    The quality of the glenoid bone is important to a successful total shoulder replacement. Finite element models have been used to model the response of the glenoid bone to an implanted prosthesis. Because very little is known about the bone strength and the material properties at the glenoid......, these models were all based on assumptions that the material properties of the glenoid were similar to those of the tibial plateau. The osteopenetrometer was used to assess the topographic strength distribution at the glenoid. Strength at the proximal subchondral level of the glenoid averaged 66.9 MPa. Higher...... peak values were measured posteriorly, superiorly, and anteriorly to the area of maximum concavity of the glenoid joint surface known as the bare area. One millimeter underneath the subchondral plate, average strength decreased by 25%, and at the 2 mm level strength decreased by 70%. The contribution...

  17. The relationship between knee joint angle and knee flexor and extensor muscle strength.

    Science.gov (United States)

    Ha, Misook; Han, Dongwook

    2017-04-01

    [Purpose] The aim of this study was to determine a relationship between joint angle and muscular strength. In particular, this research investigated the differences in maximum muscular strength and average muscular strength at the knee-joint posture. [Subjects and Methods] The study subjects comprised eight female students in their 20s attending S University in Busan. None of the subjects had functional disabilities or had experienced damage to the lower extremities in terms of measurement of muscular strength. A BIODEX system III model (Biodex medical system, USA) was used to measure joint angles and muscular strength. The axis of the dynamometer was consistent with the axis of motion, and measurements were made at 25° and 67° to examine differences in maximum muscular strength according to joint angle. [Results] The maximum muscular strength both knee-joint extension value, at 67° and flexion value, at 25° the value was larger. The average muscular strength both knee-joint extension value, at 67° and flexion value, at 25° the value was larger. [Conclusion] The results of this study reveal that muscular strength does not reach maximum at particular range angles, such as the knee-joint resting posture angle or the knee-joint middle range angle. Rather, a stretched muscle is stronger than a contracted muscle. Therefore, it is considered that it will be necessary to study the effects of the joint change ratio on muscular strength on the basis of the maximum stretched muscle.

  18. Civil Code, 11 December 1987.

    Science.gov (United States)

    1988-01-01

    Article 162 of this Mexican Code provides, among other things, that "Every person has the right freely, responsibly, and in an informed fashion to determine the number and spacing of his or her children." When a marriage is involved, this right is to be observed by the spouses "in agreement with each other." The civil codes of the following states contain the same provisions: 1) Baja California (Art. 159 of the Civil Code of 28 April 1972 as revised in Decree No. 167 of 31 January 1974); 2) Morelos (Art. 255 of the Civil Code of 26 September 1949 as revised in Decree No. 135 of 29 December 1981); 3) Queretaro (Art. 162 of the Civil Code of 29 December 1950 as revised in the Act of 9 January 1981); 4) San Luis Potosi (Art. 147 of the Civil Code of 24 March 1946 as revised in 13 June 1978); Sinaloa (Art. 162 of the Civil Code of 18 June 1940 as revised in Decree No. 28 of 14 October 1975); 5) Tamaulipas (Art. 146 of the Civil Code of 21 November 1960 as revised in Decree No. 20 of 30 April 1975); 6) Veracruz-Llave (Art. 98 of the Civil Code of 1 September 1932 as revised in the Act of 30 December 1975); and 7) Zacatecas (Art. 253 of the Civil Code of 9 February 1965 as revised in Decree No. 104 of 13 August 1975). The Civil Codes of Puebla and Tlaxcala provide for this right only in the context of marriage with the spouses in agreement. See Art. 317 of the Civil Code of Puebla of 15 April 1985 and Article 52 of the Civil Code of Tlaxcala of 31 August 1976 as revised in Decree No. 23 of 2 April 1984. The Family Code of Hidalgo requires as a formality of marriage a certification that the spouses are aware of methods of controlling fertility, responsible parenthood, and family planning. In addition, Article 22 the Civil Code of the Federal District provides that the legal capacity of natural persons is acquired at birth and lost at death; however, from the moment of conception the individual comes under the protection of the law, which is valid with respect to the

  19. Error coding simulations in C

    Science.gov (United States)

    Noble, Viveca K.

    1994-10-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  20. On the linear programming bound for linear Lee codes.

    Science.gov (United States)

    Astola, Helena; Tabus, Ioan

    2016-01-01

    Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.

  1. Gasoline2: a modern smoothed particle hydrodynamics code

    Science.gov (United States)

    Wadsley, James W.; Keller, Benjamin W.; Quinn, Thomas R.

    2017-10-01

    The methods in the Gasoline2 smoothed particle hydrodynamics (SPH) code are described and tested. Gasoline2 is the most recent version of the Gasoline code for parallel hydrodynamics and gravity with identical hydrodynamics to the Changa code. As with other Modern SPH codes, we prevent sharp jumps in time-steps, use upgraded kernels and larger neighbour numbers and employ local viscosity limiters. Unique features in Gasoline2 include its Geometric Density Average Force expression, explicit Turbulent Diffusion terms and Gradient-Based shock detection to limit artificial viscosity. This last feature allows Gasoline2 to completely avoid artificial viscosity in non-shocking compressive flows. We present a suite of tests demonstrating the value of these features with the same code configuration and parameter choices used for production simulations.

  2. Fast decoding of codes from algebraic plane curves

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Jensen, Helge Elbrønd

    1992-01-01

    Improvement to an earlier decoding algorithm for codes from algebraic geometry is presented. For codes from an arbitrary regular plane curve the authors correct up to d*/2-m2 /8+m/4-9/8 errors, where d* is the designed distance of the code and m is the degree of the curve. The complexity of finding...... the error locator is O(n7/3 ), where n is the length of the code. For codes from Hermitian curves the complexity of finding the error values, given the error locator, is O(n2), and the same complexity can be obtained in the general case if only d*/2-m2/2 errors are corrected...

  3. An architecture for hybrid coding of NTSC TV signals

    Science.gov (United States)

    Jalali, A.; Rao, K. R.

    A hardware feasible architecture for DCT/DPCM hybrid coding of color television (TV) signals has been developed. The coding system is based on formatting four horizontal scan lines into blocks of the same subcarrier phase elements. The samples in each block are rearranged and transformed by a FDCT processor. Based on its average energy, a given transform block is compared with its adjacent blocks and the nearest block is selected as its estimate. The difference between the actual and the estimated values of the (DCT) coefficients are then quantized and encoded using nonuniform quantizers and a variable length coder. Furthermore, the maximum number of different wordlengths is assumed to be five. Therefore, five sets of 256-byte encoding ROMs are used to store the quantization tables. To reduce the redundancy in the code words, an adaptive coding scheme is used. The coding scheme is based on setting two threshold levels.

  4. Asymmetric Quantum Codes on Toric Surfaces

    DEFF Research Database (Denmark)

    Hansen, Johan P.

    2017-01-01

    Asymmetric quantum error-correcting codes are quantum codes defined over biased quantum channels: qubit-flip and phase-shift errors may have equal or different probabilities. The code construction is the Calderbank-Shor-Steane construction based on two linear codes. We present families of toric...... surfaces, toric codes and associated asymmetric quantum error-correcting codes....

  5. Influence of initial imperfections on ultimate strength of spherical shells

    Directory of Open Access Journals (Sweden)

    Chang-Li Yu

    2017-09-01

    Full Text Available Comprehensive consideration regarding influence mechanisms of initial imperfections on ultimate strength of spherical shells is taken to satisfy requirement of deep-sea structural design. The feasibility of innovative numerical procedure that combines welding simulation and non-linear buckling analysis is verified by a good agreement to experimental and theoretical results. Spherical shells with a series of wall thicknesses to radius ratios are studied. Residual stress and deformations from welding process are investigated separately. Variant influence mechanisms are discovered. Residual stress is demonstrated to be influential to stress field and buckling behavior but not to the ultimate strength. Deformations are proved to have a significant impact on ultimate strength. When central angles are less than critical value, concave magnitudes reduce ultimate strengths linearly. However, deformations with central angles above critical value are of much greater harm. Less imperfection susceptibility is found in spherical shells with larger wall thicknesses to radius ratios.

  6. Calculating Outsourcing Strategies and Trials of Strength

    DEFF Research Database (Denmark)

    Christensen, Mark; Skærbæk, Peter; Tryggestad, Kjell

    . The alternative option was an immediate outsourcing strategy with facility services being the object of large cross-functional contracts for all Danish military establishments. By succeeding in presenting ‘internal optimization’ as an outsourcing option (as opposed to the usual ‘make’ option) this case...... demonstrates the power of projects and their use of accounting calculation. We study how the two options emerged and were valued differently by the supra-national outsourcing program and the local Defense projects over 22 years and how that valuation process involved accounting. Drawing on Actor-Network Theory...... outsourcing strategies during a series of trials of strength, 2. develops the concept of ‘trial of strength’ for accounting and organization research by showing how ‘the rules of the game’ for the trials of strength can become challenged and controversial, 3. shows that, in addition to the pervasive role...

  7. Strengths-based positive psychology interventions: a randomized placebo-controlled online trial on long-term effects for a signature strengths- vs. a lesser strengths-intervention.

    Science.gov (United States)

    Proyer, René T; Gander, Fabian; Wellenzohn, Sara; Ruch, Willibald

    2015-01-01

    Recent years have seen an increasing interest in research in positive psychology interventions. There is broad evidence for their effectiveness in increasing well-being and ameliorating depression. Intentional activities that focus on those character strengths, which are most typical for a person (i.e., signature strengths, SS) and encourage their usage in a new way have been identified as highly effective. The current study aims at comparing an intervention aimed at using SS with one on using individual low scoring (or lesser) strengths in a randomized placebo-controlled trial. A total of 375 adults were randomly assigned to one of the two intervention conditions [i.e., using five signature vs. five lesser strengths (LS) in a new way] or a placebo control condition (i.e., early memories). We measured happiness and depressive symptoms at five time points (i.e., pre- and post-test, 1-, 3-, and 6-months follow-ups) and character strengths at pre-test. The main findings are that (1) there were increases in happiness for up to 3 months and decreases in depressive symptoms in the short term in both intervention conditions; (2) participants found working with strengths equally rewarding (enjoyment and benefit) in both conditions; (3) those participants that reported generally higher levels of strengths benefitted more from working on LS rather than SS and those with comparatively lower levels of strengths tended to benefit more from working on SS; and (4) deviations from an average profile derived from a large sample of German-speakers completing the Values-in-Action Inventory of Strengths were associated with greater benefit from the interventions in the SS-condition. We conclude that working on character strengths is effective for increasing happiness and discuss how these interventions could be tailored to the individual for promoting their effectiveness.

  8. Shear strength of dentin and dentin bonded composites.

    Science.gov (United States)

    Mondragon, E; Söderholm, K J

    2001-01-01

    The objective of this study was to compare the shear strength of dentin with the shear strength of dentin bonded composites, and to determine how variables such as composite strength and blade width used during shear testing influence shear strength values. Dentin test samples (n = 36) were made by milling the anatomical molar crowns to a shape similar to a composite rod bonded to a flat dentin surface. Dentin bonding was accomplished by bonding composites to flat dentin surfaces (n = 72) using Scotchbond MP and Z100 (n = 36) or Silux Plus (n = 36) composites. Shear testing was conducted using a guillotine-like device with a flat blade embracing half the dentin or composite cylinders. The blade thickness was either 0.25, 0.5, 0.75, 1.0, 1.25, or 1.50 mm. Six samples per material and blade thickness were tested. In addition to the above study, the bond strength of Z100 (n = 6) and Silux (n = 6) bonded with Scotchbond MP and tested with an orthodontic edgewire loop were also tested and compared with the bond strength of the Z100 and Silux samples tested with the 0.5 thick blade. All shear testing was done at a load rate of 0.5 mm/min. The results were analyzed using ANOVA and Duncan's multiple range test. The shear strength values when tested with the blades were: dentin = 39.7 +/- 13.0 MPa, Z100 = 29.3 +/- 7.2 MPa, and Silux = 21.1 +/- 4.9 MPa; each group had significantly different values (p bonding agent is significantly lower than the shear strength of dentin. The shear strength depends on testing method (blade vs loop) and composite material.

  9. Strength Training for Young Athletes.

    Science.gov (United States)

    Kraemer, William J.; Fleck, Steven J.

    This guide is designed to serve as a resource for developing strength training programs for children. Chapter 1 uses research findings to explain why strength training is appropriate for children. Chapter 2 explains some of the important physiological concepts involved in children's growth and development as they apply to developing strength…

  10. Loading Conditions and Longitudinal Strength

    DEFF Research Database (Denmark)

    Sørensen, Herman

    1995-01-01

    Methods for the calculation of the lightweight of the ship.Loading conditions satisfying draught, trim and intact stability requirements and analysis of the corresponding stillwater longitudinal strength.......Methods for the calculation of the lightweight of the ship.Loading conditions satisfying draught, trim and intact stability requirements and analysis of the corresponding stillwater longitudinal strength....

  11. Phase strength and super lattices

    Indian Academy of Sciences (India)

    Unknown

    Abstract. Powder XRD investigations on dotriacontane-decane and dotriacontane-decanol mixtures are made. Phase strength, phase separation and formation of superlattices are discussed. The role of tunnel-like defects is considered. Keywords. Hydrocarbons; mixtures; phase strength; tunnel-like defects; super lattices. 1.

  12. New York State Code Adoption Analysis: Lighting Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Richman, Eric E.

    2004-10-20

    The adoption of the IECC 2003 Energy code will include a set of Lighting Power Density (LPD) values that are effectively a subset of the values in Addendum g to the ASHRAE/IESNA/ANSI 90.1-2001 Standard which will soon be printed as part of the 90.1-2004 version. An analysis of the effectiveness of this adoption for New York State can be provided by a direct comparison of these values with existing LPD levels represented in the current IECC 2000 code, which are themselves a subset of the current ASHRAE/IESNA/ANSI 90.1-2001 Standard (without addenda). Because the complete ASHRAE 2001 and 2004 sets of LPDs are supported by a set of detailed models, they are best suited to provide the basis for an analysis comparison of the two code levels of lighting power density stringency. It is important to note that this kind of analysis is a point-to-point comparison where a fixed level of real world activity is assumed. It is understood that buildings are not built precisely to code levels and that actual percentage of compliance above and below codes will vary among individual buildings and building types. However, without specific knowledge of this real world activity for all buildings in existence and in the future (post-code adoption) it is not possible to analyze actual effects of code adoption. However, it is possible to compare code levels and determine the potential effect of changes from one code requirement level to another. This is the comparison and effectiveness assessment

  13. LFSC - Linac Feedback Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  14. Code & order in polygonal billiards

    OpenAIRE

    Bobok, Jozef; Troubetzkoy, Serge

    2011-01-01

    Two polygons $P,Q$ are code equivalent if there are billiard orbits $u,v$ which hit the same sequence of sides and such that the projections of the orbits are dense in the boundaries $\\partial P, \\partial Q$. Our main results show when code equivalent polygons have the same angles, resp. are similar, resp. affinely similar.

  15. Distributed space-time coding

    CERN Document Server

    Jing, Yindi

    2014-01-01

    Distributed Space-Time Coding (DSTC) is a cooperative relaying scheme that enables high reliability in wireless networks. This brief presents the basic concept of DSTC, its achievable performance, generalizations, code design, and differential use. Recent results on training design and channel estimation for DSTC and the performance of training-based DSTC are also discussed.

  16. Grassmann codes and Schubert unions

    DEFF Research Database (Denmark)

    Hansen, Johan Peder; Johnsen, Trygve; Ranestad, Kristian

    2009-01-01

    We study subsets of Grassmann varieties over a field , such that these subsets are unions of Schubert cycles, with respect to a fixed flag. We study such sets in detail, and give applications to coding theory, in particular for Grassmann codes. For much is known about such Schubert unions with a ...

  17. NETWORK CODING BY BEAM FORMING

    DEFF Research Database (Denmark)

    2013-01-01

    Network coding by beam forming in networks, for example, in single frequency networks, can provide aid in increasing spectral efficiency. When network coding by beam forming and user cooperation are combined, spectral efficiency gains may be achieved. According to certain embodiments, a method...

  18. Code breaking in the pacific

    CERN Document Server

    Donovan, Peter

    2014-01-01

    Covers the historical context and the evolution of the technically complex Allied Signals Intelligence (Sigint) activity against Japan from 1920 to 1945 Describes, explains and analyzes the code breaking techniques developed during the war in the Pacific Exposes the blunders (in code construction and use) made by the Japanese Navy that led to significant US Naval victories

  19. Squares of Random Linear Codes

    DEFF Research Database (Denmark)

    Cascudo Pueyo, Ignacio; Cramer, Ronald; Mirandola, Diego

    2015-01-01

    Given a linear code $C$, one can define the $d$-th power of $C$ as the span of all componentwise products of $d$ elements of $C$. A power of $C$ may quickly fill the whole space. Our purpose is to answer the following question: does the square of a code ``typically'' fill the whole space? We give...

  20. Interleaver Design for Turbo Coding

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl; Zyablov, Viktor

    1997-01-01

    By a combination of construction and random search based on a careful analysis of the low weight words and the distance properties of the component codes, it is possible to find interleavers for turbo coding with a high minimum distance. We have designed a block interleaver with permutations...

  1. Flow Analysis of Code Customizations

    DEFF Research Database (Denmark)

    Hessellund, Anders; Sestoft, Peter

    2008-01-01

    Inconsistency between metadata and code customizations is a major concern in modern, configurable enterprise systems. The increasing reliance on metadata, in the form of XML files, and code customizations, in the form of Java files, has led to a hybrid development platform. The expected consisten...

  2. Recommendations for ECG diagnostic coding

    NARCIS (Netherlands)

    Bonner, R.E.; Caceres, C.A.; Cuddy, T.E.; Meijler, F.L.; Milliken, J.A.; Rautaharju, P.M.; Robles de Medina, E.O.; Willems, J.L.; Wolf, H.K.; Working Group 'Diagnostic Codes'

    1978-01-01

    The Oxford dictionary defines code as "a body of laws so related to each other as to avoid inconsistency and overlapping". It is obvious that natural language with its high degree of ambiguity does not qualify as a code in the sense of this definition. Everyday experiences provide ample evidence

  3. Framework of a Contour Based Depth Map Coding Method

    Science.gov (United States)

    Wang, Minghui; He, Xun; Jin, Xin; Goto, Satoshi

    Stereo-view and multi-view video formats are heavily investigated topics given their vast application potential. Depth Image Based Rendering (DIBR) system has been developed to improve Multiview Video Coding (MVC). Depth image is introduced to synthesize virtual views on the decoder side in this system. Depth image is a piecewise image, which is filled with sharp contours and smooth interior. Contours in a depth image show more importance than interior in view synthesis process. In order to improve the quality of the synthesized views and reduce the bitrate of depth image, a contour based coding strategy is proposed. First, depth image is divided into layers by different depth value intervals. Then regions, which are defined as the basic coding unit in this work, are segmented from each layer. The region is further divided into the contour and the interior. Two different procedures are employed to code contours and interiors respectively. A vector-based strategy is applied to code the contour lines. Straight lines in contours cost few of bits since they are regarded as vectors. Pixels, which are out of straight lines, are coded one by one. Depth values in the interior of a region are modeled by a linear or nonlinear formula. Coefficients in the formula are retrieved by regression. This process is called interior painting. Unlike conventional block based coding method, the residue between original frame and reconstructed frame (by contour rebuilt and interior painting) is not sent to decoder. In this proposal, contour is coded in a lossless way whereas interior is coded in a lossy way. Experimental results show that the proposed Contour Based Depth map Coding (CBDC) achieves a better performance than JMVC (reference software of MVC) in the high quality scenarios.

  4. Numerical and experimental study on the thermal shock strength of Tungsten by laser irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Dai Zhijun [Department of Mechanical Engineering, Nagaoka University of Technology, Nagaoka, Niigata 940-2188 (Japan); Department of Engineering Mechanics, Shanghai Jiaotong University, Shanghai 200240 (China)], E-mail: daizhij@yahoo.com.cn; Mutoh, Yoshiharu [Department of Mechanical Engineering, Nagaoka University of Technology, Nagaoka, Niigata 940-2188 (Japan); Sujatanond, Supamard [Department of Mechanical Engineering, Nagaoka University of Technology, Nagaoka, Niigata 940-2188 (Japan); Department of Industrial Engineering, Thammasat University, Pathum-thani 12120 (Thailand)

    2008-01-15

    The purpose of this paper is to investigate the thermal shock property of Tungsten through finite element method and laser irradiation experiments. A finite element model is developed to simulate the thermal shock behavior of Tungsten irradiated by a laser beam. An axis-symmetric model is adopted to perform the numerical simulation with the finite element code ABAQUS. The element removal and reactivation methods are used to simulate the melting and solidification processes, where the latent heat of Tungsten is introduced to consider the additional heat due to phase change. Distributions of the radial and circumferential stresses are discussed in detail. In addition, a three dimensional finite element model is also developed to calculate the value of K{sub I}. Variations of K{sub I} at the tip points of a radial crack with time in the cooling process are obtained. The critical power density curves are presented by taking the tensile strength criterion. Finally the thermal shock experiments are performed. Good agreements between the numerical solutions and the experimental results are achieved. It is concluded that the critical power density curves can be a measure to evaluate the thermal shock strength of Tungsten.

  5. STRENGTH OF NANOMODIFIED HIGH-STRENGTH LIGHTWEIGHT CONCRETES

    Directory of Open Access Journals (Sweden)

    NOZEMTСEV Alexandr Sergeevich

    2013-02-01

    Full Text Available The paper presents the results of research aimed at development of nanomodified high-strength lightweight concrete for construction. The developed concretes are of low average density and high ultimate compressive strength. It is shown that to produce this type of concrete one need to use hollow glass and aluminosilicate microspheres. To increase the durability of adhesion between cement stone and fine filler the authors offer to use complex nanodimensinal modifier based on iron hydroxide sol and silica sol as a surface nanomodifier for hollow microspheres. It is hypothesized that the proposed modifier has complex effect on the activity of the cement hydration and, at the same time increases bond strength between filler and cement-mineral matrix. The compositions for energy-efficient nanomodified high-strength lightweight concrete which density is 1300…1500 kg/m³ and compressive strength is 40…65 MPa have been developed. The approaches to the design of high-strength lightweight concrete with density of less than 2000 kg/m³ are formulated. It is noted that the proposed concretes possess dense homogeneous structure and moderate mobility. Thus, they allow processing by vibration during production. The economic and practical implications for realization of high-strength lightweight concrete in industrial production have been justified.

  6. Characterization of the Compressive Strength of Sandcrete Blocks in ...

    African Journals Online (AJOL)

    2005-05-23

    The results showed that the 450 mm x 150 mm x 225 mm sandcrete blocks in circulation as at May 23, 2005 had an average strength of 0.55 N/mm2 while those of 450 mm x 225 mm x 225 mm had an average compressive strength of 0.45 N/mm2 as at November 13, 2005. These values are very much lower than those ...

  7. Capacity achieving nonbinary LDPC coded non-uniform shaping modulation for adaptive optical communications.

    Science.gov (United States)

    Lin, Changyu; Zou, Ding; Liu, Tao; Djordjevic, Ivan B

    2016-08-08

    A mutual information inspired nonbinary coded modulation design with non-uniform shaping is proposed. Instead of traditional power of two signal constellation sizes, we design 5-QAM, 7-QAM and 9-QAM constellations, which can be used in adaptive optical networks. The non-uniform shaping and LDPC code rate are jointly considered in the design, which results in a better performance scheme for the same SNR values. The matched nonbinary (NB) LDPC code is used for this scheme, which further improves the coding gain and the overall performance. We analyze both coding performance and system SNR performance. We show that the proposed NB LDPC-coded 9-QAM has more than 2dB gain in symbol SNR compared to traditional LDPC-coded star-8-QAM. On the other hand, the proposed NB LDPC-coded 5-QAM and 7-QAM have even better performance than LDPC-coded QPSK.

  8. What Froze the Genetic Code?

    Science.gov (United States)

    Ribas de Pouplana, Lluís; Torres, Adrian Gabriel; Rafels-Ybern, Àlbert

    2017-04-05

    The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  9. What Froze the Genetic Code?

    Directory of Open Access Journals (Sweden)

    Lluís Ribas de Pouplana

    2017-04-01

    Full Text Available The frozen accident theory of the Genetic Code was a proposal by Francis Crick that attempted to explain the universal nature of the Genetic Code and the fact that it only contains information for twenty amino acids. Fifty years later, it is clear that variations to the universal Genetic Code exist in nature and that translation is not limited to twenty amino acids. However, given the astonishing diversity of life on earth, and the extended evolutionary time that has taken place since the emergence of the extant Genetic Code, the idea that the translation apparatus is for the most part immobile remains true. Here, we will offer a potential explanation to the reason why the code has remained mostly stable for over three billion years, and discuss some of the mechanisms that allow species to overcome the intrinsic functional limitations of the protein synthesis machinery.

  10. Tristan code and its application

    Science.gov (United States)

    Nishikawa, K.-I.

    Since TRISTAN: The 3-D Electromagnetic Particle Code was introduced in 1990, it has been used for many applications including the simulations of global solar windmagnetosphere interaction. The most essential ingridients of this code have been published in the ISSS-4 book. In this abstract we describe some of issues and an application of this code for the study of global solar wind-magnetosphere interaction including a substorm study. The basic code (tristan.f) for the global simulation and a local simulation of reconnection with a Harris model (issrec2.f) are available at http:/www.physics.rutger.edu/˜kenichi. For beginners the code (isssrc2.f) with simpler boundary conditions is suitable to start to run simulations. The future of global particle simulations for a global geospace general circulation (GGCM) model with predictive capability (for Space Weather Program) is discussed.

  11. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...... steps, conventionally performed at the video encoder side, to the decoder side. Emerging applications such as wireless visual sensor networks and wireless video surveillance all require lightweight video encoding with high coding efficiency and error-resilience. The video data of DVC schemes differ from...... the assumptions of SW and WZ distributed coding, e.g. by being correlated in time and nonstationary. Improving the efficiency of DVC coding is challenging. This paper presents some selected techniques to address the DVC challenges. Focus is put on pin-pointing how the decoder steps are modified to provide...

  12. Non-Protein Coding RNAs

    CERN Document Server

    Walter, Nils G; Batey, Robert T

    2009-01-01

    This book assembles chapters from experts in the Biophysics of RNA to provide a broadly accessible snapshot of the current status of this rapidly expanding field. The 2006 Nobel Prize in Physiology or Medicine was awarded to the discoverers of RNA interference, highlighting just one example of a large number of non-protein coding RNAs. Because non-protein coding RNAs outnumber protein coding genes in mammals and other higher eukaryotes, it is now thought that the complexity of organisms is correlated with the fraction of their genome that encodes non-protein coding RNAs. Essential biological processes as diverse as cell differentiation, suppression of infecting viruses and parasitic transposons, higher-level organization of eukaryotic chromosomes, and gene expression itself are found to largely be directed by non-protein coding RNAs. The biophysical study of these RNAs employs X-ray crystallography, NMR, ensemble and single molecule fluorescence spectroscopy, optical tweezers, cryo-electron microscopy, and ot...

  13. Bond strength of two component injection moulded MID

    DEFF Research Database (Denmark)

    Islam, Mohammad Aminul; Hansen, Hans Nørgaard; Tang, Peter Torben

    2006-01-01

    the two different plastic materials in the MID structure require good bonding between them. This paper finds suitable combinations of materials for MIDs from both bond strength and metallisation view-point. Plastic parts were made by two-shot injection moulding and the effects of some important process...... parameters on the resulting bond strength were investigated. A simple test setup has been used to measure the bond strength of 2k moulded plastic parts. This paper expresses the test results in numerical values and suggests suitable combinations of polymers from a large number of possibilities. The results...

  14. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  15. Some algorithmic problems of plotting codes for unstructured grids

    Science.gov (United States)

    Loehner, Rainald; Parikh, Paresh; Gumbert, Clyde

    1989-01-01

    Some algorithmic problems encountered during the development of unstructured grid plotting codes are described. Chief among them are the interpolation of three-dimensional data on planes, the plotting of a three-dimensional surface with a constant value for a given unknown, and the calculation of particle and oil-flow paths. Some special features of the unstructured grid plotting code, FEPLOT3D, are also described.

  16. Design of a VLSI Decoder for Partially Structured LDPC Codes

    Directory of Open Access Journals (Sweden)

    Fabrizio Vacca

    2008-01-01

    of their parity matrix can be partitioned into two disjoint sets, namely, the structured and the random ones. For the proposed class of codes a constructive design method is provided. To assess the value of this method the constructed codes performance are presented. From these results, a novel decoding method called split decoding is introduced. Finally, to prove the effectiveness of the proposed approach a whole VLSI decoder is designed and characterized.

  17. Nonextensive statistical approach to non-coding human DNA

    Science.gov (United States)

    Oikonomou, Th.; Provata, A.; Tirnakli, U.

    2008-04-01

    We use q-exponential distributions, which maximize the nonextensive entropy Sq (defined as Sq≡(1-∑ipiq)/(q-1)), to study the size distributions of non-coding DNA (including introns and intergenic regions) in all human chromosomes. We show that the value of the exponent q describing the non-coding size distributions is similar for all chromosomes and varies between 2≤q≤2.3 with the exception of chromosomes X and Y.

  18. ICAN Computer Code Adapted for Building Materials

    Science.gov (United States)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  19. Ensuring that User Defined Code does not See Uninitialized Fields

    DEFF Research Database (Denmark)

    Nielsen, Anders Bach

    2007-01-01

    Initialization of objects is commonly handled by user code, often in special routines known as constructors. This applies even in a virtual machine with multiple concurrent execution engines that all share the same heap. But for a language where run-time values play a role in the type system......, no user defined code can be allowed to use a field before it is initialized. This paper presents an approach which ensures that user code will not see uninitialized fields. It uses a dual-mode execution model to maintain a reasonable level of performance....

  20. Modeling of Breathy Voice Quality Using Pitch-strength Estimates.

    Science.gov (United States)

    Eddins, David A; Anand, Supraja; Camacho, Arturo; Shrivastav, Rahul

    2016-11-01

    The characteristic voice quality of a speaker conveys important linguistic, paralinguistic, and vocal health-related information. Pitch strength refers to the salience of pitch sensation in a sound and was recently reported to be strongly correlated with the magnitude of perceived breathiness based on a small number of voice stimuli. The current study examined the relationship between perceptual judgments of breathiness and computational estimates of pitch strength based on the Aud-SWIPE (P-NP) algorithm for a large number of voice stimuli (330 synthetic and 57 natural). Similar to the earlier study, the current results confirm a strong relationship between estimated pitch strength and listener judgments of breathiness such that low pitch-strength values are associated with voices that have high perceived breathiness. Based on this result, a model was developed for the perception of breathy voice quality using a pitch-strength estimator. Regression functions derived between the pitch-strength estimates and perceptual judgments of breathiness obtained from matching task revealed a linear relationship for a subset of the natural stimuli. We then used this function to obtain predicted breathiness values for the synthetic and the remaining natural stimuli. Predicted breathiness values from our model were highly correlated with the perceptual data for both types of stimuli. Systematic differences between the breathiness of natural and synthetic stimuli are discussed. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  1. Evaluation of ICD-9-CM codes for craniofacial microsomia.

    Science.gov (United States)

    Luquetti, Daniela V; Saltzman, Babette S; Vivaldi, Daniela; Pimenta, Luiz A; Hing, Anne V; Cassell, Cynthia H; Starr, Jacqueline R; Heike, Carrie L

    2012-12-01

    Craniofacial microsomia (CFM) is a congenital condition characterized by microtia and mandibular underdevelopment. Healthcare databases and birth defects surveillance programs could be used to improve knowledge of CFM. However, no specific International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9-CM) code exists for this condition, which makes standardized data collection challenging. Our aim was to evaluate the validity of existing ICD-9-CM codes to identify individuals with CFM. Study sample eligibility criteria were developed by an expert panel and matched to 11 ICD-9-CM codes. We queried hospital discharge data from two craniofacial centers and identified a total of 12,254 individuals who had ≥1 potentially CFM-related code(s). We reviewed all (n = 799) medical records identified at the University of North Carolina (UNC) and 500 randomly selected records at Seattle Children's Hospital (SCH). Individuals were classified as a CFM case or non-case. Thirty-two individuals (6%) at SCH and 93 (12%) at UNC met the CFM eligibility criteria. At both centers, 59% of cases and 95% of non-cases had only one code assigned. At both centers, the most frequent codes were 744.23 (microtia), 754.0 and 756.0 (nonspecific codes), and the code 744.23 had a positive predictive value (PPV) >80% and sensitivity >70%. The code 754.0 had a sensitivity of 3% (PPV <1%) at SCH and 36% (PPV = 5%) at UNC, whereas 756.0 had a sensitivity of 38% (PPV = 5%) at SCH and 18% (PPV = 26%) at UNC. These findings suggest the need for a specific CFM code to facilitate CFM surveillance and research. Copyright © 2012 Wiley Periodicals, Inc.

  2. From “Smaller is Stronger” to “Size-Independent Strength Plateau”: Towards Measuring the Ideal Strength of Iron

    KAUST Repository

    Han, Wei-Zhong

    2015-04-17

    The trend from “smaller is stronger” to “size-independent strength plateau” is observed in the compression of spherical iron nanoparticles. When the diameter of iron nanospheres is less than a critical value, the maximum contact pressure saturates at 10.7 GPa, corresponding to a local shear stress of ≈9.4 GPa, which is comparable to the theoretical shear strength of iron.

  3. Strength Calculation of Locally Loaded Orthotropic Shells

    Directory of Open Access Journals (Sweden)

    Yu. I. Vinogradov

    2015-01-01

    Full Text Available The article studies laminated orthotropic cylindrical, conic, spherical, and toroidal shells, which are often locally loaded in the aircraft designs over small areas of their surfaces.The aim of this work is to determine stress concentration in shells versus structure of orthotropic composite material, shell form and parameters, forms of loading areas, which borders do not coincide with lines of main curvatures of shells. For this purpose, an analytical computing algorithm to estimate strength of shells in terms of stress is developed. It enables us to have solution results of the boundary value problem with a controlled error. To solve differential equations an analytical method is used. An algorithm of the boundary value problem solution is multiplicative.The main results of researches are graphs of stress concentration in the orthotropic shells versus their parameters and areas of loading lineated by circles and ellipses.Among the other works aimed at determination of stress concentration in shells, the place of this one is defined by the analytical solution of applied problems for strength estimation in terms of shell stresses of classical forms.The developed effective analytical algorithm to solve the boundary value problem and received results are useful in research and development.

  4. The ZPIC educational code suite

    Science.gov (United States)

    Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.

    2017-10-01

    Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.

  5. ETR/ITER systems code

    Energy Technology Data Exchange (ETDEWEB)

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L. (ed.)

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  6. The Flutter Shutter Code Calculator

    Directory of Open Access Journals (Sweden)

    Yohann Tendero

    2015-08-01

    Full Text Available The goal of the flutter shutter is to make uniform motion blur invertible, by a"fluttering" shutter that opens and closes on a sequence of well chosen sub-intervals of the exposure time interval. In other words, the photon flux is modulated according to a well chosen sequence calledflutter shutter code. This article provides a numerical method that computes optimal flutter shutter codes in terms of mean square error (MSE. We assume that the observed objects follow a known (or learned random velocity distribution. In this paper, Gaussian and uniform velocity distributions are considered. Snapshots are also optimized taking the velocity distribution into account. For each velocity distribution, the gain of the optimal flutter shutter code with respectto the optimal snapshot in terms of MSE is computed. This symmetric optimization of theflutter shutter and of the snapshot allows to compare on an equal footing both solutions, i.e. camera designs. Optimal flutter shutter codes are demonstrated to improve substantially the MSE compared to classic (patented or not codes. A numerical method that permits to perform a reverse engineering of any existing (patented or not flutter shutter codes is also describedand an implementation is given. In this case we give the underlying velocity distribution fromwhich a given optimal flutter shutter code comes from. The combination of these two numerical methods furnishes a comprehensive study of the optimization of a flutter shutter that includes a forward and a backward numerical solution.

  7. Surface code implementation of block code state distillation

    Science.gov (United States)

    Fowler, Austin G.; Devitt, Simon J.; Jones, Cody

    2013-01-01

    State distillation is the process of taking a number of imperfect copies of a particular quantum state and producing fewer better copies. Until recently, the lowest overhead method of distilling states produced a single improved |A〉 state given 15 input copies. New block code state distillation methods can produce k improved |A〉 states given 3k + 8 input copies, potentially significantly reducing the overhead associated with state distillation. We construct an explicit surface code implementation of block code state distillation and quantitatively compare the overhead of this approach to the old. We find that, using the best available techniques, for parameters of practical interest, block code state distillation does not always lead to lower overhead, and, when it does, the overhead reduction is typically less than a factor of three. PMID:23736868

  8. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  9. Prediction of Torsional Strength for Very High Early Strength Geopolymer

    Directory of Open Access Journals (Sweden)

    Woraphot PRACHASAREE

    2017-11-01

    Full Text Available Very early high strength geopolymers are gaining acceptance as alternative repair materials for highways and other infrastructure. In this study, a very rapid geopolymer binder based on Metakaolin (MK and Parawood ash (PWA, developed by the authors, was experimentally tested and a prediction model for its torsional strength is proposed. The geopolymer samples were subjected to uniaxial compression, flexural beam, and torsion tests. The modulus of rupture and torsional strength in terms of compression strength were found to be well approximated by 0.7(f’c1/2 and 1/7(x2y (f’c1/2, respectively. Also an interaction relation to describe combined bending and torsion was developed in this study. In addition, the effects of aspect ratio (y/x were studied on both torsional strength and combined bending and torsion. It was found that an aspect ratio of y/x = 3 significantly reduced the torsional resistance, to about 50 % of the torsional strength of a square section.DOI: http://dx.doi.org/10.5755/j01.ms.23.4.17280

  10. Lossless Coding with Generalised Criteria

    CERN Document Server

    Charalambous, Charalambos D; Rezaei, Farzad

    2011-01-01

    This paper presents prefix codes which minimize various criteria constructed as a convex combination of maximum codeword length and average codeword length or maximum redundancy and average redundancy, including a convex combination of the average of an exponential function of the codeword length and the average redundancy. This framework encompasses as a special case several criteria previously investigated in the literature, while relations to universal coding is discussed. The coding algorithm derived is parametric resulting in re-adjusting the initial source probabilities via a weighted probability vector according to a merging rule. The level of desirable merging has implication in applications where the maximum codeword length is bounded.

  11. LiveCode mobile development

    CERN Document Server

    Lavieri, Edward D

    2013-01-01

    A practical guide written in a tutorial-style, ""LiveCode Mobile Development Hotshot"" walks you step-by-step through 10 individual projects. Every project is divided into sub tasks to make learning more organized and easy to follow along with explanations, diagrams, screenshots, and downloadable material.This book is great for anyone who wants to develop mobile applications using LiveCode. You should be familiar with LiveCode and have access to a smartphone. You are not expected to know how to create graphics or audio clips.

  12. Network Coding Fundamentals and Applications

    CERN Document Server

    Medard, Muriel

    2011-01-01

    Network coding is a field of information and coding theory and is a method of attaining maximum information flow in a network. This book is an ideal introduction for the communications and network engineer, working in research and development, who needs an intuitive introduction to network coding and to the increased performance and reliability it offers in many applications. This book is an ideal introduction for the research and development communications and network engineer who needs an intuitive introduction to the theory and wishes to understand the increased performance and reliabil

  13. Writing the Live Coding Book

    DEFF Research Database (Denmark)

    Blackwell, Alan; Cox, Geoff; Lee, Sang Wong

    2016-01-01

    This paper is a speculation on the relationship between coding and writing, and the ways in which technical innovations and capabilities enable us to rethink each in terms of the other. As a case study, we draw on recent experiences of preparing a book on live coding, which integrates a wide range...... of personal, historical, technical and critical perspectives. This book project has been both experimental and reflective, in a manner that allows us to draw on critical understanding of both code and writing, and point to the potential for new practices in the future....

  14. Linear network error correction coding

    CERN Document Server

    Guang, Xuan

    2014-01-01

    There are two main approaches in the theory of network error correction coding. In this SpringerBrief, the authors summarize some of the most important contributions following the classic approach, which represents messages by sequences?similar to algebraic coding,?and also briefly discuss the main results following the?other approach,?that uses the theory of rank metric codes for network error correction of representing messages by subspaces. This book starts by establishing the basic linear network error correction (LNEC) model and then characterizes two equivalent descriptions. Distances an

  15. i-Review: Sharing Code

    Directory of Open Access Journals (Sweden)

    Jonas Kubilius

    2014-02-01

    Full Text Available Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF. GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  16. Neural Decoder for Topological Codes

    Science.gov (United States)

    Torlai, Giacomo; Melko, Roger G.

    2017-07-01

    We present an algorithm for error correction in topological codes that exploits modern machine learning techniques. Our decoder is constructed from a stochastic neural network called a Boltzmann machine, of the type extensively used in deep learning. We provide a general prescription for the training of the network and a decoding strategy that is applicable to a wide variety of stabilizer codes with very little specialization. We demonstrate the neural decoder numerically on the well-known two-dimensional toric code with phase-flip errors.

  17. Strength training and shoulder proprioception

    National Research Council Canada - National Science Library

    Salles, José Inácio; Velasques, Bruna; Cossich, Victor; Nicoliche, Eduardo; Ribeiro, Pedro; Amaral, Marcus Vinicius; Motta, Geraldo

    2015-01-01

    .... To evaluate the result of an 8-week strength-training program on shoulder JPS and to verify whether using training intensities that are the same or divergent for the shoulder's dynamic-stabilizer...

  18. Muscle Strength and Poststroke Hemiplegia

    DEFF Research Database (Denmark)

    Kristensen, Otto H; Stenager, Egon; Dalgas, Ulrik

    2017-01-01

    OBJECTIVES: To systematically review (1) psychometric properties of criterion isokinetic dynamometry testing of muscle strength in persons with poststroke hemiplegia (PPSH); and (2) literature that compares muscle strength in patients poststroke with that in healthy controls assessed by criterion...... isokinetic dynamometry. DATA SOURCES: A systematic literature search of 7 databases was performed. STUDY SELECTION: Included studies (1) enrolled participants with definite poststroke hemiplegia according to defined criteria; (2) assessed muscle strength or power by criterion isokinetic dynamometry; (3) had...... undergone peer review; and (4) were available in English or Danish. DATA EXTRACTION: The psychometric properties of isokinetic dynamometry were reviewed with respect to reliability, validity, and responsiveness. Furthermore, comparisons of strength between paretic, nonparetic, and comparable healthy muscles...

  19. Strengths, weaknesses, opportunities and threats

    DEFF Research Database (Denmark)

    Bull, Joseph William; Jobstvogt, N.; Böhnke-Henrichs, A.

    2016-01-01

    The ecosystem services concept (ES) is becoming a cornerstone of contemporary sustainability thought. Challenges with this concept and its applications are well documented, but have not yet been systematically assessed alongside strengths and external factors that influence uptake. Such an assess......The ecosystem services concept (ES) is becoming a cornerstone of contemporary sustainability thought. Challenges with this concept and its applications are well documented, but have not yet been systematically assessed alongside strengths and external factors that influence uptake....... Such an assessment could form the basis for improving ES thinking, further embedding it into environmental decisions and management.The Young Ecosystem Services Specialists (YESS) completed a Strengths-Weaknesses-Opportunities-Threats (SWOT) analysis of ES through YESS member surveys. Strengths include the approach...

  20. Particle Strength of Bayer Hydrate

    Science.gov (United States)

    Anjier, J. L.; Marten, D. F. G.

    Because of the proposed use of fluid bed calciners at the Kaiser Aluminum Baton Rouge Works, studies into the strength of alumina and alumina trihydrate from eight different alumina plants were initiated. It was found in the course of these studies that the particle strength of Bayer hydrate depended on the precipitation process conditions under which it was produced. A series of laboratory precipitation tests was conducted to determine the effect on particle strength of process variables such as seed charge, temperature, caustic concentration and seed recycle. It is concluded from these studies that relative particle strength of alumina trihydrate, as measured by a modified Forsythe-Hertwig Apparatus, can be predicted from a knowledge of the precipitation process conditions.