WorldWideScience

Sample records for permits code-based claims

  1. Development and evaluation of a Naïve Bayesian model for coding causation of workers' compensation claims.

    Science.gov (United States)

    Bertke, S J; Meyers, A R; Wurzelbacher, S J; Bell, J; Lampl, M L; Robins, D

    2012-12-01

    Tracking and trending rates of injuries and illnesses classified as musculoskeletal disorders caused by ergonomic risk factors such as overexertion and repetitive motion (MSDs) and slips, trips, or falls (STFs) in different industry sectors is of high interest to many researchers. Unfortunately, identifying the cause of injuries and illnesses in large datasets such as workers' compensation systems often requires reading and coding the free form accident text narrative for potentially millions of records. To alleviate the need for manual coding, this paper describes and evaluates a computer auto-coding algorithm that demonstrated the ability to code millions of claims quickly and accurately by learning from a set of previously manually coded claims. The auto-coding program was able to code claims as a musculoskeletal disorders, STF or other with approximately 90% accuracy. The program developed and discussed in this paper provides an accurate and efficient method for identifying the causation of workers' compensation claims as a STF or MSD in a large database based on the unstructured text narrative and resulting injury diagnoses. The program coded thousands of claims in minutes. The method described in this paper can be used by researchers and practitioners to relieve the manual burden of reading and identifying the causation of claims as a STF or MSD. Furthermore, the method can be easily generalized to code/classify other unstructured text narratives. Published by Elsevier Ltd.

  2. PS2-15: Coding for Obesity in a Health Plan Claims Database

    OpenAIRE

    Shainline, Michael; Carter, Shelley; Von Worley, Ann; Gunter, Margaret

    2010-01-01

    Background and Aims: The Centers for Disease Control estimated the obesity rate in New Mexico for 2008 to be 25.2%. Sources estimate the following associations between obesity and type 2 diabetes (80%); cardiovascular disease (70%); hypertension (26 %). Yet obesity is infrequently coded as a secondary diagnosis among providers submitting claims. This study examines the frequency with which obesity is documented on claims forms, the relationship between age, gender, and obesity coding, and the...

  3. Non-binary unitary error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E.

    1996-06-01

    Error operator bases for systems of any dimension are defined and natural generalizations of the bit-flip/ sign-change error basis for qubits are given. These bases allow generalizing the construction of quantum codes based on eigenspaces of Abelian groups. As a consequence, quantum codes can be constructed form linear codes over {ital Z}{sub {ital n}} for any {ital n}. The generalization of the punctured code construction leads to many codes which permit transversal (i.e. fault tolerant) implementations of certain operations compatible with the error basis.

  4. Positive predictive value between medical-chart body-mass-index category and obesity versus codes in a claims-data warehouse.

    Science.gov (United States)

    Caplan, Eleanor O; Kamble, Pravin S; Harvey, Raymond A; Smolarz, B Gabriel; Renda, Andrew; Bouchard, Jonathan R; Huang, Joanna C

    2018-01-01

    To evaluate the positive predictive value of claims-based V85 codes for identifying individuals with varying degrees of BMI relative to their measured BMI obtained from medical record abstraction. This was a retrospective validation study utilizing administrative claims and medical chart data from 1 January 2009 to 31 August 2015. Randomly selected samples of patients enrolled in a Medicare Advantage Prescription Drug (MAPD) or commercial health plan and with a V85 claim were identified. The claims-based BMI category (underweight, normal weight, overweight, obese class I-III) was determined via corresponding V85 codes and compared to the BMI category derived from chart abstracted height, weight and/or BMI. The positive predictive values (PPVs) of the claims-based BMI categories were calculated with the corresponding 95% confidence intervals (CIs). The overall PPVs (95% CIs) in the MAPD and commercial samples were 90.3% (86.3%-94.4%) and 91.1% (87.3%-94.9%), respectively. In each BMI category, the PPVs (95% CIs) for the MAPD and commercial samples, respectively, were: underweight, 71.0% (55.0%-87.0%) and 75.9% (60.3%-91.4%); normal, 93.8% (85.4%-100%) and 87.8% (77.8%-97.8%); overweight, 97.4% (92.5%-100%) and 93.5% (84.9%-100%); obese class I, 96.9 (90.9%-100%) and 97.2% (91.9%-100%); obese class II, 97.0% (91.1%-100%) and 93.0% (85.4%-100%); and obese class III, 85.0% (73.3%-96.1%) and 97.1% (91.4%-100%). BMI categories derived from administrative claims, when available, can be used successfully particularly in the context of obesity research.

  5. Defining hip fracture with claims data: outpatient and provider claims matter.

    Science.gov (United States)

    Berry, S D; Zullo, A R; McConeghy, K; Lee, Y; Daiello, L; Kiel, D P

    2017-07-01

    Medicare claims are commonly used to identify hip fractures, but there is no universally accepted definition. We found that a definition using inpatient claims identified fewer fractures than a definition including outpatient and provider claims. Few additional fractures were identified by including inconsistent diagnostic and procedural codes at contiguous sites. Medicare claims data is commonly used in research studies to identify hip fractures, but there is no universally accepted definition of fracture. Our purpose was to describe potential misclassification when hip fractures are defined using Medicare Part A (inpatient) claims without considering Part B (outpatient and provider) claims and when inconsistent diagnostic and procedural codes occur at contiguous fracture sites (e.g., femoral shaft or pelvic). Participants included all long-stay nursing home residents enrolled in Medicare Parts A and B fee-for-service between 1/1/2008 and 12/31/2009 with follow-up through 12/31/2011. We compared the number of hip fractures identified using only Part A claims to (1) Part A plus Part B claims and (2) Part A and Part B claims plus discordant codes at contiguous fracture sites. Among 1,257,279 long-stay residents, 40,932 (3.2%) met the definition of hip fracture using Part A claims, and 41,687 residents (3.3%) met the definition using Part B claims. 4566 hip fractures identified using Part B claims would not have been captured using Part A claims. An additional 227 hip fractures were identified after considering contiguous fracture sites. When ascertaining hip fractures, a definition using outpatient and provider claims identified 11% more fractures than a definition with only inpatient claims. Future studies should publish their definition of fracture and specify if diagnostic codes from contiguous fracture sites were used.

  6. Sports-related injuries in New Zealand: National Insurance (Accident Compensation Corporation) claims for five sporting codes from 2012 to 2016.

    Science.gov (United States)

    King, Doug; Hume, Patria A; Hardaker, Natalie; Cummins, Cloe; Gissane, Conor; Clark, Trevor

    2018-03-12

    To provide epidemiological data and related costs for sport-related injuries of five sporting codes (cricket, netball, rugby league, rugby union and football) in New Zealand for moderate-to-serious and serious injury claims. A retrospective analytical review using detailed descriptive epidemiological data obtained from the Accident Compensation Corporation (ACC) for 2012-2016. Over the 5 years of study data, rugby union recorded the most moderate-to-serious injury entitlement claims (25 226) and costs (New Zealand dollars (NZD$)267 359 440 (£139 084 749)) resulting in the highest mean cost (NZD$10 484 (£5454)) per moderate-to-serious injury entitlement claim. Rugby union recorded more serious injury entitlement claims (n=454) than cricket (t (4) =-66.6; P<0.0001); netball (t (4) =-45.1; P<0.0001); rugby league (t (4) =-61.4; P<0.0001) and football (t (4) =66.6; P<0.0001) for 2012-2016. There was a twofold increase in the number of female moderate-to-serious injury entitlement claims for football (RR 2.6 (95%CI 2.2 to 2.9); P<0.0001) compared with cricket, and a threefold increase when compared with rugby union (risk ratio (RR) 3.1 (95%CI 2.9 to 3.3); P<0.0001). Moderate-to-serious concussion claims increased between 2012 and 2016 for netball (RR 3.7 (95%CI 1.9 to 7.1); P<0.0001), rugby union (RR 2.0 (95% CI 1.6 to 2.4); P<0.0001) and football (RR 2.3 (95%CI 1.6 to 3.2); P<0.0001). Nearly a quarter of moderate-to-serious entitlement claims (23%) and costs (24%) were to participants aged 35 years or older. Rugby union and rugby league have the highest total number and costs associated with injury. Accurate sport exposure data are needed to enable injury risk calculations. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. Claims-based definition of death in Japanese claims database: validity and implications.

    Science.gov (United States)

    Ooba, Nobuhiro; Setoguchi, Soko; Ando, Takashi; Sato, Tsugumichi; Yamaguchi, Takuhiro; Mochizuki, Mayumi; Kubota, Kiyoshi

    2013-01-01

    For the pending National Claims Database in Japan, researchers will not have access to death information in the enrollment files. We developed and evaluated a claims-based definition of death. We used healthcare claims and enrollment data between January 2005 and August 2009 for 195,193 beneficiaries aged 20 to 74 in 3 private health insurance unions. We developed claims-based definitions of death using discharge or disease status and Charlson comorbidity index (CCI). We calculated sensitivity, specificity and positive predictive values (PPVs) using the enrollment data as a gold standard in the overall population and subgroups divided by demographic and other factors. We also assessed bias and precision in two example studies where an outcome was death. The definition based on the combination of discharge/disease status and CCI provided moderate sensitivity (around 60%) and high specificity (99.99%) and high PPVs (94.8%). In most subgroups, sensitivity of the preferred definition was also around 60% but varied from 28 to 91%. In an example study comparing death rates between two anticancer drug classes, the claims-based definition provided valid and precise hazard ratios (HRs). In another example study comparing two classes of anti-depressants, the HR with the claims-based definition was biased and had lower precision than that with the gold standard definition. The claims-based definitions of death developed in this study had high specificity and PPVs while sensitivity was around 60%. The definitions will be useful in future studies when used with attention to the possible fluctuation of sensitivity in some subpopulations.

  8. Determinants of consumer understanding of health claims

    DEFF Research Database (Denmark)

    Grunert, Klaus G; Scholderer, Joachim; Rogeaux, Michel

    2011-01-01

    as safe, risky or other. In addition to the open questions on claim understanding, respondents rated a number of statements on claim interpretation for agreement and completed scales on interest in healthy eating, attitude to functional foods, and subjective knowledge on food and health. Results showed......The new EU regulation on nutrition and health claims states that claims can be permitted only if they can be expected to be understood by consumers. Investigating determinants of consumer understanding of health claims has therefore become an important topic. Understanding of a health claim...... on a yoghurt product was investigated with a sample of 720 category users in Germany. Health claim understanding was measured using open answers, which were subsequently content analysed and classified by comparison with the scientific dossier of the health claim. Based on this respondents were classified...

  9. Claims-Based Definition of Death in Japanese Claims Database: Validity and Implications

    Science.gov (United States)

    Ooba, Nobuhiro; Setoguchi, Soko; Ando, Takashi; Sato, Tsugumichi; Yamaguchi, Takuhiro; Mochizuki, Mayumi; Kubota, Kiyoshi

    2013-01-01

    Background For the pending National Claims Database in Japan, researchers will not have access to death information in the enrollment files. We developed and evaluated a claims-based definition of death. Methodology/Principal Findings We used healthcare claims and enrollment data between January 2005 and August 2009 for 195,193 beneficiaries aged 20 to 74 in 3 private health insurance unions. We developed claims-based definitions of death using discharge or disease status and Charlson comorbidity index (CCI). We calculated sensitivity, specificity and positive predictive values (PPVs) using the enrollment data as a gold standard in the overall population and subgroups divided by demographic and other factors. We also assessed bias and precision in two example studies where an outcome was death. The definition based on the combination of discharge/disease status and CCI provided moderate sensitivity (around 60%) and high specificity (99.99%) and high PPVs (94.8%). In most subgroups, sensitivity of the preferred definition was also around 60% but varied from 28 to 91%. In an example study comparing death rates between two anticancer drug classes, the claims-based definition provided valid and precise hazard ratios (HRs). In another example study comparing two classes of anti-depressants, the HR with the claims-based definition was biased and had lower precision than that with the gold standard definition. Conclusions/Significance The claims-based definitions of death developed in this study had high specificity and PPVs while sensitivity was around 60%. The definitions will be useful in future studies when used with attention to the possible fluctuation of sensitivity in some subpopulations. PMID:23741526

  10. Identifying Psoriasis and Psoriatic Arthritis Patients in Retrospective Databases When Diagnosis Codes Are Not Available: A Validation Study Comparing Medication/Prescriber Visit-Based Algorithms with Diagnosis Codes.

    Science.gov (United States)

    Dobson-Belaire, Wendy; Goodfield, Jason; Borrelli, Richard; Liu, Fei Fei; Khan, Zeba M

    2018-01-01

    Using diagnosis code-based algorithms is the primary method of identifying patient cohorts for retrospective studies; nevertheless, many databases lack reliable diagnosis code information. To develop precise algorithms based on medication claims/prescriber visits (MCs/PVs) to identify psoriasis (PsO) patients and psoriatic patients with arthritic conditions (PsO-AC), a proxy for psoriatic arthritis, in Canadian databases lacking diagnosis codes. Algorithms were developed using medications with narrow indication profiles in combination with prescriber specialty to define PsO and PsO-AC. For a 3-year study period from July 1, 2009, algorithms were validated using the PharMetrics Plus database, which contains both adjudicated medication claims and diagnosis codes. Positive predictive value (PPV), negative predictive value (NPV), sensitivity, and specificity of the developed algorithms were assessed using diagnosis code as the reference standard. Chosen algorithms were then applied to Canadian drug databases to profile the algorithm-identified PsO and PsO-AC cohorts. In the selected database, 183,328 patients were identified for validation. The highest PPVs for PsO (85%) and PsO-AC (65%) occurred when a predictive algorithm of two or more MCs/PVs was compared with the reference standard of one or more diagnosis codes. NPV and specificity were high (99%-100%), whereas sensitivity was low (≤30%). Reducing the number of MCs/PVs or increasing diagnosis claims decreased the algorithms' PPVs. We have developed an MC/PV-based algorithm to identify PsO patients with a high degree of accuracy, but accuracy for PsO-AC requires further investigation. Such methods allow researchers to conduct retrospective studies in databases in which diagnosis codes are absent. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Health and nutrition content claims on Australian fast-food websites.

    Science.gov (United States)

    Wellard, Lyndal; Koukoumas, Alexandra; Watson, Wendy L; Hughes, Clare

    2017-03-01

    To determine the extent that Australian fast-food websites contain nutrition content and health claims, and whether these claims are compliant with the new provisions of the Australia New Zealand Food Standards Code ('the Code'). Systematic content analysis of all web pages to identify nutrition content and health claims. Nutrition information panels were used to determine whether products with claims met Nutrient Profiling Scoring Criteria (NPSC) and qualifying criteria, and to compare them with the Code to determine compliance. Australian websites of forty-four fast-food chains including meals, bakery, ice cream, beverage and salad chains. Any products marketed on the websites using health or nutrition content claims. Of the forty-four fast-food websites, twenty (45 %) had at least one claim. A total of 2094 claims were identified on 371 products, including 1515 nutrition content (72 %) and 579 health claims (28 %). Five fast-food products with health (5 %) and 157 products with nutrition content claims (43 %) did not meet the requirements of the Code to allow them to carry such claims. New provisions in the Code came into effect in January 2016 after a 3-year transition. Food regulatory agencies should review fast-food websites to ensure compliance with the qualifying criteria for nutrition content and health claim regulations. This would prevent consumers from viewing unhealthy foods as healthier choices. Healthy choices could be facilitated by applying NPSC to nutrition content claims. Fast-food chains should be educated on the requirements of the Code regarding claims.

  12. Consumer attitudes and understanding of cholesterol-lowering claims on food: randomize mock-package experiments with plant sterol and oat fibre claims.

    Science.gov (United States)

    Wong, C L; Mendoza, J; Henson, S J; Qi, Y; Lou, W; L'Abbé, M R

    2014-08-01

    Few studies have examined consumer acceptability or comprehension of cholesterol-lowering claims on food labels. Our objective was to assess consumer attitudes and understanding of cholesterol-lowering claims regarding plant sterols (PS) and oat fibre (OF). We conducted two studies on: (1) PS claims and (2) OF claims. Both studies involved a randomized mock-packaged experiment within an online survey administered to Canadian consumers. In the PS study (n=721), we tested three PS-related claims (disease risk reduction claim, function claim and nutrient content claim) and a 'tastes great' claim (control) on identical margarine containers. Similarly, in the OF study (n=710), we tested three claims related to OF and a 'taste great' claim on identical cereal boxes. In both studies, participants answered the same set of questions on attitudes and understanding of claims after seeing each mock package. All claims that mentioned either PS or OF resulted in more positive attitudes than the taste control claim (Pprofile. How consumers responded to the nutrition claims between the two studies was influenced by contextual factors such as familiarity with the functional food/component and the food product that carried the claim. Permitted nutrition claims are approved based on physiological evidence and are allowed on any food product as long as it meets the associated nutrient criteria. However, it is difficult to generalize attitudes and understanding of claims when they are so highly dependent on contextual factors.

  13. A Kantian claim permitting the randomised clinical trial.

    Science.gov (United States)

    Katz, P

    2001-01-01

    Among the most contested aspects of medical research is the randomized clinical trial (RCT). While the majority of arguments justifying the RCT and its use in medical research rest within a utilitarian framework, many Kantians claim that a deontological ethical framework is prohibitive of the use RCTs in medical research. This paper argues that, in fact, the RCT is permissible within a deontological framework.

  14. Health claims in the labelling and marketing of food products:

    Science.gov (United States)

    Asp, Nils-Georg; Bryngelsson, Susanne

    2007-01-01

    Since 1990 certain health claims in the labelling and marketing of food products have been allowed in Sweden within the food sector's Code of Practice. The rules were developed in close dialogue with the authorities. The legal basis was a decision by the authorities not to apply the medicinal products’ legislation to “foods normally found on the dinner table” provided the rules defined in the Code were followed. The Code of Practice lists nine well-established diet–health relationships eligible for generic disease risk reduction claims in two steps and general rules regarding nutrient function claims. Since 2001, there has also been the possibility for using “product-specific physiological claims (PFP)”, subject to premarketing evaluation of the scientific dossier supporting the claim. The scientific documentation has been approved for 10 products with PFP, and another 15 products have been found to fulfil the Code's criteria for “low glycaemic index”. In the third edition of the Code, active since 2004, conditions in terms of nutritional composition were set, i.e. “nutrient profiles”, with a general reference to the Swedish National Food Administration's regulation on the use of a particular symbol, i.e. the keyhole symbol. Applying the Swedish Code of practice has provided experience useful in the implementation of the European Regulation on nutrition and health claims made on foods, effective from 2007.

  15. Use of health care claims data to study patients with ophthalmologic conditions.

    Science.gov (United States)

    Stein, Joshua D; Lum, Flora; Lee, Paul P; Rich, William L; Coleman, Anne L

    2014-05-01

    To describe what information is or is not included in health care claims data, provide an overview of the main advantages and limitations of performing analyses using health care claims data, and offer general guidance on how to report and interpret findings of ophthalmology-related claims data analyses. Systematic review. Not applicable. A literature review and synthesis of methods for claims-based data analyses. Not applicable. Some advantages of using claims data for analyses include large, diverse sample sizes, longitudinal follow-up, lack of selection bias, and potential for complex, multivariable modeling. The disadvantages include (a) the inherent limitations of claims data, such as incomplete, inaccurate, or missing data, or the lack of specific billing codes for some conditions; and (b) the inability, in some circumstances, to adequately evaluate the appropriateness of care. In general, reports of claims data analyses should include clear descriptions of the following methodological elements: the data source, the inclusion and exclusion criteria, the specific billing codes used, and the potential confounding factors incorporated in the multivariable models. The use of claims data for research is expected to increase with the enhanced availability of data from Medicare and other sources. The use of claims data to evaluate resource use and efficiency and to determine the basis for supplementary payment methods for physicians is anticipated. Thus, it will be increasingly important for eye care providers to use accurate and descriptive codes for billing. Adherence to general guidance on the reporting of claims data analyses, as outlined in this article, is important to enhance the credibility and applicability of findings. Guidance on optimal ways to conduct and report ophthalmology-related investigations using claims data will likely continue to evolve as health services researchers refine the metrics to analyze large administrative data sets. Copyright

  16. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases.

    Science.gov (United States)

    Weycker, Derek; Sofrygin, Oleg; Seefeld, Kim; Deeter, Robert G; Legg, Jason; Edelsberg, John

    2013-02-13

    Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN) and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive "gold standard" (ANC based definition (diagnosis codes for neutropenia, fever, and/or infection). Accuracy was evaluated principally based on positive predictive value (PPV) and sensitivity. Among 357 study subjects, 82 (23%) met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28), PPV was 100% and sensitivity was 34% (95% CI: 24-45). For the definition including neutropenia in the primary position (n=54), PPV was 87% (78-95) and sensitivity was 57% (46-68). For the definition including neutropenia in any position (n=71), PPV was 77% (68-87) and sensitivity was 67% (56-77). Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever.

  17. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases

    Directory of Open Access Journals (Sweden)

    Weycker Derek

    2013-02-01

    Full Text Available Abstract Background Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Methods Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive “gold standard” (ANC 9/L, and body temperature ≥38.3°C or receipt of antibiotics and claims-based definition (diagnosis codes for neutropenia, fever, and/or infection. Accuracy was evaluated principally based on positive predictive value (PPV and sensitivity. Results Among 357 study subjects, 82 (23% met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28, PPV was 100% and sensitivity was 34% (95% CI: 24–45. For the definition including neutropenia in the primary position (n=54, PPV was 87% (78–95 and sensitivity was 57% (46–68. For the definition including neutropenia in any position (n=71, PPV was 77% (68–87 and sensitivity was 67% (56–77. Conclusions Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever.

  18. 36 CFR 1201.4 - What types of claims are excluded from these regulations?

    Science.gov (United States)

    2010-07-01

    ...) Any debt based in whole or in part on conduct in violation of the antitrust laws or involving fraud... claims arising under the Internal Revenue Code (26 U.S.C. 1 et seq.) or the tariff laws of the United... collection; (e) Claims between Federal agencies; (f) Unless otherwise provided by law, administrative offset...

  19. Utilization of genetic tests: analysis of gene-specific billing in Medicare claims data.

    Science.gov (United States)

    Lynch, Julie A; Berse, Brygida; Dotson, W David; Khoury, Muin J; Coomer, Nicole; Kautter, John

    2017-08-01

    We examined the utilization of precision medicine tests among Medicare beneficiaries through analysis of gene-specific tier 1 and 2 billing codes developed by the American Medical Association in 2012. We conducted a retrospective cross-sectional study. The primary source of data was 2013 Medicare 100% fee-for-service claims. We identified claims billed for each laboratory test, the number of patients tested, expenditures, and the diagnostic codes indicated for testing. We analyzed variations in testing by patient demographics and region of the country. Pharmacogenetic tests were billed most frequently, accounting for 48% of the expenditures for new codes. The most common indications for testing were breast cancer, long-term use of medications, and disorders of lipid metabolism. There was underutilization of guideline-recommended tumor mutation tests (e.g., epidermal growth factor receptor) and substantial overutilization of a test discouraged by guidelines (methylenetetrahydrofolate reductase). Methodology-based tier 2 codes represented 15% of all claims billed with the new codes. The highest rate of testing per beneficiary was in Mississippi and the lowest rate was in Alaska. Gene-specific billing codes significantly improved our ability to conduct population-level research of precision medicine. Analysis of these data in conjunction with clinical records should be conducted to validate findings.Genet Med advance online publication 26 January 2017.

  20. Sensitivity of Claims-Based Algorithms to Ascertain Smoking Status More Than Doubled with Meaningful Use.

    Science.gov (United States)

    Huo, Jinhai; Yang, Ming; Tina Shih, Ya-Chen

    2018-03-01

    The "meaningful use of certified electronic health record" policy requires eligible professionals to record smoking status for more than 50% of all individuals aged 13 years or older in 2011 to 2012. To explore whether the coding to document smoking behavior has increased over time and to assess the accuracy of smoking-related diagnosis and procedure codes in identifying previous and current smokers. We conducted an observational study with 5,423,880 enrollees from the year 2009 to 2014 in the Truven Health Analytics database. Temporal trends of smoking coding, sensitivity, specificity, positive predictive value, and negative predictive value were measured. The rate of coding of smoking behavior improved significantly by the end of the study period. The proportion of patients in the claims data recorded as current smokers increased 2.3-fold and the proportion of patients recorded as previous smokers increased 4-fold during the 6-year period. The sensitivity of each International Classification of Diseases, Ninth Revision, Clinical Modification code was generally less than 10%. The diagnosis code of tobacco use disorder (305.1X) was the most sensitive code (9.3%) for identifying smokers. The specificities of these codes and the Current Procedural Terminology codes were all more than 98%. A large improvement in the coding of current and previous smoking behavior has occurred since the inception of the meaningful use policy. Nevertheless, the use of diagnosis and procedure codes to identify smoking behavior in administrative data is still unreliable. This suggests that quality improvements toward medical coding on smoking behavior are needed to enhance the capability of claims data for smoking-related outcomes research. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. INCIDENCE AND PREVALENCE OF ACROMEGALY IN THE UNITED STATES: A CLAIMS-BASED ANALYSIS.

    Science.gov (United States)

    Broder, Michael S; Chang, Eunice; Cherepanov, Dasha; Neary, Maureen P; Ludlam, William H

    2016-11-01

    Acromegaly, a rare endocrine disorder, results from excessive growth hormone secretion, leading to multisystem-associated morbidities. Using 2 large nationwide databases, we estimated the annual incidence and prevalence of acromegaly in the U.S. We used 2008 to 2013 data from the Truven Health MarketScan ® Commercial Claims and Encounters Database and IMS Health PharMetrics healthcare insurance claims databases, with health plan enrollees acromegaly (International Classification of Diseases, 9th Revision, Clinical Modification Code [ICD-9CM] 253.0), or 1 claim with acromegaly and 1 claim for pituitary tumor, pituitary surgery, or cranial stereotactic radiosurgery. Annual incidence was calculated for each year from 2009 to 2013, and prevalence in 2013. Estimates were stratified by age and sex. Incidence was up to 11.7 cases per million person-years (PMPY) in MarketScan and 9.6 cases PMPY in PharMetrics. Rates were similar by sex but typically lowest in ≤17 year olds and higher in >24 year olds. The prevalence estimates were 87.8 and 71.0 per million per year in MarketScan and PharMetrics, respectively. Prevalence consistently increased with age but was similar by sex in each database. The current U.S. incidence of acromegaly may be up to 4 times higher and prevalence may be up to 50% higher than previously reported in European studies. Our findings correspond with the estimates reported by a recent U.S. study that used a single managed care database, supporting the robustness of these estimates in this population. Our study indicates there are approximately 3,000 new cases of acromegaly per year, with a prevalence of about 25,000 acromegaly patients in the U.S. CT = computed tomography GH = growth hormone IGF-1 = insulin-like growth factor 1 ICD-9-CM Code = International Classification of Diseases, 9th Revision, Clinical Modification Codes MRI = magnetic resonance imaging PMPY = per million person-years.

  2. Guide to Permitting Hydrogen Motor Fuel Dispensing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Rivkin, Carl [National Renewable Energy Lab. (NREL), Golden, CO (United States); Buttner, William [National Renewable Energy Lab. (NREL), Golden, CO (United States); Burgess, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-03-28

    The purpose of this guide is to assist project developers, permitting officials, code enforcement officials, and other parties involved in developing permit applications and approving the implementation of hydrogen motor fuel dispensing facilities. The guide facilitates the identification of the elements to be addressed in the permitting of a project as it progresses through the approval process; the specific requirements associated with those elements; and the applicable (or potentially applicable) codes and standards by which to determine whether the specific requirements have been met. The guide attempts to identify all applicable codes and standards relevant to the permitting requirements.

  3. Musculoskeletal disorder costs and medical claim filing in the US retail trade sector.

    Science.gov (United States)

    Bhattacharya, Anasua; Leigh, J Paul

    2011-01-01

    The average costs of Musculoskeletal Disorder (MSD) and odds ratios for filing medical claims related to MSD were examined. The medical claims were identified by ICD 9 codes for four US Census regions within retail trade. Large private firms' medical claims data from Thomson Reuters Inc. MarketScan databases for the years 2003 through 2006 were used. Average costs were highest for claims related to lumbar region (ICD 9 Code: 724.02) and number of claims were largest for low back syndrome (ICD 9 Code: 724.2). Whereas the odds of filing an MSD claim did not vary greatly over time, average costs declined over time. The odds of filing claims rose with age and were higher for females and southerners than men and non-southerners. Total estimated national medical costs for MSDs within retail trade were $389 million (2007 USD).

  4. 7 CFR 1.51 - Claims based on negligence, wrongful act or omission.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false Claims based on negligence, wrongful act or omission. 1.51 Section 1.51 Agriculture Office of the Secretary of Agriculture ADMINISTRATIVE REGULATIONS Claims § 1.51 Claims based on negligence, wrongful act or omission. (a) Authority of the Department...

  5. Indications for spine surgery: validation of an administrative coding algorithm to classify degenerative diagnoses

    Science.gov (United States)

    Lurie, Jon D.; Tosteson, Anna N.A.; Deyo, Richard A.; Tosteson, Tor; Weinstein, James; Mirza, Sohail K.

    2014-01-01

    Study Design Retrospective analysis of Medicare claims linked to a multi-center clinical trial. Objective The Spine Patient Outcomes Research Trial (SPORT) provided a unique opportunity to examine the validity of a claims-based algorithm for grouping patients by surgical indication. SPORT enrolled patients for lumbar disc herniation, spinal stenosis, and degenerative spondylolisthesis. We compared the surgical indication derived from Medicare claims to that provided by SPORT surgeons, the “gold standard”. Summary of Background Data Administrative data are frequently used to report procedure rates, surgical safety outcomes, and costs in the management of spinal surgery. However, the accuracy of using diagnosis codes to classify patients by surgical indication has not been examined. Methods Medicare claims were link to beneficiaries enrolled in SPORT. The sensitivity and specificity of three claims-based approaches to group patients based on surgical indications were examined: 1) using the first listed diagnosis; 2) using all diagnoses independently; and 3) using a diagnosis hierarchy based on the support for fusion surgery. Results Medicare claims were obtained from 376 SPORT participants, including 21 with disc herniation, 183 with spinal stenosis, and 172 with degenerative spondylolisthesis. The hierarchical coding algorithm was the most accurate approach for classifying patients by surgical indication, with sensitivities of 76.2%, 88.1%, and 84.3% for disc herniation, spinal stenosis, and degenerative spondylolisthesis cohorts, respectively. The specificity was 98.3% for disc herniation, 83.2% for spinal stenosis, and 90.7% for degenerative spondylolisthesis. Misclassifications were primarily due to codes attributing more complex pathology to the case. Conclusion Standardized approaches for using claims data to accurately group patients by surgical indications has widespread interest. We found that a hierarchical coding approach correctly classified over 90

  6. Database and Registry Research in Orthopaedic Surgery: Part I: Claims-Based Data.

    Science.gov (United States)

    Pugely, Andrew J; Martin, Christopher T; Harwood, Jared; Ong, Kevin L; Bozic, Kevin J; Callaghan, John J

    2015-08-05

    The use of large-scale national databases for observational research in orthopaedic surgery has grown substantially in the last decade, and the data sets can be grossly categorized as either administrative claims or clinical registries. Administrative claims data comprise the billing records associated with the delivery of health-care services. Orthopaedic researchers have used both government and private claims to describe temporal trends, geographic variation, disparities, complications, outcomes, and resource utilization associated with both musculoskeletal disease and treatment. Medicare claims comprise one of the most robust data sets used to perform orthopaedic research, with >45 million beneficiaries. The U.S. government, through the Centers for Medicare & Medicaid Services, often uses these data to drive changes in health policy. Private claims data used in orthopaedic research often comprise more heterogeneous patient demographic samples, but allow longitudinal analysis similar to that offered by Medicare claims. Discharge databases, such as the U.S. National Inpatient Sample, provide a wide national sampling of inpatient hospital stays from all payers and allow analysis of associated adverse events and resource utilization. Administrative claims data benefit from the high patient numbers obtained through a majority of hospitals. Using claims, it is possible to follow patients longitudinally throughout encounters irrespective of the location of the institution delivering health care. Some disadvantages include lack of precision of ICD-9 (International Classification of Diseases, Ninth Revision) coding schemes. Much of these data are expensive to purchase, complicated to organize, and labor-intensive to manipulate--often requiring trained specialists for analysis. Given the changing health-care environment, it is likely that databases will provide valuable information that has the potential to influence clinical practice improvement and health policy for

  7. Reducing medical claims cost to Ghana?s National Health Insurance scheme: a cross-sectional comparative assessment of the paper- and electronic-based claims reviews

    OpenAIRE

    Nsiah-Boateng, Eric; Asenso-Boadi, Francis; Dsane-Selby, Lydia; Andoh-Adjei, Francis-Xavier; Otoo, Nathaniel; Akweongo, Patricia; Aikins, Moses

    2017-01-01

    Background A robust medical claims review system is crucial for addressing fraud and abuse and ensuring financial viability of health insurance organisations. This paper assesses claims adjustment rate of the paper- and electronic-based claims reviews of the National Health Insurance Scheme (NHIS) in Ghana. Methods The study was a cross-sectional comparative assessment of paper- and electronic-based claims reviews of the NHIS. Medical claims of subscribers for the year, 2014 were requested fr...

  8. Testbed for Multi-Wavelength Optical Code Division Multiplexing Based on Passive Linear Unitary Filters

    National Research Council Canada - National Science Library

    Yablonovitch, Eli

    2000-01-01

    .... The equipment purchased under this grant has permitted UCLA to purchase a number of broad-band optical components, including especially some unique code division multiplexing filters that permitted...

  9. The plaintiff's two-sided mouth: defeating ADA claims based on inconsistent positions taken by the plaintiff on other claims.

    Science.gov (United States)

    Connell, D S

    1996-01-01

    In the typical ADA claim, the plaintiff will claim that he or she has a disability but is nevertheless able to perform the essential functions of his or her job. This position is often in direct conflict with other non-ADA claims that the plaintiff has made or is making, where the plaintiff is claiming total disability and/or that he or she is unable to work. This article examines these phenomena, reviews the numerous recent cases that have found for employers based on these inconsistent positions of the plaintiff, and explains how employers can be develop and present this defense.

  10. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  11. Can Medicaid Claims Validly Ascertain Foster Care Status?

    Science.gov (United States)

    Raghavan, Ramesh; Brown, Derek S; Allaire, Benjamin T

    2017-08-01

    Medicaid claims have been used to identify populations of children in foster care in the current literature; however, the ability of such an approach to validly ascertain a foster care population is unknown. This study linked children in the National Survey of Child and Adolescent Well-Being-I to their Medicaid claims from 36 states using their Social Security numbers. Using this match, we examined discordance between caregiver report of foster care placement and the foster care eligibility code contained in the child's Medicaid claims. Only 73% of youth placed in foster care for at least a year displayed a Medicaid code for foster care eligibility. Half of all youth coming into contact with child welfare displayed discordance between caregiver report and Medicaid claims. Children with emergency department utilization, and those in primary care case management health insurance arrangements, had the highest odds of accurate ascertainment. The use of Medicaid claims to identify a cohort of children in foster care results in high rates of underascertainment. Supplementing administrative data with survey data is one way to enhance validity of ascertainment.

  12. Optimal, Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2002-01-01

    Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....

  13. Using Self-reports or Claims to Assess Disease Prevalence: It's Complicated.

    Science.gov (United States)

    St Clair, Patricia; Gaudette, Étienne; Zhao, Henu; Tysinger, Bryan; Seyedin, Roxanna; Goldman, Dana P

    2017-08-01

    Two common ways of measuring disease prevalence include: (1) using self-reported disease diagnosis from survey responses; and (2) using disease-specific diagnosis codes found in administrative data. Because they do not suffer from self-report biases, claims are often assumed to be more objective. However, it is not clear that claims always produce better prevalence estimates. Conduct an assessment of discrepancies between self-report and claims-based measures for 2 diseases in the US elderly to investigate definition, selection, and measurement error issues which may help explain divergence between claims and self-report estimates of prevalence. Self-reported data from 3 sources are included: the Health and Retirement Study, the Medicare Current Beneficiary Survey, and the National Health and Nutrition Examination Survey. Claims-based disease measurements are provided from Medicare claims linked to Health and Retirement Study and Medicare Current Beneficiary Survey participants, comprehensive claims data from a 20% random sample of Medicare enrollees, and private health insurance claims from Humana Inc. Prevalence of diagnosed disease in the US elderly are computed and compared across sources. Two medical conditions are considered: diabetes and heart attack. Comparisons of diagnosed diabetes and heart attack prevalence show similar trends by source, but claims differ from self-reports with regard to levels. Selection into insurance plans, disease definitions, and the reference period used by algorithms are identified as sources contributing to differences. Claims and self-reports both have strengths and weaknesses, which researchers need to consider when interpreting estimates of prevalence from these 2 sources.

  14. Rates, Amounts, and Determinants of Ambulatory Blood Pressure Monitoring Claim Reimbursements Among Medicare Beneficiaries

    Science.gov (United States)

    Kent, Shia T.; Shimbo, Daichi; Huang, Lei; Diaz, Keith M.; Viera, Anthony J.; Kilgore, Meredith; Oparil, Suzanne; Muntner, Paul

    2014-01-01

    Ambulatory blood pressure monitoring (ABPM) can be used to identify white coat hypertension and guide hypertensive treatment. We determined the percentage of ABPM claims submitted between 2007–2010 that were reimbursed. Among 1,970 Medicare beneficiaries with submitted claims, ABPM was reimbursed for 93.8% of claims that had an ICD-9 diagnosis code of 796.2 (“elevated blood pressure reading without diagnosis of hypertension”) versus 28.5% of claims without this code. Among claims without an ICD-9 diagnosis code of 796.2 listed, those for the component (e.g., recording, scanning analysis, physician review, reporting) versus full ABPM procedures and performed by institutional versus non-institutional providers were each more than two times as likely to be successfully reimbursed. Of the claims reimbursed, the median payment was $52.01 (25–75th percentiles: $32.95–$64.98). In conclusion, educating providers on the ABPM claims reimbursement process and evaluation of Medicare reimbursement may increase the appropriate use of ABPM and improve patient care. PMID:25492833

  15. ATHENA code manual. Volume 1. Code structure, system models, and solution methods

    International Nuclear Information System (INIS)

    Carlson, K.E.; Roth, P.A.; Ransom, V.H.

    1986-09-01

    The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code has been developed to perform transient simulation of the thermal hydraulic systems which may be found in fusion reactors, space reactors, and other advanced systems. A generic modeling approach is utilized which permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of a complete facility. Several working fluids are available to be used in one or more interacting loops. Different loops may have different fluids with thermal connections between loops. The modeling theory and associated numerical schemes are documented in Volume I in order to acquaint the user with the modeling base and thus aid effective use of the code. The second volume contains detailed instructions for input data preparation

  16. 78 FR 10579 - TRICARE Revision to CHAMPUS DRG-Based Payment System, Pricing of Hospital Claims

    Science.gov (United States)

    2013-02-14

    ... 0720-AB58 TRICARE Revision to CHAMPUS DRG-Based Payment System, Pricing of Hospital Claims AGENCY... change TRICARE's current regulatory provision for hospital claims priced under the DRG-based payment... under the DRG- based payment system from the beneficiary's date of admission, to pricing such claims...

  17. Ultra-processed family foods in Australia: nutrition claims, health claims and marketing techniques.

    Science.gov (United States)

    Pulker, Claire Elizabeth; Scott, Jane Anne; Pollard, Christina Mary

    2018-01-01

    To objectively evaluate voluntary nutrition and health claims and marketing techniques present on packaging of high-market-share ultra-processed foods (UPF) in Australia for their potential impact on public health. Cross-sectional. Packaging information from five high-market-share food manufacturers and one retailer were obtained from supermarket and manufacturers' websites. Ingredients lists for 215 UPF were examined for presence of added sugar. Packaging information was categorised using a taxonomy of nutrition and health information which included nutrition and health claims and five common food marketing techniques. Compliance of statements and claims with the Australia New Zealand Food Standards Code and with Health Star Ratings (HSR) were assessed for all products. Almost all UPF (95 %) contained added sugars described in thirty-four different ways; 55 % of UPF displayed a HSR; 56 % had nutrition claims (18 % were compliant with regulations); 25 % had health claims (79 % were compliant); and 97 % employed common food marketing techniques. Packaging of 47 % of UPF was designed to appeal to children. UPF carried a mean of 1·5 health and nutrition claims (range 0-10) and 2·6 marketing techniques (range 0-5), and 45 % had HSR≤3·0/5·0. Most UPF packaging featured nutrition and health statements or claims despite the high prevalence of added sugars and moderate HSR. The degree of inappropriate or inaccurate statements and claims present is concerning, particularly on packaging designed to appeal to children. Public policies to assist parents to select healthy family foods should address the quality and accuracy of information provided on UPF packaging.

  18. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  19. Internet-based Advertising Claims and Consumer Reasons for Using Electronic Cigarettes by Device Type in the US.

    Science.gov (United States)

    Pulvers, Kim; Sun, Jessica Y; Zhuang, Yue-Lin; Holguin, Gabriel; Zhu, Shu-Hong

    2017-10-01

    Important differences exist between closed-system and open-system e-cigarettes, but it is unknown whether online companies are marketing these devices differently and whether consumer reasons for using e-cigarettes vary by device type. This paper compares Internet-based advertising claims of closed- versus open-system products, and evaluates US consumers' reasons for using closed- versus open-system e-cigarettes. Internet sites selling exclusively closed (N = 130) or open (N = 129) e-cigarettes in December 2013-January 2014 were coded for advertising claims. Current users (≥18 years old) of exclusively closed or open e-cigarettes (N = 860) in a nationally representative online survey in February-March 2014 provided their main reason for using e-cigarettes. Internet sites that exclusively sold closed-system e-cigarettes were more likely to make cigarette-related claims such as e-cigarettes being healthier and cheaper than cigarettes (ps < .0001) compared to sites selling open systems. Many sites implied their products could help smokers quit. Exclusive users of both systems endorsed cessation as their top reason. Closed-system users were more likely to report their reason as "use where smoking is banned." Although promotion of e-cigarettes as cessation aids is prohibited, consumers of both systems endorsed smoking cessation as their top reason for using e-cigarettes.

  20. Pediatric radiology malpractice claims - characteristics and comparison to adult radiology claims

    Energy Technology Data Exchange (ETDEWEB)

    Breen, Micheal A.; Taylor, George A. [Boston Children' s Hospital, Department of Radiology, Boston, MA (United States); Dwyer, Kathy; Yu-Moe, Winnie [CRICO Risk Management Foundation, Boston, MA (United States)

    2017-06-15

    Medical malpractice is the primary method by which people who believe they have suffered an injury in the course of medical care seek compensation in the United States and Canada. An increasing body of research demonstrates that failure to correctly diagnose is the most common allegation made in malpractice claims against radiologists. Since the 1994 survey by the Society of Chairmen of Radiology in Children's Hospitals (SCORCH), no other published studies have specifically examined the frequency or clinical context of malpractice claims against pediatric radiologists or arising from pediatric imaging interpretation. We hypothesize that the frequency, character and outcome of malpractice claims made against pediatric radiologists differ from those seen in general radiology practice. We searched the Controlled Risk Insurance Co. (CRICO) Strategies' Comparative Benchmarking System (CBS), a private repository of approximately 350,000 open and closed medical malpractice claims in the United States, for claims related to pediatric radiology. We further queried these cases for the major allegation, the clinical environment in which the claim arose, the clinical severity of the alleged injury, indemnity paid (if payment was made), primary imaging modality involved (if applicable) and primary International Classification of Diseases, 9th revision (ICD-9) diagnosis underlying the claim. There were a total of 27,056 fully coded claims of medical malpractice in the CBS database in the 5-year period between Jan. 1, 2010, and Dec. 31, 2014. Of these, 1,472 cases (5.4%) involved patients younger than 18 years. Radiology was the primary service responsible for 71/1,472 (4.8%) pediatric cases. There were statistically significant differences in average payout for pediatric radiology claims ($314,671) compared to adult radiology claims ($174,033). The allegations were primarily diagnosis-related in 70% of pediatric radiology claims. The most common imaging modality

  1. Pediatric radiology malpractice claims - characteristics and comparison to adult radiology claims

    International Nuclear Information System (INIS)

    Breen, Micheal A.; Taylor, George A.; Dwyer, Kathy; Yu-Moe, Winnie

    2017-01-01

    Medical malpractice is the primary method by which people who believe they have suffered an injury in the course of medical care seek compensation in the United States and Canada. An increasing body of research demonstrates that failure to correctly diagnose is the most common allegation made in malpractice claims against radiologists. Since the 1994 survey by the Society of Chairmen of Radiology in Children's Hospitals (SCORCH), no other published studies have specifically examined the frequency or clinical context of malpractice claims against pediatric radiologists or arising from pediatric imaging interpretation. We hypothesize that the frequency, character and outcome of malpractice claims made against pediatric radiologists differ from those seen in general radiology practice. We searched the Controlled Risk Insurance Co. (CRICO) Strategies' Comparative Benchmarking System (CBS), a private repository of approximately 350,000 open and closed medical malpractice claims in the United States, for claims related to pediatric radiology. We further queried these cases for the major allegation, the clinical environment in which the claim arose, the clinical severity of the alleged injury, indemnity paid (if payment was made), primary imaging modality involved (if applicable) and primary International Classification of Diseases, 9th revision (ICD-9) diagnosis underlying the claim. There were a total of 27,056 fully coded claims of medical malpractice in the CBS database in the 5-year period between Jan. 1, 2010, and Dec. 31, 2014. Of these, 1,472 cases (5.4%) involved patients younger than 18 years. Radiology was the primary service responsible for 71/1,472 (4.8%) pediatric cases. There were statistically significant differences in average payout for pediatric radiology claims ($314,671) compared to adult radiology claims ($174,033). The allegations were primarily diagnosis-related in 70% of pediatric radiology claims. The most common imaging modality implicated in

  2. Pediatric radiology malpractice claims - characteristics and comparison to adult radiology claims.

    Science.gov (United States)

    Breen, Micheál A; Dwyer, Kathy; Yu-Moe, Winnie; Taylor, George A

    2017-06-01

    Medical malpractice is the primary method by which people who believe they have suffered an injury in the course of medical care seek compensation in the United States and Canada. An increasing body of research demonstrates that failure to correctly diagnose is the most common allegation made in malpractice claims against radiologists. Since the 1994 survey by the Society of Chairmen of Radiology in Children's Hospitals (SCORCH), no other published studies have specifically examined the frequency or clinical context of malpractice claims against pediatric radiologists or arising from pediatric imaging interpretation. We hypothesize that the frequency, character and outcome of malpractice claims made against pediatric radiologists differ from those seen in general radiology practice. We searched the Controlled Risk Insurance Co. (CRICO) Strategies' Comparative Benchmarking System (CBS), a private repository of approximately 350,000 open and closed medical malpractice claims in the United States, for claims related to pediatric radiology. We further queried these cases for the major allegation, the clinical environment in which the claim arose, the clinical severity of the alleged injury, indemnity paid (if payment was made), primary imaging modality involved (if applicable) and primary International Classification of Diseases, 9th revision (ICD-9) diagnosis underlying the claim. There were a total of 27,056 fully coded claims of medical malpractice in the CBS database in the 5-year period between Jan. 1, 2010, and Dec. 31, 2014. Of these, 1,472 cases (5.4%) involved patients younger than 18 years. Radiology was the primary service responsible for 71/1,472 (4.8%) pediatric cases. There were statistically significant differences in average payout for pediatric radiology claims ($314,671) compared to adult radiology claims ($174,033). The allegations were primarily diagnosis-related in 70% of pediatric radiology claims. The most common imaging modality implicated in

  3. A systematic review of validated methods to capture acute bronchospasm using administrative or claims data.

    Science.gov (United States)

    Sharifi, Mona; Krishanswami, Shanthi; McPheeters, Melissa L

    2013-12-30

    To identify and assess billing, procedural, or diagnosis code, or pharmacy claim-based algorithms used to identify acute bronchospasm in administrative and claims databases. We searched the MEDLINE database from 1991 to September 2012 using controlled vocabulary and key terms related to bronchospasm, wheeze and acute asthma. We also searched the reference lists of included studies. Two investigators independently assessed the full text of studies against pre-determined inclusion criteria. Two reviewers independently extracted data regarding participant and algorithm characteristics. Our searches identified 677 citations of which 38 met our inclusion criteria. In these 38 studies, the most commonly used ICD-9 code was 493.x. Only 3 studies reported any validation methods for the identification of bronchospasm, wheeze or acute asthma in administrative and claims databases; all were among pediatric populations and only 2 offered any validation statistics. Some of the outcome definitions utilized were heterogeneous and included other disease based diagnoses, such as bronchiolitis and pneumonia, which are typically of an infectious etiology. One study offered the validation of algorithms utilizing Emergency Department triage chief complaint codes to diagnose acute asthma exacerbations with ICD-9 786.07 (wheezing) revealing the highest sensitivity (56%), specificity (97%), PPV (93.5%) and NPV (76%). There is a paucity of studies reporting rigorous methods to validate algorithms for the identification of bronchospasm in administrative data. The scant validated data available are limited in their generalizability to broad-based populations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Challenges in using medicaid claims to ascertain child maltreatment.

    Science.gov (United States)

    Raghavan, Ramesh; Brown, Derek S; Allaire, Benjamin T; Garfield, Lauren D; Ross, Raven E; Hedeker, Donald

    2015-05-01

    Medicaid data contain International Classification of Diseases, Clinical Modification (ICD-9-CM) codes indicating maltreatment, yet there is a little information on how valid these codes are for the purposes of identifying maltreatment from health, as opposed to child welfare, data. This study assessed the validity of Medicaid codes in identifying maltreatment. Participants (n = 2,136) in the first National Survey of Child and Adolescent Well-Being were linked to their Medicaid claims obtained from 36 states. Caseworker determinations of maltreatment were compared with eight sets of ICD-9-CM codes. Of the 1,921 children identified by caseworkers as being maltreated, 15.2% had any relevant ICD-9-CM code in any of their Medicaid files across 4 years of observation. Maltreated boys and those of African American race had lower odds of displaying a maltreatment code. Using only Medicaid claims to identify maltreated children creates validity problems. Medicaid data linkage with other types of administrative data is required to better identify maltreated children. © The Author(s) 2014.

  5. Who will use claims data for the prevention of occupational trauma?

    OpenAIRE

    Larsson, Tore J

    2003-01-01

    If claims data from the public fund workers' compensation system is merged with the relevant census data, the relative distribution of occupational injury risk in the system can be calculated. A reconstituted occupational code, made from combining the present occupational and industrial codes, can be used to differentiate occupations in relation to hazards. A four-part injury severity index, generated in the claims settling process, can be used to further differentiate occupations, tasks and ...

  6. 29 CFR 1620.20 - Pay differentials claimed to be based on extra duties.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Pay differentials claimed to be based on extra duties. 1620.20 Section 1620.20 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION THE EQUAL PAY ACT § 1620.20 Pay differentials claimed to be based on extra duties. Additional...

  7. Watershed-based point sources permitting strategy and dynamic permit-trading analysis.

    Science.gov (United States)

    Ning, Shu-Kuang; Chang, Ni-Bin

    2007-09-01

    Permit-trading policy in a total maximum daily load (TMDL) program may provide an additional avenue to produce environmental benefit, which closely approximates what would be achieved through a command and control approach, with relatively lower costs. One of the important considerations that might affect the effective trading mechanism is to determine the dynamic transaction prices and trading ratios in response to seasonal changes of assimilative capacity in the river. Advanced studies associated with multi-temporal spatially varied trading ratios among point sources to manage water pollution hold considerable potential for industries and policy makers alike. This paper aims to present an integrated simulation and optimization analysis for generating spatially varied trading ratios and evaluating seasonal transaction prices accordingly. It is designed to configure a permit-trading structure basin-wide and provide decision makers with a wealth of cost-effective, technology-oriented, risk-informed, and community-based management strategies. The case study, seamlessly integrating a QUAL2E simulation model with an optimal waste load allocation (WLA) scheme in a designated TMDL study area, helps understand the complexity of varying environmental resources values over space and time. The pollutants of concern in this region, which are eligible for trading, mainly include both biochemical oxygen demand (BOD) and ammonia-nitrogen (NH3-N). The problem solution, as a consequence, suggests an array of waste load reduction targets in a well-defined WLA scheme and exhibits a dynamic permit-trading framework among different sub-watersheds in the study area. Research findings gained in this paper may extend to any transferable dynamic-discharge permit (TDDP) program worldwide.

  8. Level of Agreement and Factors Associated With Discrepancies Between Nationwide Medical History Questionnaires and Hospital Claims Data

    Directory of Open Access Journals (Sweden)

    Yeon-Yong Kim

    2017-09-01

    Full Text Available Objectives The objectives of this study were to investigate the agreement between medical history questionnaire data and claims data and to identify the factors that were associated with discrepancies between these data types. Methods Data from self-reported questionnaires that assessed an individual’s history of hypertension, diabetes mellitus, dyslipidemia, stroke, heart disease, and pulmonary tuberculosis were collected from a general health screening database for 2014. Data for these diseases were collected from a healthcare utilization claims database between 2009 and 2014. Overall agreement, sensitivity, specificity, and kappa values were calculated. Multiple logistic regression analysis was performed to identify factors associated with discrepancies and was adjusted for age, gender, insurance type, insurance contribution, residential area, and comorbidities. Results Agreement was highest between questionnaire data and claims data based on primary codes up to 1 year before the completion of self-reported questionnaires and was lowest for claims data based on primary and secondary codes up to 5 years before the completion of self-reported questionnaires. When comparing data based on primary codes up to 1 year before the completion of self-reported questionnaires, the overall agreement, sensitivity, specificity, and kappa values ranged from 93.2 to 98.8%, 26.2 to 84.3%, 95.7 to 99.6%, and 0.09 to 0.78, respectively. Agreement was excellent for hypertension and diabetes, fair to good for stroke and heart disease, and poor for pulmonary tuberculosis and dyslipidemia. Women, younger individuals, and employed individuals were most likely to under-report disease. Conclusions Detailed patient characteristics that had an impact on information bias were identified through the differing levels of agreement.

  9. Level of Agreement and Factors Associated With Discrepancies Between Nationwide Medical History Questionnaires and Hospital Claims Data.

    Science.gov (United States)

    Kim, Yeon-Yong; Park, Jong Heon; Kang, Hee-Jin; Lee, Eun Joo; Ha, Seongjun; Shin, Soon-Ae

    2017-09-01

    The objectives of this study were to investigate the agreement between medical history questionnaire data and claims data and to identify the factors that were associated with discrepancies between these data types. Data from self-reported questionnaires that assessed an individual's history of hypertension, diabetes mellitus, dyslipidemia, stroke, heart disease, and pulmonary tuberculosis were collected from a general health screening database for 2014. Data for these diseases were collected from a healthcare utilization claims database between 2009 and 2014. Overall agreement, sensitivity, specificity, and kappa values were calculated. Multiple logistic regression analysis was performed to identify factors associated with discrepancies and was adjusted for age, gender, insurance type, insurance contribution, residential area, and comorbidities. Agreement was highest between questionnaire data and claims data based on primary codes up to 1 year before the completion of self-reported questionnaires and was lowest for claims data based on primary and secondary codes up to 5 years before the completion of self-reported questionnaires. When comparing data based on primary codes up to 1 year before the completion of self-reported questionnaires, the overall agreement, sensitivity, specificity, and kappa values ranged from 93.2 to 98.8%, 26.2 to 84.3%, 95.7 to 99.6%, and 0.09 to 0.78, respectively. Agreement was excellent for hypertension and diabetes, fair to good for stroke and heart disease, and poor for pulmonary tuberculosis and dyslipidemia. Women, younger individuals, and employed individuals were most likely to under-report disease. Detailed patient characteristics that had an impact on information bias were identified through the differing levels of agreement.

  10. Leading Causes of Anesthesia-Related Liability Claims in Ambulatory Surgery Centers.

    Science.gov (United States)

    Ranum, Darrell; Beverly, Anair; Shapiro, Fred E; Urman, Richard D

    2017-11-16

    We present a contemporary analysis of patient injury, allegations, and contributing factors of anesthesia-related closed claims, which involved cases that specifically occurred in free-standing ambulatory surgery centers (ASCs). We examined ASC-closed claims data between 2007 and 2014 from The Doctors Company, a medical malpractice insurer. Findings were coded using the Comprehensive Risk Intelligence Tool developed by CRICO Strategies. We compared coded data from ASC claims with hospital operating room (HOR) claims, in terms of injury severity category, nature of injury, nature of allegation, contributing factors identified, and contributing comorbidities and claim value. Ambulatory surgery center claims were more likely to be classified as medium severity than HOR claims, more likely to involve dental damage or pain than HOR claims, but less likely to involve death or respiratory or cardiac arrest. Technical performance was the most common contributing factor: 47% of ASCs and 48% of HORs. Only 7% of allegations relating to technical performance were judged to be a direct result of poor technical performance. The most common anesthesia procedures resulting in ASC claims were injection of anesthesia into a peripheral nerve (34%) and intubation (29%). Obesity was the most common contributing comorbidity in both settings. Mean closed claim value was significantly lower for ASC than HOR claims, averaging US $87,888 versus $107,325. Analysis of ASC and HOR claims demonstrates significant differences and several common sources of liability. These include improving strategies for thorough screening, preoperative assessment and risk stratifying of patients, incorporating routine dental and airway assessment and documentation, diagnosing and treating perioperative pain adequately, and improving the efficacy of communication between patients and care providers.

  11. How Danes evaluate moral claims related to abortion

    DEFF Research Database (Denmark)

    Uldall, Sigurd Wiingaard

    2015-01-01

    OBJECTIVE: To investigate how Danish citizens evaluate four moral claims related to abortion issues, regarding the moral status of the fetus, autonomy, harm and possible negative consequences of allowing abortion and to explore the association between moral beliefs and attitudes towards abortion...... to at least one moral claim. Two hundred and fifty-eight responded to all four claims without using the option 'neither agree nor disagree' and were classified as 'morally engaged responders'. A majority of these had a pro-abortion moral. The general relationship between moral beliefs and attitudes towards...... abortion was morally sound. Being 'morally engaged' did not increase the likelihood of reaching moral judgement on whether requests for abortion should be permitted. Education, religion and parenthood were statistically associated with the investigated issues. DISCUSSION: The direction of causality...

  12. QC-LDPC code-based cryptography

    CERN Document Server

    Baldi, Marco

    2014-01-01

    This book describes the fundamentals of cryptographic primitives based on quasi-cyclic low-density parity-check (QC-LDPC) codes, with a special focus on the use of these codes in public-key cryptosystems derived from the McEliece and Niederreiter schemes. In the first part of the book, the main characteristics of QC-LDPC codes are reviewed, and several techniques for their design are presented, while tools for assessing the error correction performance of these codes are also described. Some families of QC-LDPC codes that are best suited for use in cryptography are also presented. The second part of the book focuses on the McEliece and Niederreiter cryptosystems, both in their original forms and in some subsequent variants. The applicability of QC-LDPC codes in these frameworks is investigated by means of theoretical analyses and numerical tools, in order to assess their benefits and drawbacks in terms of system efficiency and security. Several examples of QC-LDPC code-based public key cryptosystems are prese...

  13. SCDAP/RELAP5/MOD2 code manual

    International Nuclear Information System (INIS)

    Allison, C.M.; Johnson, E.C.

    1989-09-01

    The SCDAP/RELAP5 code has been developed for best-estimate transient simulation of light water reactor coolant systems during a severe accident. The code models the coupled behavior of the reactor coolant system, the core, and the fission products and aerosols in the system during a severe accident transient as well as large and small break loss-of-coolant accidents, operational transients such as anticipated transient without SCRAM, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater conditioning systems. The modeling theory and associated numerical schemes are documented in Volumes I and II to acquaint the user with the modeling base and thus aid in effective use of the code

  14. Coding for effective denial management.

    Science.gov (United States)

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  15. TRICARE revision to CHAMPUS DRG-based payment system, pricing of hospital claims. Final rule.

    Science.gov (United States)

    2014-05-21

    This Final rule changes TRICARE's current regulatory provision for inpatient hospital claims priced under the DRG-based payment system. Claims are currently priced by using the rates and weights that are in effect on a beneficiary's date of admission. This Final rule changes that provision to price such claims by using the rates and weights that are in effect on a beneficiary's date of discharge.

  16. Tradeoffs of Using Administrative Claims and Medical Records to Identify the Use of Personalized Medicine for Patients with Breast Cancer

    Science.gov (United States)

    Liang, Su-Ying; Phillips, Kathryn A.; Wang, Grace; Keohane, Carol; Armstrong, Joanne; Morris, William M.; Haas, Jennifer S.

    2012-01-01

    Background Administrative claims and medical records are important data sources to examine healthcare utilization and outcomes. Little is known about identifying personalized medicine technologies in these sources. Objectives To describe agreement, sensitivity, and specificity of administrative claims compared to medical records for two pairs of targeted tests and treatments for breast cancer. Research Design Retrospective analysis of medical records linked to administrative claims from a large health plan. We examined whether agreement varied by factors that facilitate tracking in claims (coding and cost) and that enhance medical record completeness (records from multiple providers). Subjects Women (35 – 65 years) with incident breast cancer diagnosed in 2006–2007 (n=775). Measures Use of human epidermal growth factor receptor 2 (HER2) and gene expression profiling (GEP) testing, trastuzumab and adjuvant chemotherapy in claims and medical records. Results Agreement between claims and records was substantial for GEP, trastuzumab, and chemotherapy, and lowest for HER2 tests. GEP, an expensive test with unique billing codes, had higher agreement (91.6% vs. 75.2%), sensitivity (94.9% vs. 76.7%), and specificity (90.1% vs. 29.2%) than HER2, a test without unique billing codes. Trastuzumab, a treatment with unique billing codes, had slightly higher agreement (95.1% vs. 90%) and sensitivity (98.1% vs. 87.9%) than adjuvant chemotherapy. Conclusions Higher agreement and specificity were associated with services that had unique billing codes and high cost. Administrative claims may be sufficient for examining services with unique billing codes. Medical records provide better data for identifying tests lacking specific codes and for research requiring detailed clinical information. PMID:21422962

  17. Polynomial theory of error correcting codes

    CERN Document Server

    Cancellieri, Giovanni

    2015-01-01

    The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.

  18. Quality of care in patients with atrial fibrillation in primary care: a cross-sectional study comparing clinical and claims data.

    Science.gov (United States)

    Preuss, Rebekka; Chenot, Jean-François; Angelow, Aniela

    2016-01-01

    Objectives: Atrial fibrillation (AF) is a common cardiac arrhythmia with increased risk of thromboembolic stroke. Oral anticoagulation (OAC) reduces stroke risk by up to 68%. The aim of our study was to evaluate quality of care in patients with AF in a primary health care setting with a focus on physician guideline adherence for OAC prescription and heart rate- and rhythm management. In a second step we aimed to compare OAC rates based on primary care data with rates based on claims data. Methods: We included all GP practices in the region Vorpommern-Greifswald, Germany, which were willing to participate (N=29/182, response rate 16%). Claims data was derived from the regional association of statutory health insurance physicians. Patients with a documented AF diagnosis (ICD-10-GM-Code ICD I48.-) from 07/2011-06/2012 were identified using electronic medical records (EMR) and claims data. Stroke and bleeding risk were calculated using the CHA 2 DS 2 -VASc and HAS-BLED scores. We calculated crude treatment rates for OAC, rate and rhythm control medications and adjusted OAC treatment rates based on practice and claims data. Adjusted rates were calculated including the CHA 2 DS 2 -VASc and HAS-BLED scores and individual factors affecting guideline based treatment. Results: We identified 927 patients based on EMR and 1,247 patients based on claims data. The crude total OAC treatment rate was 69% based on EMR and 61% based on claims data. The adjusted OAC treatment rates were 90% for patients based on EMR and 63% based on claims data. 82% of the AF patients received a treatment for rate control and 12% a treatment for rhythm control. The most common reasons for non-prescription of OAC were an increased risk of falling, dementia and increased bleeding risk. Conclusion: Our results suggest that a high rate of AF patients receive a drug therapy according to guidelines. There is a large difference between crude and adjusted OAC treatment rates. This is due to individual

  19. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    Science.gov (United States)

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  20. XGC developments for a more efficient XGC-GENE code coupling

    Science.gov (United States)

    Dominski, Julien; Hager, Robert; Ku, Seung-Hoe; Chang, Cs

    2017-10-01

    In the Exascale Computing Program, the High-Fidelity Whole Device Modeling project initially aims at delivering a tightly-coupled simulation of plasma neoclassical and turbulence dynamics from the core to the edge of the tokamak. To permit such simulations, the gyrokinetic codes GENE and XGC will be coupled together. Numerical efforts are made to improve the numerical schemes agreement in the coupling region. One of the difficulties of coupling those codes together is the incompatibility of their grids. GENE is a continuum grid-based code and XGC is a Particle-In-Cell code using unstructured triangular mesh. A field-aligned filter is thus implemented in XGC. Even if XGC originally had an approximately field-following mesh, this field-aligned filter permits to have a perturbation discretization closer to the one solved in the field-aligned code GENE. Additionally, new XGC gyro-averaging matrices are implemented on a velocity grid adapted to the plasma properties, thus ensuring same accuracy from the core to the edge regions.

  1. SCDAP/RELAP5/MOD2 code manual

    International Nuclear Information System (INIS)

    Allison, C.M.; Johnson, E.C.

    1989-09-01

    The SCDAP/RELAP5 code has been developed for best-estimate transient simulation of light water reactor coolant systems during a severe accident. The code models the coupled behavior of the reactor coolant system, the core, and the fission products and aerosols in the system during a severe accident transient as well as large and small break loss-of-coolant accidents, operational transients such as anticipated transient without SCRAM, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater conditioning systems. The modeling theory and associated numerical schemes are documented in Volumes I and in this document, Volume II, to acquaint the user with the modeling base and thus aid in effective use of the code. 135 refs., 48 figs., 8 tabs

  2. Relationship between various pressure vessel and piping codes

    International Nuclear Information System (INIS)

    Canonico, D.A.

    1976-01-01

    Section VIII of the ASME Code provides stress allowable values for material specifications that are provided in Section II Parts A and B. Since the adoption of the ASME Code over 60 years ago the incidence of failure has been greatly reduced. The Codes are currently based on strength criteria and advancements in the technology of fracture toughness and fracture mechanics should permit an even greater degree of reliability and safety. This lecture discusses the various Sections of the Code. It describes the basis for the establishment of design stress allowables and promotes the idea of the use of fracture mechanics

  3. An Individual Claims History Simulation Machine

    Directory of Open Access Journals (Sweden)

    Andrea Gabrielli

    2018-03-01

    Full Text Available The aim of this project is to develop a stochastic simulation machine that generates individual claims histories of non-life insurance claims. This simulation machine is based on neural networks to incorporate individual claims feature information. We provide a fully calibrated stochastic scenario generator that is based on real non-life insurance data. This stochastic simulation machine allows everyone to simulate their own synthetic insurance portfolio of individual claims histories and back-test thier preferred claims reserving method.

  4. Identification of Emergency Department Visits in Medicare Administrative Claims: Approaches and Implications

    Science.gov (United States)

    Venkatesh, Arjun K.; Mei, Hao; Kocher, Keith E.; Granovsky, Michael; Obermeyer, Ziad; Spatz, Erica S.; Rothenberg, Craig; Krumholz, Harlan M.; Lin, Zhenqui

    2018-01-01

    Objectives Administrative claims data sets are often used for emergency care research and policy investigations of healthcare resource utilization, acute care practices, and evaluation of quality improvement interventions. Despite the high profile of emergency department (ED) visits in analyses using administrative claims, little work has evaluated the degree to which existing definitions based on claims data accurately captures conventionally defined hospital-based ED services. We sought to construct an operational definition for ED visitation using a comprehensive Medicare data set and to compare this definition to existing operational definitions used by researchers and policymakers. Methods We examined four operational definitions of an ED visit commonly used by researchers and policymakers using a 20% sample of the 2012 Medicare Chronic Condition Warehouse (CCW) data set. The CCW data set included all Part A (hospital) and Part B (hospital outpatient, physician) claims for a nationally representative sample of continuously enrolled Medicare fee-for-services beneficiaries. Three definitions were based on published research or existing quality metrics including: 1) provider claims–based definition, 2) facility claims–based definition, and 3) CMS Research Data Assistance Center (ResDAC) definition. In addition, we developed a fourth operational definition (Yale definition) that sought to incorporate additional coding rules for identifying ED visits. We report levels of agreement and disagreement among the four definitions. Results Of 10,717,786 beneficiaries included in the sample data set, 22% had evidence of ED use during the study year under any of the ED visit definitions. The definition using provider claims identified a total of 4,199,148 ED visits, the facility definition 4,795,057 visits, the ResDAC definition 5,278,980 ED visits, and the Yale definition 5,192,235 ED visits. The Yale definition identified a statistically different (p services in the

  5. Examination of the accuracy of coding hospital-acquired pressure ulcer stages.

    Science.gov (United States)

    Coomer, Nicole M; McCall, Nancy T

    2013-01-01

    Pressure ulcers (PU) are considered harmful conditions that are reasonably prevented if accepted standards of care are followed. They became subject to the payment adjustment for hospitalacquired conditions (HACs) beginning October 1, 2008. We examined several aspects of the accuracy of coding for pressure ulcers under the Medicare Hospital-Acquired Condition Present on Admission (HAC-POA) Program. We used the "4010" claim format as a basis of reference to show some of the issues of the old format, such as the underreporting of pressure ulcer stages on pressure ulcer claims and how the underreporting varied by hospital characteristics. We then used the rate of Stage III and IV pressure ulcer HACs reported in the Hospital Cost and Utilization Project State Inpatient Databases data to look at the sensitivity of PU HAC-POA coding to the number of diagnosis fields. We examined Medicare claims data for FYs 2009 and 2010 to examine the degree that the presence of stage codes were underreported on pressure ulcer claims. We selected all claims with a secondary diagnosis code of pressure ulcer site (ICD-9 diagnosis codes 707.00-707.09) that were not reported as POA (POA of "N" or "U"). We then created a binary indicator for the presence of any pressure ulcer stage diagnosis code. We examine the percentage of claims with a diagnosis of a pressure ulcer site code with no accompanying pressure ulcer stage code. Our results point to underreporting of PU stages under the "4010" format and that the reporting of stage codes varied across hospital type and location. Further, our results indicate that under the "5010" format, a higher number of pressure ulcer HACs can be expected to be reported and we should expect to encounter a larger percentage of pressure ulcers incorrectly coded as POA under the new format. The combination of the capture of 25 diagnosis codes under the new "5010" format and the change from ICD-9 to ICD-10 will likely alleviate the observed underreporting of

  6. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  7. Validity of vascular trauma codes at major trauma centres.

    Science.gov (United States)

    Altoijry, Abdulmajeed; Al-Omran, Mohammed; Lindsay, Thomas F; Johnston, K Wayne; Melo, Magda; Mamdani, Muhammad

    2013-12-01

    The use of administrative databases in vascular injury research has been increasing, but the validity of the diagnosis codes used in this research is uncertain. We assessed the positive predictive value (PPV) of International Classification of Diseases, tenth revision (ICD-10), vascular injury codes in administrative claims data in Ontario. We conducted a retrospective validation study using the Canadian Institute for Health Information Discharge Abstract Database, an administrative database that records all hospital admissions in Canada. We evaluated 380 randomly selected hospital discharge abstracts from the 2 main trauma centres in Toronto, Ont., St.Michael's Hospital and Sunnybrook Health Sciences Centre, between Apr. 1, 2002, and Mar. 31, 2010. We then compared these records with the corresponding patients' hospital charts to assess the level of agreement for procedure coding. We calculated the PPV and sensitivity to estimate the validity of vascular injury diagnosis coding. The overall PPV for vascular injury coding was estimated to be 95% (95% confidence interval [CI] 92.3-96.8). The PPV among code groups for neck, thorax, abdomen, upper extremity and lower extremity injuries ranged from 90.8 (95% CI 82.2-95.5) to 97.4 (95% CI 91.0-99.3), whereas sensitivity ranged from 90% (95% CI 81.5-94.8) to 98.7% (95% CI 92.9-99.8). Administrative claims hospital discharge data based on ICD-10 diagnosis codes have a high level of validity when identifying cases of vascular injury. Observational Study Level III.

  8. Consumers’ Health-Related Motive Orientations and Reactions to Claims about Dietary Calcium

    Directory of Open Access Journals (Sweden)

    Christine Hoefkens

    2013-01-01

    Full Text Available Health claims may contribute to better informed and healthier food choices and to improved industrial competitiveness by marketing foods that support healthier lifestyles in line with consumer preferences. With the more stringent European Union regulation of nutrition and health claims, insights into consumers’ health-related goal patterns and their reactions towards such claims are needed to influence the content of lawful claims. This study investigated how consumers’ explicit and implicit health-related motive orientations (HRMOs together with the type of calcium-claim (nutrition claim, health claim and reduction of disease risk claim influence perceived credibility and purchasing intention of calcium-enriched fruit juice. Data were collected in April 2006 through a consumer survey with 341 Belgian adults. The findings indicate that stronger implicit HRMOs (i.e., indirect benefits of calcium for personal health are associated with higher perceived credibility, which is not (yet translated into a higher purchasing intention. Consumers’ explicit HRMOs, which refer to direct benefits or physiological functions of calcium in the body — as legally permitted in current calcium-claims in the EU — do not associate with reactions to the claims. Independently of consumers’ HRMOs, the claim type significantly affects the perceived credibility and purchasing intention of the product. Implications for nutrition policy makers and food industries are discussed.

  9. Identification of Emergency Department Visits in Medicare Administrative Claims: Approaches and Implications.

    Science.gov (United States)

    Venkatesh, Arjun K; Mei, Hao; Kocher, Keith E; Granovsky, Michael; Obermeyer, Ziad; Spatz, Erica S; Rothenberg, Craig; Krumholz, Harlan M; Lin, Zhenqui

    2017-04-01

    Administrative claims data sets are often used for emergency care research and policy investigations of healthcare resource utilization, acute care practices, and evaluation of quality improvement interventions. Despite the high profile of emergency department (ED) visits in analyses using administrative claims, little work has evaluated the degree to which existing definitions based on claims data accurately captures conventionally defined hospital-based ED services. We sought to construct an operational definition for ED visitation using a comprehensive Medicare data set and to compare this definition to existing operational definitions used by researchers and policymakers. We examined four operational definitions of an ED visit commonly used by researchers and policymakers using a 20% sample of the 2012 Medicare Chronic Condition Warehouse (CCW) data set. The CCW data set included all Part A (hospital) and Part B (hospital outpatient, physician) claims for a nationally representative sample of continuously enrolled Medicare fee-for-services beneficiaries. Three definitions were based on published research or existing quality metrics including: 1) provider claims-based definition, 2) facility claims-based definition, and 3) CMS Research Data Assistance Center (ResDAC) definition. In addition, we developed a fourth operational definition (Yale definition) that sought to incorporate additional coding rules for identifying ED visits. We report levels of agreement and disagreement among the four definitions. Of 10,717,786 beneficiaries included in the sample data set, 22% had evidence of ED use during the study year under any of the ED visit definitions. The definition using provider claims identified a total of 4,199,148 ED visits, the facility definition 4,795,057 visits, the ResDAC definition 5,278,980 ED visits, and the Yale definition 5,192,235 ED visits. The Yale definition identified a statistically different (p < 0.05) collection of ED visits than all other

  10. System Based Code: Principal Concept

    International Nuclear Information System (INIS)

    Yasuhide Asada; Masanori Tashimo; Masahiro Ueta

    2002-01-01

    This paper introduces a concept of the 'System Based Code' which has initially been proposed by the authors intending to give nuclear industry a leap of progress in the system reliability, performance improvement, and cost reduction. The concept of the System Based Code intends to give a theoretical procedure to optimize the reliability of the system by administrating every related engineering requirement throughout the life of the system from design to decommissioning. (authors)

  11. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  12. Hanford facility dangerous waste permit application, general information portion

    International Nuclear Information System (INIS)

    Hays, C.B.

    1998-01-01

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needed by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in this report)

  13. Health and nutrition content claims on websites advertising infant formula available in Australia: A content analysis.

    Science.gov (United States)

    Berry, Nina J; Gribble, Karleen D

    2017-10-01

    The use of health and nutrition content claims in infant formula advertising is restricted by many governments in response to WHO policies and WHA resolutions. The purpose of this study was to determine whether such prohibited claims could be observed in Australian websites that advertise infant formula products. A comprehensive internet search was conducted to identify websites that advertise infant formula available for purchase in Australia. Content analysis was used to identify prohibited claims. The coding frame was closely aligned with the provisions of the Australian and New Zealand Food Standard Code, which prohibits these claims. The outcome measures were the presence of health claims, nutrition content claims, or references to the nutritional content of human milk. Web pages advertising 25 unique infant formula products available for purchase in Australia were identified. Every advertisement (100%) contained at least one health claim. Eighteen (72%) also contained at least one nutrition content claim. Three web pages (12%) advertising brands associated with infant formula products referenced the nutritional content of human milk. All of these claims appear in spite of national regulations prohibiting them indicating a failure of monitoring and/or enforcement. Where countries have enacted instruments to prohibit health and other claims in infant formula advertising, the marketing of infant formula must be actively monitored to be effective. © 2016 John Wiley & Sons Ltd.

  14. Non-fatal workplace violence workers' compensation claims (1993-1996).

    Science.gov (United States)

    Hashemi, L; Webster, B S

    1998-06-01

    More is known about fatal workplace violence than non-fatal workplace violence (NFWV). This study provides descriptive information on the number and cost of NFWV claims filed with a large workers' compensation carrier. NFWV claims from 51 US jurisdictions were selected either by cause codes or by word search from the accident-description narrative. Claims reported in 1993 through 1996 were analyzed to report the frequency, cost, gender, age, industry, and nature of injury. An analysis of a random sample of 600 claims provided information on perpetrator type, cause of events, and injury mechanism. A total of 28,692 NFWV claims were filed during the study period. No cost was incurred for 32.5% of the claims, and 15.5% received payments for lost work. As a percentage of all claims filed by industry, schools had the highest percentage (11.4%) of NFWV claims, and banking had the highest percentage (11.5%) of cost. The majority of claims in the banking random sample group (93%) were due to stress. In the random sample, 90.3% of claims were caused by criminals (51.8%) or by patients, clients, or customers (38.5%). Only 9.7% were caused by an employee (9.2%) or a personal acquaintance of the employee (0.5%). Employers should acknowledge that NFWV incidents occur, recognize that the majority of perpetrators are criminals or clients rather than employees, and develop appropriate prevention and intervention programs.

  15. Accuracy of claims-based algorithms for epilepsy research: Revealing the unseen performance of claims-based studies.

    Science.gov (United States)

    Moura, Lidia M V R; Price, Maggie; Cole, Andrew J; Hoch, Daniel B; Hsu, John

    2017-04-01

    To evaluate published algorithms for the identification of epilepsy cases in medical claims data using a unique linked dataset with both clinical and claims data. Using data from a large, regional health delivery system, we identified all patients contributing biologic samples to the health system's Biobank (n = 36K). We identified all subjects with at least one diagnosis potentially consistent with epilepsy, for example, epilepsy, convulsions, syncope, or collapse, between 2014 and 2015, or who were seen at the epilepsy clinic (n = 1,217), plus a random sample of subjects with neither claims nor clinic visits (n = 435); we then performed a medical chart review in a random subsample of 1,377 to assess the epilepsy diagnosis status. Using the chart review as the reference standard, we evaluated the test characteristics of six published algorithms. The best-performing algorithm used diagnostic and prescription drug data (sensitivity = 70%, 95% confidence interval [CI] 66-73%; specificity = 77%, 95% CI 73-81%; and area under the curve [AUC] = 0.73, 95%CI 0.71-0.76) when applied to patients age 18 years or older. Restricting the sample to adults aged 18-64 years resulted in a mild improvement in accuracy (AUC = 0.75,95%CI 0.73-0.78). Adding information about current antiepileptic drug use to the algorithm increased test performance (AUC = 0.78, 95%CI 0.76-0.80). Other algorithms varied in their included data types and performed worse. Current approaches for identifying patients with epilepsy in insurance claims have important limitations when applied to the general population. Approaches incorporating a range of information, for example, diagnoses, treatments, and site of care/specialty of physician, improve the performance of identification and could be useful in epilepsy studies using large datasets. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  16. RELAP5/MOD3 code manual: Code structure, system models, and solution methods. Volume 1

    International Nuclear Information System (INIS)

    1995-08-01

    The RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during postulated accidents. The code models the coupled behavior of the reactor coolant system and the core for loss-of-coolant accidents, and operational transients, such as anticipated transient without scram, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling, approach is used that permits simulating a variety of thermal hydraulic systems. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater systems. RELAP5/MOD3 code documentation is divided into seven volumes: Volume I provides modeling theory and associated numerical schemes

  17. Current status of nutrition labelling and claims in the South-East Asian region: are we in harmony?

    Science.gov (United States)

    Tee, E-Siong; Tamin, Suryani; Ilyas, Rosmulyati; Ramos, Adelisa; Tan, Wei-Ling; Lai, Darwin Kah-Soon; Kongchuntuk, Hataya

    2002-01-01

    This review includes the situation of nutrition labelling and claims in six countries in South-East Asia: Brunei, Indonesia, Malaysia, Philippines, Singapore and Thailand. With the exception of Malaysia, there is no mandatory nutrition labelling requirements for foods in these countries except for special categories of foods and when nutritional claims are made for fortified or enriched foods. Nevertheless, several food manufacturers, especially multinationals, do voluntarily label the nutritional content of a number of food products. There is, therefore, increasing interest among authorities in countries in the region to start formulating regulations for nutrition labelling for a wider variety of foods. Malaysia has proposed new regulations to make it mandatory to label a number of foodstuffs with the four core nutrients, protein, carbohydrate, fat and energy. Other countries have preferred to start with voluntary labelling by the manufacturers, but have spelt out the requirements for this voluntary labelling. The format and requirements for nutrition labelling differ widely for countries in the region. Some countries, such as Malaysia, closely follow the Codex guidelines on nutrition labelling in terms of format, components to be included and mode of expression. Other countries, such as the Philippines and Thailand, have drafted nutrition labelling regulations very similar to those of the Nutrition Labeling and Education Act (NLEA) of the United States. Nutrition and health claims are also not specifically permitted under food regulations that were enacted before 1998. However, various food products on the market have been carrying a variety of nutrition and health claims. There is concern that without proper regulations, the food industry may not be certain as to what claims can be made. Excessive and misleading claims made by irresponsible manufacturers would only serve to confuse and mislead the consumer. In recent years, there has been efforts in countries in

  18. Controlling for Frailty in Pharmacoepidemiologic Studies of Older Adults: Validation of an Existing Medicare Claims-based Algorithm.

    Science.gov (United States)

    Cuthbertson, Carmen C; Kucharska-Newton, Anna; Faurot, Keturah R; Stürmer, Til; Jonsson Funk, Michele; Palta, Priya; Windham, B Gwen; Thai, Sydney; Lund, Jennifer L

    2018-07-01

    Frailty is a geriatric syndrome characterized by weakness and weight loss and is associated with adverse health outcomes. It is often an unmeasured confounder in pharmacoepidemiologic and comparative effectiveness studies using administrative claims data. Among the Atherosclerosis Risk in Communities (ARIC) Study Visit 5 participants (2011-2013; n = 3,146), we conducted a validation study to compare a Medicare claims-based algorithm of dependency in activities of daily living (or dependency) developed as a proxy for frailty with a reference standard measure of phenotypic frailty. We applied the algorithm to the ARIC participants' claims data to generate a predicted probability of dependency. Using the claims-based algorithm, we estimated the C-statistic for predicting phenotypic frailty. We further categorized participants by their predicted probability of dependency (<5%, 5% to <20%, and ≥20%) and estimated associations with difficulties in physical abilities, falls, and mortality. The claims-based algorithm showed good discrimination of phenotypic frailty (C-statistic = 0.71; 95% confidence interval [CI] = 0.67, 0.74). Participants classified with a high predicted probability of dependency (≥20%) had higher prevalence of falls and difficulty in physical ability, and a greater risk of 1-year all-cause mortality (hazard ratio = 5.7 [95% CI = 2.5, 13]) than participants classified with a low predicted probability (<5%). Sensitivity and specificity varied across predicted probability of dependency thresholds. The Medicare claims-based algorithm showed good discrimination of phenotypic frailty and high predictive ability with adverse health outcomes. This algorithm can be used in future Medicare claims analyses to reduce confounding by frailty and improve study validity.

  19. Sickness benefit claims due to mental disorders in Brazil : associations in a population-based study

    NARCIS (Netherlands)

    Barbosa-Branco, Anadergh; Bultmann, Ute; Steenstra, Ivan

    2012-01-01

    This study aims to determine the prevalence and duration of sickness benefit claims due to mental disorders and their association with economic activity, sex, age, work-relatedness and income replacement using a population-based study of sickness benefit claims (> 15 days) due to mental disorders in

  20. Group representations, error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E

    1996-01-01

    This report continues the discussion of unitary error bases and quantum codes. Nice error bases are characterized in terms of the existence of certain characters in a group. A general construction for error bases which are non-abelian over the center is given. The method for obtaining codes due to Calderbank et al. is generalized and expressed purely in representation theoretic terms. The significance of the inertia subgroup both for constructing codes and obtaining the set of transversally implementable operations is demonstrated.

  1. 29 CFR 4281.18 - Outstanding claims for withdrawal liability.

    Science.gov (United States)

    2010-07-01

    ... INSOLVENCY, REORGANIZATION, TERMINATION, AND OTHER RULES APPLICABLE TO MULTIEMPLOYER PLANS DUTIES OF PLAN... in insolvency proceedings. The plan sponsor shall value an outstanding claim for withdrawal liability... title 11, United States Code, or any case or proceeding under similar provisions of state insolvency...

  2. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Fuel performance codes approximate this complex behavior using an axisymmetric, axially-stacked, one-dimensional radial representation to save computation cost. However, the need for improved modeling of PCMI and, particularly, the importance of multidimensional capability for accurate fuel performance simulation has been identified as safety margin decreases. Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed

  3. Healthfulness and nutritional composition of Canadian prepackaged foods with and without sugar claims.

    Science.gov (United States)

    Bernstein, Jodi T; Franco-Arellano, Beatriz; Schermel, Alyssa; Labonté, Marie-Ève; L'Abbé, Mary R

    2017-11-01

    The objective of this study was to evaluate differences in calories, nutrient content, overall healthfulness, and use of sweetener ingredients between products with and without sugar claims. Consumers assume products with sugar claims are healthier and lower in calories. It is therefore important claims be found on comparatively healthier items. This study is a cross-sectional analysis of the University of Toronto's 2013 Food Label Database. Subcategories where at least 5% of products (and n ≥ 5) carried a sugar claim were included (n = 3048). Differences in median calorie content, nutrient content, and overall healthfulness, using the Food Standards Australia/New Zealand Nutrient Profiling Scoring criterion, between products with and without sugar claims, were determined. Proportion of products with and without claims that had excess free sugar levels (≥10% of calories from free sugar) and that contained sweeteners was also determined. Almost half (48%) of products with sugar claims contained excess free sugar, and a greater proportion contained sweeteners than products without such claims (30% vs 5%, χ 2 = 338.6, p contents than products without claims. At the subcategory level, reductions in free sugar contents were not always met with similar reductions in calorie contents. This study highlights concerns with regards to the nutritional composition of products bearing sugar claims. Findings can support educational messaging to assist consumer interpretation of sugar claims and can inform changes in nutrition policies, for example, permitting sugar claims only on products with calorie reductions and without excess free sugar.

  4. Conscience claims, metaphysics, and avoiding an LGBT eugenic.

    Science.gov (United States)

    Brummett, Abram

    2018-06-01

    Novel assisted reproductive technologies (ART) are poised to present our society with strange new ethical questions, such as whether lesbian, gay, bisexual, and transgender (LGBT) couples should be allowed to produce children biologically related to both parents, or whether trans-women who want to experience childbirth should be allowed to receive uterine transplants. Clinicians opposed to offering such technologies to LGBT couples on moral grounds are likely to seek legal shelter through the conscience clauses enshrined in U.S. law. This paper begins by briefly discussing some novel ART on the horizon and noting that it is unclear whether current conscience clauses will permit fertility clinics to deny such services to LGBT individuals. A compromise approach to conscience is any view that sees the value of respecting conscience claims within limits. I describe and critique the constraints proposed in the recent work of Wicclair, NeJaime and Siegel as ultimately begging the question. My purpose is to strengthen their arguments by suggesting that in the controversial situations that elicit claims of conscience, bioethicists should engage with the metaphysical claims in play. I argue that conscience claims against LGBT individuals ought to be constrained because the underlying metaphysic-that God has decreed the LGBT lifestyle to be sinful-is highly implausible from the perspective of a naturalized metaphysic, which ought to be the lens through which we evaluate conscience claims. © 2018 John Wiley & Sons Ltd.

  5. 38 CFR 3.160 - Status of claims.

    Science.gov (United States)

    2010-07-01

    ..., Compensation, and Dependency and Indemnity Compensation Claims § 3.160 Status of claims. The following definitions are applicable to claims for pension, compensation, and dependency and indemnity compensation. (a... for a benefit received after final disallowance of an earlier claim, or any application based on...

  6. 32 CFR 536.120 - Claims payable as maritime claims.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Claims payable as maritime claims. 536.120... ACCOUNTS CLAIMS AGAINST THE UNITED STATES Maritime Claims § 536.120 Claims payable as maritime claims. A claim is cognizable under this subpart if it arises in or on a maritime location, involves some...

  7. 20 CFR 10.405 - Who is considered a dependent in a claim based on disability or impairment?

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Who is considered a dependent in a claim based on disability or impairment? 10.405 Section 10.405 Employees' Benefits OFFICE OF WORKERS... Disability and Impairment § 10.405 Who is considered a dependent in a claim based on disability or impairment...

  8. FIRE PERMIT NOW ON EDH!

    CERN Multimedia

    TIS General Safety Group or

    2001-01-01

    The electronic version of the Fire Permit form is now active. The aim of the Fire Permit procedure is to reduce the risk of fire or explosion. It is mandatory when performing 'hot work' (mainly activities which involve the use of naked flames or other heat sources - e.g. welding, brazing, cutting, grinding, etc.). Its use is explained in the CERN Fire Protection Code E. (Fire Protection) The new electronic form, which is substantially unchanged from the previous authorizing procedure, will be available on the Electronic Document Handling system (https://edh.cern.ch/) as of 1st September 2001. From this date use of the paper version should be discontinued.

  9. The cost of respirable coal mine dust: an analysis based on new black lung claims

    Energy Technology Data Exchange (ETDEWEB)

    Page, S.J.; Organiscak, J.A.; Lichtman, K. [US Bureau of Mines, Pittsburgh, PA (United States). Dept. of the Interior

    1997-12-01

    The article provides summation of the monetary costs of new compensation claims associated with levels of unmitigated respirable coal mine dust and the resultant lung disease known as black lung and compares these compensation costs to the cost of dust control technology research by the US Bureau of Mines. It presents an analysis of these expenditures and projects these costs over the period from 1991 to 2010, based on projected future new claims which are assumed to be approved for federal and state benefit payment. Since current and future dust control research efforts cannot change past claim histories, a valid comparison of future research spending with other incurred costs must examine only the cost of future new claims. The bias of old claim costs was eliminated in this analysis by examining only claims since 1980. The results estimate that for an expected 339 new approved claims annually from 1991 to 2010, the Federal Trust Fund costs will be 985 million dollars. During this same period, state black lung compensation is estimated to be 18.2 billion dollars. The Bureau of Mines dust control research expenditures are estimated as 0.44% of the projected future black lung-related costs. 9 refs., 4 figs., 3 tabs.

  10. Hanford facility dangerous waste permit application, PUREX storage tunnels

    International Nuclear Information System (INIS)

    Price, S.M.

    1997-01-01

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating treatment, storage, and/or disposal units, such as the PUREX Storage Tunnels (this document, DOE/RL-90-24). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the US Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needs defined by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. A checklist indicating where information is contained in the PUREX Storage Tunnels permit application documentation, in relation to the Washington State Department of Ecology guidance, is located in the Contents Section. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the PUREX Storage Tunnels permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text. Information provided in this PUREX Storage Tunnels permit application documentation is current as of April 1997

  11. Act No. 1183, Civil Code, 23 December 1985.

    Science.gov (United States)

    1987-01-01

    This document contains major provisions of Paraguay's 1985 Civil Code. The Code sets the marriage age at 16 for males and 14 for females and forbids marriage between natural and adopted relatives as well as between persons of the same sex. Bigamy is forbidden, as is marriage between a person and someone convicted of attempting or committing homicide against that person's spouse. Legal incompetents may not marry. Underage minors may marry with the permission of their parents or a court. Noted among the rights and duties of a married couple is the stipulation that husbands (or a judge) must give their approval before wives can legally run a business or work outside of the house or perform other specified activities. Valid marriages are dissolved only upon the death of one spouse. Remarriage in Paraguay after divorce abroad is forbidden. Spouses may legally separate after 2 years of married life (married minors must remain together until 2 years past the age of majority). Marital separation may be requested for adultery, attempted homicide by one spouse upon the other, dishonest or immoral conduct, extreme cruelty or abuse, voluntary or malicious abandonment, or the state of habitual intoxication or repeated use of drugs. Marriages can be annulled in specified cases. Marital property is subject to the community property regime, but each spouse may retain control of specified types of personal property. The Code appoints the husband as manager of community property within limits and reserves certain property to the wife. The Code permits premarital agreements about property management, and covers the dissolution and liquidation of the community property regime. The Code also sets provisions governing "de facto" unions; filiation for children born in and outside of wedlock; claims for parental recognition; kinship; and the duty to provide maintenance to spouses, children, and other relatives.

  12. 38 CFR 3.311 - Claims based on exposure to ionizing radiation.

    Science.gov (United States)

    2010-07-01

    ... to ionizing radiation. 3.311 Section 3.311 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF... Evaluations; Service Connection § 3.311 Claims based on exposure to ionizing radiation. (a) Determinations of... to ionizing radiation in service, an assessment will be made as to the size and nature of the...

  13. Development of authentication code for multi-access optical code division multiplexing based quantum key distribution

    Science.gov (United States)

    Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.

    2018-05-01

    One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.

  14. State waste discharge permit application 400 Area secondary cooling water. Revision 2

    International Nuclear Information System (INIS)

    1996-01-01

    This document constitutes the Washington Administrative Code 173-216 State Waste Discharge Permit Application that serves as interim compliance as required by Consent Order DE 91NM-177, for the 400 Area Secondary Cooling Water stream. As part of the Hanford Federal Facility Agreement and Consent Order negotiations, the US Department of Energy, Richland Operations Office, the US Environmental Protection Agency, and the Washington State Department of Ecology agreed that liquid effluent discharges to the ground on the Hanford Site that affect groundwater or have the potential to affect groundwater would be subject to permitting under the structure of Chapter 173-216 of the Washington Administrative Code, the State Waste Discharge Permitting Program. As a result of this decision, the Washington State Department of Ecology and the US Department of Energy, Richland Operations Office entered into Consent Order DE 91NM-177. The Consent Order DE 91NM-177 requires a series of permitting activities for liquid effluent discharges. Based upon compositional and flow rate characteristics, liquid effluent streams on the Hanford Site have been categorized into Phase 1, Phase 2, and Miscellaneous streams. This document only addresses the 400 Area Secondary Cooling Water stream, which has been identified as a Phase 2 stream. The 400 Area Secondary Cooling Water stream includes contribution streams from the Fuels and Materials Examination Facility, the Maintenance and Storage Facility, the 481-A pump house, and the Fast Flux Test Facility

  15. Multiplexed coding in the human basal ganglia

    Science.gov (United States)

    Andres, D. S.; Cerquetti, D.; Merello, M.

    2016-04-01

    A classic controversy in neuroscience is whether information carried by spike trains is encoded by a time averaged measure (e.g. a rate code), or by complex time patterns (i.e. a time code). Here we apply a tool to quantitatively analyze the neural code. We make use of an algorithm based on the calculation of the temporal structure function, which permits to distinguish what scales of a signal are dominated by a complex temporal organization or a randomly generated process. In terms of the neural code, this kind of analysis makes it possible to detect temporal scales at which a time patterns coding scheme or alternatively a rate code are present. Additionally, finding the temporal scale at which the correlation between interspike intervals fades, the length of the basic information unit of the code can be established, and hence the word length of the code can be found. We apply this algorithm to neuronal recordings obtained from the Globus Pallidus pars interna from a human patient with Parkinson’s disease, and show that a time pattern coding and a rate coding scheme co-exist at different temporal scales, offering a new example of multiplexed neuronal coding.

  16. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  17. Hanford Facility Dangerous Waste Permit Application, 222-S Laboratory Complex

    International Nuclear Information System (INIS)

    WILLIAMS, J.F.

    2000-01-01

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating treatment, storage, and/or disposal units, such as the 222-S Laboratory Complex (this document, DOE/RL-91-27). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1987 and 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needs defined by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the 222-S Laboratory Complex permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text. Information provided in this 222-S Laboratory Complex permit application documentation is current as of August 2000

  18. Data linkage of inpatient hospitalization and workers' claims data sets to characterize occupational falls.

    Science.gov (United States)

    Bunn, Terry L; Slavova, Svetla; Bathke, Arne

    2007-07-01

    The identification of industry, occupation, and associated injury costs for worker falls in Kentucky have not been fully examined. The purpose of this study was to determine the associations between industry and occupation and 1) hospitalization length of stay; 2) hospitalization charges; and 3) workers' claims costs in workers suffering falls, using linked inpatient hospitalization discharge and workers' claims data sets. Hospitalization cases were selected with ICD-9-CM external cause of injury codes for falls and payer code of workers' claims for years 2000-2004. Selection criteria for workers'claims cases were International Association of Industrial Accident Boards and Commissions Electronic Data Interchange Nature (IAIABCEDIN) injuries coded as falls and/or slips. Common data variables between the two data sets such as date of birth, gender, date of injury, and hospital admission date were used to perform probabilistic data linkage using LinkSolv software. Statistical analysis was performed with non-parametric tests. Construction falls were the most prevalent for male workers and incurred the highest hospitalization and workers' compensation costs, whereas most female worker falls occurred in the services industry. The largest percentage of male worker falls was from one level to another, while the largest percentage of females experienced a fall, slip, or trip (not otherwise classified). When male construction worker falls were further analyzed, laborers and helpers had longer hospital stays as well as higher total charges when the worker fell from one level to another. Data linkage of hospitalization and workers' claims falls data provides additional information on industry, occupation, and costs that are not available when examining either data set alone.

  19. Facial expression coding in children and adolescents with autism: Reduced adaptability but intact norm-based coding.

    Science.gov (United States)

    Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise

    2018-05-01

    Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.

  20. 37 CFR 10.67 - Settling similar claims of clients.

    Science.gov (United States)

    2010-07-01

    ... clients. 10.67 Section 10.67 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE... Office Code of Professional Responsibility § 10.67 Settling similar claims of clients. A practitioner who represents two or more clients shall not make or participate in the making of an aggregate settlement of the...

  1. Discussion of the question whether a local authority can claim to be affected in its planning competence by a permit issued for construction of a radwaste processing plant. High Administrative Court Lueneburg, judgement of 21.1.1993 - 7 K 5/90

    International Nuclear Information System (INIS)

    Czajka, D.

    1993-01-01

    A local authority has taken legal action against the first partial permit for the construction of a radwaste conditioning pilot plant at Gorleben, claiming to be affected in its planning competence by the fact that transport of spent fuel elements between the spent fuel storage facility and the pilot plant 2 km away would have to proceed on the rural district road. The action has been discussed. Appealable head notes: A local authority is not affected in its planning competence by a permit issued for construction of a facility for radwaste processing, although the operation of said facility may result in radwaste being transported by a road crossing the local authority's territory. (orig.) [de

  2. Tradeable CO2 emission permits for cost-effective control of global warming

    International Nuclear Information System (INIS)

    Kosobud, R.F.; South, D.W.; Daly, T.A.; Quinn, K.G.

    1991-01-01

    Many current global warming mitigation policy proposals call for large, near-term reductions in CO 2 emissions, thereby entailing high initial carbon emission tax rates or permit prices. This paper claims that these high initial tax rates or permit prices are not cost-effective in achieving the desired degree of climate change control. A cost-effective permit system is proposed and described that, under certain assumptions, would allow markets to optimally lead permit prices along a gradually increasing trajectory over tie. This price path presents the Hotelling result and would ease the abrupt, inefficient, and costly adjustments imposed on the fossil fuel and other industries in current proposals. This finding is demonstrated using the Argonne Model, a linear programming energy- environmental-economic model that allows for intertemporal optimization of consumer energy well-being. 12 refs., 3 figs., 1 tab

  3. Normative Beliefs, Discursive Claims, and Implementation of Reform-Based Science Standards

    Science.gov (United States)

    Veal, William R.; Riley Lloyd, Mary E.; Howell, Malia R.; Peters, John

    2016-01-01

    Reform-based science instruction is guided by teachers' normative beliefs. Discursive claims are how teachers say they teach science. Previous research has studied the change in teachers' beliefs and how beliefs influence intended practice and action in the classroom. Few studies have connected what teachers believe, how they say they teach, and…

  4. Algorithms to identify colonic ischemia, complications of constipation and irritable bowel syndrome in medical claims data: development and validation.

    Science.gov (United States)

    Sands, Bruce E; Duh, Mei-Sheng; Cali, Clorinda; Ajene, Anuli; Bohn, Rhonda L; Miller, David; Cole, J Alexander; Cook, Suzanne F; Walker, Alexander M

    2006-01-01

    A challenge in the use of insurance claims databases for epidemiologic research is accurate identification and verification of medical conditions. This report describes the development and validation of claims-based algorithms to identify colonic ischemia, hospitalized complications of constipation, and irritable bowel syndrome (IBS). From the research claims databases of a large healthcare company, we selected at random 120 potential cases of IBS and 59 potential cases each of colonic ischemia and hospitalized complications of constipation. We sought the written medical records and were able to abstract 107, 57, and 51 records, respectively. We established a 'true' case status for each subject by applying standard clinical criteria to the available chart data. Comparing the insurance claims histories to the assigned case status, we iteratively developed, tested, and refined claims-based algorithms that would capture the diagnoses obtained from the medical records. We set goals of high specificity for colonic ischemia and hospitalized complications of constipation, and high sensitivity for IBS. The resulting algorithms substantially improved on the accuracy achievable from a naïve acceptance of the diagnostic codes attached to insurance claims. The specificities for colonic ischemia and serious complications of constipation were 87.2 and 92.7%, respectively, and the sensitivity for IBS was 98.9%. U.S. commercial insurance claims data appear to be usable for the study of colonic ischemia, IBS, and serious complications of constipation. (c) 2005 John Wiley & Sons, Ltd.

  5. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed. Nuclear fuel operates in an extreme environment that induces complex multiphysics phenomena, occurring over distances ranging from inter-atomic spacing to meters, and times scales ranging from microseconds to years. This multiphysics behavior is often tightly coupled, a well known example being the thermomechanical behavior. Adding to this complexity, important aspects of fuel behavior are inherently

  6. Incorporating Code-Based Software in an Introductory Statistics Course

    Science.gov (United States)

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  7. Claims-based studies of oral glucose-lowering medications can achieve balance in critical clinical variables only observed in electronic health records.

    Science.gov (United States)

    Patorno, Elisabetta; Gopalakrishnan, Chandrasekar; Franklin, Jessica M; Brodovicz, Kimberly G; Masso-Gonzalez, Elvira; Bartels, Dorothee B; Liu, Jun; Schneeweiss, Sebastian

    2018-04-01

    To evaluate the extent to which balance in unmeasured characteristics of patients with type 2 diabetes (T2DM) was achieved in claims data, by comparing against more detailed information from linked electronic health records (EHR) data. Within a large US commercial insurance database and using a cohort design, we identified patients with T2DM initiating linagliptin or a comparator agent within class (ie, another dipeptidyl peptidase-4 inhibitor) or outside class (ie, pioglitazone or a sulphonylurea) between May 2011 and December 2012. We focused on comparators used at a similar stage of diabetes to linagliptin. For each comparison, 1:1 propensity score (PS) matching was used to balance >100 baseline claims-based characteristics, including proxies of diabetes severity and duration. Additional clinical data from EHR were available for a subset of patients. We assessed representativeness of the claims-EHR-linked subset, evaluated the balance of claims- and EHR-based covariates before and after PS-matching via standardized differences (SDs), and quantified the potential bias associated with observed imbalances. From a claims-based study population of 166 613 patients with T2DM, 7219 (4.3%) patients were linked to their EHR data. Claims-based characteristics in the EHR-linked and EHR-unlinked patients were similar (SD balance of claims-based and EHR-based patient characteristics appeared to be reasonable before PS-matching and generally improved in the PS-matched population, to be SD balance in covariates typically unmeasured in administrative claims datasets, to the extent that residual confounding is unlikely. © 2017 John Wiley & Sons Ltd.

  8. Implementation of LT codes based on chaos

    International Nuclear Information System (INIS)

    Zhou Qian; Li Liang; Chen Zengqiang; Zhao Jiaxiang

    2008-01-01

    Fountain codes provide an efficient way to transfer information over erasure channels like the Internet. LT codes are the first codes fully realizing the digital fountain concept. They are asymptotically optimal rateless erasure codes with highly efficient encoding and decoding algorithms. In theory, for each encoding symbol of LT codes, its degree is randomly chosen according to a predetermined degree distribution, and its neighbours used to generate that encoding symbol are chosen uniformly at random. Practical implementation of LT codes usually realizes the randomness through pseudo-randomness number generator like linear congruential method. This paper applies the pseudo-randomness of chaotic sequence in the implementation of LT codes. Two Kent chaotic maps are used to determine the degree and neighbour(s) of each encoding symbol. It is shown that the implemented LT codes based on chaos perform better than the LT codes implemented by the traditional pseudo-randomness number generator. (general)

  9. A no-go theorem for a two-dimensional self-correcting quantum memory based on stabilizer codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara

    2009-01-01

    We study properties of stabilizer codes that permit a local description on a regular D-dimensional lattice. Specifically, we assume that the stabilizer group of a code (the gauge group for subsystem codes) can be generated by local Pauli operators such that the support of any generator is bounded by a hypercube of size O(1). Our first result concerns the optimal scaling of the distance d with the linear size of the lattice L. We prove an upper bound d=O(L D-1 ) which is tight for D=1, 2. This bound applies to both subspace and subsystem stabilizer codes. Secondly, we analyze the suitability of stabilizer codes for building a self-correcting quantum memory. Any stabilizer code with geometrically local generators can be naturally transformed to a local Hamiltonian penalizing states that violate the stabilizer condition. A degenerate ground state of this Hamiltonian corresponds to the logical subspace of the code. We prove that for D=1, 2, different logical states can be mapped into each other by a sequence of single-qubit Pauli errors such that the energy of all intermediate states is upper bounded by a constant independent of the lattice size L. The same result holds if there are unused logical qubits that are treated as 'gauge qubits'. It demonstrates that a self-correcting quantum memory cannot be built using stabilizer codes in dimensions D=1, 2. This result is in sharp contrast with the existence of a classical self-correcting memory in the form of a two-dimensional (2D) ferromagnet. Our results leave open the possibility for a self-correcting quantum memory based on 2D subsystem codes or on 3D subspace or subsystem codes.

  10. Nutrition issues in Codex: health claims, nutrient reference values and WTO agreements: a conference report.

    Science.gov (United States)

    Aggett, Peter J; Hathcock, John; Jukes, David; Richardson, David P; Calder, Philip C; Bischoff-Ferrari, Heike; Nicklas, Theresa; Mühlebach, Stefan; Kwon, Oran; Lewis, Janine; Lugard, Maurits J F; Prock, Peter

    2012-03-01

    Codex documents may be used as educational and consensus materials for member governments. Also, the WTO SPS Agreement recognizes Codex as the presumptive international authority on food issues. Nutrient bioavailability is a critical factor in determining the ability of nutrients to provide beneficial effects. Bioavailability also influences the quantitative dietary requirements that are the basis of nutrient intake recommendations and NRVs. Codex, EFSA and some national regulatory authorities have established guidelines or regulations that will permit several types of health claims. The scientific basis for claims has been established by the US FDA and EFSA, but not yet by Codex. Evidence-based nutrition differs from evidence-based medicine, but the differences are only recently gaining recognition. Health claims on foods may provide useful information to consumers, but many will interpret the information to mean that they can rely upon the food or nutrient to eliminate a disease risk. NRVs are designed to provide a quantitative basis for comparing the nutritive values of foods, helping to illustrate how specific foods fit into the overall diet. The INL-98 and the mean of adult male and female values provide NRVs that are sufficient when used as targets for individual intakes by most adults. WTO recognizes Codex as the primary international authority on food issues. Current regulatory schemes based on recommended dietary allowances are trade restrictive. A substantial number of decisions by the EFSA could lead to violation of WTO agreements.

  11. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    Science.gov (United States)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  12. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  13. LEARNING POINTS FROM WHISTLEBLOWER CLAIMS AGAINST INSTITUTIONS OF HIGHER EDUCATION

    Directory of Open Access Journals (Sweden)

    Christopher R. Schmidt

    2015-12-01

    Full Text Available The types of whistleblowing claims made against institutions of higher education are not well understood nor are the various mechanisms used to solicit, investigate, and learn from such claims at the institutional and state levels. This research obtained and analyzed whistleblower claims made against institutions of higher education and explores and facilitates a discussion around the value of learning opportunities that come from whistleblowing claims. Aggregate claims data and detail workpapers for claims made against the 45 publicly funded colleges and universities in the state of Ohio, in the midwestern United States was analyzed to identify patterns and areas of focus which could improve institutional processes and internal controls. Four areas resulted from the analysis: hiring and pay practices, prevention of the theft of institutional assets, prevention of the theft of student funds, and an institutional accreditation issue. All claims that were reported reflected real concerns on topics of strategic importance to institutions and their management practices, although not all were substantiated or corroborated. One quarter of the claims resulted in proven cases for recovery and prosecution. At the state level, completeness of investigation and administrative learning were sometimes not pursued due to the code enforcement nature of the governing bodies whose mandate was limited to the identification and prosecution of crimes, although improvement opportunities clearly existed. The case of Ohio demonstrates that open government and public information request processes can provide sufficient information to allow insight into the nature of the claims and to identify improvement opportunities for both the institution and state level administration.

  14. Chiral symmetry breaking is permitted in supersymmetric QED

    International Nuclear Information System (INIS)

    Walker, M.

    2000-01-01

    Full text: A chirally symmetric theory will generally have a chirally symmetric and a chirally asymmetric solution for the dressed fermionic propagator. It has been claimed that no chirally asymmetric solution for the fermionic propagator exists in supersymmetric QED. This result in the superfield formalism uses a gauge dependent argument whose validity has since been questioned. We present an analogous analysis using the component formalism which demonstrates that chiral symmetry breaking is permitted in this theory. We open the presentation with a brief introduction to supersymmetry, supersymmetric QED, and the superfield formalism. We describe chiral symmetry breaking and the Dyson-Schwinger equation used to analyse it. The derivation of the erroneous theorem claiming the lack of an a chiral propagator is outlined and its flaws discussed. We finish with the equivalent derivation in component fields and our contradictory result

  15. Variable code gamma ray imaging system

    International Nuclear Information System (INIS)

    Macovski, A.; Rosenfeld, D.

    1979-01-01

    A gamma-ray source distribution in the body is imaged onto a detector using an array of apertures. The transmission of each aperture is modulated using a code such that the individual views of the source through each aperture can be decoded and separated. The codes are chosen to maximize the signal to noise ratio for each source distribution. These codes determine the photon collection efficiency of the aperture array. Planar arrays are used for volumetric reconstructions and circular arrays for cross-sectional reconstructions. 14 claims

  16. 32 CFR 536.121 - Claims not payable as maritime claims.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Claims not payable as maritime claims. 536.121... ACCOUNTS CLAIMS AGAINST THE UNITED STATES Maritime Claims § 536.121 Claims not payable as maritime claims... (except at (e) and (k)), and 536.46; (b) Are not maritime in nature; (c) Are not in the best interests of...

  17. Non-Binary Protograph-Based LDPC Codes: Analysis,Enumerators and Designs

    OpenAIRE

    Sun, Yizeng

    2013-01-01

    Non-binary LDPC codes can outperform binary LDPC codes using sum-product algorithm with higher computation complexity. Non-binary LDPC codes based on protographs have the advantage of simple hardware architecture. In the first part of this thesis, we will use EXIT chart analysis to compute the thresholds of different protographs over GF(q). Based on threshold computation, some non-binary protograph-based LDPC codes are designed and their frame error rates are compared with binary LDPC codes. ...

  18. Four year-olds use norm-based coding for face identity.

    Science.gov (United States)

    Jeffery, Linda; Read, Ainsley; Rhodes, Gillian

    2013-05-01

    Norm-based coding, in which faces are coded as deviations from an average face, is an efficient way of coding visual patterns that share a common structure and must be distinguished by subtle variations that define individuals. Adults and school-aged children use norm-based coding for face identity but it is not yet known if pre-school aged children also use norm-based coding. We reasoned that the transition to school could be critical in developing a norm-based system because school places new demands on children's face identification skills and substantially increases experience with faces. Consistent with this view, face identification performance improves steeply between ages 4 and 7. We used face identity aftereffects to test whether norm-based coding emerges between these ages. We found that 4 year-old children, like adults, showed larger face identity aftereffects for adaptors far from the average than for adaptors closer to the average, consistent with use of norm-based coding. We conclude that experience prior to age 4 is sufficient to develop a norm-based face-space and that failure to use norm-based coding cannot explain 4 year-old children's poor face identification skills. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. AQUIS: A PC-based air quality and permit information system

    International Nuclear Information System (INIS)

    Smith, A.E.; Huber, C.C.; Tschanz, J.; Ryckman, J.S. Jr.

    1992-01-01

    The Air Quality Utility Information System (AQUIS) was developed to calculate and track emissions, permits, and related information. The system runs on IBM-compatible personal computers using dBASE IV. AQUIS tracks more than 900 data items distributed among various source categories and allows the user to enter specific information on permit control devices, stacks, and related regulatory requirements. The system is currently operating at seven US Air Force Materiel Command facilities, large industrial operations involved in the repair and maintenance of aircraft. Environmental management personnel are responsible for the compliance status of as many as l,000 sources at each facility. The usefulness of the system has been enhanced by providing a flexible reporting capability that permits users who are unfamiliar with database structure to design and prepare reports containing specified information. In addition to the standard six pollutants, AQUIS calculates compound-specific emissions and allows users to enter their own emission estimates. This capability will be useful in developing air toxics inventories and control plans

  20. 78 FR 16764 - Limitation on Claims Against Proposed Public Transportation Projects

    Science.gov (United States)

    2013-03-18

    ... DEPARTMENT OF TRANSPORTATION Federal Transit Administration Limitation on Claims Against Proposed Public Transportation Projects AGENCY: Federal Transit Administration (FTA), DOT. ACTION: Notice. SUMMARY... advising the public of final agency actions subject to Section 139(l) of Title 23, United States Code (U.S...

  1. 78 FR 4191 - Limitation on Claims Against Proposed Public Transportation Projects

    Science.gov (United States)

    2013-01-18

    ... DEPARTMENT OF TRANSPORTATION Federal Transit Administration Limitation on Claims Against Proposed Public Transportation Projects AGENCY: Federal Transit Administration (FTA), DOT. ACTION: Notice. SUMMARY... advising the public of final agency actions subject to Section 139(l) of Title 23, United States Code (U.S...

  2. 78 FR 33890 - Limitation on Claims Against Proposed Public Transportation Projects

    Science.gov (United States)

    2013-06-05

    ... DEPARTMENT OF TRANSPORTATION Federal Transit Administration Limitation on Claims Against Proposed Public Transportation Projects AGENCY: Federal Transit Administration (FTA), DOT. ACTION: Notice. SUMMARY... advising the public of final agency actions subject to Section 139(l) of Title 23, United States Code (U.S...

  3. The role of radiology in diagnostic error: a medical malpractice claims review.

    Science.gov (United States)

    Siegal, Dana; Stratchko, Lindsay M; DeRoo, Courtney

    2017-09-26

    Just as radiologic studies allow us to see past the surface to the vulnerable and broken parts of the human body, medical malpractice claims help us see past the surface of medical errors to the deeper vulnerabilities and potentially broken aspects of our healthcare delivery system. And just as the insights we gain through radiologic studies provide focus for a treatment plan for healing, so too can the analysis of malpractice claims provide insights to improve the delivery of safe patient care. We review 1325 coded claims where Radiology was the primary service provider to better understand the problems leading to patient harm, and the opportunities most likely to improve diagnostic care in the future.

  4. Markets for renewable energy and pollution emissions: Environmental claims, emission-reduction accounting, and product decoupling

    International Nuclear Information System (INIS)

    Moore, Michael R.; Lewis, Geoffrey McD.; Cepela, Daniel J.

    2010-01-01

    Green electricity generation can provide an indirect route to cleaner air: by displacing generation from fossil fuels, green electricity can reduce emissions of CO 2 and conventional air pollutants. Several types of voluntary markets have emerged in the United States to take advantage of this relationship, including green electricity programs, carbon offsets, and renewable energy certificates. At the same time, regulators are favoring cap-and-trade mechanisms for regulating emissions. This paper describes the appropriate framing of environmental claims for green electricity products. We apply an accounting framework for evaluating claims made for capped pollutants, with entries for emissions, avoided emissions due to green electricity, and unused emission permits. This framework is applied in case studies of two major electric utilities that operate with green electricity programs and capped pollutants. The cases demonstrate that the relative magnitude of 'unused permits' and 'emissions avoided' is a key relationship for evaluating an emissions reduction claim. Lastly, we consider the evolution of the green electricity marketplace given the reliance on cap-and-trade. In this setting, pollution-emission products could be decoupled from one another and from the various green electricity products. Several positive consequences could transpire, including better transparency of products, lower certification costs, and more product choices.

  5. Markets for renewable energy and pollution emissions. Environmental claims, emission-reduction accounting, and product decoupling

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Michael R.; Cepela, Daniel J. [University of Michigan, MI (United States); Lewis, Geoffrey McD. [University of Waterloo, ON (Canada)

    2010-10-15

    Green electricity generation can provide an indirect route to cleaner air: by displacing generation from fossil fuels, green electricity can reduce emissions of CO{sub 2} and conventional air pollutants. Several types of voluntary markets have emerged in the United States to take advantage of this relationship, including green electricity programs, carbon offsets, and renewable energy certificates. At the same time, regulators are favoring cap-and-trade mechanisms for regulating emissions. This paper describes the appropriate framing of environmental claims for green electricity products. We apply an accounting framework for evaluating claims made for capped pollutants, with entries for emissions, avoided emissions due to green electricity, and unused emission permits. This framework is applied in case studies of two major electric utilities that operate with green electricity programs and capped pollutants. The cases demonstrate that the relative magnitude of 'unused permits' and 'emissions avoided' is a key relationship for evaluating an emissions reduction claim. Lastly, we consider the evolution of the green electricity marketplace given the reliance on cap-and-trade. In this setting, pollution-emission products could be decoupled from one another and from the various green electricity products. Several positive consequences could transpire, including better transparency of products, lower certification costs, and more product choices. (author)

  6. Markets for renewable energy and pollution emissions: Environmental claims, emission-reduction accounting, and product decoupling

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Michael R., E-mail: micmoore@umich.ed [University of Michigan, MI (United States); Lewis, Geoffrey McD. [University of Waterloo, ON (Canada); Cepela, Daniel J. [University of Michigan, MI (United States)

    2010-10-15

    Green electricity generation can provide an indirect route to cleaner air: by displacing generation from fossil fuels, green electricity can reduce emissions of CO{sub 2} and conventional air pollutants. Several types of voluntary markets have emerged in the United States to take advantage of this relationship, including green electricity programs, carbon offsets, and renewable energy certificates. At the same time, regulators are favoring cap-and-trade mechanisms for regulating emissions. This paper describes the appropriate framing of environmental claims for green electricity products. We apply an accounting framework for evaluating claims made for capped pollutants, with entries for emissions, avoided emissions due to green electricity, and unused emission permits. This framework is applied in case studies of two major electric utilities that operate with green electricity programs and capped pollutants. The cases demonstrate that the relative magnitude of 'unused permits' and 'emissions avoided' is a key relationship for evaluating an emissions reduction claim. Lastly, we consider the evolution of the green electricity marketplace given the reliance on cap-and-trade. In this setting, pollution-emission products could be decoupled from one another and from the various green electricity products. Several positive consequences could transpire, including better transparency of products, lower certification costs, and more product choices.

  7. 32 CFR 536.129 - Claims cognizable as UCMJ claims.

    Science.gov (United States)

    2010-07-01

    ... Personnel Claims Act and chapter 11 of AR 27-20, which provides compensation only for tangible personal... 32 National Defense 3 2010-07-01 2010-07-01 true Claims cognizable as UCMJ claims. 536.129 Section 536.129 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY CLAIMS AND ACCOUNTS...

  8. Coding and decoding for code division multiple user communication systems

    Science.gov (United States)

    Healy, T. J.

    1985-01-01

    A new algorithm is introduced which decodes code division multiple user communication signals. The algorithm makes use of the distinctive form or pattern of each signal to separate it from the composite signal created by the multiple users. Although the algorithm is presented in terms of frequency-hopped signals, the actual transmitter modulator can use any of the existing digital modulation techniques. The algorithm is applicable to error-free codes or to codes where controlled interference is permitted. It can be used when block synchronization is assumed, and in some cases when it is not. The paper also discusses briefly some of the codes which can be used in connection with the algorithm, and relates the algorithm to past studies which use other approaches to the same problem.

  9. [Surgical procedures involved in claims for alleged defects in praxis].

    Science.gov (United States)

    Arimany-Manso, Josep; Benet-Travé, J; Bruguera-Cortada, M; Torné-Escasany, R; Klamburg-Pujol, J; Gómez-Durán, Esperanza L

    2014-03-01

    Medical professional liability and adverse events in health care are major concerns worldwide and the analysis of claims for alleged defects in praxis is a potential source of knowledge. High rates of adverse events and complaints have been reported in surgical procedures. This article analyzes the claims registered by the Council of Medical Colleges in Catalonia between 1986 and 2012, and explores surgical procedures claimed (ICD- 9-CM coding), as well as the final outcome of the claim. Among the 5,419 records identified on surgical procedures, the interventions of the musculoskeletal system and skin and integument showed the highest frequencies. Interventions related to "non-curative" medicine should be emphasized because of their higher rates of economical agreement or condemnation outcomes, which were significantly higher for mastopexia. The results underscore the importance of the surgical area in medical professional liability and the high risk of payouts among those procedures belonging to the so-called "non-curative" medicine. Copyright © 2014 Elsevier España, S.L. All rights reserved.

  10. Development of an electronic claim system based on an integrated electronic health record platform to guarantee interoperability.

    Science.gov (United States)

    Kim, Hwa Sun; Cho, Hune; Lee, In Keun

    2011-06-01

    We design and develop an electronic claim system based on an integrated electronic health record (EHR) platform. This system is designed to be used for ambulatory care by office-based physicians in the United States. This is achieved by integrating various medical standard technologies for interoperability between heterogeneous information systems. The developed system serves as a simple clinical data repository, it automatically fills out the Centers for Medicare and Medicaid Services (CMS)-1500 form based on information regarding the patients and physicians' clinical activities. It supports electronic insurance claims by creating reimbursement charges. It also contains an HL7 interface engine to exchange clinical messages between heterogeneous devices. The system partially prevents physician malpractice by suggesting proper treatments according to patient diagnoses and supports physicians by easily preparing documents for reimbursement and submitting claim documents to insurance organizations electronically, without additional effort by the user. To show the usability of the developed system, we performed an experiment that compares the time spent filling out the CMS-1500 form directly and time required create electronic claim data using the developed system. From the experimental results, we conclude that the system could save considerable time for physicians in making claim documents. The developed system might be particularly useful for those who need a reimbursement-specialized EHR system, even though the proposed system does not completely satisfy all criteria requested by the CMS and Office of the National Coordinator for Health Information Technology (ONC). This is because the criteria are not sufficient but necessary condition for the implementation of EHR systems. The system will be upgraded continuously to implement the criteria and to offer more stable and transparent transmission of electronic claim data.

  11. Four Year-Olds Use Norm-Based Coding for Face Identity

    Science.gov (United States)

    Jeffery, Linda; Read, Ainsley; Rhodes, Gillian

    2013-01-01

    Norm-based coding, in which faces are coded as deviations from an average face, is an efficient way of coding visual patterns that share a common structure and must be distinguished by subtle variations that define individuals. Adults and school-aged children use norm-based coding for face identity but it is not yet known if pre-school aged…

  12. 50 CFR 660.25 - Permits.

    Science.gov (United States)

    2010-10-01

    ... change and the reasons for the request. If the permit requested to be changed to the base permit is..., vessel owner, or permit owner for any reason. The sablefish at-sea processing exemption will expire upon... ownership. (G) For a request to change a permit's ownership that is necessitated by divorce, the individual...

  13. Data visualization for ONEDANT and TWODANT discrete ordinates codes

    International Nuclear Information System (INIS)

    Lee, C.L.

    1993-01-01

    Effective graphical display of code calculations allow for efficient analysis of results. This is especially true in the case of discrete ordinates transport codes, which can generate thousands of flux or reaction rate data points per calculation. For this reason, a package of portable interface programs called OTTUI (ONEDANT-TWODANT-Tecplot trademark Unix-Based Interface) has been developed at Los Alamos National Laboratory to permit rapid visualization of ONEDANT and TWODANT discrete ordinates results using the graphics package Tecplot. This paper describes the various uses of OTTUI for display of ONEDANT and TWODANT problem geometries and calculational results

  14. Evaluation of a complex, population-based injury claims management intervention for improving injury outcomes: study protocol

    Science.gov (United States)

    Collie, Alex; Gabbe, Belinda; Fitzharris, Michael

    2015-01-01

    Introduction Injuries resulting from road traffic crashes are a substantial cause of disability and death worldwide. Injured persons receiving compensation have poorer recovery and return to work than those with non-compensable injury. Case or claims management is a critical component of injury compensation systems, and there is now evidence that claims management can have powerful positive impacts on recovery, but can also impede recovery or exacerbate mental health concerns in some injured people. This study seeks to evaluate the impact of a population-based injury claims management intervention in the State of Victoria, Australia, on the health of those injured in motor vehicle crashes, their experience of the compensation process, and the financial viability of the compensation system. Methods and analysis Evaluation of this complex intervention involves a series of linked but stand-alone research projects to assess the anticipated process changes, impacts and outcomes of the intervention over a 5-year time frame. Linkage and analysis of routine administrative and health system data is supplemented with a series of primary studies collecting new information. Additionally, a series of ‘action’ research projects will be undertaken to inform the implementation of the intervention. A program logic model designed by the state government Transport Accident Commission in conjunction with the research team provides the evaluation framework. Ethics and dissemination Relatively few studies have comprehensively examined the impact of compensation system processes on the health of injured persons, their satisfaction with systems processes, and impacts on the financial performance of the compensation scheme itself. The wholesale, population-based transformation of an injury claims management model is a rare opportunity to document impacts of system-level policy change on outcomes of injured persons. Findings will contribute to the evidence base of information on the

  15. ClaimAssociationService

    Data.gov (United States)

    Department of Veterans Affairs — Retrieves and updates a veteranÆs claim status and claim-rating association (claim association for current rating) from the Corporate database for a claim selected...

  16. CIVIL PROTECTION MECHANISM OF THE ASSIGNEE RIGHTS BASED ON THE PATENT CLAIM

    Directory of Open Access Journals (Sweden)

    N. V. Marchenko

    2014-04-01

    Full Text Available Purpose. Statistical analysis of inventive activity in Ukraine shows that the largest number of applications is submitted by employees of universities and research institutions – almost 60% of all inventions. Practice of inventions execution proves that for researchers, especially for students, the most difficult part of the application and author documents is the claim. The purpose of research is a synthesis and supplying the general principles of quality drafting the patent claim, providing further legal protection of the patent. Methodology. Monitoring and analysis of the world documentary informational flow through the civil protection mechanism of the assignee rights on the basis of the patent claim allows us to compare the world systems of formulas development and summarize some key moments concerning the point in question. The example analysis of the correct patent claim drafting and its interpretation in court cases on intellectual property was made. Findings. The specific properties of the patent claim were described. They are conciseness, latitude, completeness and certainty, compliance with unity requirements and novelty of the invention. On the basis of the research it is established that there is a great difference between Ukrainian and American patent claims. A number of common mistakes and shortcomings during the claim drafting were identified. The need to restore the various forms of the invention training in universities of Ukraine was emphasized, since on this basis one should train a number of specialists who are able to carry out the commercialization of intellectual property results into productive findings. Originality. A number of issues and techniques was investigated and summarized. They can be applied by the courts in interpreting of the patent claim in the processing of intellectual property cases. Especially it concerns determining the correct drafting of the patent claim. Practical value. This work may be used

  17. Sport-related concussions in New Zealand: a review of 10 years of Accident Compensation Corporation moderate to severe claims and costs.

    Science.gov (United States)

    King, Doug; Gissane, Conor; Brughelli, Matt; Hume, Patria A; Harawira, Joseph

    2014-05-01

    This paper provides an overview of the epidemiology of sport-related concussion and associated costs in New Zealand requiring medical treatment from 2001 to 2011 in seven sports codes. A retrospective review of injury entitlement claims by seven sports from 2001 to 2011. Data were analyzed by sporting code, age, ethnicity, gender and year of competition for total and moderate-to-severe (MSC) Accident Compensation Corporation (ACC) claims and costs. A total of 20,902 claims costing $NZD 16,546,026 were recorded over the study period of which 1330 (6.4%) were MSC claims. The mean yearly number and costs of MSC claims were 133 ± 36 and $1,303,942 ± 378,949. Rugby union had the highest number of MSC claims per year (38; 95% CI 36-41 per 1000 MSC claims). New Zealand Māori recorded the highest total ($6,000,759) and mean cost ($21,120) per MSC claim. Although MSC injury claims were only 6.4% of total claims, they accounted for 79.1% of total costs indicating that although the majority of sport-related concussions may be minor in severity, the related economic costs associated with more serious sport-related concussion can be high. The finding that rugby union recorded the most MSC claims in the current study was not unexpected. Of concern is that rugby league recorded a low number of MSC claims but the highest mean cost per claim. Due to the high mean cost per concussion, and the high total and mean cost for New Zealand Māori, further investigation is warranted. Copyright © 2013 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  18. Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes

    Science.gov (United States)

    Lin, Shu

    1998-01-01

    A code trellis is a graphical representation of a code, block or convolutional, in which every path represents a codeword (or a code sequence for a convolutional code). This representation makes it possible to implement Maximum Likelihood Decoding (MLD) of a code with reduced decoding complexity. The most well known trellis-based MLD algorithm is the Viterbi algorithm. The trellis representation was first introduced and used for convolutional codes [23]. This representation, together with the Viterbi decoding algorithm, has resulted in a wide range of applications of convolutional codes for error control in digital communications over the last two decades. There are two major reasons for this inactive period of research in this area. First, most coding theorists at that time believed that block codes did not have simple trellis structure like convolutional codes and maximum likelihood decoding of linear block codes using the Viterbi algorithm was practically impossible, except for very short block codes. Second, since almost all of the linear block codes are constructed algebraically or based on finite geometries, it was the belief of many coding theorists that algebraic decoding was the only way to decode these codes. These two reasons seriously hindered the development of efficient soft-decision decoding methods for linear block codes and their applications to error control in digital communications. This led to a general belief that block codes are inferior to convolutional codes and hence, that they were not useful. Chapter 2 gives a brief review of linear block codes. The goal is to provide the essential background material for the development of trellis structure and trellis-based decoding algorithms for linear block codes in the later chapters. Chapters 3 through 6 present the fundamental concepts, finite-state machine model, state space formulation, basic structural properties, state labeling, construction procedures, complexity, minimality, and

  19. Effects of comparative claims in prescription drug direct-to-consumer advertising on consumer perceptions and recall.

    Science.gov (United States)

    O'Donoghue, Amie C; Williams, Pamela A; Sullivan, Helen W; Boudewyns, Vanessa; Squire, Claudia; Willoughby, Jessica Fitts

    2014-11-01

    Although pharmaceutical companies cannot make comparative claims in direct-to-consumer (DTC) ads for prescription drugs without substantial evidence, the U.S. Food and Drug Administration permits some comparisons based on labeled attributes of the drug, such as dosing. Researchers have examined comparative advertising for packaged goods; however, scant research has examined comparative DTC advertising. We conducted two studies to determine if comparative claims in DTC ads influence consumers' perceptions and recall of drug information. In Experiment 1, participants with osteoarthritis (n=1934) viewed a fictitious print or video DTC ad that had no comparative claim or made an efficacy comparison to a named or unnamed competitor. Participants who viewed print (but not video) ads with named competitors had greater efficacy and lower risk perceptions than participants who viewed unnamed competitor and noncomparative ads. In Experiment 2, participants with high cholesterol or high body mass index (n=5317) viewed a fictitious print or video DTC ad that had no comparative claim or made a comparison to a named or unnamed competitor. We varied the type of comparison (of indication, dosing, or mechanism of action) and whether the comparison was accompanied by a visual depiction. Participants who viewed print and video ads with named competitors had greater efficacy perceptions than participants who viewed unnamed competitor and noncomparative ads. Unlike Experiment 1, named competitors in print ads resulted in higher risk perceptions than unnamed competitors. In video ads, participants who saw an indication comparison had greater benefit recall than participants who saw dosing or mechanism of action comparisons. In addition, visual depictions of the comparison decreased risk recall for video ads. Overall, the results suggest that comparative claims in DTC ads could mislead consumers about a drug's efficacy and risk; therefore, caution should be used when presenting

  20. 20 CFR 410.679 - Finality of findings with respect to other claims for benefits based on the disability or death...

    Science.gov (United States)

    2010-04-01

    ... claims for benefits based on the disability or death of a miner. 410.679 Section 410.679 Employees..., Finality of Decisions, and Representation of Parties § 410.679 Finality of findings with respect to other claims for benefits based on the disability or death of a miner. Findings of fact made in a determination...

  1. An Evaluation of Algorithms for Identifying Metastatic Breast, Lung, or Colorectal Cancer in Administrative Claims Data.

    Science.gov (United States)

    Whyte, Joanna L; Engel-Nitz, Nicole M; Teitelbaum, April; Gomez Rey, Gabriel; Kallich, Joel D

    2015-07-01

    Administrative health care claims data are used for epidemiologic, health services, and outcomes cancer research and thus play a significant role in policy. Cancer stage, which is often a major driver of cost and clinical outcomes, is not typically included in claims data. Evaluate algorithms used in a dataset of cancer patients to identify patients with metastatic breast (BC), lung (LC), or colorectal (CRC) cancer using claims data. Clinical data on BC, LC, or CRC patients (between January 1, 2007 and March 31, 2010) were linked to a health care claims database. Inclusion required health plan enrollment ≥3 months before initial cancer diagnosis date. Algorithms were used in the claims database to identify patients' disease status, which was compared with physician-reported metastases. Generic and tumor-specific algorithms were evaluated using ICD-9 codes, varying diagnosis time frames, and including/excluding other tumors. Positive and negative predictive values, sensitivity, and specificity were assessed. The linked databases included 14,480 patients; of whom, 32%, 17%, and 14.2% had metastatic BC, LC, and CRC, respectively, at diagnosis and met inclusion criteria. Nontumor-specific algorithms had lower specificity than tumor-specific algorithms. Tumor-specific algorithms' sensitivity and specificity were 53% and 99% for BC, 55% and 85% for LC, and 59% and 98% for CRC, respectively. Algorithms to distinguish metastatic BC, LC, and CRC from locally advanced disease should use tumor-specific primary cancer codes with 2 claims for the specific primary cancer >30-42 days apart to reduce misclassification. These performed best overall in specificity, positive predictive values, and overall accuracy to identify metastatic cancer in a health care claims database.

  2. Mesh-based parallel code coupling interface

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, K.; Steckel, B. (eds.) [GMD - Forschungszentrum Informationstechnik GmbH, St. Augustin (DE). Inst. fuer Algorithmen und Wissenschaftliches Rechnen (SCAI)

    2001-04-01

    MpCCI (mesh-based parallel code coupling interface) is an interface for multidisciplinary simulations. It provides industrial end-users as well as commercial code-owners with the facility to combine different simulation tools in one environment. Thereby new solutions for multidisciplinary problems will be created. This opens new application dimensions for existent simulation tools. This Book of Abstracts gives a short overview about ongoing activities in industry and research - all presented at the 2{sup nd} MpCCI User Forum in February 2001 at GMD Sankt Augustin. (orig.) [German] MpCCI (mesh-based parallel code coupling interface) definiert eine Schnittstelle fuer multidisziplinaere Simulationsanwendungen. Sowohl industriellen Anwender als auch kommerziellen Softwarehersteller wird mit MpCCI die Moeglichkeit gegeben, Simulationswerkzeuge unterschiedlicher Disziplinen miteinander zu koppeln. Dadurch entstehen neue Loesungen fuer multidisziplinaere Problemstellungen und fuer etablierte Simulationswerkzeuge ergeben sich neue Anwendungsfelder. Dieses Book of Abstracts bietet einen Ueberblick ueber zur Zeit laufende Arbeiten in der Industrie und in der Forschung, praesentiert auf dem 2{sup nd} MpCCI User Forum im Februar 2001 an der GMD Sankt Augustin. (orig.)

  3. 41 CFR 105-55.003 - Antitrust, fraud, tax, interagency claims, and claims over $100,000 excluded.

    Science.gov (United States)

    2010-07-01

    ... apply to any debt based in whole or in part on conduct in violation of the antitrust laws or to any debt... antitrust laws or any claim involving fraud, the presentation of a false claim, or misrepresentation on the... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Antitrust, fraud, tax...

  4. Hardware-efficient bosonic quantum error-correcting codes based on symmetry operators

    Science.gov (United States)

    Niu, Murphy Yuezhen; Chuang, Isaac L.; Shapiro, Jeffrey H.

    2018-03-01

    We establish a symmetry-operator framework for designing quantum error-correcting (QEC) codes based on fundamental properties of the underlying system dynamics. Based on this framework, we propose three hardware-efficient bosonic QEC codes that are suitable for χ(2 )-interaction based quantum computation in multimode Fock bases: the χ(2 ) parity-check code, the χ(2 ) embedded error-correcting code, and the χ(2 ) binomial code. All of these QEC codes detect photon-loss or photon-gain errors by means of photon-number parity measurements, and then correct them via χ(2 ) Hamiltonian evolutions and linear-optics transformations. Our symmetry-operator framework provides a systematic procedure for finding QEC codes that are not stabilizer codes, and it enables convenient extension of a given encoding to higher-dimensional qudit bases. The χ(2 ) binomial code is of special interest because, with m ≤N identified from channel monitoring, it can correct m -photon-loss errors, or m -photon-gain errors, or (m -1 )th -order dephasing errors using logical qudits that are encoded in O (N ) photons. In comparison, other bosonic QEC codes require O (N2) photons to correct the same degree of bosonic errors. Such improved photon efficiency underscores the additional error-correction power that can be provided by channel monitoring. We develop quantum Hamming bounds for photon-loss errors in the code subspaces associated with the χ(2 ) parity-check code and the χ(2 ) embedded error-correcting code, and we prove that these codes saturate their respective bounds. Our χ(2 ) QEC codes exhibit hardware efficiency in that they address the principal error mechanisms and exploit the available physical interactions of the underlying hardware, thus reducing the physical resources required for implementing their encoding, decoding, and error-correction operations, and their universal encoded-basis gate sets.

  5. SCDAP/RELAP5/MOD 3.1 code manual: User's guide and input manual. Volume 3

    International Nuclear Information System (INIS)

    Coryell, E.W.; Johnsen, E.C.; Allison, C.M.

    1995-06-01

    The SCDAP/RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during a severe accident. The code models the coupled behavior of the reactor coolant system, core, fission product released during a severe accident transient as well as large and small break loss of coolant accidents, operational transients such as anticipated transient without SCRAM, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater conditioning systems. This volume provides guidelines to code users based upon lessons learned during the developmental assessment process. A description of problem control and the installation process is included. Appendix a contains the description of the input requirements

  6. Development of System Based Code: Case Study of Life-Cycle Margin Evaluation

    International Nuclear Information System (INIS)

    Tai Asayama; Masaki Morishita; Masanori Tashimo

    2006-01-01

    For a leap of progress in structural deign of nuclear plant components, The late Professor Emeritus Yasuhide Asada proposed the System Based Code. The key concepts of the System Based Code are; (1) life-cycle margin optimization, (2) expansion of technical options as well as combinations of technical options beyond the current codes and standards, and (3) designing to clearly defined target reliabilities. Those concepts are very new to most of the nuclear power plant designers who are naturally obliged to design to current codes and standards; the application of the concepts of the System Based Code to design will lead to entire change of practices that designers have long been accustomed to. On the other hand, experienced designers are supposed to have expertise that can support and accelerate the development of the System Based Code. Therefore, interfacing with experienced designers is of crucial importance for the development of the System Based Code. The authors conducted a survey on the acceptability of the System Based Code concept. The results were analyzed from the possibility of improving structural design both in terms of reliability and cost effectiveness by the introduction of the System Based Code concept. It was concluded that the System Based Code is beneficial for those purposes. Also described is the expertise elicited from the results of the survey that can be reflected to the development of the System Based Code. (authors)

  7. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  8. Transgender Medicare Beneficiaries and Chronic Conditions: Exploring Fee-for-Service Claims Data

    Science.gov (United States)

    Guerino, Paul; Ewald, Erin; Laffan, Alison M.

    2017-01-01

    Abstract Purpose: Data on the health and well-being of the transgender population are limited. However, using claims data we can identify transgender Medicare beneficiaries (TMBs) with high confidence. We seek to describe the TMB population and provide comparisons of chronic disease burden between TMBs and cisgender Medicare beneficiaries (CMBs), thus laying a foundation for national level TMB health disparity research. Methods: Using a previously validated claims algorithm based on ICD-9-CM codes relating to transsexualism and gender identity disorder, we identified a cohort of TMBs using Medicare Fee-for-Service (FFS) claims data. We then describe the demographic characteristics and chronic disease burden of TMBs (N = 7454) and CMBs (N = 39,136,229). Results: Compared to CMBs, a greater observed proportion of TMBs are young (under age 65) and Black, although these differences vary by entitlement. Regardless of entitlement, TMBs have more chronic conditions than CMBs, and more TMBs have been diagnosed with asthma, autism spectrum disorder, chronic obstructive pulmonary disease, depression, hepatitis, HIV, schizophrenia, and substance use disorders. TMBs also have higher observed rates of potentially disabling mental health and neurological/chronic pain conditions, as well as obesity and other liver conditions (nonhepatitis), compared to CMBs. Conclusion: This is the first systematic look at chronic disease burden in the transgender population using Medicare FFS claims data. We found that TMBs experience multiple chronic conditions at higher rates than CMBs, regardless of Medicare entitlement. TMBs under age 65 show an already heavy chronic disease burden which will only be exacerbated with age. PMID:29125908

  9. Resistor-logic demultiplexers for nanoelectronics based on constant-weight codes.

    Science.gov (United States)

    Kuekes, Philip J; Robinett, Warren; Roth, Ron M; Seroussi, Gadiel; Snider, Gregory S; Stanley Williams, R

    2006-02-28

    The voltage margin of a resistor-logic demultiplexer can be improved significantly by basing its connection pattern on a constant-weight code. Each distinct code determines a unique demultiplexer, and therefore a large family of circuits is defined. We consider using these demultiplexers for building nanoscale crossbar memories, and determine the voltage margin of the memory system based on a particular code. We determine a purely code-theoretic criterion for selecting codes that will yield memories with large voltage margins, which is to minimize the ratio of the maximum to the minimum Hamming distance between distinct codewords. For the specific example of a 64 × 64 crossbar, we discuss what codes provide optimal performance for a memory.

  10. Does the Holland Code Predict Job Satisfaction and Productivity in Clothing Factory Workers?

    Science.gov (United States)

    Heesacker, Martin; And Others

    1988-01-01

    Administered Self-Directed Search to sewing machine operators to determine Holland code, and assessed work productivity, job satisfaction, absenteeism, and insurance claims. Most workers were of the Social code. Social subjects were the most satisfied, Conventional and Realistic subjects next, and subjects of other codes less so. Productivity of…

  11. Coding Transparency in Object-Based Video

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2006-01-01

    A novel algorithm for coding gray level alpha planes in object-based video is presented. The scheme is based on segmentation in multiple layers. Different coders are specifically designed for each layer. In order to reduce the bit rate, cross-layer redundancies as well as temporal correlation are...

  12. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2009-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  13. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2008-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  14. Hanford facility RCRA permit condition II.U.1 report: mapping of underground piping

    Energy Technology Data Exchange (ETDEWEB)

    Hays, C.B.

    1996-09-27

    The purpose of this report is to fulfill Condition Il.U.1. of the Hanford Facility (HF) Resource Conservation and Recovery Act (RCRA) Permit. The HF RCRA Permit, Number WA7890008967, became effective on September 28, 1994 (Ecology 1994). Permit Conditions Il.U. (mapping) and II.V. (marking) of the HF RCRA Permit, Dangerous Waste (OW) Portion, require the mapping and marking of dangerous waste underground pipelines subject to the provisions of the Washington Administrative Code (WAC) Chapter 173-303. Permit Condition Il.U.I. requires the submittal of a report describing the methodology used to generate pipeline maps and to assure their quality. Though not required by the Permit, this report also documents the approach used for the field marking of dangerous waste underground pipelines.

  15. Image content authentication based on channel coding

    Science.gov (United States)

    Zhang, Fan; Xu, Lei

    2008-03-01

    The content authentication determines whether an image has been tampered or not, and if necessary, locate malicious alterations made on the image. Authentication on a still image or a video are motivated by recipient's interest, and its principle is that a receiver must be able to identify the source of this document reliably. Several techniques and concepts based on data hiding or steganography designed as a means for the image authentication. This paper presents a color image authentication algorithm based on convolution coding. The high bits of color digital image are coded by the convolution codes for the tamper detection and localization. The authentication messages are hidden in the low bits of image in order to keep the invisibility of authentication. All communications channels are subject to errors introduced because of additive Gaussian noise in their environment. Data perturbations cannot be eliminated but their effect can be minimized by the use of Forward Error Correction (FEC) techniques in the transmitted data stream and decoders in the receiving system that detect and correct bits in error. This paper presents a color image authentication algorithm based on convolution coding. The message of each pixel is convolution encoded with the encoder. After the process of parity check and block interleaving, the redundant bits are embedded in the image offset. The tamper can be detected and restored need not accessing the original image.

  16. Physical activity and influenza-coded outpatient visits, a population-based cohort study.

    Directory of Open Access Journals (Sweden)

    Eric Siu

    Full Text Available Although the benefits of physical activity in preventing chronic medical conditions are well established, its impacts on infectious diseases, and seasonal influenza in particular, are less clearly defined. We examined the association between physical activity and influenza-coded outpatient visits, as a proxy for influenza infection.We conducted a cohort study of Ontario respondents to Statistics Canada's population health surveys over 12 influenza seasons. We assessed physical activity levels through survey responses, and influenza-coded physician office and emergency department visits through physician billing claims. We used logistic regression to estimate the risk of influenza-coded outpatient visits during influenza seasons. The cohort comprised 114,364 survey respondents who contributed 357,466 person-influenza seasons of observation. Compared to inactive individuals, moderately active (OR 0.83; 95% CI 0.74-0.94 and active (OR 0.87; 95% CI 0.77-0.98 individuals were less likely to experience an influenza-coded visit. Stratifying by age, the protective effect of physical activity remained significant for individuals <65 years (active OR 0.86; 95% CI 0.75-0.98, moderately active: OR 0.85; 95% CI 0.74-0.97 but not for individuals ≥ 65 years. The main limitations of this study were the use of influenza-coded outpatient visits rather than laboratory-confirmed influenza as the outcome measure, the reliance on self-report for assessing physical activity and various covariates, and the observational study design.Moderate to high amounts of physical activity may be associated with reduced risk of influenza for individuals <65 years. Future research should use laboratory-confirmed influenza outcomes to confirm the association between physical activity and influenza.

  17. Fast H.264/AVC FRExt intra coding using belief propagation.

    Science.gov (United States)

    Milani, Simone

    2011-01-01

    In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.

  18. Annual Hanford Site environmental permitting status report

    International Nuclear Information System (INIS)

    Sonnichsen, J.C.

    1998-01-01

    The information contained and/or referenced in this Annual Hanford Site Environmental Permitting Status Report (Status Report) addresses the State Environmental Policy Act (SEPA) of 1971 and Condition II.W. of the Resource Conservation and Recovery Act (RCRA) of 1976 Permit, Dangerous Waste Portion (DW Portion). Condition II.W. of the RCRA Permit specifies the Permittees are responsible for all other applicable federal, state, and local permits for the development and operation of the Hanford Facility. Condition II.W. of the RCRA Permit specifies that the Permittees are to use their best efforts to obtain such permits. For the purposes of permit condition, 'best efforts' means submittal of documentation and/or approval(s) in accordance with schedules specified in applicable regulations, or as determined through negotiations with the applicable regulatory agencies. This Status Report includes information on all existing and anticipated environmental permitting. Environmental permitting required by RCRA, the Hazardous and Solid Waste Amendments (HSWA) of 1984, and non-RCRA permitting (solid waste handling, Clean Air Act Amendments of 1990, Clean Water Act Amendments of 1987, Washington State waste discharge, and onsite sewage system) is addressed. Information on RCRA and non-RCRA is current as of July 31, 1998. For the purposes of RCRA and the State of Washington Hazardous Waste Management Act of 1976 [as administered through the Dangerous Waste Regulations, Washington Active Code (WAC) 173-303], the Hanford Facility is considered a single facility. As such, the Hanford Facility has been issued one US Environmental Protection Agency (EPA)/State Identification Number (WA7890008967). This EPA/State identification number encompasses over 60 treatment, storage, and/or disposal (TSD) units. The Washington State Department of Ecology (Ecology) has been delegated authority by the EPA to administer the RCRA, including mixed waste authority. The RCRA permitting approach for

  19. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  20. 21 CFR 206.10 - Code imprint required.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Code imprint required. 206.10 Section 206.10 Food...: GENERAL IMPRINTING OF SOLID ORAL DOSAGE FORM DRUG PRODUCTS FOR HUMAN USE § 206.10 Code imprint required... imprint that, in conjunction with the product's size, shape, and color, permits the unique identification...

  1. Using SpaceClaimTD Direct for Modeling Components with Complex Geometries for the Thermal Desktop-Based Advanced Stirling Radioisotope Generator Model

    Science.gov (United States)

    Fabanich, William A., Jr.

    2014-01-01

    SpaceClaim/TD Direct has been used extensively in the development of the Advanced Stirling Radioisotope Generator (ASRG) thermal model. This paper outlines the workflow for that aspect of the task and includes proposed best practices and lessons learned. The ASRG thermal model was developed to predict component temperatures and power output and to provide insight into the prime contractor's thermal modeling efforts. The insulation blocks, heat collectors, and cold side adapter flanges (CSAFs) were modeled with this approach. The model was constructed using mostly TD finite difference (FD) surfaces/solids. However, some complex geometry could not be reproduced with TD primitives while maintaining the desired degree of geometric fidelity. Using SpaceClaim permitted the import of original CAD files and enabled the defeaturing/repair of those geometries. TD Direct (a SpaceClaim add-on from CRTech) adds features that allowed the "mark-up" of that geometry. These so-called "mark-ups" control how finite element (FE) meshes are to be generated through the "tagging" of features (e.g. edges, solids, surfaces). These tags represent parameters that include: submodels, material properties, material orienters, optical properties, and radiation analysis groups. TD aliases were used for most tags to allow analysis to be performed with a variety of parameter values. "Domain-tags" were also attached to individual and groups of surfaces and solids to allow them to be used later within TD to populate objects like, for example, heaters and contactors. These tools allow the user to make changes to the geometry in SpaceClaim and then easily synchronize the mesh in TD without having to redefine the objects each time as one would if using TDMesher. The use of SpaceClaim/TD Direct helps simplify the process for importing existing geometries and in the creation of high fidelity FE meshes to represent complex parts. It also saves time and effort in the subsequent analysis.

  2. Using SpaceClaim/TD Direct for Modeling Components with Complex Geometries for the Thermal Desktop-Based Advanced Stirling Radioisotope Generator Model

    Science.gov (United States)

    Fabanich, William

    2014-01-01

    SpaceClaim/TD Direct has been used extensively in the development of the Advanced Stirling Radioisotope Generator (ASRG) thermal model. This paper outlines the workflow for that aspect of the task and includes proposed best practices and lessons learned. The ASRG thermal model was developed to predict component temperatures and power output and to provide insight into the prime contractors thermal modeling efforts. The insulation blocks, heat collectors, and cold side adapter flanges (CSAFs) were modeled with this approach. The model was constructed using mostly TD finite difference (FD) surfaces solids. However, some complex geometry could not be reproduced with TD primitives while maintaining the desired degree of geometric fidelity. Using SpaceClaim permitted the import of original CAD files and enabled the defeaturing repair of those geometries. TD Direct (a SpaceClaim add-on from CRTech) adds features that allowed the mark-up of that geometry. These so-called mark-ups control how finite element (FE) meshes were generated and allowed the tagging of features (e.g. edges, solids, surfaces). These tags represent parameters that include: submodels, material properties, material orienters, optical properties, and radiation analysis groups. TD aliases were used for most tags to allow analysis to be performed with a variety of parameter values. Domain-tags were also attached to individual and groups of surfaces and solids to allow them to be used later within TD to populate objects like, for example, heaters and contactors. These tools allow the user to make changes to the geometry in SpaceClaim and then easily synchronize the mesh in TD without having to redefine these objects each time as one would if using TD Mesher.The use of SpaceClaim/TD Direct has helped simplify the process for importing existing geometries and in the creation of high fidelity FE meshes to represent complex parts. It has also saved time and effort in the subsequent analysis.

  3. Error Resilience in Current Distributed Video Coding Architectures

    Directory of Open Access Journals (Sweden)

    Tonoli Claudia

    2009-01-01

    Full Text Available In distributed video coding the signal prediction is shifted at the decoder side, giving therefore most of the computational complexity burden at the receiver. Moreover, since no prediction loop exists before transmission, an intrinsic robustness to transmission errors has been claimed. This work evaluates and compares the error resilience performance of two distributed video coding architectures. In particular, we have considered a video codec based on the Stanford architecture (DISCOVER codec and a video codec based on the PRISM architecture. Specifically, an accurate temporal and rate/distortion based evaluation of the effects of the transmission errors for both the considered DVC architectures has been performed and discussed. These approaches have been also compared with H.264/AVC, in both cases of no error protection, and simple FEC error protection. Our evaluations have highlighted in all cases a strong dependence of the behavior of the various codecs to the content of the considered video sequence. In particular, PRISM seems to be particularly well suited for low-motion sequences, whereas DISCOVER provides better performance in the other cases.

  4. Perception of health claims among Nordic consumers

    DEFF Research Database (Denmark)

    Grunert, Klaus G.; Lähteenmäki, Liisa; Boztug, Yasemin

    2009-01-01

    . Claims were constructed from an underlying universe combining different active ingredients (familiar, unfamiliar), type of claim (combination of information about ingredient, physiological function and health benefit), framing (positive, negative) and use of qualifier (with, without 'may'). Across pairs...... of active ingredient, physiological function and health benefit, whereas the other prefers 'short' claims consisting of the health benefit only. Results also showed that the familiar ingredient is preferred to the unfamiliar one, whereas effects of positive vs. negative framing depended on the type......Health claim perception was investigated by a web-based instrument with a sample of 4612 respondents in the Nordic countries (Denmark, Finland, Iceland, Norway, Sweden). Respondents decided which of a pair of claims sounded better, was easier to understand, and was more convincing in their opinion...

  5. An examination of structure-function claims in dietary supplement advertising in the U.S.: 2003-2009.

    Science.gov (United States)

    Avery, Rosemary J; Eisenberg, Matthew D; Cantor, Jonathan H

    2017-04-01

    Dietary supplement advertising cannot claim a causal link between the product and the treatment, prevention, or cure of a disease unless manufacturers seek approval from the FDA for a health claim. Manufacturers can make structure-function (S-F) claims without FDA approval linking a supplement to a body function or system using words such as "may help" or "promotes." These S-F claims are examined in this study in order to determine whether they mimic health claims for which the FDA requires stricter scientific evidence. Data include S-F claims in supplement advertisements (N=6179) appearing in US nationally circulated magazines (N=137) from 2003 to 2009. All advertisements were comprehensively coded for S-F claims, seals of approval, and other claims of guarantee. S-F claims associate supplements with a wide variety of health conditions, many of which are serious diseases and/or ailments. A significant number of the specific verbs used in these S-F claims are indicative of disease treatment/cure effects, thereby possibly mimicking health claims to the average consumer. The strength of the clinical associations made are largely unsubstantiated in the medical literature. Claims that a product is "scientifically proven" or "guaranteed" were largely unsubstantiated by clinical literature. Ads carrying externally validating seals of approval were highly prevalent. S-F claims that strongly mimic FDA-prohibited health claims are likely to create confusion in interpretation and possible public health concerns are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Taxes vs Permits. Options for Price-Based Climate Change Regulation

    International Nuclear Information System (INIS)

    Sin, I.; Kerr, S.; Hendy, J.

    2005-03-01

    This paper provides an overview of key issues involved in the choice among market-based instruments for climate change policy. Specifically, it examines the potential net benefits from shifting to a permit system for emission reduction, and the preconditions necessary for this change. It also draws out the implications of New Zealand's specific circumstances and current climate policies for future policy development

  7. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  8. Design of ACM system based on non-greedy punctured LDPC codes

    Science.gov (United States)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  9. Effects of School-Based Educational Interventions for Enhancing Adolescents Abilities in Critical Appraisal of Health Claims: A Systematic Review.

    Directory of Open Access Journals (Sweden)

    Lena V Nordheim

    Full Text Available Adolescents are frequent media users who access health claims from various sources. The plethora of conflicting, pseudo-scientific, and often misleading health claims in popular media makes critical appraisal of health claims an essential ability. Schools play an important role in educating youth to critically appraise health claims. The objective of this systematic review was to evaluate the effects of school-based educational interventions for enhancing adolescents' abilities in critically appraising health claims.We searched MEDLINE, Embase, PsycINFO, AMED, Cinahl, Teachers Reference Centre, LISTA, ERIC, Sociological Abstracts, Social Services Abstracts, The Cochrane Library, Science Citation Index Expanded, Social Sciences Citation Index, and sources of grey literature. Studies that evaluated school-based educational interventions to improve adolescents' critical appraisal ability for health claims through advancing the students' knowledge about science were included. Eligible study designs were randomised and non-randomised controlled trials, and interrupted time series. Two authors independently selected studies, extracted data, and assessed risk of bias in included studies. Due to heterogeneity in interventions and inadequate reporting of results, we performed a descriptive synthesis of studies. We used GRADE (Grading of Recommendations, Assessment, Development, and Evaluation to assess the certainty of the evidence.Eight studies were included: two compared different teaching modalities, while the others compared educational interventions to instruction as usual. Studies mostly reported positive short-term effects on critical appraisal-related knowledge and skills in favour of the educational interventions. However, the certainty of the evidence for all comparisons and outcomes was very low.Educational interventions in schools may have beneficial short-term effects on knowledge and skills relevant to the critical appraisal of health

  10. Effects of School-Based Educational Interventions for Enhancing Adolescents Abilities in Critical Appraisal of Health Claims: A Systematic Review.

    Science.gov (United States)

    Nordheim, Lena V; Gundersen, Malene W; Espehaug, Birgitte; Guttersrud, Øystein; Flottorp, Signe

    2016-01-01

    Adolescents are frequent media users who access health claims from various sources. The plethora of conflicting, pseudo-scientific, and often misleading health claims in popular media makes critical appraisal of health claims an essential ability. Schools play an important role in educating youth to critically appraise health claims. The objective of this systematic review was to evaluate the effects of school-based educational interventions for enhancing adolescents' abilities in critically appraising health claims. We searched MEDLINE, Embase, PsycINFO, AMED, Cinahl, Teachers Reference Centre, LISTA, ERIC, Sociological Abstracts, Social Services Abstracts, The Cochrane Library, Science Citation Index Expanded, Social Sciences Citation Index, and sources of grey literature. Studies that evaluated school-based educational interventions to improve adolescents' critical appraisal ability for health claims through advancing the students' knowledge about science were included. Eligible study designs were randomised and non-randomised controlled trials, and interrupted time series. Two authors independently selected studies, extracted data, and assessed risk of bias in included studies. Due to heterogeneity in interventions and inadequate reporting of results, we performed a descriptive synthesis of studies. We used GRADE (Grading of Recommendations, Assessment, Development, and Evaluation) to assess the certainty of the evidence. Eight studies were included: two compared different teaching modalities, while the others compared educational interventions to instruction as usual. Studies mostly reported positive short-term effects on critical appraisal-related knowledge and skills in favour of the educational interventions. However, the certainty of the evidence for all comparisons and outcomes was very low. Educational interventions in schools may have beneficial short-term effects on knowledge and skills relevant to the critical appraisal of health claims. The small

  11. Development of Monte Carlo-based pebble bed reactor fuel management code

    International Nuclear Information System (INIS)

    Setiadipura, Topan; Obara, Toru

    2014-01-01

    Highlights: • A new Monte Carlo-based fuel management code for OTTO cycle pebble bed reactor was developed. • The double-heterogeneity was modeled using statistical method in MVP-BURN code. • The code can perform analysis of equilibrium and non-equilibrium phase. • Code-to-code comparisons for Once-Through-Then-Out case were investigated. • Ability of the code to accommodate the void cavity was confirmed. - Abstract: A fuel management code for pebble bed reactors (PBRs) based on the Monte Carlo method has been developed in this study. The code, named Monte Carlo burnup analysis code for PBR (MCPBR), enables a simulation of the Once-Through-Then-Out (OTTO) cycle of a PBR from the running-in phase to the equilibrium condition. In MCPBR, a burnup calculation based on a continuous-energy Monte Carlo code, MVP-BURN, is coupled with an additional utility code to be able to simulate the OTTO cycle of PBR. MCPBR has several advantages in modeling PBRs, namely its Monte Carlo neutron transport modeling, its capability of explicitly modeling the double heterogeneity of the PBR core, and its ability to model different axial fuel speeds in the PBR core. Analysis at the equilibrium condition of the simplified PBR was used as the validation test of MCPBR. The calculation results of the code were compared with the results of diffusion-based fuel management PBR codes, namely the VSOP and PEBBED codes. Using JENDL-4.0 nuclide library, MCPBR gave a 4.15% and 3.32% lower k eff value compared to VSOP and PEBBED, respectively. While using JENDL-3.3, MCPBR gave a 2.22% and 3.11% higher k eff value compared to VSOP and PEBBED, respectively. The ability of MCPBR to analyze neutron transport in the top void of the PBR core and its effects was also confirmed

  12. Transmission imaging with a coded source

    International Nuclear Information System (INIS)

    Stoner, W.W.; Sage, J.P.; Braun, M.; Wilson, D.T.; Barrett, H.H.

    1976-01-01

    The conventional approach to transmission imaging is to use a rotating anode x-ray tube, which provides the small, brilliant x-ray source needed to cast sharp images of acceptable intensity. Stationary anode sources, although inherently less brilliant, are more compatible with the use of large area anodes, and so they can be made more powerful than rotating anode sources. Spatial modulation of the source distribution provides a way to introduce detailed structure in the transmission images cast by large area sources, and this permits the recovery of high resolution images, in spite of the source diameter. The spatial modulation is deliberately chosen to optimize recovery of image structure; the modulation pattern is therefore called a ''code.'' A variety of codes may be used; the essential mathematical property is that the code possess a sharply peaked autocorrelation function, because this property permits the decoding of the raw image cast by th coded source. Random point arrays, non-redundant point arrays, and the Fresnel zone pattern are examples of suitable codes. This paper is restricted to the case of the Fresnel zone pattern code, which has the unique additional property of generating raw images analogous to Fresnel holograms. Because the spatial frequency of these raw images are extremely coarse compared with actual holograms, a photoreduction step onto a holographic plate is necessary before the decoded image may be displayed with the aid of coherent illumination

  13. Statistical methods for improving verification of claims of origin for Italian wines based on stable isotope ratios

    International Nuclear Information System (INIS)

    Dordevic, N.; Wehrens, R.; Postma, G.J.; Buydens, L.M.C.; Camin, F.

    2012-01-01

    Highlights: ► The assessment of claims of origin is of enormous economic importance for DOC and DOCG wines. ► The official method is based on univariate statistical tests of H, C and O isotopic ratios. ► We consider 5220 Italian wine samples collected in the period 2000–2010. ► Multivariate statistical analysis leads to much better specificity and easier detection of false claims of origin. ► In the case of multi-modal data, mixture modelling provides additional improvements. - Abstract: Wine derives its economic value to a large extent from geographical origin, which has a significant impact on the quality of the wine. According to the food legislation, wines can be without geographical origin (table wine) and wines with origin. Wines with origin must have characteristics which are essential due to its region of production and must be produced, processed and prepared, exclusively within that region. The development of fast and reliable analytical methods for the assessment of claims of origin is very important. The current official method is based on the measurement of stable isotope ratios of water and alcohol in wine, which are influenced by climatic factors. The results in this paper are based on 5220 Italian wine samples collected in the period 2000–2010. We evaluate the univariate approach underlying the official method to assess claims of origin and propose several new methods to get better geographical discrimination between samples. It is shown that multivariate methods are superior to univariate approaches in that they show increased sensitivity and specificity. In cases where data are non-normally distributed, an approach based on mixture modelling provides additional improvements.

  14. Etiology of work-related electrical injuries: a narrative analysis of workers' compensation claims.

    Science.gov (United States)

    Lombardi, David A; Matz, Simon; Brennan, Melanye J; Smith, Gordon S; Courtney, Theodore K

    2009-10-01

    The purpose of this study was to provide new insight into the etiology of primarily nonfatal, work-related electrical injuries. We developed a multistage, case-selection algorithm to identify electrical-related injuries from workers' compensation claims and a customized coding taxonomy to identify pre-injury circumstances. Workers' compensation claims routinely collected over a 1-year period from a large U.S. insurance provider were used to identify electrical-related injuries using an algorithm that evaluated: coded injury cause information, nature of injury, "accident" description, and injury description narratives. Concurrently, a customized coding taxonomy for these narratives was developed to abstract the activity, source, initiating process, mechanism, vector, and voltage. Among the 586,567 reported claims during 2002, electrical-related injuries accounted for 1283 (0.22%) of nonfatal claims and 15 fatalities (1.2% of electrical). Most (72.3%) were male, average age of 36, working in services (33.4%), manufacturing (24.7%), retail trade (17.3%), and construction (7.2%). Body part(s) injured most often were the hands, fingers, or wrist (34.9%); multiple body parts/systems (25.0%); lower/upper arm; elbow; shoulder, and upper extremities (19.2%). The leading activities were conducting manual tasks (55.1%); working with machinery, appliances, or equipment; working with electrical wire; and operating powered or nonpowered hand tools. Primary injury sources were appliances and office equipment (24.4%); wires, cables/cords (18.0%); machines and other equipment (11.8%); fixtures, bulbs, and switches (10.4%); and lightning (4.3%). No vector was identified in 85% of cases. and the work process was initiated by others in less than 1% of cases. Injury narratives provide valuable information to overcome some of the limitations of precoded data, more specially for identifying additional injury cases and in supplementing traditional epidemiologic data for further

  15. A neutron spectrum unfolding code based on iterative procedures

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.

    2012-10-01

    In this work, the version 3.0 of the neutron spectrum unfolding code called Neutron Spectrometry and Dosimetry from Universidad Autonoma de Zacatecas (NSDUAZ), is presented. This code was designed in a graphical interface under the LabVIEW programming environment and it is based on the iterative SPUNIT iterative algorithm, using as entrance data, only the rate counts obtained with 7 Bonner spheres based on a 6 Lil(Eu) neutron detector. The main features of the code are: it is intuitive and friendly to the user; it has a programming routine which automatically selects the initial guess spectrum by using a set of neutron spectra compiled by the International Atomic Energy Agency. Besides the neutron spectrum, this code calculates the total flux, the mean energy, H(10), h(10), 15 dosimetric quantities for radiation protection porpoises and 7 survey meter responses, in four energy grids, based on the International Atomic Energy Agency compilation. This code generates a full report in html format with all relevant information. In this work, the neutron spectrum of a 241 AmBe neutron source on air, located at 150 cm from detector, is unfolded. (Author)

  16. CALIOP: a multichannel design code for gas-cooled fast reactors. Code description and user's guide

    International Nuclear Information System (INIS)

    Thompson, W.I.

    1980-10-01

    CALIOP is a design code for fluid-cooled reactors composed of parallel fuel tubes in hexagonal or cylindrical ducts. It may be used with gaseous or liquid coolants. It has been used chiefly for design of a helium-cooled fast breeder reactor and has built-in cross section information to permit calculations of fuel loading, breeding ratio, and doubling time. Optional cross-section input allows the code to be used with moderated cores and with other fuels

  17. 78 FR 47153 - Claims Under the Federal Tort Claims Act for Loss of or Damage to Property or for Personal Injury...

    Science.gov (United States)

    2013-08-05

    ... on negligence, wrongful act or omission. Authority: 12 U.S.C. 5492(a)(1), (11); 28 U.S.C. 2672; 28 CFR 14.11. Sec. 1076.101 Claims against a Bureau employee based on negligence, wrongful act or... representative may present a claim against a Bureau employee based on negligence, or wrongful act or omission, as...

  18. Prevalence rates for depression by industry: a claims database analysis.

    Science.gov (United States)

    Wulsin, Lawson; Alterman, Toni; Timothy Bushnell, P; Li, Jia; Shen, Rui

    2014-11-01

    To estimate and interpret differences in depression prevalence rates among industries, using a large, group medical claims database. Depression cases were identified by ICD-9 diagnosis code in a population of 214,413 individuals employed during 2002-2005 by employers based in western Pennsylvania. Data were provided by Highmark, Inc. (Pittsburgh and Camp Hill, PA). Rates were adjusted for age, gender, and employee share of health care costs. National industry measures of psychological distress, work stress, and physical activity at work were also compiled from other data sources. Rates for clinical depression in 55 industries ranged from 6.9 to 16.2 %, (population rate = 10.45 %). Industries with the highest rates tended to be those which, on the national level, require frequent or difficult interactions with the public or clients, and have high levels of stress and low levels of physical activity. Additional research is needed to help identify industries with relatively high rates of depression in other regions and on the national level, and to determine whether these differences are due in part to specific work stress exposures and physical inactivity at work. Claims database analyses may provide a cost-effective way to identify priorities for depression treatment and prevention in the workplace.

  19. Prevalence rates for depression by industry: a claims database analysis

    Science.gov (United States)

    Alterman, Toni; Bushnell, P. Timothy; Li, Jia; Shen, Rui

    2015-01-01

    Purpose To estimate and interpret differences in depression prevalence rates among industries, using a large, group medical claims database. Methods Depression cases were identified by ICD-9 diagnosis code in a population of 214,413 individuals employed during 2002–2005 by employers based in western Pennsylvania. Data were provided by Highmark, Inc. (Pittsburgh and Camp Hill, PA). Rates were adjusted for age, gender, and employee share of health care costs. National industry measures of psychological distress, work stress, and physical activity at work were also compiled from other data sources. Results Rates for clinical depression in 55 industries ranged from 6.9 to 16.2 %, (population rate = 10.45 %). Industries with the highest rates tended to be those which, on the national level, require frequent or difficult interactions with the public or clients, and have high levels of stress and low levels of physical activity. Conclusions Additional research is needed to help identify industries with relatively high rates of depression in other regions and on the national level, and to determine whether these differences are due in part to specific work stress exposures and physical inactivity at work. Clinical significance Claims database analyses may provide a cost-effective way to identify priorities for depression treatment and prevention in the workplace. PMID:24907896

  20. Joint source/channel coding of scalable video over noisy channels

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, G.; Zakhor, A. [Department of Electrical Engineering and Computer Sciences University of California Berkeley, California94720 (United States)

    1997-01-01

    We propose an optimal bit allocation strategy for a joint source/channel video codec over noisy channel when the channel state is assumed to be known. Our approach is to partition source and channel coding bits in such a way that the expected distortion is minimized. The particular source coding algorithm we use is rate scalable and is based on 3D subband coding with multi-rate quantization. We show that using this strategy, transmission of video over very noisy channels still renders acceptable visual quality, and outperforms schemes that use equal error protection only. The flexibility of the algorithm also permits the bit allocation to be selected optimally when the channel state is in the form of a probability distribution instead of a deterministic state. {copyright} {ital 1997 American Institute of Physics.}

  1. 20 CFR 429.103 - Who may file my claim?

    Science.gov (United States)

    2010-04-01

    ... authorized agent, or your legal representative may file the claim. (c) Claims based on death. The executor or... behalf as agent, executor, administrator, parent, guardian or other representative. ...

  2. Design of Packet-Based Block Codes with Shift Operators

    Directory of Open Access Journals (Sweden)

    Ilow Jacek

    2010-01-01

    Full Text Available This paper introduces packet-oriented block codes for the recovery of lost packets and the correction of an erroneous single packet. Specifically, a family of systematic codes is proposed, based on a Vandermonde matrix applied to a group of information packets to construct redundant packets, where the elements of the Vandermonde matrix are bit-level right arithmetic shift operators. The code design is applicable to packets of any size, provided that the packets within a block of information packets are of uniform length. In order to decrease the overhead associated with packet padding using shift operators, non-Vandermonde matrices are also proposed for designing packet-oriented block codes. An efficient matrix inversion procedure for the off-line design of the decoding algorithm is presented to recover lost packets. The error correction capability of the design is investigated as well. The decoding algorithm, based on syndrome decoding, to correct a single erroneous packet in a group of received packets is presented. The paper is equipped with examples of codes using different parameters. The code designs and their performance are tested using Monte Carlo simulations; the results obtained exhibit good agreement with the corresponding theoretical results.

  3. Characteristics of claims in the management of septic arthritis in Japan: Retrospective analyses of judicial precedents and closed claims.

    Science.gov (United States)

    Otaki, Yasuhiro; DaSilva, Makiko Ishida; Saito, Yuichi; Oyama, Yasuaki; Oiso, Giichiro; Yoshida, Tomohiko; Fukuhara, Masakazu; Moriyama, Mitsuru

    2018-03-01

    Septic arthritis (SA) cases can result in claims or litigation because of poor prognosis even if it is unavoidable. Although these claims or litigation are useful for understanding causes and background factors of medical errors, the characteristics of malpractice claims associated with SA remain undetermined in Japan. This study aimed to increase our understanding of malpractice claims in the clinical management of SA. We analyzed 6 civil precedents and 16 closed claims of SA from 8530 malpractice claims processed between July 2004 and June 2014 by the Tokyo office of Sompo Japan Nipponkoa Insurance, Incorporated. We also studied 5 accident and 21 incident reports of SA based on project data compiled by the Japan Council for Quality Health Care. The rate of negligence was 83.3% in the precedents and 75.0% in closed claims. Two main malpractice claim patterns were revealed: SA in a lower extremity joint following sepsis caused by methicillin-resistant Staphylococcus aureus in newborns and SA in an injection site following joint injection. These two patterns accounted for 83.3% and 56.3% of judicial cases and closed claim cases, respectively. Breakdowns in care process of accident and incident reports were clearly differentiated from judicial cases or closed claim cases (Fisher's exact test, p < 0.001). It is important to pay particular attention to SA following sepsis in newborns and to monitor for any signs of SA after joint injection to ensure early diagnosis. Analysis of both malpractice claims and accident and incident reports is essential to ensure a full understanding of the situation in Japan. Copyright © 2017. Published by Elsevier Taiwan LLC.

  4. Costs by industry and diagnosis among musculoskeletal claims in a state workers compensation system: 1999-2004.

    Science.gov (United States)

    Dunning, Kari K; Davis, Kermit G; Cook, Chad; Kotowski, Susan E; Hamrick, Chris; Jewell, Gregory; Lockey, James

    2010-03-01

    Musculoskeletal disorders (MSDs) are a tremendous burden on industry in the United States. However, there is limited understanding of the unique issues relating to specific industry sectors, specifically the frequency and costs of different MSDs. Claim data from 1999 to 2004 from the Ohio Bureau of Workers' Compensation were analyzed as a function of industry sector (NAICS industry-sector categories) and anatomical region (ICD-9 codes). Almost 50% of the claims were lumbar spine (26.9%) or hand/wrist (21.7%). The majority of claims were from manufacturing (25.1%) and service (32.8%) industries. The industries with the highest average costs per claim were transportation, warehouse, and utilities and construction. Across industries, the highest costs per claim were consistently for the lumbar spine, shoulder, and cervical spine body regions. This study provides insight into the severity (i.e., medical and indemnity costs) of MSDs across multiple industries, providing data for prioritizing of resources for research and interventions. 2009 Wiley-Liss, Inc.

  5. Statistical methods for improving verification of claims of origin for Italian wines based on stable isotope ratios

    Energy Technology Data Exchange (ETDEWEB)

    Dordevic, N.; Wehrens, R. [IASMA Research and Innovation Centre, Fondazione Edmund Mach, via Mach 1, 38010 San Michele all' Adige (Italy); Postma, G.J.; Buydens, L.M.C. [Radboud University Nijmegen, Institute for Molecules and Materials, Analytical Chemistry, P.O. Box 9010, 6500 GL Nijmegen (Netherlands); Camin, F., E-mail: federica.camin@fmach.it [IASMA Research and Innovation Centre, Fondazione Edmund Mach, via Mach 1, 38010 San Michele all' Adige (Italy)

    2012-12-13

    Highlights: Black-Right-Pointing-Pointer The assessment of claims of origin is of enormous economic importance for DOC and DOCG wines. Black-Right-Pointing-Pointer The official method is based on univariate statistical tests of H, C and O isotopic ratios. Black-Right-Pointing-Pointer We consider 5220 Italian wine samples collected in the period 2000-2010. Black-Right-Pointing-Pointer Multivariate statistical analysis leads to much better specificity and easier detection of false claims of origin. Black-Right-Pointing-Pointer In the case of multi-modal data, mixture modelling provides additional improvements. - Abstract: Wine derives its economic value to a large extent from geographical origin, which has a significant impact on the quality of the wine. According to the food legislation, wines can be without geographical origin (table wine) and wines with origin. Wines with origin must have characteristics which are essential due to its region of production and must be produced, processed and prepared, exclusively within that region. The development of fast and reliable analytical methods for the assessment of claims of origin is very important. The current official method is based on the measurement of stable isotope ratios of water and alcohol in wine, which are influenced by climatic factors. The results in this paper are based on 5220 Italian wine samples collected in the period 2000-2010. We evaluate the univariate approach underlying the official method to assess claims of origin and propose several new methods to get better geographical discrimination between samples. It is shown that multivariate methods are superior to univariate approaches in that they show increased sensitivity and specificity. In cases where data are non-normally distributed, an approach based on mixture modelling provides additional improvements.

  6. Particle-in-Cell Codes for plasma-based particle acceleration

    CERN Document Server

    Pukhov, Alexander

    2016-01-01

    Basic principles of particle-in-cell (PIC ) codes with the main application for plasma-based acceleration are discussed. The ab initio full electromagnetic relativistic PIC codes provide the most reliable description of plasmas. Their properties are considered in detail. Representing the most fundamental model, the full PIC codes are computationally expensive. The plasma-based acceler- ation is a multi-scale problem with very disparate scales. The smallest scale is the laser or plasma wavelength (from one to hundred microns) and the largest scale is the acceleration distance (from a few centimeters to meters or even kilometers). The Lorentz-boost technique allows to reduce the scale disparity at the costs of complicating the simulations and causing unphysical numerical instabilities in the code. Another possibility is to use the quasi-static approxi- mation where the disparate scales are separated analytically.

  7. BenefitClaimWebServiceBean/BenefitClaimWebService

    Data.gov (United States)

    Department of Veterans Affairs — A formal or informal request for a type of monetary or non-monetary benefit. This service provides benefit claims and benefit claim special issues data, allows the...

  8. Linking individual medicare health claims data with work-life claims and other administrative data.

    Science.gov (United States)

    Mokyr Horner, Elizabeth; Cullen, Mark R

    2015-09-30

    Researchers investigating health outcomes for populations over age 65 can utilize Medicare claims data, but these data include no direct information about individuals' health prior to age 65 and are not typically linkable to files containing data on exposures and behaviors during their worklives. The current paper is a proof-of-concept, of merging employers' administrative data and private, employment-based health claims with Medicare data. Characteristics of the linked data, including sensitivity and specificity, are evaluated with an eye toward potential uses of such linked data. This paper uses a sample of former manufacturing workers from an industrial cohort as a test case. The dataset created by this integration could be useful to research in areas such as social epidemiology and occupational health. Medicare and employment administrative data were linked for a large cohort of manufacturing workers (employed at some point during 1996-2008) who transitioned onto Medicare between 2001-2009. Data on work-life health, including biometric indicators, were used to predict health at age 65 and to investigate the concordance of employment-based insurance claims with subsequent Medicare insurance claims. Chronic diseases were found to have relatively high levels of concordance between employment-based private insurance and subsequent Medicare insurance. Information about patient health prior to receipt of Medicare, including biometric indicators, were found to predict health at age 65. Combining these data allows for evaluation of continuous health trajectories, as well as modeling later-life health as a function of work-life behaviors and exposures. It also provides a potential endpoint for occupational health research. This is the first harmonization of its kind, providing a proof-of-concept. The dataset created by this integration could be useful for research in areas such as social epidemiology and occupational health.

  9. ITS Version 6 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2008-04-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of lineartime-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 90. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.

  10. Web- and system-code based, interactive, nuclear power plant simulators

    International Nuclear Information System (INIS)

    Kim, K. D.; Jain, P.; Rizwan, U.

    2006-01-01

    Using two different approaches, on-line, web- and system-code based graphical user interfaces have been developed for reactor system analysis. Both are LabVIEW (graphical programming language developed by National Instruments) based systems that allow local users as well as those at remote sites to run, interact and view the results of the system code in a web browser. In the first approach, only the data written by the system code in a tab separated ASCII output file is accessed and displayed graphically. In the second approach, LabVIEW virtual instruments are coupled with the system code as dynamic link libraries (DLL). RELAP5 is used as the system code to demonstrate the capabilities of these approaches. From collaborative projects between teams in geographically remote locations to providing system code experience to distance education students, these tools can be very beneficial in many areas of teaching and R and D. (authors)

  11. 77 FR 1501 - Special Purpose Permit Application; Draft Environmental Assessment; Hawaii-Based Shallow-Set...

    Science.gov (United States)

    2012-01-10

    ...-FF01M01000] Special Purpose Permit Application; Draft Environmental Assessment; Hawaii-Based Shallow-Set... the operation of the Hawaii-based shallow-set longline fishery that targets swordfish (Xiphias gladius... albatross, by NMFS in its regulation of the shallow-set longline fishery based in Hawaii. This fishery...

  12. 77 FR 50153 - Special Purpose Permit Application; Hawaii-Based Shallow-Set Longline Fishery; Final...

    Science.gov (United States)

    2012-08-20

    ...-FF01M01000] Special Purpose Permit Application; Hawaii-Based Shallow-Set Longline Fishery; Final... of the Hawaii-based shallow-set longline fishery, which targets swordfish. After evaluating several... take of seabirds in the shallow-set longline fishery based in Hawaii. The analysis of alternatives is...

  13. General scientific guidance for stakeholders on health claim applications

    DEFF Research Database (Denmark)

    Sjödin, Anders Mikael

    2016-01-01

    of Article 13.1 claims except for claims put on hold by the European Commission, and has evaluated additional health claim applications submitted pursuant to Articles 13.5, 14 and also 19. In addition, comments received from stakeholders indicate that general issues that are common to all health claims need...... based on the experience gained to date with the evaluation of health claims, and it may be further updated, as appropriate, when additional issues are addressed.......The European Food Safety Authority (EFSA) asked the Panel on Dietetic Products Nutrition and Allergies (NDA) to update the General guidance for stakeholders on the evaluation of Article 13.1, 13.5 and 14 health claims published in March 2011. Since then, the NDA Panel has completed the evaluation...

  14. Unfolding code for neutron spectrometry based on neural nets technology

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.

    2012-10-01

    The most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Neural Networks have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This unfolding code called Neutron Spectrometry and Dosimetry by means of Artificial Neural Networks was designed in a graphical interface under LabVIEW programming environment. The core of the code is an embedded neural network architecture, previously optimized by the R obust Design of Artificial Neural Networks Methodology . The main features of the code are: is easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6 Lil(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, only seven rate counts measurement with a Bonner spheres spectrometer are required for simultaneously unfold the 60 energy bins of the neutron spectrum and to calculate 15 dosimetric quantities, for radiation protection porpoises. This code generates a full report in html format with all relevant information. (Author)

  15. Dual Coding and Bilingual Memory.

    Science.gov (United States)

    Paivio, Allan; Lambert, Wallace

    1981-01-01

    Describes study which tested a dual coding approach to bilingual memory using tasks that permit comparison of the effects of bilingual encoding with verbal-nonverbal dual encoding items. Results provide strong support for a version of the independent or separate stories view of bilingual memory. (Author/BK)

  16. Auto Code Generation for Simulink-Based Attitude Determination Control System

    Science.gov (United States)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  17. Predicting number of hospitalization days based on health insurance claims data using bagged regression trees.

    Science.gov (United States)

    Xie, Yang; Schreier, Günter; Chang, David C W; Neubauer, Sandra; Redmond, Stephen J; Lovell, Nigel H

    2014-01-01

    Healthcare administrators worldwide are striving to both lower the cost of care whilst improving the quality of care given. Therefore, better clinical and administrative decision making is needed to improve these issues. Anticipating outcomes such as number of hospitalization days could contribute to addressing this problem. In this paper, a method was developed, using large-scale health insurance claims data, to predict the number of hospitalization days in a population. We utilized a regression decision tree algorithm, along with insurance claim data from 300,000 individuals over three years, to provide predictions of number of days in hospital in the third year, based on medical admissions and claims data from the first two years. Our method performs well in the general population. For the population aged 65 years and over, the predictive model significantly improves predictions over a baseline method (predicting a constant number of days for each patient), and achieved a specificity of 70.20% and sensitivity of 75.69% in classifying these subjects into two categories of 'no hospitalization' and 'at least one day in hospital'.

  18. Paracantor: A two group, two region reactor code

    Energy Technology Data Exchange (ETDEWEB)

    Stone, Stuart

    1956-07-01

    Paracantor I a two energy group, two region, time independent reactor code, which obtains a closed solution for a critical reactor assembly. The code deals with cylindrical reactors of finite length and with a radial reflector of finite thickness. It is programmed for the 1.B.M: Magnetic Drum Data-Processing Machine, Type 650. The limited memory space available does not permit a flux solution to be included in the basic Paracantor code. A supplementary code, Paracantor 11, has been programmed which computes fluxes, .including adjoint fluxes, from the .output of Paracamtor I.

  19. RELAP/MOD3 code manual: User's guidelines. Volume 5, Revision 1

    International Nuclear Information System (INIS)

    Fletcher, C.D.; Schultz, R.R.

    1995-08-01

    The RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during postulated accidents. The code models the coupled behavior of the reactor coolant system and the core for loss-of-coolant accidents, and operational transients, such as anticipated transient without scram, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits simulating a variety of thermal hydraulic systems. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater systems. Volume V contains guidelines that have solved over the past several years through the use of the RELAP5 code

  20. Optical image encryption based on real-valued coding and subtracting with the help of QR code

    Science.gov (United States)

    Deng, Xiaopeng

    2015-08-01

    A novel optical image encryption based on real-valued coding and subtracting is proposed with the help of quick response (QR) code. In the encryption process, the original image to be encoded is firstly transformed into the corresponding QR code, and then the corresponding QR code is encoded into two phase-only masks (POMs) by using basic vector operations. Finally, the absolute values of the real or imaginary parts of the two POMs are chosen as the ciphertexts. In decryption process, the QR code can be approximately restored by recording the intensity of the subtraction between the ciphertexts, and hence the original image can be retrieved without any quality loss by scanning the restored QR code with a smartphone. Simulation results and actual smartphone collected results show that the method is feasible and has strong tolerance to noise, phase difference and ratio between intensities of the two decryption light beams.

  1. Central Decoding for Multiple Description Codes based on Domain Partitioning

    Directory of Open Access Journals (Sweden)

    M. Spiertz

    2006-01-01

    Full Text Available Multiple Description Codes (MDC can be used to trade redundancy against packet loss resistance for transmitting data over lossy diversity networks. In this work we focus on MD transform coding based on domain partitioning. Compared to Vaishampayan’s quantizer based MDC, domain based MD coding is a simple approach for generating different descriptions, by using different quantizers for each description. Commonly, only the highest rate quantizer is used for reconstruction. In this paper we investigate the benefit of using the lower rate quantizers to enhance the reconstruction quality at decoder side. The comparison is done on artificial source data and on image data. 

  2. SGV: a code to evaluate plasma reaction rates to a specified accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Devoto, R.S.; Hanson, J.D.

    1978-09-22

    A FORTRAN code to evaluate binary reaction rates (sigmav) for a plasma to a specified accuracy is described. Distribution functions permitted are (1) two Maxwellian species at different temperatures, (2) beam-Maxwellian, (3) cold gas with Maxwellian, and (4) beam-plasma with mirror distribution of the form f(v) varies as f(v) M (cos theta). Several functional forms are permitted for f(v) and M(cos theta). Cross-section subroutines for a number of interactions involving hydrogen, helium, and electrons are included, as is a routine allowing input of numerical data. The code is written as a subroutine to allow ready incorporation into larger plasma codes.

  3. Design of Packet-Based Block Codes with Shift Operators

    Directory of Open Access Journals (Sweden)

    Jacek Ilow

    2010-01-01

    Full Text Available This paper introduces packet-oriented block codes for the recovery of lost packets and the correction of an erroneous single packet. Specifically, a family of systematic codes is proposed, based on a Vandermonde matrix applied to a group of k information packets to construct r redundant packets, where the elements of the Vandermonde matrix are bit-level right arithmetic shift operators. The code design is applicable to packets of any size, provided that the packets within a block of k information packets are of uniform length. In order to decrease the overhead associated with packet padding using shift operators, non-Vandermonde matrices are also proposed for designing packet-oriented block codes. An efficient matrix inversion procedure for the off-line design of the decoding algorithm is presented to recover lost packets. The error correction capability of the design is investigated as well. The decoding algorithm, based on syndrome decoding, to correct a single erroneous packet in a group of n=k+r received packets is presented. The paper is equipped with examples of codes using different parameters. The code designs and their performance are tested using Monte Carlo simulations; the results obtained exhibit good agreement with the corresponding theoretical results.

  4. 40 CFR 1620.3 - Administrative claim; who may file.

    Science.gov (United States)

    2010-07-01

    ...) A claim based on death may be presented by the executor or administrator of the decedent's estate... that the basis for the representation is documented in writing. (d) A claim for loss totally...

  5. Description of the COMRADEX code

    International Nuclear Information System (INIS)

    Spangler, G.W.; Boling, M.; Rhoades, W.A.; Willis, C.A.

    1967-01-01

    The COMRADEX Code is discussed briefly and instructions are provided for the use of the code. The subject code was developed for calculating doses from hypothetical power reactor accidents. It permits the user to analyze four successive levels of containment with time-varying leak rates. Filtration, cleanup, fallout and plateout in each containment shell can also be analyzed. The doses calculated include the direct gamma dose from the containment building, the internal doses to as many as 14 organs including the thyroid, bone, lung, etc. from inhaling the contaminated air, and the external gamma doses from the cloud. While further improvements are needed, such as a provision for calculating doses from fallout, rainout and washout, the present code capabilities have a wide range of applicability for reactor accident analysis

  6. 40 CFR 10.3 - Administrative claims; who may file.

    Science.gov (United States)

    2010-07-01

    ...) A claim based on death may be presented by the executor or administrator of the decedent's estate or... to present a claim on behalf of the claimant as agent, executor, administrator, parent, guardian, or...

  7. Hybrid Video Coding Based on Bidimensional Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Lorenzo Granai

    2004-12-01

    Full Text Available Hybrid video coding combines together two stages: first, motion estimation and compensation predict each frame from the neighboring frames, then the prediction error is coded, reducing the correlation in the spatial domain. In this work, we focus on the latter stage, presenting a scheme that profits from some of the features introduced by the standard H.264/AVC for motion estimation and replaces the transform in the spatial domain. The prediction error is so coded using the matching pursuit algorithm which decomposes the signal over an appositely designed bidimensional, anisotropic, redundant dictionary. Comparisons are made among the proposed technique, H.264, and a DCT-based coding scheme. Moreover, we introduce fast techniques for atom selection, which exploit the spatial localization of the atoms. An adaptive coding scheme aimed at optimizing the resource allocation is also presented, together with a rate-distortion study for the matching pursuit algorithm. Results show that the proposed scheme outperforms the standard DCT, especially at very low bit rates.

  8. H.264 Layered Coded Video over Wireless Networks: Channel Coding and Modulation Constraints

    Directory of Open Access Journals (Sweden)

    Ghandi MM

    2006-01-01

    Full Text Available This paper considers the prioritised transmission of H.264 layered coded video over wireless channels. For appropriate protection of video data, methods such as prioritised forward error correction coding (FEC or hierarchical quadrature amplitude modulation (HQAM can be employed, but each imposes system constraints. FEC provides good protection but at the price of a high overhead and complexity. HQAM is less complex and does not introduce any overhead, but permits only fixed data ratios between the priority layers. Such constraints are analysed and practical solutions are proposed for layered transmission of data-partitioned and SNR-scalable coded video where combinations of HQAM and FEC are used to exploit the advantages of both coding methods. Simulation results show that the flexibility of SNR scalability and absence of picture drift imply that SNR scalability as modelled is superior to data partitioning in such applications.

  9. 24 CFR 17.3 - Administrative claim; who may file.

    Science.gov (United States)

    2010-04-01

    ... representative. (c) A claim based on death may be presented by the executor or administrator of the decedent's... evidence of his authority to present a claim on behalf of the claimant as agent, executor, administrator...

  10. Unfolding code for neutron spectrometry based on neural nets technology

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R., E-mail: morvymm@yahoo.com.mx [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Apdo. Postal 336, 98000 Zacatecas (Mexico)

    2012-10-15

    The most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Neural Networks have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This unfolding code called Neutron Spectrometry and Dosimetry by means of Artificial Neural Networks was designed in a graphical interface under LabVIEW programming environment. The core of the code is an embedded neural network architecture, previously optimized by the {sup R}obust Design of Artificial Neural Networks Methodology{sup .} The main features of the code are: is easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a {sup 6}Lil(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, only seven rate counts measurement with a Bonner spheres spectrometer are required for simultaneously unfold the 60 energy bins of the neutron spectrum and to calculate 15 dosimetric quantities, for radiation protection porpoises. This code generates a full report in html format with all relevant information. (Author)

  11. Student Dress Codes and Uniforms. Research Brief

    Science.gov (United States)

    Johnston, Howard

    2009-01-01

    According to an Education Commission of the States "Policy Report", research on the effects of dress code and school uniform policies is inconclusive and mixed. Some researchers find positive effects; others claim no effects or only perceived effects. While no state has legislatively mandated the wearing of school uniforms, 28 states and…

  12. Protograph-Based Raptor-Like Codes

    Science.gov (United States)

    Divsalar, Dariush; Chen, Tsung-Yi; Wang, Jiadong; Wesel, Richard D.

    2014-01-01

    Theoretical analysis has long indicated that feedback improves the error exponent but not the capacity of pointto- point memoryless channels. The analytic and empirical results indicate that at short blocklength regime, practical rate-compatible punctured convolutional (RCPC) codes achieve low latency with the use of noiseless feedback. In 3GPP, standard rate-compatible turbo codes (RCPT) did not outperform the convolutional codes in the short blocklength regime. The reason is the convolutional codes for low number of states can be decoded optimally using Viterbi decoder. Despite excellent performance of convolutional codes at very short blocklengths, the strength of convolutional codes does not scale with the blocklength for a fixed number of states in its trellis.

  13. Computer-modeling codes to improve exploration nuclear-logging methods. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Wilson, R.D.; Price, R.K.; Kosanke, K.L.

    1983-03-01

    As part of the Department of Energy's National Uranium Resource Evaluation (NURE) project's Technology Development effort, a number of computer codes and accompanying data bases were assembled for use in modeling responses of nuclear borehole logging Sondes. The logging methods include fission neutron, active and passive gamma-ray, and gamma-gamma. These CDC-compatible computer codes and data bases are available on magnetic tape from the DOE Technical Library at its Grand Junction Area Office. Some of the computer codes are standard radiation-transport programs that have been available to the radiation shielding community for several years. Other codes were specifically written to model the response of borehole radiation detectors or are specialized borehole modeling versions of existing Monte Carlo transport programs. Results from several radiation modeling studies are available as two large data bases (neutron and gamma-ray). These data bases are accompanied by appropriate processing programs that permit the user to model a wide range of borehole and formation-parameter combinations for fission-neutron, neutron-, activation and gamma-gamma logs. The first part of this report consists of a brief abstract for each code or data base. The abstract gives the code name and title, short description, auxiliary requirements, typical running time (CDC 6600), and a list of references. The next section gives format specifications and/or directory for the tapes. The final section of the report presents listings for programs used to convert data bases between machine floating-point and EBCDIC

  14. Should the District Courts Have Jurisdiction Over Pre-Award Contract Claims? A Claim for the Claims Court

    National Research Council Canada - National Science Library

    Short, John J

    1987-01-01

    This thesis briefly examines the jurisdiction of the federal district courts and the United States Court of Claims over pre-award contract claims before the Federal Courts Improvement Act of October 1...

  15. 12 CFR 793.3 - Administrative claim; who may file.

    Science.gov (United States)

    2010-01-01

    ... authorized agent, or his legal representative. (c) A claim based on death may be presented by the executor or... accompanied by evidence of his authority to present a claim on behalf of the claimant as agent, executor...

  16. PERMUTATION-BASED POLYMORPHIC STEGO-WATERMARKS FOR PROGRAM CODES

    Directory of Open Access Journals (Sweden)

    Denys Samoilenko

    2016-06-01

    Full Text Available Purpose: One of the most actual trends in program code protection is code marking. The problem consists in creation of some digital “watermarks” which allow distinguishing different copies of the same program codes. Such marks could be useful for authority protection, for code copies numbering, for program propagation monitoring, for information security proposes in client-server communication processes. Methods: We used the methods of digital steganography adopted for program codes as text objects. The same-shape symbols method was transformed to same-semantic element method due to codes features which makes them different from ordinary texts. We use dynamic principle of marks forming making codes similar to be polymorphic. Results: We examined the combinatorial capacity of permutations possible in program codes. As a result it was shown that the set of 5-7 polymorphic variables is suitable for the most modern network applications. Marks creation and restoration algorithms where proposed and discussed. The main algorithm is based on full and partial permutations in variables names and its declaration order. Algorithm for partial permutation enumeration was optimized for calculation complexity. PHP code fragments which realize the algorithms were listed. Discussion: Methodic proposed in the work allows distinguishing of each client-server connection. In a case if a clone of some network resource was found the methodic could give information about included marks and thereby data on IP, date and time, authentication information of client copied the resource. Usage of polymorphic stego-watermarks should improve information security indexes in network communications.

  17. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  18. Learning-Based Just-Noticeable-Quantization- Distortion Modeling for Perceptual Video Coding.

    Science.gov (United States)

    Ki, Sehwan; Bae, Sung-Ho; Kim, Munchurl; Ko, Hyunsuk

    2018-07-01

    Conventional predictive video coding-based approaches are reaching the limit of their potential coding efficiency improvements, because of severely increasing computation complexity. As an alternative approach, perceptual video coding (PVC) has attempted to achieve high coding efficiency by eliminating perceptual redundancy, using just-noticeable-distortion (JND) directed PVC. The previous JNDs were modeled by adding white Gaussian noise or specific signal patterns into the original images, which were not appropriate in finding JND thresholds due to distortion with energy reduction. In this paper, we present a novel discrete cosine transform-based energy-reduced JND model, called ERJND, that is more suitable for JND-based PVC schemes. Then, the proposed ERJND model is extended to two learning-based just-noticeable-quantization-distortion (JNQD) models as preprocessing that can be applied for perceptual video coding. The two JNQD models can automatically adjust JND levels based on given quantization step sizes. One of the two JNQD models, called LR-JNQD, is based on linear regression and determines the model parameter for JNQD based on extracted handcraft features. The other JNQD model is based on a convolution neural network (CNN), called CNN-JNQD. To our best knowledge, our paper is the first approach to automatically adjust JND levels according to quantization step sizes for preprocessing the input to video encoders. In experiments, both the LR-JNQD and CNN-JNQD models were applied to high efficiency video coding (HEVC) and yielded maximum (average) bitrate reductions of 38.51% (10.38%) and 67.88% (24.91%), respectively, with little subjective video quality degradation, compared with the input without preprocessing applied.

  19. Edge-preserving Intra Depth Coding based on Context-coding and H.264/AVC

    DEFF Research Database (Denmark)

    Zamarin, Marco; Salmistraro, Matteo; Forchhammer, Søren

    2013-01-01

    Depth map coding plays a crucial role in 3D Video communication systems based on the “Multi-view Video plus Depth” representation as view synthesis performance is strongly affected by the accuracy of depth information, especially at edges in the depth map image. In this paper an efficient algorithm...... for edge-preserving intra depth compression based on H.264/AVC is presented. The proposed method introduces a new Intra mode specifically targeted to depth macroblocks with arbitrarily shaped edges, which are typically not efficiently represented by DCT. Edge macroblocks are partitioned into two regions...... each approximated by a flat surface. Edge information is encoded by means of contextcoding with an adaptive template. As a novel element, the proposed method allows exploiting the edge structure of previously encoded edge macroblocks during the context-coding step to further increase compression...

  20. Degrees of Truthfulness in Accepted Scientific Claims.

    Directory of Open Access Journals (Sweden)

    Ahmed Hassan Mabrouk

    2008-12-01

    Full Text Available Normal 0 false false false EN-MY X-NONE AR-SA MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Abstract: Sciences adopt different methodologies in deriving claims and establishing theories. As a result, two accepted claims or theories belonging to two different sciences may not necessarily carry the same degree of truthfulness. Examining the different methodologies of deriving claims in the sciences of ʿaqīdah (Islamic Creed, fiqh (Islamic Jurisprudence and physics, the study shows that ʿaqīdah provides a holistic understanding of the universe. Physics falls short of interpreting physical phenomena unless these phenomena are looked at through the ʿaqīdah holistic view. Left to itself, error may creep into laws of physics due to the methodology of conducting the physical experiments, misinterpreting the experimental results, or accepting invalid assumptions. As for fiqh, it is found that apart from apparent errors, fiqh views cannot be falsified. It is, therefore, useful to consider ʿaqīdah as a master science which would permit all other sciences to live in harmony.

  1. Experimental data bases useful for quantification of model uncertainties in best estimate codes

    International Nuclear Information System (INIS)

    Wilson, G.E.; Katsma, K.R.; Jacobson, J.L.; Boodry, K.S.

    1988-01-01

    A data base is necessary for assessment of thermal hydraulic codes within the context of the new NRC ECCS Rule. Separate effect tests examine particular phenomena that may be used to develop and/or verify models and constitutive relationships in the code. Integral tests are used to demonstrate the capability of codes to model global characteristics and sequence of events for real or hypothetical transients. The nuclear industry has developed a large experimental data base of fundamental nuclear, thermal-hydraulic phenomena for code validation. Given a particular scenario, and recognizing the scenario's important phenomena, selected information from this data base may be used to demonstrate applicability of a particular code to simulate the scenario and to determine code model uncertainties. LBLOCA experimental data bases useful to this objective are identified in this paper. 2 tabs

  2. RELAP/MOD3 code manual: User`s guidelines. Volume 5, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, C.D.; Schultz, R.R. [Lockheed Idaho Technologies Co., Idaho Falls, ID (United States)

    1995-08-01

    The RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during postulated accidents. The code models the coupled behavior of the reactor coolant system and the core for loss-of-coolant accidents, and operational transients, such as anticipated transient without scram, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits simulating a variety of thermal hydraulic systems. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater systems. Volume V contains guidelines that have solved over the past several years through the use of the RELAP5 code.

  3. Health claims on food products in Southeast Asia: regulatory frameworks, barriers, and opportunities.

    Science.gov (United States)

    Tan, Karin Y M; van der Beek, Eline M; Chan, M Y; Zhao, Xuejun; Stevenson, Leo

    2015-09-01

    The Association of Southeast Asian Nations aims to act as a single market and allow free movement of goods, services, and manpower. The purpose of this article is to present an overview of the current regulatory framework for health claims in Southeast Asia and to highlight the current barriers and opportunities in the regulatory frameworks in the Association of Southeast Asian Nations. To date, 5 countries in Southeast Asia, i.e., Indonesia, Malaysia, the Philippines, Singapore, and Thailand, have regulations and guidelines to permit the use of health claims on food products. There are inconsistencies in the regulations and the types of evidence required for health claim applications in these countries. A clear understanding of the regulatory frameworks in these countries may help to increase trade in this fast-growing region and to provide direction for the food industry and the regulatory community to develop and market food products with better nutritional quality tailored to the needs of Southeast Asian consumers. © The Author(s) 2015. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  5. RELAP5/MOD3 code manual: User's guide and input requirements. Volume 2

    International Nuclear Information System (INIS)

    1995-08-01

    The RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during postulated accidents. The code models the coupled behavior of the reactor coolant system and the core for loss-of-coolant accidents, and operational transients, such as anticipated transient without scram, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits simulating a variety of thermal hydraulic systems. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater systems. Volume II contains detailed instructions for code application and input data preparation

  6. Extension of the EQ3/6 computer codes to geochemical modeling of brines

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.J.; Wolery, T.J.

    1984-10-23

    Recent modifications to the EQ3/6 geochemical modeling software package provide for the use of Pitzer's equations to calculate the activity coefficients of aqueous species and the activity of water. These changes extend the range of solute concentrations over which the codes can be used to dependably calculate equilibria in geochemical systems, and permit the inclusion of ion pairs, complexes, and undissociated acids and bases as explicit component species in the Pitzer model. Comparisons of calculations made by the EQ3NR and EQ6 compuer codes with experimental data confirm that the modifications not only allow the codes to accurately evaluate activity coefficients in concentrated solutions, but also permit prediction of solubility limits of evaporite minerals in brines at 25/sup 0/C and elevated temperatures. Calculations for a few salts can be made at temperatures up to approx. 300/sup 0/C, but the temperature range for most electrolytes is constrained by the availability of requisite data to values less than or equal to 100/sup 0/C. The implementation of Pitzer's equations in EQ3/6 allows application of these codes to problems involving calculation of geochemical equilibria in brines; such as evaluation of the chemical environment which might be anticipated for nuclear waste canisters located in a salt repository. 26 references, 3 figures, 1 table.

  7. TRANSNET -- access to radioactive and hazardous materials transportation codes and databases

    International Nuclear Information System (INIS)

    Cashwell, J.W.

    1992-01-01

    TRANSNET has been developed and maintained by Sandia National Laboratories under the sponsorship of the United States Department of Energy (DOE) Office of Environmental Restoration and Waste Management to permit outside access to computerized routing, risk and systems analysis models, and associated databases. The goal of the TRANSNET system is to enable transfer of transportation analytical methods and data to qualified users by permitting direct, timely access to the up-to-date versions of the codes and data. The TRANSNET facility comprises a dedicated computer with telephone ports on which these codes and databases are adapted, modified, and maintained. To permit the widest spectrum of outside users, TRANSNET is designed to minimize hardware and documentation requirements. The user is thus required to have an IBM-compatible personal computer, Hayes-compatible modem with communications software, and a telephone. Maintenance and operation of the TRANSNET facility are underwritten by the program sponsor(s) as are updates to the respective models and data, thus the only charges to the user of the system are telephone hookup charges. TRANSNET provides access to the most recent versions of the models and data developed by or for Sandia National Laboratories. Code modifications that have been made since the last published documentation are noted to the user on the introductory screens. User friendly interfaces have been developed for each of the codes and databases on TRANSNET. In addition, users are provided with default input data sets for typical problems which can either be used directly or edited. Direct transfers of analytical or data files between codes are provided to permit the user to perform complex analyses with a minimum of input. Recent developments to the TRANSNET system include use of the system to directly pass data files between both national and international users as well as development and integration of graphical depiction techniques

  8. Optimal interference code based on machine learning

    Science.gov (United States)

    Qian, Ye; Chen, Qian; Hu, Xiaobo; Cao, Ercong; Qian, Weixian; Gu, Guohua

    2016-10-01

    In this paper, we analyze the characteristics of pseudo-random code, by the case of m sequence. Depending on the description of coding theory, we introduce the jamming methods. We simulate the interference effect or probability model by the means of MATLAB to consolidate. In accordance with the length of decoding time the adversary spends, we find out the optimal formula and optimal coefficients based on machine learning, then we get the new optimal interference code. First, when it comes to the phase of recognition, this study judges the effect of interference by the way of simulating the length of time over the decoding period of laser seeker. Then, we use laser active deception jamming simulate interference process in the tracking phase in the next block. In this study we choose the method of laser active deception jamming. In order to improve the performance of the interference, this paper simulates the model by MATLAB software. We find out the least number of pulse intervals which must be received, then we can make the conclusion that the precise interval number of the laser pointer for m sequence encoding. In order to find the shortest space, we make the choice of the greatest common divisor method. Then, combining with the coding regularity that has been found before, we restore pulse interval of pseudo-random code, which has been already received. Finally, we can control the time period of laser interference, get the optimal interference code, and also increase the probability of interference as well.

  9. SPORTS - a simple non-linear thermalhydraulic stability code

    International Nuclear Information System (INIS)

    Chatoorgoon, V.

    1986-01-01

    A simple code, called SPORTS, has been developed for two-phase stability studies. A novel method of solution of the finite difference equations was deviced and incorporated, and many of the approximations that are common in other stability codes are avoided. SPORTS is believed to be accurate and efficient, as small and large time-steps are permitted, and hence suitable for micro-computers. (orig.)

  10. Hanford facility dangerous waste permit application, general information portion. Revision 3

    International Nuclear Information System (INIS)

    Sonnichsen, J.C.

    1997-01-01

    For purposes of the Hanford facility dangerous waste permit application, the US Department of Energy's contractors are identified as ''co-operators'' and sign in that capacity (refer to Condition I.A.2. of the Dangerous Waste Portion of the Hanford Facility Resource Conservation and Recovery Act Permit). Any identification of these contractors as an ''operator'' elsewhere in the application is not meant to conflict with the contractors' designation as co-operators but rather is based on the contractors' contractual status with the U.S. Department of Energy, Richland Operations Office. The Dangerous Waste Portion of the initial Hanford Facility Resource Conservation and Recovery Act Permit, which incorporated five treatment, storage, and/or disposal units, was based on information submitted in the Hanford Facility Dangerous Waste Permit Application and in closure plan and closure/postclosure plan documentation. During 1995, the Dangerous Waste Portion was modified twice to incorporate another eight treatment, storage, and/or disposal units; during 1996, the Dangerous Waste Portion was modified once to incorporate another five treatment, storage, and/or disposal units. The permit modification process will be used at least annually to incorporate additional treatment, storage, and/or disposal units as permitting documentation for these units is finalized. The units to be included in annual modifications are specified in a schedule contained in the Dangerous Waste Portion of the Hanford Facility Resource Conservation and Recovery Act Permit. Treatment, storage, and/or disposal units will remain in interim status until incorporated into the Permit. The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (this document, DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to individual operating treatment, storage, and/or disposal units for which

  11. Hanford facility dangerous waste permit application, general information portion. Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Sonnichsen, J.C.

    1997-08-21

    For purposes of the Hanford facility dangerous waste permit application, the US Department of Energy`s contractors are identified as ``co-operators`` and sign in that capacity (refer to Condition I.A.2. of the Dangerous Waste Portion of the Hanford Facility Resource Conservation and Recovery Act Permit). Any identification of these contractors as an ``operator`` elsewhere in the application is not meant to conflict with the contractors` designation as co-operators but rather is based on the contractors` contractual status with the U.S. Department of Energy, Richland Operations Office. The Dangerous Waste Portion of the initial Hanford Facility Resource Conservation and Recovery Act Permit, which incorporated five treatment, storage, and/or disposal units, was based on information submitted in the Hanford Facility Dangerous Waste Permit Application and in closure plan and closure/postclosure plan documentation. During 1995, the Dangerous Waste Portion was modified twice to incorporate another eight treatment, storage, and/or disposal units; during 1996, the Dangerous Waste Portion was modified once to incorporate another five treatment, storage, and/or disposal units. The permit modification process will be used at least annually to incorporate additional treatment, storage, and/or disposal units as permitting documentation for these units is finalized. The units to be included in annual modifications are specified in a schedule contained in the Dangerous Waste Portion of the Hanford Facility Resource Conservation and Recovery Act Permit. Treatment, storage, and/or disposal units will remain in interim status until incorporated into the Permit. The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (this document, DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to individual operating treatment, storage, and/or disposal units for which

  12. Protograph LDPC Codes for the Erasure Channel

    Science.gov (United States)

    Pollara, Fabrizio; Dolinar, Samuel J.; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews the use of protograph Low Density Parity Check (LDPC) codes for erasure channels. A protograph is a Tanner graph with a relatively small number of nodes. A "copy-and-permute" operation can be applied to the protograph to obtain larger derived graphs of various sizes. For very high code rates and short block sizes, a low asymptotic threshold criterion is not the best approach to designing LDPC codes. Simple protographs with much regularity and low maximum node degrees appear to be the best choices Quantized-rateless protograph LDPC codes can be built by careful design of the protograph such that multiple puncturing patterns will still permit message passing decoding to proceed

  13. Framing and Claiming: How Information-Framing Affects Expected Social Security Claiming Behavior.

    Science.gov (United States)

    Brown, Jeffrey R; Kapteyn, Arie; Mitchell, Olivia S

    2016-03-01

    This paper provides evidence that Social Security benefit claiming decisions are strongly affected by framing and are thus inconsistent with expected utility theory. Using a randomized experiment that controls for both observable and unobservable differences across individuals, we find that the use of a "breakeven analysis" encourages early claiming. Respondents are more likely to delay when later claiming is framed as a gain, and the claiming age is anchored at older ages. Additionally, the financially less literate, individuals with credit card debt, and those with lower earnings are more influenced by framing than others.

  14. Spatially coded backscatter radiography

    International Nuclear Information System (INIS)

    Thangavelu, S.; Hussein, E.M.A.

    2007-01-01

    Conventional radiography requires access to two opposite sides of an object, which makes it unsuitable for the inspection of extended and/or thick structures (airframes, bridges, floors etc.). Backscatter imaging can overcome this problem, but the indications obtained are difficult to interpret. This paper applies the coded aperture technique to gamma-ray backscatter-radiography in order to enhance the detectability of flaws. This spatial coding method involves the positioning of a mask with closed and open holes to selectively permit or block the passage of radiation. The obtained coded-aperture indications are then mathematically decoded to detect the presence of anomalies. Indications obtained from Monte Carlo calculations were utilized in this work to simulate radiation scattering measurements. These simulated measurements were used to investigate the applicability of this technique to the detection of flaws by backscatter radiography

  15. Construction of Quasi-Cyclic LDPC Codes Based on Fundamental Theorem of Arithmetic

    Directory of Open Access Journals (Sweden)

    Hai Zhu

    2018-01-01

    Full Text Available Quasi-cyclic (QC LDPC codes play an important role in 5G communications and have been chosen as the standard codes for 5G enhanced mobile broadband (eMBB data channel. In this paper, we study the construction of QC LDPC codes based on an arbitrary given expansion factor (or lifting degree. First, we analyze the cycle structure of QC LDPC codes and give the necessary and sufficient condition for the existence of short cycles. Based on the fundamental theorem of arithmetic in number theory, we divide the integer factorization into three cases and present three classes of QC LDPC codes accordingly. Furthermore, a general construction method of QC LDPC codes with girth of at least 6 is proposed. Numerical results show that the constructed QC LDPC codes perform well over the AWGN channel when decoded with the iterative algorithms.

  16. SyncClaimService

    Data.gov (United States)

    Department of Veterans Affairs — Provides various methods to sync Claim related data for NWQ processing. It includes web operations to get Claims, get Unique Contention Classifications, get Unique...

  17. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  18. RELAP5/MOD3 code manual. Volume 4, Models and correlations

    International Nuclear Information System (INIS)

    1995-08-01

    The RELAP5 code has been developed for best-estimate transient simulation of light water reactor coolant systems during postulated accidents. The code models the coupled behavior of the reactor coolant system and the core for loss-of-coolant accidents and operational transients such as anticipated transient without scram, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits simulating a variety of thermal hydraulic systems. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater systems. RELAP5/MOD3 code documentation is divided into seven volumes: Volume I presents modeling theory and associated numerical schemes; Volume II details instructions for code application and input data preparation; Volume III presents the results of developmental assessment cases that demonstrate and verify the models used in the code; Volume IV discusses in detail RELAP5 models and correlations; Volume V presents guidelines that have evolved over the past several years through the use of the RELAP5 code; Volume VI discusses the numerical scheme used in RELAP5; and Volume VII presents a collection of independent assessment calculations

  19. Exercise-induced bronchospasm: coding and billing for physician services.

    Science.gov (United States)

    Pohlig, Carol

    2009-01-01

    Physician reporting of the service to insurance companies for reimbursement is multifaceted and perplexing to those who do not understand the factors to consider. Test selection should be individualized based on the patient's history and/or needs. Federal regulations concerning physician supervision of diagnostic tests mandate different levels of physician supervision based on the type and complexity of the test. Many factors play a key role in physician claim submission. These include testing location, component services, coding edits, and additional visits. Medical necessity of the service(s) must also be demonstrated for payer consideration and reimbursement. The following article reviews various tests for exercise-induced bronchospasm and focuses on issues to assist the physician in reporting the services accurately and appropriately.

  20. Contributions at the Tripoli Monte Carlo code qualifying on critical experiences and at neutronic interaction study of fissile units

    International Nuclear Information System (INIS)

    Nouri, A.

    1994-01-01

    Criticality studies in nuclear fuel cycle are based on Monte Carlo method. These codes use multigroup cross sections which can verify by experimental configurations or by use of reference codes such Tripoli 2. In this Tripoli 2 code nuclear data are errors attached and asked for experimental studies with critical experiences. This is one of the aim of this thesis. To calculate the keff of interacted fissile units we have used the multigroup Monte Carlo code Moret with convergence problems. A new estimator of reactions rates permit to better approximate the neutrons exchange between units and a new importance function has been tested. 2 annexes

  1. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    Science.gov (United States)

    Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño

    2013-01-01

    In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  2. A neutron spectrum unfolding computer code based on artificial neural networks

    International Nuclear Information System (INIS)

    Ortiz-Rodríguez, J.M.; Reyes Alfaro, A.; Reyes Haro, A.; Cervantes Viramontes, J.M.; Vega-Carrillo, H.R.

    2014-01-01

    The Bonner Spheres Spectrometer consists of a thermal neutron sensor placed at the center of a number of moderating polyethylene spheres of different diameters. From the measured readings, information can be derived about the spectrum of the neutron field where measurements were made. Disadvantages of the Bonner system are the weight associated with each sphere and the need to sequentially irradiate the spheres, requiring long exposure periods. Provided a well-established response matrix and adequate irradiation conditions, the most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Intelligence, mainly Artificial Neural Networks, have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This code is called Neutron Spectrometry and Dosimetry with Artificial Neural networks unfolding code that was designed in a graphical interface. The core of the code is an embedded neural network architecture previously optimized using the robust design of artificial neural networks methodology. The main features of the code are: easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6 LiI(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, for unfolding the neutron spectrum, only seven rate counts measured with seven Bonner spheres are required; simultaneously the code calculates 15 dosimetric quantities as well as the total flux for radiation protection purposes. This code generates a full report with all information of the unfolding

  3. 3D Scan-Based Wavelet Transform and Quality Control for Video Coding

    Directory of Open Access Journals (Sweden)

    Parisot Christophe

    2003-01-01

    Full Text Available Wavelet coding has been shown to achieve better compression than DCT coding and moreover allows scalability. 2D DWT can be easily extended to 3D and thus applied to video coding. However, 3D subband coding of video suffers from two drawbacks. The first is the amount of memory required for coding large 3D blocks; the second is the lack of temporal quality due to the sequence temporal splitting. In fact, 3D block-based video coders produce jerks. They appear at blocks temporal borders during video playback. In this paper, we propose a new temporal scan-based wavelet transform method for video coding combining the advantages of wavelet coding (performance, scalability with acceptable reduced memory requirements, no additional CPU complexity, and avoiding jerks. We also propose an efficient quality allocation procedure to ensure a constant quality over time.

  4. Framing and Claiming: How Information-Framing Affects Expected Social Security Claiming Behavior

    Science.gov (United States)

    Brown, Jeffrey R.; Kapteyn, Arie; Mitchell, Olivia S.

    2017-01-01

    This paper provides evidence that Social Security benefit claiming decisions are strongly affected by framing and are thus inconsistent with expected utility theory. Using a randomized experiment that controls for both observable and unobservable differences across individuals, we find that the use of a “breakeven analysis” encourages early claiming. Respondents are more likely to delay when later claiming is framed as a gain, and the claiming age is anchored at older ages. Additionally, the financially less literate, individuals with credit card debt, and those with lower earnings are more influenced by framing than others. PMID:28579641

  5. Using 'big data' to validate claims made in the pharmaceutical approval process.

    Science.gov (United States)

    Wasser, Thomas; Haynes, Kevin; Barron, John; Cziraky, Mark

    2015-01-01

    Big Data in the healthcare setting refers to the storage, assimilation, and analysis of large quantities of information regarding patient care. These data can be collected and stored in a wide variety of ways including electronic medical records collected at the patient bedside, or through medical records that are coded and passed to insurance companies for reimbursement. When these data are processed it is possible to validate claims as a part of the regulatory review process regarding the anticipated performance of medications and devices. In order to analyze properly claims by manufacturers and others, there is a need to express claims in terms that are testable in a timeframe that is useful and meaningful to formulary committees. Claims for the comparative benefits and costs, including budget impact, of products and devices need to be expressed in measurable terms, ideally in the context of submission or validation protocols. Claims should be either consistent with accessible Big Data or able to support observational studies where Big Data identifies target populations. Protocols should identify, in disaggregated terms, key variables that would lead to direct or proxy validation. Once these variables are identified, Big Data can be used to query massive quantities of data in the validation process. Research can be passive or active in nature. Passive, where the data are collected retrospectively; active where the researcher is prospectively looking for indicators of co-morbid conditions, side-effects or adverse events, testing these indicators to determine if claims are within desired ranges set forth by the manufacturer. Additionally, Big Data can be used to assess the effectiveness of therapy through health insurance records. This, for example, could indicate that disease or co-morbid conditions cease to be treated. Understanding the basic strengths and weaknesses of Big Data in the claim validation process provides a glimpse of the value that this research

  6. The Redox Code.

    Science.gov (United States)

    Jones, Dean P; Sies, Helmut

    2015-09-20

    The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.

  7. Warped Discrete Cosine Transform-Based Low Bit-Rate Block Coding Using Image Downsampling

    Directory of Open Access Journals (Sweden)

    Ertürk Sarp

    2007-01-01

    Full Text Available This paper presents warped discrete cosine transform (WDCT-based low bit-rate block coding using image downsampling. While WDCT aims to improve the performance of conventional DCT by frequency warping, the WDCT has only been applicable to high bit-rate coding applications because of the overhead required to define the parameters of the warping filter. Recently, low bit-rate block coding based on image downsampling prior to block coding followed by upsampling after the decoding process is proposed to improve the compression performance for low bit-rate block coders. This paper demonstrates that a superior performance can be achieved if WDCT is used in conjunction with image downsampling-based block coding for low bit-rate applications.

  8. Computer codes for beam dynamics analysis of cyclotronlike accelerators

    Science.gov (United States)

    Smirnov, V.

    2017-12-01

    Computer codes suitable for the study of beam dynamics in cyclotronlike (classical and isochronous cyclotrons, synchrocyclotrons, and fixed field alternating gradient) accelerators are reviewed. Computer modeling of cyclotron segments, such as the central zone, acceleration region, and extraction system is considered. The author does not claim to give a full and detailed description of the methods and algorithms used in the codes. Special attention is paid to the codes already proven and confirmed at the existing accelerating facilities. The description of the programs prepared in the worldwide known accelerator centers is provided. The basic features of the programs available to users and limitations of their applicability are described.

  9. Medical chart validation of an algorithm for identifying multiple sclerosis relapse in healthcare claims.

    Science.gov (United States)

    Chastek, Benjamin J; Oleen-Burkey, Merrikay; Lopez-Bresnahan, Maria V

    2010-01-01

    Relapse is a common measure of disease activity in relapsing-remitting multiple sclerosis (MS). The objective of this study was to test the content validity of an operational algorithm for detecting relapse in claims data. A claims-based relapse detection algorithm was tested by comparing its detection rate over a 1-year period with relapses identified based on medical chart review. According to the algorithm, MS patients in a US healthcare claims database who had either (1) a primary claim for MS during hospitalization or (2) a corticosteroid claim following a MS-related outpatient visit were designated as having a relapse. Patient charts were examined for explicit indication of relapse or care suggestive of relapse. Positive and negative predictive values were calculated. Medical charts were reviewed for 300 MS patients, half of whom had a relapse according to the algorithm. The claims-based criteria correctly classified 67.3% of patients with relapses (positive predictive value) and 70.0% of patients without relapses (negative predictive value; kappa 0.373: p value of the operational algorithm. Limitations of the algorithm include lack of differentiation between relapsing-remitting MS and other types, and that it does not incorporate measures of function and disability. The claims-based algorithm appeared to successfully detect moderate-to-severe MS relapse. This validated definition can be applied to future claims-based MS studies.

  10. Trellis and turbo coding iterative and graph-based error control coding

    CERN Document Server

    Schlegel, Christian B

    2015-01-01

    This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework. Advanced research-related developments such as spatial coupling. A focus on algorithmic and implementation aspects of error control coding.

  11. 20 CFR 405.410 - Selecting claims for Decision Review Board review.

    Science.gov (United States)

    2010-04-01

    ... will not review claims based on the identity of the administrative law judge who decided the claim. (b... Decision Review Board review. (a)(1) The Board may review your claim if the administrative law judge made a decision under §§ 405.340 or 405.370 of this part, regardless of whether the administrative law judge's...

  12. 32 CFR 842.121 - Referring a claim to the US Attorney.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Referring a claim to the US Attorney. 842.121 Section 842.121 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE CLAIMS AND... to the US Attorney. Only HQ USAF/JACC authorizes referral of a claim to the US Attorney. The base SJA...

  13. European consumers and health claims: attitudes, understanding and purchasing behaviour.

    Science.gov (United States)

    Wills, Josephine M; Storcksdieck genannt Bonsmann, Stefan; Kolka, Magdalena; Grunert, Klaus G

    2012-05-01

    Health claims on food products are often used as a means to highlight scientifically proven health benefits associated with consuming those foods. But do consumers understand and trust health claims? This paper provides an overview of recent research on consumers and health claims including attitudes, understanding and purchasing behaviour. A majority of studies investigated selective product-claim combinations, with ambiguous findings apart from consumers' self-reported generic interest in health claims. There are clear indications that consumer responses differ substantially according to the nature of carrier product, the type of health claim, functional ingredient used or a combination of these components. Health claims tend to be perceived more positively when linked to a product with an overall positive health image, whereas some studies demonstrate higher perceived credibility of products with general health claims (e.g. omega-3 and brain development) compared to disease risk reduction claims (e.g. bioactive peptides to reduce risk of heart disease), others report the opposite. Inconsistent evidence also exists on the correlation between having a positive attitude towards products with health claims and purchase intentions. Familiarity with the functional ingredient and/or its claimed health effect seems to result in a more favourable evaluation. Better nutritional knowledge, however, does not automatically lead to a positive attitude towards products carrying health messages. Legislation in the European Union requires that the claim is understood by the average consumer. As most studies on consumers' understanding of health claims are based on subjective understanding, this remains an area for more investigation.

  14. Context based Coding of Quantized Alpha Planes for Video Objects

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2002-01-01

    In object based video, each frame is a composition of objects that are coded separately. The composition is performed through the alpha plane that represents the transparency of the object. We present an alternative to MPEG-4 for coding of alpha planes that considers their specific properties....... Comparisons in terms of rate and distortion are provided, showing that the proposed coding scheme for still alpha planes is better than the algorithms for I-frames used in MPEG-4....

  15. Room Heat-Up Analysis with GOTHIC code

    International Nuclear Information System (INIS)

    Jimenez, G.; Olza, J. M.

    2010-01-01

    The GOTHIC T M computer code is a state-of-the art program for modeling multiphase, multicomponent fluid flow. GOTHIC is rapidly becoming the industry-standard code for performing both containment design basis accident (DBA) analyses and analyses to support equipment qualification. GOTHIC has a flexible nodding structure that allows both lumped parameter and 3-D modeling capabilities. Multidimensional analysis capabilities greatly enhance the study of noncondensable gases and stratification and permit the calculation of flow field details within any given volume.

  16. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  17. 38 CFR 3.316 - Claims based on chronic effects of exposure to mustard gas and Lewisite.

    Science.gov (United States)

    2010-07-01

    ... effects of exposure to mustard gas and Lewisite. 3.316 Section 3.316 Pensions, Bonuses, and Veterans... Compensation Ratings and Evaluations; Service Connection § 3.316 Claims based on chronic effects of exposure to mustard gas and Lewisite. (a) Except as provided in paragraph (b) of this section, exposure to the...

  18. A neutron spectrum unfolding computer code based on artificial neural networks

    Science.gov (United States)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2014-02-01

    The Bonner Spheres Spectrometer consists of a thermal neutron sensor placed at the center of a number of moderating polyethylene spheres of different diameters. From the measured readings, information can be derived about the spectrum of the neutron field where measurements were made. Disadvantages of the Bonner system are the weight associated with each sphere and the need to sequentially irradiate the spheres, requiring long exposure periods. Provided a well-established response matrix and adequate irradiation conditions, the most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Intelligence, mainly Artificial Neural Networks, have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This code is called Neutron Spectrometry and Dosimetry with Artificial Neural networks unfolding code that was designed in a graphical interface. The core of the code is an embedded neural network architecture previously optimized using the robust design of artificial neural networks methodology. The main features of the code are: easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6LiI(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, for unfolding the neutron spectrum, only seven rate counts measured with seven Bonner spheres are required; simultaneously the code calculates 15 dosimetric quantities as well as the total flux for radiation protection purposes. This code generates a full report with all information of the unfolding in

  19. Lossy to lossless object-based coding of 3-D MRI data.

    Science.gov (United States)

    Menegaz, Gloria; Thiran, Jean-Philippe

    2002-01-01

    We propose a fully three-dimensional (3-D) object-based coding system exploiting the diagnostic relevance of the different regions of the volumetric data for rate allocation. The data are first decorrelated via a 3-D discrete wavelet transform. The implementation via the lifting steps scheme allows to map integer-to-integer values, enabling lossless coding, and facilitates the definition of the object-based inverse transform. The coding process assigns disjoint segments of the bitstream to the different objects, which can be independently accessed and reconstructed at any up-to-lossless quality. Two fully 3-D coding strategies are considered: embedded zerotree coding (EZW-3D) and multidimensional layered zero coding (MLZC), both generalized for region of interest (ROI)-based processing. In order to avoid artifacts along region boundaries, some extra coefficients must be encoded for each object. This gives rise to an overheading of the bitstream with respect to the case where the volume is encoded as a whole. The amount of such extra information depends on both the filter length and the decomposition depth. The system is characterized on a set of head magnetic resonance images. Results show that MLZC and EZW-3D have competitive performances. In particular, the best MLZC mode outperforms the others state-of-the-art techniques on one of the datasets for which results are available in the literature.

  20. A novel construction method of QC-LDPC codes based on CRT for optical communications

    Science.gov (United States)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-05-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.

  1. 32 CFR 536.77 - Applicable law for claims under the Military Claims Act.

    Science.gov (United States)

    2010-07-01

    ... contributory negligence be interpreted and applied according to the law of the place of the occurrence... 32 National Defense 3 2010-07-01 2010-07-01 true Applicable law for claims under the Military... Act § 536.77 Applicable law for claims under the Military Claims Act. (a) General principles—(1) Tort...

  2. SCDAP/RELAP5/MOD 3.1 code manual: Interface theory. Volume 1

    International Nuclear Information System (INIS)

    Coryell, E.W.

    1995-06-01

    The SCDAP/RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during a severe accident. The code models the coupled behavior of the reactor coolant system, core, fission product released during a severe accident transient as well as large and small break loss of coolant accidents, operational transients such as anticipated transient without SCRAM, loss of off-site power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater conditioning systems. This volume describes the organization and manner of the interface between severe accident models which are resident in the SCDAP portion of the code and hydrodynamic models which are resident in the RELAP5 portion of the code. A description of the organization and structure of SCDAP/RELAP5 is presented. Additional information is provided regarding the manner in which models in one portion of the code impact other parts of the code, and models which are dependent on and derive information from other subcodes

  3. 40 CFR 233.21 - General permits.

    Science.gov (United States)

    2010-07-01

    ... ensure compliance with existing permit conditions an any reporting monitoring, or prenotification... apply for an individual permit. This discretionary authority will be based on concerns for the aquatic environment including compliance with paragraph (b) of this section and the 404(b)(1) Guidelines (40 CFR part...

  4. Field-based tests of geochemical modeling codes: New Zealand hydrothermal systems

    International Nuclear Information System (INIS)

    Bruton, C.J.; Glassley, W.E.; Bourcier, W.L.

    1993-12-01

    Hydrothermal systems in the Taupo Volcanic Zone, North Island, New Zealand are being used as field-based modeling exercises for the EQ3/6 geochemical modeling code package. Comparisons of the observed state and evolution of the hydrothermal systems with predictions of fluid-solid equilibria made using geochemical modeling codes will determine how the codes can be used to predict the chemical and mineralogical response of the environment to nuclear waste emplacement. Field-based exercises allow us to test the models on time scales unattainable in the laboratory. Preliminary predictions of mineral assemblages in equilibrium with fluids sampled from wells in the Wairakei and Kawerau geothermal field suggest that affinity-temperature diagrams must be used in conjunction with EQ6 to minimize the effect of uncertainties in thermodynamic and kinetic data on code predictions

  5. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  6. A Monte Carlo burnup code linking MCNP and REBUS

    International Nuclear Information System (INIS)

    Hanan, N.A.; Olson, A.P.; Pond, R.B.; Matos, J.E.

    1998-01-01

    The REBUS-3 burnup code, used in the anl RERTR Program, is a very general code that uses diffusion theory (DIF3D) to obtain the fluxes required for reactor burnup analyses. Diffusion theory works well for most reactors. However, to include the effects of exact geometry and strong absorbers that are difficult to model using diffusion theory, a Monte Carlo method is required. MCNP, a general-purpose, generalized-geometry, time-dependent, Monte Carlo transport code, is the most widely used Monte Carlo code. This paper presents a linking of the MCNP code and the REBUS burnup code to perform these difficult analyses. The linked code will permit the use of the full capabilities of REBUS which include non-equilibrium and equilibrium burnup analyses. Results of burnup analyses using this new linked code are also presented. (author)

  7. A Monte Carlo burnup code linking MCNP and REBUS

    International Nuclear Information System (INIS)

    Hanan, N. A.

    1998-01-01

    The REBUS-3 burnup code, used in the ANL RERTR Program, is a very general code that uses diffusion theory (DIF3D) to obtain the fluxes required for reactor burnup analyses. Diffusion theory works well for most reactors. However, to include the effects of exact geometry and strong absorbers that are difficult to model using diffusion theory, a Monte Carlo method is required. MCNP, a general-purpose, generalized-geometry, time-dependent, Monte Carlo transport code, is the most widely used Monte Carlo code. This paper presents a linking of the MCNP code and the REBUS burnup code to perform these difficult burnup analyses. The linked code will permit the use of the full capabilities of REBUS which include non-equilibrium and equilibrium burnup analyses. Results of burnup analyses using this new linked code are also presented

  8. Wavelet based multicarrier code division multiple access ...

    African Journals Online (AJOL)

    This paper presents the study on Wavelet transform based Multicarrier Code Division Multiple Access (MC-CDMA) system for a downlink wireless channel. The performance of the system is studied for Additive White Gaussian Noise Channel (AWGN) and slowly varying multipath channels. The bit error rate (BER) versus ...

  9. Pluvial, urban flood mechanisms and characteristics - Assessment based on insurance claims

    Science.gov (United States)

    Sörensen, Johanna; Mobini, Shifteh

    2017-12-01

    Pluvial flooding is a problem in many cities and for city planning purpose the mechanisms behind pluvial flooding are of interest. Previous studies seldom use insurance claim data to analyse city scale characteristics that lead to flooding. In the present study, two long time series (∼20 years) of flood claims from property owners have been collected and analysed in detail to investigate the mechanisms and characteristics leading to urban flooding. The flood claim data come from the municipal water utility company and property owners with insurance that covers property loss from overland flooding, groundwater intrusion through basement walls and flooding from the drainage system. These data are used as a proxy for flood severity for several events in the Swedish city of Malmö. It is discussed which rainfall characteristics give most flooding and why some rainfall events do not lead to severe flooding, how city scale topography and sewerage system type influence spatial distribution of flood claims, and which impact high sea level has on flooding in Malmö. Three severe flood events are described in detail and compared with a number of smaller flood events. It was found that the main mechanisms and characteristics of flood extent and its spatial distribution in Malmö are intensity and spatial distribution of rainfall, distance to the main sewer system as well as overland flow paths, and type of drainage system, while high sea level has little impact on the flood extent. Finally, measures that could be taken to lower the flood risk in Malmö, and other cities with similar characteristics, are discussed.

  10. Content analysis of e-cigarette products, promotions, prices and claims on Internet tobacco vendor websites, 2013-2014.

    Science.gov (United States)

    Williams, Rebecca S; Derrick, Jason; Liebman, Aliza K; LaFleur, Kevin

    2017-11-03

    To identify the population of Internet e-cigarette vendors (IEVs) and conduct content analysis of products sold and IEVs' promotional, claims and pricing practices. Multiple sources were used to identify IEV websites, primarily complex search algorithms scanning over 180 million websites. In 2013, 32 446 websites were manually screened, identifying 980 IEVs, with the 281 most popular selected for content analysis. This methodology yielded 31 239 websites for manual screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. While the majority of IEVs (71.9%) were US based in 2013, this dropped to 64.3% in 2014 (plocated in at least 38 countries, and 12% providing location indicators reflecting two or more countries, complicating jurisdictional determinations.Reflecting the retail market, IEVs are transitioning from offering disposable and 'cigalike' e-cigarettes to larger tank and "mod" systems. Flavored e-cigarettes were available from 85.9% of IEVs in 2014, with fruit and candy flavors being most popular. Most vendors (76.5%) made health claims in 2013, dropping to 43.1% in 2014. Some IEVs featured conflicting claims about whether or not e-cigarettes aid in smoking cessation. There was wide variation in pricing, with e-cigarettes available as inexpensive as one dollar, well within the affordable range for adults and teens. The number of Internet e-cigarette vendors grew threefold from 2013 to 2014, far surpassing the number of Internet cigarette vendors (N=775) at the 2004 height of that industry. New and expanded regulations for online e-cigarette sales are needed, including restrictions on flavors and marketing claims. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Lossless Image Compression Based on Multiple-Tables Arithmetic Coding

    Directory of Open Access Journals (Sweden)

    Rung-Ching Chen

    2009-01-01

    Full Text Available This paper is intended to present a lossless image compression method based on multiple-tables arithmetic coding (MTAC method to encode a gray-level image f. First, the MTAC method employs a median edge detector (MED to reduce the entropy rate of f. The gray levels of two adjacent pixels in an image are usually similar. A base-switching transformation approach is then used to reduce the spatial redundancy of the image. The gray levels of some pixels in an image are more common than those of others. Finally, the arithmetic encoding method is applied to reduce the coding redundancy of the image. To promote high performance of the arithmetic encoding method, the MTAC method first classifies the data and then encodes each cluster of data using a distinct code table. The experimental results show that, in most cases, the MTAC method provides a higher efficiency in use of storage space than the lossless JPEG2000 does.

  12. Occupational injuries and diseases in Alberta : lost-time claims and claim rates in the upstream oil and gas industries, 2001 to 2005

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-15

    In order to provide a detailed review of workplace health and safety, the Alberta Ministry of Human Resources and Employment prepares an annual report on the occupational injuries and diseases in the upstream oil and gas industries. The purpose of the report is to provide government, employers, workers, and health and safety professionals with information about key health and safety issues. This report presented estimations of the risk of injury or disease at the provincial, industry sector and subsector level as well as general descriptions about the incidents and injured workers. It also revealed the fatality rates for the major industry sectors as well as the occupational fatalities that the Workers Compensation Board (WCB) accepted for compensation. The number of employers that earned a certificate of recognition was also identified. The injury and disease analysis was discussed in terms of injured worker characteristics; nature of injury or disease; part of body injured; source of injury or disease; type of event or exposure; and duration of disability. The report also provided terms, definitions and formulas and upstream oil and gas WCB industry codes. It was found that in 2005, the WCB accepted 1,481 lost-time claims from upstream oil and gas workers, representing 4.2 per cent of all lost-time claims in the province. In addition, employers with 20 to 39 person-years had the highest lost-time claim rate of 2.4 per 100 person-years. tabs., figs., 2 appendices.

  13. State Waste Discharge Permit application: 200-W Powerhouse Ash Pit

    Energy Technology Data Exchange (ETDEWEB)

    Atencio, B.P.

    1994-06-01

    As part of the Hanford Federal Facility Agreement and Consent Order negotiations; the US Department of Energy, Richland Operations Office, the US Environmental Protection Agency, and the Washington State Department of Ecology agreed that liquid effluent discharges to the ground on the Hanford Site which affect groundwater or have the potential to affect groundwater would be subject to permitting under the structure of Chapter 173-216 (or 173-218 where applicable) of the Washington Administrative Code, the State Waste Discharge Permit Program. This document constitutes the State Waste Discharge Permit application for the 200-W Powerhouse Ash Pit. The 200-W Powerhouse Ash Waste Water discharges to the 200-W Powerhouse Ash Pit via dedicated pipelines. The 200-W Powerhouse Ash Waste Water is the only discharge to the 200-W Powerhouse Ash Pit. The 200-W Powerhouse is a steam generation facility consisting of a coal-handling and preparation section and boilers.

  14. State Waste Discharge Permit application: 200-E Powerhouse Ash Pit

    Energy Technology Data Exchange (ETDEWEB)

    Atencio, B.P.

    1994-06-01

    As part of the Hanford Federal Facility Agreement and Consent Order negotiations, the US Department and Energy, Richland Operations Office, the US Environmental Protection Agency, and the Washington State Department of Ecology agreed that liquid effluent discharges to the ground on the Hanford Site which affect groundwater or have the potential to affect groundwater would be subject to permitting under the structure of Chapter 173-216 (or 173-218 where applicable) of the Washington Administrative Code, the State Waste Discharge Permit Program. This document constitutes the State Waste Discharge Permit application for the 200-E Powerhouse Ash Pit. The 200-E Powerhouse Ash Waste Water discharges to the 200-E Powerhouse Ash Pit via dedicated pipelines. The 200-E Ash Waste Water is the only discharge to the 200-E Powerhouse Ash Pit. The 200-E Powerhouse is a steam generation facility consisting of a coal-handling and preparation section and boilers.

  15. Comorbidity ascertainment from the ESRD Medical Evidence Report and Medicare claims around dialysis initiation: a comparison using US Renal Data System data.

    Science.gov (United States)

    Krishnan, Mahesh; Weinhandl, Eric D; Jackson, Scott; Gilbertson, David T; Lacson, Eduardo

    2015-11-01

    The end-stage renal disease Medical Evidence Report serves as a source of comorbid condition data for risk adjustment of quality metrics. We sought to compare comorbid condition data in the Medical Evidence Report around dialysis therapy initiation with diagnosis codes in Medicare claims. Observational cohort study using US Renal Data System data. Medicare-enrolled elderly (≥66 years) patients who initiated maintenance dialysis therapy July 1 to December 31, 2007, 2008, or 2009. 12 comorbid conditions ascertained from claims during the 6 months before dialysis therapy initiation, the Medical Evidence Report, and claims during the 3 months after dialysis therapy initiation. None. Comorbid condition prevalence according to claims before dialysis therapy initiation generally exceeded prevalence according to the Medical Evidence Report. The κ statistics for comorbid condition designations other than diabetes ranged from 0.06 to 0.43. Discordance of designations was associated with age, race, sex, and end-stage renal disease Network. During 23,930 patient-years of follow-up from 4 to 12 months after dialysis therapy initiation (8,930 deaths), designations from claims during the 3 months after initiation better discriminated risk of death than designations from the Medical Evidence Report (C statistics of 0.674 vs 0.616). Between the Medical Evidence Report and claims, standardized mortality ratios changed by >10% for more than half the dialysis facilities. Neither the Medical Evidence Report nor diagnosis codes in claims constitute a gold standard of comorbid condition data; results may not apply to nonelderly patients or patients without Medicare coverage. Discordance of comorbid condition designations from the Medical Evidence Report and claims around dialysis therapy initiation was substantial and significantly associated with patient characteristics, including location. These patterns may engender bias in risk-adjusted quality metrics. In lieu of the Medical

  16. Hanford Facility dangerous waste permit application, general information

    International Nuclear Information System (INIS)

    1993-05-01

    The current Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (this document, number DOE/RL-91-28) and a treatment, storage, and/or disposal Unit-Specific Portion, which includes documentation for individual TSD units (e.g., document numbers DOE/RL-89-03 and DOE/RL-90-01). Both portions consist of a Part A division and a Part B division. The Part B division consists of 15 chapters that address the content of the Part B checklists prepared by the Washington State Department of Ecology (Ecology 1987) and the US Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information requirements mandated by the Hazardous and Solid Waste Amendments of 1984 and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology checklist section numbers, in brackets, follow the chapter headings and subheadings. Documentation contained in the General Information Portion (i.e., this document, number DOE/RL-91-28) is broader in nature and applies to all treatment, storage, and/or disposal units for which final status is sought. Because of its broad nature, the Part A division of the General Information Portion references the Hanford Facility Dangerous Waste Part A Permit Application (document number DOE/RL-88-21), a compilation of all Part A documentation for the Hanford Facility

  17. The Modularized Software Package ASKI - Full Waveform Inversion Based on Waveform Sensitivity Kernels Utilizing External Seismic Wave Propagation Codes

    Science.gov (United States)

    Schumacher, F.; Friederich, W.

    2015-12-01

    We present the modularized software package ASKI which is a flexible and extendable toolbox for seismic full waveform inversion (FWI) as well as sensitivity or resolution analysis operating on the sensitivity matrix. It utilizes established wave propagation codes for solving the forward problem and offers an alternative to the monolithic, unflexible and hard-to-modify codes that have typically been written for solving inverse problems. It is available under the GPL at www.rub.de/aski. The Gauss-Newton FWI method for 3D-heterogeneous elastic earth models is based on waveform sensitivity kernels and can be applied to inverse problems at various spatial scales in both Cartesian and spherical geometries. The kernels are derived in the frequency domain from Born scattering theory as the Fréchet derivatives of linearized full waveform data functionals, quantifying the influence of elastic earth model parameters on the particular waveform data values. As an important innovation, we keep two independent spatial descriptions of the earth model - one for solving the forward problem and one representing the inverted model updates. Thereby we account for the independent needs of spatial model resolution of forward and inverse problem, respectively. Due to pre-integration of the kernels over the (in general much coarser) inversion grid, storage requirements for the sensitivity kernels are dramatically reduced.ASKI can be flexibly extended to other forward codes by providing it with specific interface routines that contain knowledge about forward code-specific file formats and auxiliary information provided by the new forward code. In order to sustain flexibility, the ASKI tools must communicate via file output/input, thus large storage capacities need to be accessible in a convenient way. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full

  18. CRACKEL: a computer code for CFR fuel management calculations

    International Nuclear Information System (INIS)

    Burstall, R.F.; Ball, M.A.; Thornton, D.E.J.

    1975-12-01

    The CRACKLE computer code is designed to perform rapid fuel management surveys of CFR systems. The code calculates overall features such as reactivity, power distributions and breeding gain, and also calculates for each sub-assembly plutonium content and power output. A number of alternative options are built into the code, in order to permit different fuel management strategies to be calculated, and to perform more detailed calculations when necessary. A brief description is given of the methods of calculation, and the input facilities of CRACKLE, with examples. (author)

  19. State Waste Discharge Permit Application: Electric resistance tomography testing

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-01

    This permit application documentation is for a State Waste Discharge Permit issued in accordance with requirements of Washington Administrative Code 173-216. The activity being permitted is a technology test using electrical resistance tomography. The electrical resistance tomography technology was developed at Lawrence Livermore National Laboratory and has been used at other waste sites to track underground contamination plumes. The electrical resistance tomography technology measures soil electrical resistance between two electrodes. If a fluid contaminated with electrolytes is introduced into the soil, the soil resistance is expected to drop. By using an array of measurement electrodes in several boreholes, the areal extent of contamination can be estimated. At the Hanford Site, the purpose of the testing is to determine if the electrical resistance tomography technology can be used in the vicinity of large underground metal tanks without the metal tank interfering with the test. It is anticipated that the electrical resistance tomography technology will provide a method for accurately detecting leaks from the bottom of underground tanks, such as the Hanford Site single-shell tanks.

  20. State Waste Discharge Permit Application: Electric resistance tomography testing

    International Nuclear Information System (INIS)

    1994-04-01

    This permit application documentation is for a State Waste Discharge Permit issued in accordance with requirements of Washington Administrative Code 173-216. The activity being permitted is a technology test using electrical resistance tomography. The electrical resistance tomography technology was developed at Lawrence Livermore National Laboratory and has been used at other waste sites to track underground contamination plumes. The electrical resistance tomography technology measures soil electrical resistance between two electrodes. If a fluid contaminated with electrolytes is introduced into the soil, the soil resistance is expected to drop. By using an array of measurement electrodes in several boreholes, the areal extent of contamination can be estimated. At the Hanford Site, the purpose of the testing is to determine if the electrical resistance tomography technology can be used in the vicinity of large underground metal tanks without the metal tank interfering with the test. It is anticipated that the electrical resistance tomography technology will provide a method for accurately detecting leaks from the bottom of underground tanks, such as the Hanford Site single-shell tanks

  1. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    Directory of Open Access Journals (Sweden)

    Behrang Barekatain

    Full Text Available In recent years, Random Network Coding (RNC has emerged as a promising solution for efficient Peer-to-Peer (P2P video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  2. Permit trading and credit trading

    DEFF Research Database (Denmark)

    Boom, Jan-Tjeerd; R. Dijstra, Bouwe

    This paper compares emissions trading based on a cap on total emissions (permit trading) and on relative standards per unit of output (credit trading). Two types of market structure are considered: perfect competition and Cournot oligopoly. We find that output, abatement costs and the number...... of firms are higher under credit trading. Allowing trade between permit-trading and credit-trading sectors may increase in welfare. With perfect competition, permit trading always leads to higher welfare than credit trading. With imperfect competition, credit trading may outperform permit trading....... Environmental policy can lead to exit, but also to entry of firms. Entry and exit have a profound impact on the performance of the schemes, especially under imperfect competition. We find that it may be impossible to implement certain levels of total industry emissions. Under credit trading several levels...

  3. Optimal Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kroon, I. B.; Faber, Michael Havbro

    1994-01-01

    Calibration of partial safety factors is considered in general, including classes of structures where no code exists beforehand. The partial safety factors are determined such that the difference between the reliability for the different structures in the class considered and a target reliability...... level is minimized. Code calibration on a decision theoretical basis is also considered and it is shown how target reliability indices can be calibrated. Results from code calibration for rubble mound breakwater designs are shown....

  4. Environmental performances of coproducts. Application of Claiming-Based Allocation models to straw and vetiver biorefineries in an Indian context.

    Science.gov (United States)

    Gnansounou, Edgard; Raman, Jegannathan Kenthorai

    2018-04-24

    Among the renewables, non-food and wastelands based biofuels are essential for the transport sector to achieve country's climate mitigation targets. With the growing interest in biorefineries, setting policy requirements for other coproducts along with biofuels is necessary to improve the products portfolio of biorefinery, increase the bioproducts perception by the consumers and push the technology forward. Towards this context, Claiming-Based allocation models were used in comparative life cycle assessment of multiple products from wheat straw biorefinery and vetiver biorefinery. Vetiver biorefinery shows promising Greenhouse gas emission savings (181-213%) compared to the common crop based lignocellulose (wheat straw) biorefinery. Assistance of Claiming-Based Allocation models favors to find out the affordable allocation limit (0-80%) among the coproducts in order to achieve the individual prospective policy targets. Such models show promising application in multiproduct life cycle assessment studies where appropriate allocation is challenging to achieve the individual products emission subject to policy targets. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Claiming health in food products

    DEFF Research Database (Denmark)

    Lähteenmäki, Liisa

    2013-01-01

    Health-related information is increasingly used on food products to convey their benefits. Health claims as a subcategory of these messages link the beneficial component, functions or health outcomes with specific products. For consumers, health claims seem to carry the message of increased...... healthiness, but not necessarily making the product more appealing. The wording of the claim seems to have little impact on claim perception, yet the health image of carrier products is important. From consumer-related factors the relevance and attitudes towards functional foods play a role, whereas socio......-demographic factors have only minor impact and the impact seems to be case-dependent. Familiarity with claims and functional foods increase perceived healthiness and acceptance of these products. Apparently consumers make rather rational interpretations of claims and their benefits when forced to assess...

  6. Warranty claim analysis considering human factors

    International Nuclear Information System (INIS)

    Wu Shaomin

    2011-01-01

    Warranty claims are not always due to product failures. They can also be caused by two types of human factors. On the one hand, consumers might claim warranty due to misuse and/or failures caused by various human factors. Such claims might account for more than 10% of all reported claims. On the other hand, consumers might not be bothered to claim warranty for failed items that are still under warranty, or they may claim warranty after they have experienced several intermittent failures. These two types of human factors can affect warranty claim costs. However, research in this area has received rather little attention. In this paper, we propose three models to estimate the expected warranty cost when the two types of human factors are included. We consider two types of failures: intermittent and fatal failures, which might result in different claim patterns. Consumers might report claims after a fatal failure has occurred, and upon intermittent failures they might report claims after a number of failures have occurred. Numerical examples are given to validate the results derived.

  7. Short-Block Protograph-Based LDPC Codes

    Science.gov (United States)

    Divsalar, Dariush; Dolinar, Samuel; Jones, Christopher

    2010-01-01

    Short-block low-density parity-check (LDPC) codes of a special type are intended to be especially well suited for potential applications that include transmission of command and control data, cellular telephony, data communications in wireless local area networks, and satellite data communications. [In general, LDPC codes belong to a class of error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels.] The codes of the present special type exhibit low error floors, low bit and frame error rates, and low latency (in comparison with related prior codes). These codes also achieve low maximum rate of undetected errors over all signal-to-noise ratios, without requiring the use of cyclic redundancy checks, which would significantly increase the overhead for short blocks. These codes have protograph representations; this is advantageous in that, for reasons that exceed the scope of this article, the applicability of protograph representations makes it possible to design highspeed iterative decoders that utilize belief- propagation algorithms.

  8. Trellis-coded CPM for satellite-based mobile communications

    Science.gov (United States)

    Abrishamkar, Farrokh; Biglieri, Ezio

    1988-01-01

    Digital transmission for satellite-based land mobile communications is discussed. To satisfy the power and bandwidth limitations imposed on such systems, a combination of trellis coding and continuous-phase modulated signals are considered. Some schemes based on this idea are presented, and their performance is analyzed by computer simulation. The results obtained show that a scheme based on directional detection and Viterbi decoding appears promising for practical applications.

  9. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    Science.gov (United States)

    Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin

    2016-01-01

    High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  10. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    Directory of Open Access Journals (Sweden)

    Yueying Wu

    Full Text Available High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI extraction using the high efficiency video coding (H.265/HEVC standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0. The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  11. Image Coding Based on Address Vector Quantization.

    Science.gov (United States)

    Feng, Yushu

    Image coding is finding increased application in teleconferencing, archiving, and remote sensing. This thesis investigates the potential of Vector Quantization (VQ), a relatively new source coding technique, for compression of monochromatic and color images. Extensions of the Vector Quantization technique to the Address Vector Quantization method have been investigated. In Vector Quantization, the image data to be encoded are first processed to yield a set of vectors. A codeword from the codebook which best matches the input image vector is then selected. Compression is achieved by replacing the image vector with the index of the code-word which produced the best match, the index is sent to the channel. Reconstruction of the image is done by using a table lookup technique, where the label is simply used as an address for a table containing the representative vectors. A code-book of representative vectors (codewords) is generated using an iterative clustering algorithm such as K-means, or the generalized Lloyd algorithm. A review of different Vector Quantization techniques are given in chapter 1. Chapter 2 gives an overview of codebook design methods including the Kohonen neural network to design codebook. During the encoding process, the correlation of the address is considered and Address Vector Quantization is developed for color image and monochrome image coding. Address VQ which includes static and dynamic processes is introduced in chapter 3. In order to overcome the problems in Hierarchical VQ, Multi-layer Address Vector Quantization is proposed in chapter 4. This approach gives the same performance as that of the normal VQ scheme but the bit rate is about 1/2 to 1/3 as that of the normal VQ method. In chapter 5, a Dynamic Finite State VQ based on a probability transition matrix to select the best subcodebook to encode the image is developed. In chapter 6, a new adaptive vector quantization scheme, suitable for color video coding, called "A Self -Organizing

  12. [Comparative review of the Senegalese and French deontology codes].

    Science.gov (United States)

    Soumah, M; Mbaye, I; Bah, H; Gaye Fall, M C; Sow, M L

    2005-01-01

    The medical deontology regroups duties of the physicians and regulate the exercise of medicine. The code of medical deontology of Senegal inspired of the French medical deontology code, has not been revised since its institution whereas the French deontology code knew three revisions. Comparing the two codes of deontology titles by title and article by article, this work beyond a parallel between the two codes puts in inscription the progress in bioethics that are to the basis of the revisions of the French medical deontology code. This article will permit an advocacy of the health professionals, in favor of a setting to level of the of Senegalese medical deontology code. Because legal litigation, that is important in the developed countries, intensify in our developing countries. It is inherent to the technological progress and to the awareness of the patients of their rights.

  13. 24 CFR 17.4 - Administrative claim; evidence and information to be submitted.

    Science.gov (United States)

    2010-04-01

    ... bearing on either the responsibility of the United States for the death or the damages claimed. (c... information which may have a bearing on either the responsibility of the United States for the personal injury or the damages claimed. (b) Death. In support of a claim based on death, the claimant may be required...

  14. Predictive values of diagnostic codes for identifying serious hypocalcemia and dermatologic adverse events among women with postmenopausal osteoporosis in a commercial health plan database.

    Science.gov (United States)

    Wang, Florence T; Xue, Fei; Ding, Yan; Ng, Eva; Critchlow, Cathy W; Dore, David D

    2018-04-10

    Post-marketing safety studies of medicines often rely on administrative claims databases to identify adverse outcomes following drug exposure. Valid ascertainment of outcomes is essential for accurate results. We aim to quantify the validity of diagnostic codes for serious hypocalcemia and dermatologic adverse events from insurance claims data among women with postmenopausal osteoporosis (PMO). We identified potential cases of serious hypocalcemia and dermatologic events through ICD-9 diagnosis codes among women with PMO within claims from a large US healthcare insurer (June 2005-May 2010). A physician adjudicated potential hypocalcemic and dermatologic events identified from the primary position on emergency department (ED) or inpatient claims through medical record review. Positive predictive values (PPVs) and 95% confidence intervals (CIs) quantified the fraction of potential cases that were confirmed. Among 165,729 patients with PMO, medical charts were obtained for 40 of 55 (73%) potential hypocalcemia cases; 16 were confirmed (PPV 40%, 95% CI 25-57%). The PPV was higher for ED than inpatient claims (82 vs. 24%). Among 265 potential dermatologic events (primarily urticaria or rash), we obtained 184 (69%) charts and confirmed 128 (PPV 70%, 95% CI 62-76%). The PPV was higher for ED than inpatient claims (77 vs. 39%). Diagnostic codes for hypocalcemia and dermatologic events may be sufficient to identify events giving rise to emergency care, but are less accurate for identifying events within hospitalizations.

  15. Mortgage development in Serbia, specialty in Serbian Civil Code from 1844. year

    Directory of Open Access Journals (Sweden)

    Popov Danica

    2012-01-01

    Full Text Available The history of Serbian mortgage law is not too long. The mortgage law began in first half of XIX century. The first Act about mortgage was The Intabulation Law from 1839. year. Mortgage is a right of lien on an immovable which authorizes the creditor to seek satisfaction of his claim for the value of such immovable before of other creditors that do not have mortgage on it, as well as before creditors who have acquired mortgage on such immovable subsequent on him, irrespective of a contingent change of owner of the encumbered immovable. The mortgage was based on some rules. The mortgage is accessory rights. One of the basic features of a security right is its dependence on the claim. The mortgage creditor is entitled to request satisfaction of the claim from the value of immovable encumbered by mortgage, regardless of whether it is still in possession of the mortgage debitor or it has been convened into ownership of a third party. This The Intabulation Law is changed twice. For the first time in 1842 year, and second time in 1853. year. The lack of both changes was missing the principle of speciality. Principle of speciality marks the fact that a guarantee (security, may secure only a definite claim of one creditor and that the guarantee may exist only on a definite set of objects (assets. On one hand speciality of real security interest forbids securing an indefinite number of claims or an indefinite amount of claim, and, on the other, forbids the prospect of indefinite assets, or all assets of the debitor be subject of to such a security interest. This lack are eliminate in the new Intabulation Law from 1854. year, which was incorporated into Serbian Civil Code from 1844. year. The subject of this article is The Intabulation Law from 1854. year.

  16. The PHREEQE Geochemical equilibrium code data base and calculations

    International Nuclear Information System (INIS)

    Andersoon, K.

    1987-01-01

    Compilation of a thermodynamic data base for actinides and fission products for use with PHREEQE has begun and a preliminary set of actinide data has been tested for the PHREEQE code in a version run on an IBM XT computer. The work until now has shown that the PHREEQE code mostly gives satisfying results for specification of actinides in natural water environment. For U and Np under oxidizing conditions, however, the code has difficulties to converge with pH and Eh conserved when a solubility limit is applied. For further calculations of actinide and fission product specification and solubility in a waste repository and in the surrounding geosphere, more data are needed. It is necessary to evaluate the influence of the large uncertainties of some data. A quality assurance and a check on the consistency of the data base is also needed. Further work with data bases should include: an extension to fission products, an extension to engineering materials, an extension to other ligands than hydroxide and carbonate, inclusion of more mineral phases, inclusion of enthalpy data, a control of primary references in order to decide if values from different compilations are taken from the same primary reference and contacts and discussions with other groups, working with actinide data bases, e.g. at the OECD/NEA and at the IAEA. (author)

  17. ITS version 5.0 :the integrated TIGER series of coupled electron/Photon monte carlo transport codes with CAD geometry.

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2005-09-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  18. A Low-Jitter Wireless Transmission Based on Buffer Management in Coding-Aware Routing

    Directory of Open Access Journals (Sweden)

    Cunbo Lu

    2015-08-01

    Full Text Available It is significant to reduce packet jitter for real-time applications in a wireless network. Existing coding-aware routing algorithms use the opportunistic network coding (ONC scheme in a packet coding algorithm. The ONC scheme never delays packets to wait for the arrival of a future coding opportunity. The loss of some potential coding opportunities may degrade the contribution of network coding to jitter performance. In addition, most of the existing coding-aware routing algorithms assume that all flows participating in the network have equal rate. This is unrealistic, since multi-rate environments often appear. To overcome the above problem and expand coding-aware routing to multi-rate scenarios, from the view of data transmission, we present a low-jitter wireless transmission algorithm based on buffer management (BLJCAR, which decides packets in coding node according to the queue-length based threshold policy instead of the regular ONC policy as used in existing coding-aware routing algorithms. BLJCAR is a unified framework to merge the single rate case and multiple rate case. Simulations results show that the BLJCAR algorithm embedded in coding-aware routing outperforms the traditional ONC policy in terms of jitter, packet delivery delay, packet loss ratio and network throughput in network congestion in any traffic rates.

  19. 'Blocked area' of a citizens' action group in operating plan permit accoding to Mining Law

    Energy Technology Data Exchange (ETDEWEB)

    1982-05-26

    On the question as to whether a citizen's action group, organized in the form of a registered club, has the right to file suit as defined by paragraph 2 of sect. 42 of the rules of administrative courts, in case they bring forward that their right to the reforestation of an estate, ensured by easement, will be affected by a skeleton operating plan permit issued under the mining law. Since the protection of the recreational function of forests is a task the safeguarding of which is solely assigned to bodies of public administration, anyone who has a real right may not claim neighbourly protection under public law in so far. On the relationship between operating plan approval, procedures are according to mining laws and the licensing procedures concerning construction permits.

  20. Impact of Different Spreading Codes Using FEC on DWT Based MC-CDMA System

    OpenAIRE

    Masum, Saleh; Kabir, M. Hasnat; Islam, Md. Matiqul; Shams, Rifat Ara; Ullah, Shaikh Enayet

    2012-01-01

    The effect of different spreading codes in DWT based MC-CDMA wireless communication system is investigated. In this paper, we present the Bit Error Rate (BER) performance of different spreading codes (Walsh-Hadamard code, Orthogonal gold code and Golay complementary sequences) using Forward Error Correction (FEC) of the proposed system. The data is analyzed and is compared among different spreading codes in both coded and uncoded cases. It is found via computer simulation that the performance...

  1. Diagnosis-based and external cause-based criteria to identify adverse drug reactions in hospital ICD-coded data: application to an Australia population-based study

    Directory of Open Access Journals (Sweden)

    Wei Du

    2017-04-01

    Full Text Available Objectives: External cause International Classification of Diseases (ICD codes are commonly used to ascertain adverse drug reactions (ADRs related to hospitalisation. We quantified ascertainment of ADR-related hospitalisation using external cause codes and additional ICD-based hospital diagnosis codes. Methods: We reviewed the scientific literature to identify different ICD-based criteria for ADR-related hospitalisations, developed algorithms to capture ADRs based on candidate hospital ICD-10 diagnoses and external cause codes (Y40–Y59, and incorporated previously published causality ratings estimating the probability that a specific diagnosis was ADR related. We applied the algorithms to the NSW Admitted Patient Data Collection records of 45 and Up Study participants (2011–2013. Results: Of 493 442 hospitalisations among 267 153 study participants during 2011–2013, 18.8% (n = 92 953 had hospital diagnosis codes that were potentially ADR related; 1.1% (n = 5305 had high/very high–probability ADR-related diagnosis codes (causality ratings: A1 and A2; and 2.0% (n = 10 039 had ADR-related external cause codes. Overall, 2.2% (n = 11 082 of cases were classified as including an ADR-based hospitalisation on either external cause codes or high/very high–probability ADR-related diagnosis codes. Hence, adding high/very high–probability ADR-related hospitalisation codes to standard external cause codes alone (Y40–Y59 increased the number of hospitalisations classified as having an ADR-related diagnosis by 10.4%. Only 6.7% of cases with high-probability ADR-related mental symptoms were captured by external cause codes. Conclusion: Selective use of high-probability ADR-related hospital diagnosis codes in addition to external cause codes yielded a modest increase in hospitalised ADR incidence, which is of potential clinical significance. Clinically validated combinations of diagnosis codes could potentially further enhance capture.

  2. Solution weighting for the SAND-II Monte Carlo code

    International Nuclear Information System (INIS)

    Oster, C.A.; McElroy, W.N.; Simons, R.L.; Lippincott, E.P.; Odette, G.R.

    1976-01-01

    Modifications to the SAND-II Error Analysis Monte Carlo code to include solution weighting based on input data uncertainties have been made and are discussed together with background information on the SAND-II algorithm. The new procedure permits input data having smaller uncertainties to have a greater influence on the solution spectrum than do the data having larger uncertainties. The results of an indepth study to find a practical procedure and the first results of its application to three important Interlaboratory LMFBR Reaction Rate (ILRR) program benchmark spectra (CFRMF, ΣΣ, and 235 U fission) are discussed

  3. LSB-Based Steganography Using Reflected Gray Code

    Science.gov (United States)

    Chen, Chang-Chu; Chang, Chin-Chen

    Steganography aims to hide secret data into an innocuous cover-medium for transmission and to make the attacker cannot recognize the presence of secret data easily. Even the stego-medium is captured by the eavesdropper, the slight distortion is hard to be detected. The LSB-based data hiding is one of the steganographic methods, used to embed the secret data into the least significant bits of the pixel values in a cover image. In this paper, we propose an LSB-based scheme using reflected-Gray code, which can be applied to determine the embedded bit from secret information. Following the transforming rule, the LSBs of stego-image are not always equal to the secret bits and the experiment shows that the differences are up to almost 50%. According to the mathematical deduction and experimental results, the proposed scheme has the same image quality and payload as the simple LSB substitution scheme. In fact, our proposed data hiding scheme in the case of G1 (one bit Gray code) system is equivalent to the simple LSB substitution scheme.

  4. Properties of an Arithmetic Code for Geodesic Flows

    International Nuclear Information System (INIS)

    Chaves, Daniel P B; Palazzo, Reginaldo Jr; Rios Leite, Jose R

    2011-01-01

    Topological analysis of chaotic dynamical systems emerged in the nineties as a powerful tool in the study of strange attractors in low-dimensional dynamical systems. It is based on identifying the stretching and squeezing mechanisms responsible for creating a strange attractor and organize all the unstable periodic orbits in this attractor. This method is concerned with the manifold generated by the chaotic system. Furthermore, as a mathematical object, the manifolds have a well studied geometric and algebraic structure, particularly for the case of compact surfaces. Intending to use this structure in the analysis and application of chaotic systems through their topological characteristics, we determine properties of geodesic codes for compact surfaces necessary for the construction of encoders from the symbolic sequences of experimental data generated by the unstable periodic orbits of the strange attractor (related to the behavior changes of the system with the variation of control parameters) to the geodesic code sequences, which permits to use the surface structure to study the system orbits.

  5. Claims in civil engineering contracts

    CERN Document Server

    Speirs, N A

    1999-01-01

    This paper considers claims arising during civil engineering construction contracts. The meaning of the word 'claim' is considered and its possible implications for additional cost and time to completion. The conditions of the construction contract selected will influence the risk apportionment between contractor and client and the price offered by the contractor for the work. Competitive bidding constraints and profit margins in the construction industry, however, may also influence the price offered. This in turn can influence the likelihood of claims arising. The client from his point of view is concerned to complete the work within an agreed time and budget. The circumstances under which claims may arise are reviewed in relation to typical conditions of contract. These circumstances are then related to the CERN LHC civil works. Ways of avoiding claims, where this is possible, are considered. Finally, the means of evaluation of claims and their settlement are considered.

  6. Performance analysis of multiple interference suppression over asynchronous/synchronous optical code-division multiple-access system based on complementary/prime/shifted coding scheme

    Science.gov (United States)

    Nieh, Ta-Chun; Yang, Chao-Chin; Huang, Jen-Fa

    2011-08-01

    A complete complementary/prime/shifted prime (CPS) code family for the optical code-division multiple-access (OCDMA) system is proposed. Based on the ability of complete complementary (CC) code, the multiple-access interference (MAI) can be suppressed and eliminated via spectral amplitude coding (SAC) OCDMA system under asynchronous/synchronous transmission. By utilizing the shifted prime (SP) code in the SAC scheme, the hardware implementation of encoder/decoder can be simplified with a reduced number of optical components, such as arrayed waveguide grating (AWG) and fiber Bragg grating (FBG). This system has a superior performance as compared to previous bipolar-bipolar coding OCDMA systems.

  7. Second WCB claims: who is at risk?

    Science.gov (United States)

    Cherry, Nicola M; Sithole, Fortune; Beach, Jeremy R; Burstyn, Igor

    2010-01-01

    Many workers with one Workers' Compensation Board (WCB) claim make further claims. If the characteristics of the job, initial injury or worker were predictive of an early second claim, interventions at the time of return to work after the first claim might be effective in reducing the burden of work-related injury. This report explores the characteristic of those who make a second claim. Records of all Alberta WCB claims from January 1, 1995, to December 31, 2004, for individuals 18 to claim, sex and age of claimant, type of injury, type of accident, occupation, industry, an indicator of company size, and industry claim rate were extracted, as well as the date of any second claim. The likelihood of second claim and mean time to second claim were estimated. Multivariate analyses were performed using Cox regression. 1,047,828 claims were identified from 490,230 individuals. Of these, 49.2% had at least two claims. In the multivariate model a reduced time to second claim was associated with male sex, younger age and some types of injury and accident. Machining trades were at highest risk of early second claim (hazard ratio [HR] 2.54 compared with administration), and of the industry sectors manufacturing was at highest risk (HR 1.37 compared with business, personal and professional services). Some caution is needed in interpreting these data as they may be affected by under-reporting and job changes between claims. Nonetheless, they suggest that there remains room for interventions to reduce the considerable differences in risk of a second claim among workers, jobs and industries.

  8. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    Science.gov (United States)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  9. GPU-accelerated 3D neutron diffusion code based on finite difference method

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Q.; Yu, G.; Wang, K. [Dept. of Engineering Physics, Tsinghua Univ. (China)

    2012-07-01

    Finite difference method, as a traditional numerical solution to neutron diffusion equation, although considered simpler and more precise than the coarse mesh nodal methods, has a bottle neck to be widely applied caused by the huge memory and unendurable computation time it requires. In recent years, the concept of General-Purpose computation on GPUs has provided us with a powerful computational engine for scientific research. In this study, a GPU-Accelerated multi-group 3D neutron diffusion code based on finite difference method was developed. First, a clean-sheet neutron diffusion code (3DFD-CPU) was written in C++ on the CPU architecture, and later ported to GPUs under NVIDIA's CUDA platform (3DFD-GPU). The IAEA 3D PWR benchmark problem was calculated in the numerical test, where three different codes, including the original CPU-based sequential code, the HYPRE (High Performance Pre-conditioners)-based diffusion code and CITATION, were used as counterpoints to test the efficiency and accuracy of the GPU-based program. The results demonstrate both high efficiency and adequate accuracy of the GPU implementation for neutron diffusion equation. A speedup factor of about 46 times was obtained, using NVIDIA's Geforce GTX470 GPU card against a 2.50 GHz Intel Quad Q9300 CPU processor. Compared with the HYPRE-based code performing in parallel on an 8-core tower server, the speedup of about 2 still could be observed. More encouragingly, without any mathematical acceleration technology, the GPU implementation ran about 5 times faster than CITATION which was speeded up by using the SOR method and Chebyshev extrapolation technique. (authors)

  10. GPU-accelerated 3D neutron diffusion code based on finite difference method

    International Nuclear Information System (INIS)

    Xu, Q.; Yu, G.; Wang, K.

    2012-01-01

    Finite difference method, as a traditional numerical solution to neutron diffusion equation, although considered simpler and more precise than the coarse mesh nodal methods, has a bottle neck to be widely applied caused by the huge memory and unendurable computation time it requires. In recent years, the concept of General-Purpose computation on GPUs has provided us with a powerful computational engine for scientific research. In this study, a GPU-Accelerated multi-group 3D neutron diffusion code based on finite difference method was developed. First, a clean-sheet neutron diffusion code (3DFD-CPU) was written in C++ on the CPU architecture, and later ported to GPUs under NVIDIA's CUDA platform (3DFD-GPU). The IAEA 3D PWR benchmark problem was calculated in the numerical test, where three different codes, including the original CPU-based sequential code, the HYPRE (High Performance Pre-conditioners)-based diffusion code and CITATION, were used as counterpoints to test the efficiency and accuracy of the GPU-based program. The results demonstrate both high efficiency and adequate accuracy of the GPU implementation for neutron diffusion equation. A speedup factor of about 46 times was obtained, using NVIDIA's Geforce GTX470 GPU card against a 2.50 GHz Intel Quad Q9300 CPU processor. Compared with the HYPRE-based code performing in parallel on an 8-core tower server, the speedup of about 2 still could be observed. More encouragingly, without any mathematical acceleration technology, the GPU implementation ran about 5 times faster than CITATION which was speeded up by using the SOR method and Chebyshev extrapolation technique. (authors)

  11. 13 CFR 114.104 - What evidence and information may SBA require relating to my claim?

    Science.gov (United States)

    2010-01-01

    ... BUSINESS ADMINISTRATION ADMINISTRATIVE CLAIMS UNDER THE FEDERAL TORT CLAIMS ACT AND REPRESENTATION AND... relevant to the government's alleged liability or the damages you claim. (c) For a claim based on death: (1) An authenticated death certificate or other competent evidence showing cause of death, date of death...

  12. Valuation of Non-Life Liabilities from Claims Triangles

    Directory of Open Access Journals (Sweden)

    Mathias Lindholm

    2017-07-01

    Full Text Available This paper provides a complete program for the valuation of aggregate non-life insurance liability cash flows based on claims triangle data. The valuation is fully consistent with the principle of valuation by considering the costs associated with a transfer of the liability to a so-called reference undertaking subject to capital requirements throughout the runoff of the liability cash flow. The valuation program includes complete details on parameter estimation, bias correction and conservative estimation of the value of the liability under partial information. The latter is based on a new approach to the estimation of mean squared error of claims reserve prediction.

  13. Workers Compensation Claim Data -

    Data.gov (United States)

    Department of Transportation — This data set contains DOT employee workers compensation claim data for current and past DOT employees. Types of data include claim data consisting of PII data (SSN,...

  14. Blackout sequence modeling for Atucha-I with MARCH3 code

    International Nuclear Information System (INIS)

    Baron, J.; Bastianelli, B.

    1997-01-01

    The modeling of a blackout sequence in Atucha I nuclear power plant is presented in this paper, as a preliminary phase for a level II probabilistic safety assessment. Such sequence is analyzed with the code MARCH3 from STCP (Source Term Code Package), based on a specific model developed for Atucha, that takes into accounts it peculiarities. The analysis includes all the severe accident phases, from the initial transient (loss of heat sink), loss of coolant through the safety valves, core uncovered, heatup, metal-water reaction, melting and relocation, heatup and failure of the pressure vessel, core-concrete interaction in the reactor cavity, heatup and failure of the containment building (multi-compartmented) due to quasi-static overpressurization. The results obtained permit to visualize the time sequence of these events, as well as provide the basis for source term studies. (author) [es

  15. 32 CFR 842.110 - Claims not payable.

    Science.gov (United States)

    2010-07-01

    ...) Claims for a maritime occurrence covered under U.S. admiralty laws. (o) Claims for: (1) Any tax or... International Agreements Claims Act. (4) The Air Force Admiralty Claims Act and the Admiralty Extensions Act. (5...) Claims from the combat activities of the armed forces during war or armed conflict. (c) Claims for...

  16. Claimed Versus Calculated Cue-Weighting Systems for Screening Employee Applicants

    Science.gov (United States)

    Blevins, David E.

    1975-01-01

    This research compares the cue-weighting system which assessors claimed they used with the cue-weighting system one would infer they used based on multiple observations of their assessing behavior. The claimed cue-weighting systems agreed poorly with the empirically calculated cue-weighting systems for all assessors except one who utilized only…

  17. Finger Vein Recognition Based on Local Directional Code

    Science.gov (United States)

    Meng, Xianjing; Yang, Gongping; Yin, Yilong; Xiao, Rongyang

    2012-01-01

    Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP), Local Derivative Pattern (LDP) and Local Line Binary Pattern (LLBP). However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD), this paper represents a new direction based local descriptor called Local Directional Code (LDC) and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP. PMID:23202194

  18. Finger Vein Recognition Based on Local Directional Code

    Directory of Open Access Journals (Sweden)

    Rongyang Xiao

    2012-11-01

    Full Text Available Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP, Local Derivative Pattern (LDP and Local Line Binary Pattern (LLBP. However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD, this paper represents a new direction based local descriptor called Local Directional Code (LDC and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP.

  19. Scintillator Based Coded-Aperture Imaging for Neutron Detection

    International Nuclear Information System (INIS)

    Hayes, Sean-C.; Gamage, Kelum-A-A.

    2013-06-01

    In this paper we are going to assess the variations of neutron images using a series of Monte Carlo simulations. We are going to study neutron images of the same neutron source with different source locations, using a scintillator based coded-aperture system. The Monte Carlo simulations have been conducted making use of the EJ-426 neutron scintillator detector. This type of detector has a low sensitivity to gamma rays and is therefore of particular use in a system with a source that emits a mixed radiation field. From the use of different source locations, several neutron images have been produced, compared both qualitatively and quantitatively for each case. This allows conclusions to be drawn on how suited the scintillator based coded-aperture neutron imaging system is to detecting various neutron source locations. This type of neutron imaging system can be easily used to identify and locate nuclear materials precisely. (authors)

  20. THE IMPACT OF SHRINKING HANFORD BOUNDARIES ON PERMITS FOR TOXIC AIR POLLUTANT EMISSIONS FROM THE HANFORD 200 WEST AREA

    International Nuclear Information System (INIS)

    JOHNSON, R.E.

    2005-01-01

    This presentation (CE-580. Graduate Seminar) presents a brief description of an approach to use a simpler dispersion modeling method (SCREEN3) in conjunction with joint frequency tables for Hanford wind conditions to evaluate the impacts of shrinking the Hanford boundaries on the current permits for facilities in the 200 West Area. To fulfill requirements for the graduate student project (CE-702. Master's Special Problems), this evaluation will be completed and published over the next two years. Air toxic emissions play an important role in environmental quality and require a state approved permit. One example relates to containers or waste that are designated as Transuranic Waste (TRU), which are required to have venting devices due to hydrogen generation. The Washington State Department of Ecology (Ecology) determined that the filters used did not meet the definition of a ''pressure relief device'' and that a permit application would have to be submitted by the Central Waste Complex (CWC) for criteria pollutant and toxic air pollutant (TAP) emissions in accordance with Washington Administrative Code (WAC) 173-400 and 173-460. The permit application submitted in 2000 to Ecology used Industrial Source Code III (ISCIII) dispersion modeling to demonstrate that it was not possible for CWC to release a sufficient quantity of fugitive Toxic Air Pollutant emissions that could exceed the Acceptable Source Impact Levels (ASILs) at the Hanford Site Boundary. The modeled emission rates were based on the diurnal breathing in and out through the vented drums (approximately 20% of the drums), using published vapor pressure, molecular weight, and specific gravity data for all 600+ compounds, with a conservative estimate of one exchange volume per day (208 liters per drum). Two permit applications were submitted also to Ecology for the Waste Receiving and Processing Facility and the T Plant Complex. Both permit applications were based on the Central Waste Complex approach, and

  1. Triboelectric-Based Transparent Secret Code.

    Science.gov (United States)

    Yuan, Zuqing; Du, Xinyu; Li, Nianwu; Yin, Yingying; Cao, Ran; Zhang, Xiuling; Zhao, Shuyu; Niu, Huidan; Jiang, Tao; Xu, Weihua; Wang, Zhong Lin; Li, Congju

    2018-04-01

    Private and security information for personal identification requires an encrypted tool to extend communication channels between human and machine through a convenient and secure method. Here, a triboelectric-based transparent secret code (TSC) that enables self-powered sensing and information identification simultaneously in a rapid process method is reported. The transparent and hydrophobic TSC can be conformed to any cambered surface due to its high flexibility, which extends the application scenarios greatly. Independent of the power source, the TSC can induce obvious electric signals only by surface contact. This TSC is velocity-dependent and capable of achieving a peak voltage of ≈4 V at a resistance load of 10 MΩ and a sliding speed of 0.1 m s -1 , according to a 2 mm × 20 mm rectangular stripe. The fabricated TSC can maintain its performance after reciprocating rolling for about 5000 times. The applications of TSC as a self-powered code device are demonstrated, and the ordered signals can be recognized through the height of the electric peaks, which can be further transferred into specific information by the processing program. The designed TSC has great potential in personal identification, commodity circulation, valuables management, and security defense applications.

  2. SCDAP/RELAP5/MOD 3.1 Code Manual: Developmental assessment. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Hohorst, J.K.; Johnsen, E.C. [eds.; Allison, C.M. [Lockheed Idaho Technologies Co., Idaho Falls, ID (United States)

    1995-06-01

    The SCDAP/RELAP5 code has been developed for best estimate transient simulation of Light Water Reactor coolant systems during a severe accident. The code models the coupled behavior of the reactor coolant system, the core, fission product released during a severe accident transient as well as large and small break loss of coolant accidents, operational transients such as anticipated transient without SCRAM, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater conditioning systems. This volume contains detailed code-to-data calculations performed using SCDAP/RELAP5/MOD3.1, as well as comparison calculations performed with earlier code versions. Results of full plant calculations which include Surry, TMI-2, and Browns Ferry are described. Results of a nodalization study, which accounted for both axial and radial nodalization of the core, are also reported.

  3. SCDAP/RELAP5/MOD 3.1 Code Manual: Developmental assessment. Volume 5

    International Nuclear Information System (INIS)

    Hohorst, J.K.; Johnsen, E.C.

    1995-06-01

    The SCDAP/RELAP5 code has been developed for best estimate transient simulation of Light Water Reactor coolant systems during a severe accident. The code models the coupled behavior of the reactor coolant system, the core, fission product released during a severe accident transient as well as large and small break loss of coolant accidents, operational transients such as anticipated transient without SCRAM, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater conditioning systems. This volume contains detailed code-to-data calculations performed using SCDAP/RELAP5/MOD3.1, as well as comparison calculations performed with earlier code versions. Results of full plant calculations which include Surry, TMI-2, and Browns Ferry are described. Results of a nodalization study, which accounted for both axial and radial nodalization of the core, are also reported

  4. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  5. Variable disparity-motion estimation based fast three-view video coding

    Science.gov (United States)

    Bae, Kyung-Hoon; Kim, Seung-Cheol; Hwang, Yong Seok; Kim, Eun-Soo

    2009-02-01

    In this paper, variable disparity-motion estimation (VDME) based 3-view video coding is proposed. In the encoding, key-frame coding (KFC) based motion estimation and variable disparity estimation (VDE) for effectively fast three-view video encoding are processed. These proposed algorithms enhance the performance of 3-D video encoding/decoding system in terms of accuracy of disparity estimation and computational overhead. From some experiments, stereo sequences of 'Pot Plant' and 'IVO', it is shown that the proposed algorithm's PSNRs is 37.66 and 40.55 dB, and the processing time is 0.139 and 0.124 sec/frame, respectively.

  6. Regulations, Codes, and Standards (RCS) Template for California Hydrogen Dispensing Stations

    Energy Technology Data Exchange (ETDEWEB)

    Rivkin, C.; Blake, C.; Burgess, R.; Buttner, W.; Post, M.

    2012-11-01

    This report explains the Regulations, Codes, and Standards (RCS) requirements for hydrogen dispensing stations in the State of California. The reports shows the basic components of a hydrogen dispensing station in a simple schematic drawing; the permits and approvals that would typically be required for the construction and operation of a hydrogen dispensing station; and a basic permit that might be employed by an Authority Having Jurisdiction (AHJ).

  7. FEAST: a two-dimensional non-linear finite element code for calculating stresses

    International Nuclear Information System (INIS)

    Tayal, M.

    1986-06-01

    The computer code FEAST calculates stresses, strains, and displacements. The code is two-dimensional. That is, either plane or axisymmetric calculations can be done. The code models elastic, plastic, creep, and thermal strains and stresses. Cracking can also be simulated. The finite element method is used to solve equations describing the following fundamental laws of mechanics: equilibrium; compatibility; constitutive relations; yield criterion; and flow rule. FEAST combines several unique features that permit large time-steps in even severely non-linear situations. The features include a special formulation for permitting many finite elements to simultaneously cross the boundary from elastic to plastic behaviour; accomodation of large drops in yield-strength due to changes in local temperature and a three-step predictor-corrector method for plastic analyses. These features reduce computing costs. Comparisons against twenty analytical solutions and against experimental measurements show that predictions of FEAST are generally accurate to ± 5%

  8. IBO Claim Taking Project

    Data.gov (United States)

    Social Security Administration — IBO manually tracks all Canadian Claims and DSU claims via this report. It also provides a summary for each region and office of origin that the DSU works with. This...

  9. Health and Stress Management and Mental-health Disability Claims.

    Science.gov (United States)

    Marchand, Alain; Haines, Victor Y; Harvey, Steve; Dextras-Gauthier, Julie; Durand, Pierre

    2016-12-01

    This study examines the associations between health and stress management (HSM) practices and mental-health disability claims. Data from the Salveo study was collected during 2009-2012 within 60 workplaces nested in 37 companies located in Canada (Quebec) and insured by a large insurance company. In each company, 1 h interviews were conducted with human resources managers in order to obtain data on 63 HSM practices. Companies and workplaces were sorted into the low-claims and high-claims groups according to the median rate of the population of the insurer's corporate clients. Logistic regression adjusted for design effect and multidimensional scaling was used to analyse the data. After controlling for company size and economic sector, task design, demands control, gratifications, physical activity and work-family balance were associated with low mental-health disability claims rates. Further analyses revealed three company profiles that were qualified as laissez-faire, integrated and partially integrated approaches to HSM. Of the three, the integrated profile was associated with low mental-health disability claims rates. The results of this study provide evidence-based guidance for a better control of mental-health disability claims. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs...... (PP-CPNs) which is a subclass of CPNs equipped with an explicit separation of process control flow, message passing, and access to shared and local data. We show how PP-CPNs caters for a four phase structure-based automatic code generation process directed by the control flow of processes....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  11. RADTRAN: a computer code to analyze transportation of radioactive material

    International Nuclear Information System (INIS)

    Taylor, J.M.; Daniel, S.L.

    1977-04-01

    A computer code is presented which predicts the environmental impact of any specific scheme of radioactive material transportation. Results are presented in terms of annual latent cancer fatalities and annual early fatility probability resulting from exposure, during normal transportation or transport accidents. The code is developed in a generalized format to permit wide application including normal transportation analysis; consideration of alternatives; and detailed consideration of specific sectors of industry

  12. State Waste Discharge Permit application, 183-N Backwash Discharge Pond

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-01

    As part of the Hanford Federal Facility Agreement and Consent Order negotiations (Ecology et al. 1994), the US Department of Energy, Richland Operations Office, the US Environmental Protection Agency, and the Washington State Department of Ecology agreed that liquid effluent discharges to the ground on the Hanford Site which affect groundwater or have the potential to affect groundwater would be subject to permitting under the structure of Chapter 173--216 (or 173--218 where applicable) of the Washington Administrative Code, the State Waste Discharge Permit Program. As a result of this decision, the Washington State Department of Ecology and the US Department of Energy, Richland Operations Office entered into Consent Order No. DE91NM-177, (Ecology and DOE-RL 1991). The Consent Order No. DE91NM-177 requires a series of permitting activities for liquid effluent discharges. Liquid effluents on the Hanford Site have been classified as Phase I, Phase II, and Miscellaneous Streams. The Consent Order No. DE91NM-177 establishes milestones for State Waste Discharge Permit application submittals for all Phase I and Phase II streams, as well as the following 11 Miscellaneous Streams as identified in Table 4 of the Consent Order No. DE91NM-177.

  13. Does a Claims Diagnosis of Autism Mean a True Case?

    Science.gov (United States)

    Burke, James P.; Jain, Anjali; Yang, Wenya; Kelly, Jonathan P.; Kaiser, Marygrace; Becker, Laura; Lawer, Lindsay; Newschaffer, Craig J.

    2014-01-01

    The purpose of this study was to validate autism spectrum disorder cases identified through claims-based case identification algorithms against a clinical review of medical charts. Charts were reviewed for 432 children who fell into one of the three following groups: (a) more than or equal to two claims with an autism spectrum disorder diagnosis…

  14. Secure-Network-Coding-Based File Sharing via Device-to-Device Communication

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2017-01-01

    Full Text Available In order to increase the efficiency and security of file sharing in the next-generation networks, this paper proposes a large scale file sharing scheme based on secure network coding via device-to-device (D2D communication. In our scheme, when a user needs to share data with others in the same area, the source node and all the intermediate nodes need to perform secure network coding operation before forwarding the received data. This process continues until all the mobile devices in the networks successfully recover the original file. The experimental results show that secure network coding is very feasible and suitable for such file sharing. Moreover, the sharing efficiency and security outperform traditional replication-based sharing scheme.

  15. Field-based tests of geochemical modeling codes usign New Zealand hydrothermal systems

    International Nuclear Information System (INIS)

    Bruton, C.J.; Glassley, W.E.; Bourcier, W.L.

    1994-06-01

    Hydrothermal systems in the Taupo Volcanic Zone, North Island, New Zealand are being used as field-based modeling exercises for the EQ3/6 geochemical modeling code package. Comparisons of the observed state and evolution of the hydrothermal systems with predictions of fluid-solid equilibria made using geochemical modeling codes will determine how the codes can be used to predict the chemical and mineralogical response of the environment to nuclear waste emplacement. Field-based exercises allow us to test the models on time scales unattainable in the laboratory. Preliminary predictions of mineral assemblages in equilibrium with fluids sampled from wells in the Wairakei and Kawerau geothermal field suggest that affinity-temperature diagrams must be used in conjunction with EQ6 to minimize the effect of uncertainties in thermodynamic and kinetic data on code predictions

  16. Medicaid Drug Claims Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.

  17. Nine-year-old children use norm-based coding to visually represent facial expression.

    Science.gov (United States)

    Burton, Nichola; Jeffery, Linda; Skinner, Andrew L; Benton, Christopher P; Rhodes, Gillian

    2013-10-01

    Children are less skilled than adults at making judgments about facial expression. This could be because they have not yet developed adult-like mechanisms for visually representing faces. Adults are thought to represent faces in a multidimensional face-space, and have been shown to code the expression of a face relative to the norm or average face in face-space. Norm-based coding is economical and adaptive, and may be what makes adults more sensitive to facial expression than children. This study investigated the coding system that children use to represent facial expression. An adaptation aftereffect paradigm was used to test 24 adults and 18 children (9 years 2 months to 9 years 11 months old). Participants adapted to weak and strong antiexpressions. They then judged the expression of an average expression. Adaptation created aftereffects that made the test face look like the expression opposite that of the adaptor. Consistent with the predictions of norm-based but not exemplar-based coding, aftereffects were larger for strong than weak adaptors for both age groups. Results indicate that, like adults, children's coding of facial expressions is norm-based. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  18. Hanford Facility dangerous waste permit application, liquid effluent retention facility and 200 area effluent treatment facility

    International Nuclear Information System (INIS)

    Coenenberg, J.G.

    1997-01-01

    The Hanford Facility Dangerous Waste Permit Application is considered to 10 be a single application organized into a General Information Portion (document 11 number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the 12 Unit-Specific Portion is limited to Part B permit application documentation 13 submitted for individual, 'operating' treatment, storage, and/or disposal 14 units, such as the Liquid Effluent Retention Facility and 200 Area Effluent 15 Treatment Facility (this document, DOE/RL-97-03). 16 17 Both the General Information and Unit-Specific portions of the Hanford 18 Facility Dangerous Waste Permit Application address the content of the Part B 19 permit application guidance prepared by the Washington State Department of 20 Ecology (Ecology 1987 and 1996) and the U.S. Environmental Protection Agency 21 (40 Code of Federal Regulations 270), with additional information needs 22 defined by the Hazardous and Solid Waste Amendments and revisions of 23 Washington Administrative Code 173-303. For ease of reference, the Washington 24 State Department of Ecology alpha-numeric section identifiers from the permit 25 application guidance documentation (Ecology 1996) follow, in brackets, the 26 chapter headings and subheadings. A checklist indicating where information is 27 contained in the Liquid Effluent Retention Facility and 200 Area Effluent 28 Treatment Facility permit application documentation, in relation to the 29 Washington State Department of Ecology guidance, is located in the Contents 30 Section. 31 32 Documentation contained in the General Information Portion is broader in 33 nature and could be used by multiple treatment, storage, and/or disposal units 34 (e.g., the glossary provided in the General Information Portion). Wherever 35 appropriate, the Liquid Effluent Retention Facility and 200 Area Effluent 36 Treatment Facility permit application documentation makes cross-reference to 37 the General Information Portion, rather than duplicating

  19. Hanford Facility dangerous waste permit application, liquid effluent retention facility and 200 area effluent treatment facility

    Energy Technology Data Exchange (ETDEWEB)

    Coenenberg, J.G.

    1997-08-15

    The Hanford Facility Dangerous Waste Permit Application is considered to 10 be a single application organized into a General Information Portion (document 11 number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the 12 Unit-Specific Portion is limited to Part B permit application documentation 13 submitted for individual, `operating` treatment, storage, and/or disposal 14 units, such as the Liquid Effluent Retention Facility and 200 Area Effluent 15 Treatment Facility (this document, DOE/RL-97-03). 16 17 Both the General Information and Unit-Specific portions of the Hanford 18 Facility Dangerous Waste Permit Application address the content of the Part B 19 permit application guidance prepared by the Washington State Department of 20 Ecology (Ecology 1987 and 1996) and the U.S. Environmental Protection Agency 21 (40 Code of Federal Regulations 270), with additional information needs 22 defined by the Hazardous and Solid Waste Amendments and revisions of 23 Washington Administrative Code 173-303. For ease of reference, the Washington 24 State Department of Ecology alpha-numeric section identifiers from the permit 25 application guidance documentation (Ecology 1996) follow, in brackets, the 26 chapter headings and subheadings. A checklist indicating where information is 27 contained in the Liquid Effluent Retention Facility and 200 Area Effluent 28 Treatment Facility permit application documentation, in relation to the 29 Washington State Department of Ecology guidance, is located in the Contents 30 Section. 31 32 Documentation contained in the General Information Portion is broader in 33 nature and could be used by multiple treatment, storage, and/or disposal units 34 (e.g., the glossary provided in the General Information Portion). Wherever 35 appropriate, the Liquid Effluent Retention Facility and 200 Area Effluent 36 Treatment Facility permit application documentation makes cross-reference to 37 the General Information Portion, rather than duplicating

  20. Nutritional characterisation of foods: Science-based approach to nutrient profiling - Summary report of an ILSI Europe workshop held in April 2006

    DEFF Research Database (Denmark)

    Tetens, Inge; Oberdörfer, R.; Madsen, C.

    2007-01-01

    The background of the workshop was the proposed EU legislation to regulate nutrition and health claims for foods in Europe. This regulation will require the development of a science-based nutrient profiling system in order to determine which foods or categories of foods will be permitted to make...... nutrition or health claims. Nutrient profiling can also be used to categorize foods, based on an assessment of their nutrient composition according to scientific principles. Today, various nutrient profiling schemes are available to classify foods based on their nutritional characteristics. The aim...... profiles for the purpose of regulating nutrition and health claims. The 76 workshop participants were scientists from European academic institutions, research institutes, food standards agencies, food industry and other interested parties, all of whom contributed their thinking on this topic. The workshop...

  1. Coding and Billing in Surgical Education: A Systems-Based Practice Education Program.

    Science.gov (United States)

    Ghaderi, Kimeya F; Schmidt, Scott T; Drolet, Brian C

    Despite increased emphasis on systems-based practice through the Accreditation Council for Graduate Medical Education core competencies, few studies have examined what surgical residents know about coding and billing. We sought to create and measure the effectiveness of a multifaceted approach to improving resident knowledge and performance of documenting and coding outpatient encounters. We identified knowledge gaps and barriers to documentation and coding in the outpatient setting. We implemented a series of educational and workflow interventions with a group of 12 residents in a surgical clinic at a tertiary care center. To measure the effect of this program, we compared billing codes for 1 year before intervention (FY2012) to prospectively collected data from the postintervention period (FY2013). All related documentation and coding were verified by study-blinded auditors. Interventions took place at the outpatient surgical clinic at Rhode Island Hospital, a tertiary-care center. A cohort of 12 plastic surgery residents ranging from postgraduate year 2 through postgraduate year 6 participated in the interventional sequence. A total of 1285 patient encounters in the preintervention group were compared with 1170 encounters in the postintervention group. Using evaluation and management codes (E&M) as a measure of documentation and coding, we demonstrated a significant and durable increase in billing with supporting clinical documentation after the intervention. For established patient visits, the monthly average E&M code level increased from 2.14 to 3.05 (p coding and billing of outpatient clinic encounters. Using externally audited coding data, we demonstrate significantly increased rates of higher complexity E&M coding in a stable patient population based on improved documentation and billing awareness by the residents. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  2. Calculation code MIXSET for Purex process

    International Nuclear Information System (INIS)

    Gonda, Kozo; Fukuda, Shoji.

    1977-09-01

    MIXSET is a FORTRAN IV calculation code for Purex process that simulate the dynamic behavior of solvent extraction processes in mixer-settlers. Two options permit terminating dynamic phase by time or by achieving steady state. These options also permit continuing calculation successively using new inputs from a arbitrary phase. A third option permits artificial rapid close to steady state and a fourth option permits searching optimum input to satisfy both of specification and recovery rate of product. MIXSET handles maximum chemical system of eight components with or without mutual dependence of the distribution of the components. The chemical system in MIXSET includes chemical reactions and/or decaying reaction. Distribution data can be supplied by third-power polynominal equations or tables, and kinetic data by tables or given constants. The fluctuation of the interfacial level height in settler is converted into the flow rate changes of organic and aqueous stream to follow dynamic behavior of extraction process in detail. MIXSET can be applied to flowsheet study, start up and/or shut down procedure study and real time process management in countercurrent solvent extraction processes. (auth.)

  3. McBits: fast constant-time code-based cryptography

    NARCIS (Netherlands)

    Bernstein, D.J.; Chou, T.; Schwabe, P.

    2015-01-01

    This paper presents extremely fast algorithms for code-based public-key cryptography, including full protection against timing attacks. For example, at a 2^128 security level, this paper achieves a reciprocal decryption throughput of just 60493 cycles (plus cipher cost etc.) on a single Ivy Bridge

  4. nRC: non-coding RNA Classifier based on structural features.

    Science.gov (United States)

    Fiannaca, Antonino; La Rosa, Massimo; La Paglia, Laura; Rizzo, Riccardo; Urso, Alfonso

    2017-01-01

    Non-coding RNA (ncRNA) are small non-coding sequences involved in gene expression regulation of many biological processes and diseases. The recent discovery of a large set of different ncRNAs with biologically relevant roles has opened the way to develop methods able to discriminate between the different ncRNA classes. Moreover, the lack of knowledge about the complete mechanisms in regulative processes, together with the development of high-throughput technologies, has required the help of bioinformatics tools in addressing biologists and clinicians with a deeper comprehension of the functional roles of ncRNAs. In this work, we introduce a new ncRNA classification tool, nRC (non-coding RNA Classifier). Our approach is based on features extraction from the ncRNA secondary structure together with a supervised classification algorithm implementing a deep learning architecture based on convolutional neural networks. We tested our approach for the classification of 13 different ncRNA classes. We obtained classification scores, using the most common statistical measures. In particular, we reach an accuracy and sensitivity score of about 74%. The proposed method outperforms other similar classification methods based on secondary structure features and machine learning algorithms, including the RNAcon tool that, to date, is the reference classifier. nRC tool is freely available as a docker image at https://hub.docker.com/r/tblab/nrc/. The source code of nRC tool is also available at https://github.com/IcarPA-TBlab/nrc.

  5. 37 CFR 360.25 - Copies of claims.

    Science.gov (United States)

    2010-07-01

    ... Section 360.25 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS SUBMISSION OF ROYALTY CLAIMS FILING OF CLAIMS TO ROYALTY FEES COLLECTED UNDER COMPULSORY LICENSE Digital Audio Recording Devices and Media Royalty Claims § 360.25 Copies of claims. A claimant shall, for each claim...

  6. 32 CFR 842.94 - Assertable claims.

    Science.gov (United States)

    2010-07-01

    ..., against a tort-feasor when: (a) Damage results from negligence and the claim is for: (1) More than $100... ADMINISTRATIVE CLAIMS Property Damage Tort Claims in Favor of the United States (31 U.S.C. 3701, 3711-3719) § 842.... (The two claims should be consolidated and processed under subpart N). (d) The Tort-feasor or his...

  7. Researching on knowledge architecture of design by analysis based on ASME code

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan

    2003-01-01

    The quality of knowledge-based system's knowledge architecture is one of decisive factors of knowledge-based system's validity and rationality. For designing the ASME code knowledge based system, this paper presents a knowledge acquisition method which is extracting knowledge through document analysis consulted domain experts' knowledge. Then the paper describes knowledge architecture of design by analysis based on the related rules in ASME code. The knowledge of the knowledge architecture is divided into two categories: one is empirical knowledge, and another is ASME code knowledge. Applied as the basement of the knowledge architecture, a general procedural process of design by analysis that is met the engineering design requirements and designers' conventional mode is generalized and explained detailed in the paper. For the sake of improving inference efficiency and concurrent computation of KBS, a kind of knowledge Petri net (KPN) model is proposed and adopted in expressing the knowledge architecture. Furthermore, for validating and verifying of the empirical rules, five knowledge validation and verification theorems are given in the paper. Moreover the research production is applicable to design the knowledge architecture of ASME codes or other engineering standards. (author)

  8. Novel power saving architecture for FBG based OCDMA code generation

    Science.gov (United States)

    Osadola, Tolulope B.; Idris, Siti K.; Glesk, Ivan

    2013-10-01

    A novel architecture for generating incoherent, 2-dimensional wavelength hopping-time spreading optical CDMA codes is presented. The architecture is designed to facilitate the reuse of optical source signal that is unused after an OCDMA code has been generated using fiber Bragg grating based encoders. Effective utilization of available optical power is therefore achieved by cascading several OCDMA encoders thereby enabling 3dB savings in optical power.

  9. Iterative channel decoding of FEC-based multiple-description codes.

    Science.gov (United States)

    Chang, Seok-Ho; Cosman, Pamela C; Milstein, Laurence B

    2012-03-01

    Multiple description coding has been receiving attention as a robust transmission framework for multimedia services. This paper studies the iterative decoding of FEC-based multiple description codes. The proposed decoding algorithms take advantage of the error detection capability of Reed-Solomon (RS) erasure codes. The information of correctly decoded RS codewords is exploited to enhance the error correction capability of the Viterbi algorithm at the next iteration of decoding. In the proposed algorithm, an intradescription interleaver is synergistically combined with the iterative decoder. The interleaver does not affect the performance of noniterative decoding but greatly enhances the performance when the system is iteratively decoded. We also address the optimal allocation of RS parity symbols for unequal error protection. For the optimal allocation in iterative decoding, we derive mathematical equations from which the probability distributions of description erasures can be generated in a simple way. The performance of the algorithm is evaluated over an orthogonal frequency-division multiplexing system. The results show that the performance of the multiple description codes is significantly enhanced.

  10. 21 CFR 101.13 - Nutrient content claims-general principles.

    Science.gov (United States)

    2010-04-01

    ... from recognized data bases for raw and processed foods, recipes, and other means to compute nutrient... and amounts of ingredients, cooking temperatures, etc.). Firms making claims on foods based on this...

  11. Probabilistic Decision Based Block Partitioning for Future Video Coding

    KAUST Repository

    Wang, Zhao

    2017-11-29

    In the latest Joint Video Exploration Team development, the quadtree plus binary tree (QTBT) block partitioning structure has been proposed for future video coding. Compared to the traditional quadtree structure of High Efficiency Video Coding (HEVC) standard, QTBT provides more flexible patterns for splitting the blocks, which results in dramatically increased combinations of block partitions and high computational complexity. In view of this, a confidence interval based early termination (CIET) scheme is proposed for QTBT to identify the unnecessary partition modes in the sense of rate-distortion (RD) optimization. In particular, a RD model is established to predict the RD cost of each partition pattern without the full encoding process. Subsequently, the mode decision problem is casted into a probabilistic framework to select the final partition based on the confidence interval decision strategy. Experimental results show that the proposed CIET algorithm can speed up QTBT block partitioning structure by reducing 54.7% encoding time with only 1.12% increase in terms of bit rate. Moreover, the proposed scheme performs consistently well for the high resolution sequences, of which the video coding efficiency is crucial in real applications.

  12. An in-depth assessment of a diagnosis-based risk adjustment model based on national health insurance claims: the application of the Johns Hopkins Adjusted Clinical Group case-mix system in Taiwan.

    Science.gov (United States)

    Chang, Hsien-Yen; Weiner, Jonathan P

    2010-01-18

    Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI) programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG) case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234), while those in both 2002 and 2003 were included for prospective analyses (n = 164,562). Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group) model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster). When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Given the widespread availability of claims data and the superior explanatory

  13. Energy information data base: report number codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used. (RWR)

  14. Energy information data base: report number codes

    International Nuclear Information System (INIS)

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used

  15. A Mechanism to Avoid Collusion Attacks Based on Code Passing in Mobile Agent Systems

    Science.gov (United States)

    Jaimez, Marc; Esparza, Oscar; Muñoz, Jose L.; Alins-Delgado, Juan J.; Mata-Díaz, Jorge

    Mobile agents are software entities consisting of code, data, state and itinerary that can migrate autonomously from host to host executing their code. Despite its benefits, security issues strongly restrict the use of code mobility. The protection of mobile agents against the attacks of malicious hosts is considered the most difficult security problem to solve in mobile agent systems. In particular, collusion attacks have been barely studied in the literature. This paper presents a mechanism that avoids collusion attacks based on code passing. Our proposal is based on a Multi-Code agent, which contains a different variant of the code for each host. A Trusted Third Party is responsible for providing the information to extract its own variant to the hosts, and for taking trusted timestamps that will be used to verify time coherence.

  16. Assigning clinical codes with data-driven concept representation on Dutch clinical free text.

    Science.gov (United States)

    Scheurwegs, Elyne; Luyckx, Kim; Luyten, Léon; Goethals, Bart; Daelemans, Walter

    2017-05-01

    Clinical codes are used for public reporting purposes, are fundamental to determining public financing for hospitals, and form the basis for reimbursement claims to insurance providers. They are assigned to a patient stay to reflect the diagnosis and performed procedures during that stay. This paper aims to enrich algorithms for automated clinical coding by taking a data-driven approach and by using unsupervised and semi-supervised techniques for the extraction of multi-word expressions that convey a generalisable medical meaning (referred to as concepts). Several methods for extracting concepts from text are compared, two of which are constructed from a large unannotated corpus of clinical free text. A distributional semantic model (i.c. the word2vec skip-gram model) is used to generalize over concepts and retrieve relations between them. These methods are validated on three sets of patient stay data, in the disease areas of urology, cardiology, and gastroenterology. The datasets are in Dutch, which introduces a limitation on available concept definitions from expert-based ontologies (e.g. UMLS). The results show that when expert-based knowledge in ontologies is unavailable, concepts derived from raw clinical texts are a reliable alternative. Both concepts derived from raw clinical texts perform and concepts derived from expert-created dictionaries outperform a bag-of-words approach in clinical code assignment. Adding features based on tokens that appear in a semantically similar context has a positive influence for predicting diagnostic codes. Furthermore, the experiments indicate that a distributional semantics model can find relations between semantically related concepts in texts but also introduces erroneous and redundant relations, which can undermine clinical coding performance. Copyright © 2017. Published by Elsevier Inc.

  17. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  18. ITS Version 3.0: The Integrated TIGER Series of coupled electron/photon Monte Carlo transport codes

    International Nuclear Information System (INIS)

    Halbleib, J.A.; Kensek, R.P.; Valdez, G.D.; Mehlhorn, T.A.; Seltzer, S.M.; Berger, M.J.

    1993-01-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields. It combines operational simplicity and physical accuracy in order to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Flexibility of construction permits tailoring of the codes to specific applications and extension of code capabilities to more complex applications through simple update procedures

  19. ITS Version 3.0: The Integrated TIGER Series of coupled electron/photon Monte Carlo transport codes

    Energy Technology Data Exchange (ETDEWEB)

    Halbleib, J.A.; Kensek, R.P.; Valdez, G.D.; Mehlhorn, T.A. [Sandia National Labs., Albuquerque, NM (United States); Seltzer, S.M.; Berger, M.J. [National Inst. of Standards and Technology, Gaithersburg, MD (United States). Ionizing Radiation Div.

    1993-06-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields. It combines operational simplicity and physical accuracy in order to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Flexibility of construction permits tailoring of the codes to specific applications and extension of code capabilities to more complex applications through simple update procedures.

  20. Demonstration of Emulator-Based Bayesian Calibration of Safety Analysis Codes: Theory and Formulation

    Directory of Open Access Journals (Sweden)

    Joseph P. Yurko

    2015-01-01

    Full Text Available System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here with Markov Chain Monte Carlo (MCMC sampling feasible. This work uses Gaussian Process (GP based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.

  1. GRADSPMHD: A parallel MHD code based on the SPH formalism

    Science.gov (United States)

    Vanaverbeke, S.; Keppens, R.; Poedts, S.

    2014-03-01

    We present GRADSPMHD, a completely Lagrangian parallel magnetohydrodynamics code based on the SPH formalism. The implementation of the equations of SPMHD in the “GRAD-h” formalism assembles known results, including the derivation of the discretized MHD equations from a variational principle, the inclusion of time-dependent artificial viscosity, resistivity and conductivity terms, as well as the inclusion of a mixed hyperbolic/parabolic correction scheme for satisfying the ∇ṡB→ constraint on the magnetic field. The code uses a tree-based formalism for neighbor finding and can optionally use the tree code for computing the self-gravity of the plasma. The structure of the code closely follows the framework of our parallel GRADSPH FORTRAN 90 code which we added previously to the CPC program library. We demonstrate the capabilities of GRADSPMHD by running 1, 2, and 3 dimensional standard benchmark tests and we find good agreement with previous work done by other researchers. The code is also applied to the problem of simulating the magnetorotational instability in 2.5D shearing box tests as well as in global simulations of magnetized accretion disks. We find good agreement with available results on this subject in the literature. Finally, we discuss the performance of the code on a parallel supercomputer with distributed memory architecture. Catalogue identifier: AERP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 620503 No. of bytes in distributed program, including test data, etc.: 19837671 Distribution format: tar.gz Programming language: FORTRAN 90/MPI. Computer: HPC cluster. Operating system: Unix. Has the code been vectorized or parallelized?: Yes, parallelized using MPI. RAM: ˜30 MB for a

  2. Energy-Efficient Cluster Based Routing Protocol in Mobile Ad Hoc Networks Using Network Coding

    Directory of Open Access Journals (Sweden)

    Srinivas Kanakala

    2014-01-01

    Full Text Available In mobile ad hoc networks, all nodes are energy constrained. In such situations, it is important to reduce energy consumption. In this paper, we consider the issues of energy efficient communication in MANETs using network coding. Network coding is an effective method to improve the performance of wireless networks. COPE protocol implements network coding concept to reduce number of transmissions by mixing the packets at intermediate nodes. We incorporate COPE into cluster based routing protocol to further reduce the energy consumption. The proposed energy-efficient coding-aware cluster based routing protocol (ECCRP scheme applies network coding at cluster heads to reduce number of transmissions. We also modify the queue management procedure of COPE protocol to further improve coding opportunities. We also use an energy efficient scheme while selecting the cluster head. It helps to increase the life time of the network. We evaluate the performance of proposed energy efficient cluster based protocol using simulation. Simulation results show that the proposed ECCRP algorithm reduces energy consumption and increases life time of the network.

  3. Guarani Morphology in Paraguayan Spanish: Insights from Code-Mixing Typology

    Science.gov (United States)

    Estigarribia, Bruno

    2017-01-01

    In this paper we examine the use of Guarani affixes and clitics in colloquial Paraguayan Spanish. We depart from the traditional view of these as "borrowings," and instead explore the idea that these phenomena can be integrated within Muysken's (2000, 2013, 2014) typology of code-mixing. We claim that most of these uses may stem from a…

  4. Supporting Situated Learning Based on QR Codes with Etiquetar App: A Pilot Study

    Science.gov (United States)

    Camacho, Miguel Olmedo; Pérez-Sanagustín, Mar; Alario-Hoyos, Carlos; Soldani, Xavier; Kloos, Carlos Delgado; Sayago, Sergio

    2014-01-01

    EtiquetAR is an authoring tool for supporting the design and enactment of situated learning experiences based on QR tags. Practitioners use etiquetAR for creating, managing and personalizing collections of QR codes with special properties: (1) codes can have more than one link pointing at different multimedia resources, (2) codes can be updated…

  5. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    Science.gov (United States)

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  6. QR code based noise-free optical encryption and decryption of a gray scale image

    Science.gov (United States)

    Jiao, Shuming; Zou, Wenbin; Li, Xia

    2017-03-01

    In optical encryption systems, speckle noise is one major challenge in obtaining high quality decrypted images. This problem can be addressed by employing a QR code based noise-free scheme. Previous works have been conducted for optically encrypting a few characters or a short expression employing QR codes. This paper proposes a practical scheme for optically encrypting and decrypting a gray-scale image based on QR codes for the first time. The proposed scheme is compatible with common QR code generators and readers. Numerical simulation results reveal the proposed method can encrypt and decrypt an input image correctly.

  7. An Examination of the Performance Based Building Code on the Design of a Commercial Building

    Directory of Open Access Journals (Sweden)

    John Greenwood

    2012-11-01

    Full Text Available The Building Code of Australia (BCA is the principal code under which building approvals in Australia are assessed. The BCA adopted performance-based solutions for building approvals in 1996. Performance-based codes are based upon a set of explicit objectives, stated in terms of a hierarchy of requirements beginning with key general objectives. With this in mind, the research presented in this paper aims to analyse the impact of the introduction of the performance-based code within Western Australia to gauge the effect and usefulness of alternative design solutions in commercial construction using a case study project. The research revealed that there are several advantages to the use of alternative designs and that all parties, in general, are in favour of the performance-based building code of Australia. It is suggested that change in the assessment process to streamline the alternative design path is needed for the greater use of the performance-based alternative. With appropriate quality control measures, minor variations to the deemed-to-satisfy provisions could easily be managed by the current and future building surveying profession.

  8. External validation of a multivariable claims-based rule for predicting in-hospital mortality and 30-day post-pulmonary embolism complications

    Directory of Open Access Journals (Sweden)

    Craig I. Coleman

    2016-10-01

    Full Text Available Abstract Background Low-risk pulmonary embolism (PE patients may be candidates for outpatient treatment or abbreviated hospital stay. There is a need for a claims-based prediction rule that payers/hospitals can use to risk stratify PE patients. We sought to validate the In-hospital Mortality for PulmonAry embolism using Claims daTa (IMPACT prediction rule for in-hospital and 30-day outcomes. Methods We used the Optum Research Database from 1/2008-3/2015 and included adults hospitalized for PE (415.1x in the primary position or secondary position when accompanied by a primary code for a PE complication and having continuous medical and prescription coverage for ≥6-months prior and 3-months post-inclusion or until death. In-hospital and 30-day mortality and 30-day complications (recurrent venous thromboembolism, rehospitalization or death were assessed and prognostic accuracies of IMPACT with 95 % confidence intervals (CIs were calculated. Results In total, 47,531 PE patients were included. In-hospital and 30-day mortality occurred in 7.9 and 9.4 % of patients and 20.8 % experienced any complication within 30-days. Of the 19.5 % of patients classified as low-risk by IMPACT, 2.0 % died in-hospital, resulting in a sensitivity and specificity of 95.2 % (95 % CI, 94.4–95.8 and 20.7 % (95 % CI, 20.4–21.1. Only 1 additional low-risk patient died within 30-days of admission and 12.2 % experienced a complication, translating into a sensitivity and specificity of 95.9 % (95 % CI, 95.3–96.5 and 21.1 % (95 % CI, 20.7–21.5 for mortality and 88.5 % (95 % CI, 87.9–89.2 and 21.6 % (95 % CI, 21.2–22.0 for any complication. Conclusion IMPACT had acceptable sensitivity for predicting in-hospital and 30-day mortality or complications and may be valuable for retrospective risk stratification of PE patients.

  9. External validation of a multivariable claims-based rule for predicting in-hospital mortality and 30-day post-pulmonary embolism complications.

    Science.gov (United States)

    Coleman, Craig I; Peacock, W Frank; Fermann, Gregory J; Crivera, Concetta; Weeda, Erin R; Hull, Michael; DuCharme, Mary; Becker, Laura; Schein, Jeff R

    2016-10-22

    Low-risk pulmonary embolism (PE) patients may be candidates for outpatient treatment or abbreviated hospital stay. There is a need for a claims-based prediction rule that payers/hospitals can use to risk stratify PE patients. We sought to validate the In-hospital Mortality for PulmonAry embolism using Claims daTa (IMPACT) prediction rule for in-hospital and 30-day outcomes. We used the Optum Research Database from 1/2008-3/2015 and included adults hospitalized for PE (415.1x in the primary position or secondary position when accompanied by a primary code for a PE complication) and having continuous medical and prescription coverage for ≥6-months prior and 3-months post-inclusion or until death. In-hospital and 30-day mortality and 30-day complications (recurrent venous thromboembolism, rehospitalization or death) were assessed and prognostic accuracies of IMPACT with 95 % confidence intervals (CIs) were calculated. In total, 47,531 PE patients were included. In-hospital and 30-day mortality occurred in 7.9 and 9.4 % of patients and 20.8 % experienced any complication within 30-days. Of the 19.5 % of patients classified as low-risk by IMPACT, 2.0 % died in-hospital, resulting in a sensitivity and specificity of 95.2 % (95 % CI, 94.4-95.8) and 20.7 % (95 % CI, 20.4-21.1). Only 1 additional low-risk patient died within 30-days of admission and 12.2 % experienced a complication, translating into a sensitivity and specificity of 95.9 % (95 % CI, 95.3-96.5) and 21.1 % (95 % CI, 20.7-21.5) for mortality and 88.5 % (95 % CI, 87.9-89.2) and 21.6 % (95 % CI, 21.2-22.0) for any complication. IMPACT had acceptable sensitivity for predicting in-hospital and 30-day mortality or complications and may be valuable for retrospective risk stratification of PE patients.

  10. On Rational Interpolation-Based List-Decoding and List-Decoding Binary Goppa Codes

    DEFF Research Database (Denmark)

    Beelen, Peter; Høholdt, Tom; Nielsen, Johan Sebastian Rosenkilde

    2013-01-01

    We derive the Wu list-decoding algorithm for generalized Reed–Solomon (GRS) codes by using Gröbner bases over modules and the Euclidean algorithm as the initial algorithm instead of the Berlekamp–Massey algorithm. We present a novel method for constructing the interpolation polynomial fast. We gi...... and a duality in the choice of parameters needed for decoding, both in the case of GRS codes and in the case of Goppa codes....

  11. 32 CFR 537.15 - Statutory authority for maritime claims and claims involving civil works of a maritime nature.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Statutory authority for maritime claims and claims involving civil works of a maritime nature. 537.15 Section 537.15 National Defense Department of....15 Statutory authority for maritime claims and claims involving civil works of a maritime nature. (a...

  12. IDENTITY CLAIMS, TEXTS, ROME AND GALATIANS

    African Journals Online (AJOL)

    inform the identity claimed and negotiated by people and groups. When ..... 24 To some extent, going against the grain of Bourdieu' notion that “what exist in the social world are .... based on words and information that create reality” (Lampe 1995:940, emphasis ..... Jesus, the Early Church and the Roman superpower.

  13. FARO base case post-test analysis by COMETA code

    Energy Technology Data Exchange (ETDEWEB)

    Annunziato, A.; Addabbo, C. [Joint Research Centre, Ispra (Italy)

    1995-09-01

    The paper analyzes the COMETA (Core Melt Thermal-Hydraulic Analysis) post test calculations of FARO Test L-11, the so-called Base Case Test. The FARO Facility, located at JRC Ispra, is used to simulate the consequences of Severe Accidents in Nuclear Power Plants under a variety of conditions. The COMETA Code has a 6 equations two phase flow field and a 3 phases corium field: the jet, the droplets and the fused-debris bed. The analysis shown that the code is able to pick-up all the major phenomena occurring during the fuel-coolant interaction pre-mixing phase.

  14. State waste discharge permit application: Hydrotest, maintenance and construction discharges. Revision 0

    International Nuclear Information System (INIS)

    1995-11-01

    On December 23, 1991, the US DOE< Richland Operation Office (RL) and the Washington State Department of Ecology (Ecology) agreed to adhere to the provisions of the Department of Ecology Consent Order No. DE91NM-177 (216 Consent Order) (Ecology and US DOE 1991). The 216 Consent Order list regulatory milestones for liquid effluent streams at the Hanford Site and requires compliance with the permitting requirements of Washington Administrative Code. Hanford Site liquid effluent streams discharging to the soil column have been categorized on the 216 Consent Order as follows: Phase I Streams; Phase II Streams; Miscellaneous Streams. Phase I and Phase II Streams were initially addressed in two report. Miscellaneous Streams are subject to the requirements of several milestones identified in the 216 Consent Order. This document constitutes the Categorical State Waste Discharge Permit application for hydrotest,maintenance and construction discharges throughout the Hanford Site. This categorical permit application form was prepared and approved by Ecology

  15. State waste discharge permit application: Hydrotest, maintenance and construction discharges. Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    On December 23, 1991, the US DOE< Richland Operation Office (RL) and the Washington State Department of Ecology (Ecology) agreed to adhere to the provisions of the Department of Ecology Consent Order No. DE91NM-177 (216 Consent Order) (Ecology and US DOE 1991). The 216 Consent Order list regulatory milestones for liquid effluent streams at the Hanford Site and requires compliance with the permitting requirements of Washington Administrative Code. Hanford Site liquid effluent streams discharging to the soil column have been categorized on the 216 Consent Order as follows: Phase I Streams; Phase II Streams; Miscellaneous Streams. Phase I and Phase II Streams were initially addressed in two report. Miscellaneous Streams are subject to the requirements of several milestones identified in the 216 Consent Order. This document constitutes the Categorical State Waste Discharge Permit application for hydrotest,maintenance and construction discharges throughout the Hanford Site. This categorical permit application form was prepared and approved by Ecology.

  16. ALFITeX. A new code for the deconvolution of complex alpha-particle spectra

    International Nuclear Information System (INIS)

    Caro Marroyo, B.; Martin Sanchez, A.; Jurado Vargas, M.

    2013-01-01

    A new code for the deconvolution of complex alpha-particle spectra has been developed. The ALFITeX code is written in Visual Basic for Microsoft Office Excel 2010 spreadsheets, incorporating several features aimed at making it a fast, robust and useful tool with a user-friendly interface. The deconvolution procedure is based on the Levenberg-Marquardt algorithm, with the curve fitting the experimental data being the mathematical function formed by the convolution of a Gaussian with two left-handed exponentials in the low-energy-tail region. The code also includes the capability of fitting a possible constant background contribution. The application of the singular value decomposition method for matrix inversion permits the fit of any kind of alpha-particle spectra, even those presenting singularities or an ill-conditioned curvature matrix. ALFITeX has been checked with its application to the deconvolution and the calculation of the alpha-particle emission probabilities of 239 Pu, 241 Am and 235 U. (author)

  17. Application programming interface document for the modernized Transient Reactor Analysis Code (TRAC-M)

    International Nuclear Information System (INIS)

    Mahaffy, J.; Boyack, B.E.; Steinke, R.G.

    1998-05-01

    The objective of this document is to ease the task of adding new system components to the Transient Reactor Analysis Code (TRAC) or altering old ones. Sufficient information is provided to permit replacement or modification of physical models and correlations. Within TRAC, information is passed at two levels. At the upper level, information is passed by system-wide and component-specific data modules at and above the level of component subroutines. At the lower level, information is passed through a combination of module-based data structures and argument lists. This document describes the basic mechanics involved in the flow of information within the code. The discussion of interfaces in the body of this document has been kept to a general level to highlight key considerations. The appendices cover instructions for obtaining a detailed list of variables used to communicate in each subprogram, definitions and locations of key variables, and proposed improvements to intercomponent interfaces that are not available in the first level of code modernization

  18. State waste discharge permit application, 200-E chemical drain field

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-01

    As part of the Hanford Federal Facility Agreement and Consent Order negotiations (Ecology et al. 1994), the US Department of Energy, Richland Operations Office, the US Environmental Protection Agency, and the Washington State Department of Ecology agreed that liquid effluent discharges to the ground on the Hanford Site which affect groundwater or have the potential to affect ground would be subject to permitting under the structure of Chapter 173-216 (or 173-218 where applicable) of the Washington Administrative Code, the State Waste Discharge Permit Program. As a result of this decision, the Washington State Department of Ecology and the US Department of Energy, Richland Operations Office entered into Consent Order No. DE 91NM-177, (Ecology and DOE-RL 1991). The Consent Order No. DE 91NM-177 requires a series of permitting activities for liquid effluent discharges. This document presents the State Waste Discharge Permit (SWDP) application for the 200-E Chemical Drain Field. Waste water from the 272-E Building enters the process sewer line directly through a floor drain, while waste water from the 2703-E Building is collected in two floor drains, (north and south) that act as sumps and are discharged periodically. The 272-E and 2703-E Buildings constitute the only discharges to the process sewer line and the 200-E Chemical Drain Field.

  19. State waste discharge permit application, 200-E chemical drain field

    International Nuclear Information System (INIS)

    1994-06-01

    As part of the Hanford Federal Facility Agreement and Consent Order negotiations (Ecology et al. 1994), the US Department of Energy, Richland Operations Office, the US Environmental Protection Agency, and the Washington State Department of Ecology agreed that liquid effluent discharges to the ground on the Hanford Site which affect groundwater or have the potential to affect ground would be subject to permitting under the structure of Chapter 173-216 (or 173-218 where applicable) of the Washington Administrative Code, the State Waste Discharge Permit Program. As a result of this decision, the Washington State Department of Ecology and the US Department of Energy, Richland Operations Office entered into Consent Order No. DE 91NM-177, (Ecology and DOE-RL 1991). The Consent Order No. DE 91NM-177 requires a series of permitting activities for liquid effluent discharges. This document presents the State Waste Discharge Permit (SWDP) application for the 200-E Chemical Drain Field. Waste water from the 272-E Building enters the process sewer line directly through a floor drain, while waste water from the 2703-E Building is collected in two floor drains, (north and south) that act as sumps and are discharged periodically. The 272-E and 2703-E Buildings constitute the only discharges to the process sewer line and the 200-E Chemical Drain Field

  20. 32 CFR 536.19 - Disaster claims planning.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Disaster claims planning. 536.19 Section 536.19... AGAINST THE UNITED STATES The Army Claims System § 536.19 Disaster claims planning. All ACOs will prepare... requirements related to disaster claims planning. ...

  1. The drift flux model in the ASSERT subchannel code

    International Nuclear Information System (INIS)

    Carver, M.B.; Judd, R.A.; Kiteley, J.C.; Tahir, A.

    1987-01-01

    The ASSERT subchannel code has been developed specifically to model flow and phase distributions within CANDU fuel bundles. ASSERT uses a drift-flux model that permits the phases to have unequal velocities, and can thus model phase separation tendencies that may occur in horizontal flow. The basic principles of ASSERT are outlined, and computed results are compared against data from various experiments for validation purposes. The paper concludes with an example of the use of the code to predict critical heat flux in CANDU geometries

  2. Workplace health and safety regulations: Impact of enforcement and consultation on workers' compensation claims rates in Washington State.

    Science.gov (United States)

    Baggs, James; Silverstein, Barbara; Foley, Michael

    2003-05-01

    There has been considerable debate in the public policy arena about the appropriate mix of regulatory enforcement and consultation in achieving desired health and safety behavior across industries. Recently there has been a shift in federal policy toward voluntary approaches and constraining the scope of enforcement programs, although there is little evidence that this might improve health and safety outcomes. To address this, we examined changes in lost time workers compensation claims rates for Washington State employers who had (1) no OSHA State Plan (WISHA) activity, (2) enforcement, (3) consultation, and (4) both types of visits. Compensable claims rates, hours, and WISHA activity were determined for each employer account with a single business location that had payroll hours reported for every quarter from 1997-2000 and more than 10 employees. We used a generalized estimating equations (GEE) approach to Poisson regression to model the association between WISHA activity and claims rate controlling for other external factors. Controlling for previous claims rate and average size, claims rates for employers with WISHA enforcement activity declined 22.5% in fixed site industry SIC codes compared to 7% among employers with no WISHA activity (P 0.10). WISHA consultation activity was not associated with a greater decline in compensable claims rates (-2.3% for fixed sites and +3.5% for non-fixed sites). WISHA activity did not adversely affect worksite survivability through the study period. Enforcement inspections are significantly associated with decreasing compensable workers compensation claims rates especially for fixed site employers. We were unable to identify an association between consultation activities and decreasing claims rates. Copyright 2003 Wiley-Liss, Inc.

  3. 78 FR 54875 - Privacy Act of 1974; Computer Matching Program Between the Department of Education (ED) and the...

    Science.gov (United States)

    2013-09-06

    ....C. 1091(g) and (p). SSA will verify the issuance of an SSN to, and will confirm the citizenship... verifying the accuracy of each individual's SSN and claim to a citizenship status that permits that... the Code of Federal Regulations is available via the Federal Digital System at: www.gpo.gov/fsys . At...

  4. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    Science.gov (United States)

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  5. Operating room fires: a closed claims analysis.

    Science.gov (United States)

    Mehta, Sonya P; Bhananker, Sanjay M; Posner, Karen L; Domino, Karen B

    2013-05-01

    To assess patterns of injury and liability associated with operating room (OR) fires, closed malpractice claims in the American Society of Anesthesiologists Closed Claims Database since 1985 were reviewed. All claims related to fires in the OR were compared with nonfire-related surgical anesthesia claims. An analysis of fire-related claims was performed to identify causative factors. There were 103 OR fire claims (1.9% of 5,297 surgical claims). Electrocautery was the ignition source in 90% of fire claims. OR fire claims more frequently involved older outpatients compared with other surgical anesthesia claims (P fire claims (P fires (n = 93) increased over time (P fires occurred during head, neck, or upper chest procedures (high-fire-risk procedures). Oxygen served as the oxidizer in 95% of electrocautery-induced OR fires (84% with open delivery system). Most electrocautery-induced fires (n = 75, 81%) occurred during monitored anesthesia care. Oxygen was administered via an open delivery system in all high-risk procedures during monitored anesthesia care. In contrast, alcohol-containing prep solutions and volatile compounds were present in only 15% of OR fires during monitored anesthesia care. Electrocautery-induced fires during monitored anesthesia care were the most common cause of OR fires claims. Recognition of the fire triad (oxidizer, fuel, and ignition source), particularly the critical role of supplemental oxygen by an open delivery system during use of the electrocautery, is crucial to prevent OR fires. Continuing education and communication among OR personnel along with fire prevention protocols in high-fire-risk procedures may reduce the occurrence of OR fires.

  6. Quality assurance aspects of the environmental code NECTAR

    International Nuclear Information System (INIS)

    Macdonald, H.F.; Nair, S.; Mascall, R.A.

    1986-02-01

    This report describes the quality assurance (QA) procedures which have been adopted in respect of the Environment code NECTAR (Nuclear Environmental Consequences, Transport of Activity and Risks). These procedures involve the verification, validation and evaluation of the individual NECTAR modules, namely RICE, SIRKIT, ATMOS, POPDOS and FOODWEB. The verification and validation of each module are considered in turn, while the final part of the report provides an overall evaluation of the code. The QA procedures are designed to provide reassurance that the NECTAR code is free from systematic errors and will perform calculations within the range of uncertainty and limitations claimed in its documentation. Following consideration of a draft version of this report by the Off-site Dose Methodology Working Group, the ATMOS, POPDOS and FOODWEB modules of NECTAR have been endorsed for use by the Board in reactor design and safety studies. (author)

  7. Protograph based LDPC codes with minimum distance linearly growing with block size

    Science.gov (United States)

    Divsalar, Dariush; Jones, Christopher; Dolinar, Sam; Thorpe, Jeremy

    2005-01-01

    We propose several LDPC code constructions that simultaneously achieve good threshold and error floor performance. Minimum distance is shown to grow linearly with block size (similar to regular codes of variable degree at least 3) by considering ensemble average weight enumerators. Our constructions are based on projected graph, or protograph, structures that support high-speed decoder implementations. As with irregular ensembles, our constructions are sensitive to the proportion of degree-2 variable nodes. A code with too few such nodes tends to have an iterative decoding threshold that is far from the capacity threshold. A code with too many such nodes tends to not exhibit a minimum distance that grows linearly in block length. In this paper we also show that precoding can be used to lower the threshold of regular LDPC codes. The decoding thresholds of the proposed codes, which have linearly increasing minimum distance in block size, outperform that of regular LDPC codes. Furthermore, a family of low to high rate codes, with thresholds that adhere closely to their respective channel capacity thresholds, is presented. Simulation results for a few example codes show that the proposed codes have low error floors as well as good threshold SNFt performance.

  8. Health Claims Data Warehouse (HCDW)

    Data.gov (United States)

    Office of Personnel Management — The Health Claims Data Warehouse (HCDW) will receive and analyze health claims data to support management and administrative purposes. The Federal Employee Health...

  9. High-Capacity Quantum Secure Direct Communication Based on Quantum Hyperdense Coding with Hyperentanglement

    International Nuclear Information System (INIS)

    Wang Tie-Jun; Li Tao; Du Fang-Fang; Deng Fu-Guo

    2011-01-01

    We present a quantum hyperdense coding protocol with hyperentanglement in polarization and spatial-mode degrees of freedom of photons first and then give the details for a quantum secure direct communication (QSDC) protocol based on this quantum hyperdense coding protocol. This QSDC protocol has the advantage of having a higher capacity than the quantum communication protocols with a qubit system. Compared with the QSDC protocol based on superdense coding with d-dimensional systems, this QSDC protocol is more feasible as the preparation of a high-dimension quantum system is more difficult than that of a two-level quantum system at present. (general)

  10. 37 CFR 360.5 - Copies of claims.

    Science.gov (United States)

    2010-07-01

    ... Section 360.5 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS SUBMISSION OF ROYALTY CLAIMS FILING OF CLAIMS TO ROYALTY FEES COLLECTED UNDER COMPULSORY LICENSE Cable Claims... hand delivery or by mail, file an original and one copy of the claim to cable royalty fees. ...

  11. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  12. An in-depth assessment of a diagnosis-based risk adjustment model based on national health insurance claims: the application of the Johns Hopkins Adjusted Clinical Group case-mix system in Taiwan

    Directory of Open Access Journals (Sweden)

    Weiner Jonathan P

    2010-01-01

    Full Text Available Abstract Background Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. Methods A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234, while those in both 2002 and 2003 were included for prospective analyses (n = 164,562. Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. Results The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster. When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Conclusions Given the

  13. Working while incapable to work? Changing concepts of permitted work in the UK disability benefit system

    Directory of Open Access Journals (Sweden)

    Jackie Gulland

    2017-11-01

    Full Text Available This article focusses on the borderland between "work" and "not work" in UK disability benefit systems. People who claim disability benefits often have to prove that they are "incapable of work" in order to qualify. The idea of incapacity for work requires an understanding of the meaning of the term "work," a concept which has a common sense simplicity but which is much more difficult to define in practice. UK disability benefit systems have developed the notion of "permitted work" to allow people to do small amounts of paid work while retaining entitlement to benefit. This concept of "permitted work" has its roots in the early twentieth century when claimants were sometimes entitled to disability benefits if any work that they did was considered to be sufficiently trivial to not count as "work." Policy on this changed over time, with particular developments after the Second World War, as rehabilitation and therapy became the key focus of permitted work rules. Current developments in UK social security policy treat almost everyone as a potential worker, changing the way in which permitted work operates. This article uses archive material on appeals against refusals of benefit, policy documents and case law to consider the social meanings of these moving boundaries of permitted work. Disability benefits are not value neutral: they are measures of social control which divide benefit claimants into those who are required to participate in the labour market and those who are exempted from this requirement.

  14. Development of a potential based code for MHD analysis of LLCB TBM

    International Nuclear Information System (INIS)

    Bhuyan, P.J.; Goswami, K.S.

    2010-01-01

    A two dimensional solver is developed for MHD flows with low magnetic Reynolds' number based on the electrostatic potential formulation for the Lorentz forces and current densities which will be used to calculate the MHD pressure drop inside the channels of liquid breeder based Test Blanket Modules (TBMs). The flow geometry is assumed to be rectangular and perpendicular to the flow direction, with flow and electrostatic potential variations along the flow direction neglected. A structured, non-uniform, collocated grid is used in the calculation, where the velocity (u), pressure (p) and electrostatic potential (φ) are calculated at the cell centers, whereas the current densities are calculated at the cell faces. Special relaxation techniques are employed in calculating the electrostatic potential for ensuring the divergence-free condition for current density. The code is benchmarked over a square channel for various Hartmann numbers up to 25,000 with and without insulation coatings by (i) comparing the pressure drop with the approximate analytical results found in literature and (ii) comparing the pressure drop with the ones obtained in our previous calculations based on the induction formulation for the electromagnetic part. Finally the code is used to determine the MHD pressure drop in case of LLCB TBM. The code gives similar results as obtained by us in our previous calculations based on the induction formulation. However, the convergence is much faster in case of potential based code.

  15. Emulating Wired Backhaul with Wireless Network Coding

    DEFF Research Database (Denmark)

    Thomsen, Henning; De Carvalho, Elisabeth; Popovski, Petar

    2014-01-01

    In this paper we address the need for wireless network densification. We propose a solution wherein the wired backhaul employed in heterogeneous cellular networks is replaced with wireless links, while maintaining the rate requirements of the uplink and downlink traffic of each user. The first...... of the two-way protocol. The transmit power is set high enough to enable successive decoding at the small cell base station where the downlink data to the user is first decoded and its contribution removed from the received signal followed by the uplink data from the user. The decoding of the second layer......, the uplink traffic to the user, remains identical to the one performed in a wired system. In the broadcast phase, the decoding of the downlink traffic can also be guaranteed to remain identical. Hence, our solution claims an emulation of a wired backhaul with wireless network coding with same performance. We...

  16. Fractal Image Coding Based on a Fitting Surface

    Directory of Open Access Journals (Sweden)

    Sheng Bi

    2014-01-01

    Full Text Available A no-search fractal image coding method based on a fitting surface is proposed. In our research, an improved gray-level transform with a fitting surface is introduced. One advantage of this method is that the fitting surface is used for both the range and domain blocks and one set of parameters can be saved. Another advantage is that the fitting surface can approximate the range and domain blocks better than the previous fitting planes; this can result in smaller block matching errors and better decoded image quality. Since the no-search and quadtree techniques are adopted, smaller matching errors also imply less number of blocks matching which results in a faster encoding process. Moreover, by combining all the fitting surfaces, a fitting surface image (FSI is also proposed to speed up the fractal decoding. Experiments show that our proposed method can yield superior performance over the other three methods. Relative to range-averaged image, FSI can provide faster fractal decoding process. Finally, by combining the proposed fractal coding method with JPEG, a hybrid coding method is designed which can provide higher PSNR than JPEG while maintaining the same Bpp.

  17. Claim prevention at reactor facilities

    International Nuclear Information System (INIS)

    Colby, B.P.

    1987-01-01

    Why does a radiation worker bring a claim alleging bodily injury from radiation exposure? Natural cancer, fear of radiation induced cancer, financial gain, emotional distress and mental anguish are some reasons for workers' claims. In this paper the author describes what power reactor health physicists are doing to reduce the likelihood of claims by establishing programs which provide sound protection of workers, prevent radiological events, improve workers' knowledge of radiological conditions and provide guidance for radiological incident response

  18. Development of SAGE, A computer code for safety assessment analyses for Korean Low-Level Radioactive Waste Disposal

    International Nuclear Information System (INIS)

    Zhou, W.; Kozak, Matthew W.; Park, Joowan; Kim, Changlak; Kang, Chulhyung

    2002-01-01

    This paper describes a computer code, called SAGE (Safety Assessment Groundwater Evaluation) to be used for evaluation of the concept for low-level waste disposal in the Republic of Korea (ROK). The conceptual model in the code is focused on releases from a gradually degrading engineered barrier system to an underlying unsaturated zone, thence to a saturated groundwater zone. Doses can be calculated for several biosphere systems including drinking contaminated groundwater, and subsequent contamination of foods, rivers, lakes, or the ocean by that groundwater. The flexibility of the code will permit both generic analyses in support of design and site development activities, and straightforward modification to permit site-specific and design-specific safety assessments of a real facility as progress is made toward implementation of a disposal site. In addition, the code has been written to easily interface with more detailed codes for specific parts of the safety assessment. In this way, the code's capabilities can be significantly expanded as needed. The code has the capability to treat input parameters either deterministic ally or probabilistic ally. Parameter input is achieved through a user-friendly Graphical User Interface.

  19. 40 CFR 35.6600 - Contractor claims.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Contractor claims. 35.6600 Section 35... Actions Procurement Requirements Under A Cooperative Agreement § 35.6600 Contractor claims. (a) General... prepared by the contractor to support a claim against the recipient; and (4) The award official determines...

  20. An Infrastructure for UML-Based Code Generation Tools

    Science.gov (United States)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  1. 31 CFR 361.8 - Claim for replacement.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Claim for replacement. 361.8 Section... § 361.8 Claim for replacement. Claim for replacement shall be made in writing to the Secretary, to the..., Parkersburg, WV 26106-1328. The claim, accompanied by a recommendation regarding the manner of replacement...

  2. Reserving by detailed conditioning on individual claim

    Science.gov (United States)

    Kartikasari, Mujiati Dwi; Effendie, Adhitya Ronnie; Wilandari, Yuciana

    2017-03-01

    The estimation of claim reserves is an important activity in insurance companies to fulfill their liabilities. Recently, reserving method of individual claim have attracted a lot of interest in the actuarial science, which overcome some deficiency of aggregated claim method. This paper explores the Reserving by Detailed Conditioning (RDC) method using all of claim information for reserving with individual claim of liability insurance from an Indonesian general insurance company. Furthermore, we compare it to Chain Ladder and Bornhuetter-Ferguson method.

  3. 28 CFR 14.4 - Administrative claims; evidence and information to be submitted.

    Science.gov (United States)

    2010-07-01

    ... bearing on either the responsibility of the United States for the death or the damages claimed. (b... submitted. (a) Death. In support of a claim based on death, the claimant may be required to submit the following evidence or information: (1) An authenticated death certificate or other competent evidence...

  4. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  5. Gröbner Bases, Coding, and Cryptography

    CERN Document Server

    Sala, Massimiliano; Perret, Ludovic

    2009-01-01

    Coding theory and cryptography allow secure and reliable data transmission, which is at the heart of modern communication. This book offers a comprehensive overview on the application of commutative algebra to coding theory and cryptography. It analyzes important properties of algebraic/geometric coding systems individually.

  6. Incidence of catheter-related complications in patients with central venous or hemodialysis catheters: a health care claims database analysis.

    Science.gov (United States)

    Napalkov, Pavel; Felici, Diana M; Chu, Laura K; Jacobs, Joan R; Begelman, Susan M

    2013-10-16

    Central venous catheter (CVC) and hemodialysis (HD) catheter usage are associated with complications that occur during catheter insertion, dwell period, and removal. This study aims to identify and describe the incidence rates of catheter-related complications in a large patient population in a United States-based health care claims database after CVC or HD catheter placement. Patients in the i3 InVision DataMart® health care claims database with at least 1 CVC or HD catheter insertion claim were categorized into CVC or HD cohorts using diagnostic and procedural codes from the US Renal Data System, American College of Surgeons, and American Medical Association's Physician Performance Measures. Catheter-related complications were identified using published diagnostic and procedural codes. Incidence rates (IRs)/1000 catheter-days were calculated for complications including catheter-related bloodstream infections (CRBSIs), thrombosis, embolism, intracranial hemorrhage (ICH), major bleeding (MB), and mechanical catheter-related complications (MCRCs). Thirty percent of the CVC cohort and 54% of the HD cohort had catheter placements lasting <90 days. Catheter-related complications occurred most often during the first 90 days of catheter placement. IRs were highest for CRBSIs in both cohorts (4.0 [95% CI, 3.7-4.3] and 5.1 [95% CI, 4.7-5.6], respectively). Other IRs in CVC and HD cohorts, respectively, were thrombosis, 1.3 and 0.8; MCRCs, 0.6 and 0.7; embolism, 0.4 and 0.5; MB, 0.1 and 0.3; and ICH, 0.1 in both cohorts. Patients with cancer at baseline had significantly higher IRs for CRBSIs and thrombosis than non-cancer patients. CVC or HD catheter-related complications were most frequently seen in patients 16 years or younger. The risk of catheter-related complications is highest during the first 90 days of catheter placement in patients with CVCs and HD catheters and in younger patients (≤16 years of age) with HD catheters. Data provided in this study can be applied

  7. Code accuracy evaluation of ISP 35 calculations based on NUPEC M-7-1 test

    International Nuclear Information System (INIS)

    Auria, F.D.; Oriolo, F.; Leonardi, M.; Paci, S.

    1995-01-01

    Quantitative evaluation of code uncertainties is a necessary step in the code assessment process, above all if best-estimate codes are utilised for licensing purposes. Aiming at quantifying the code accuracy, an integral methodology based on the Fast Fourier Transform (FFT) has been developed at the University of Pisa (DCMN) and has been already applied to several calculations related to primary system test analyses. This paper deals with the first application of the FFT based methodology to containment code calculations based on a hydrogen mixing and distribution test performed in the NUPEC (Nuclear Power Engineering Corporation) facility. It is referred to pre-test and post-test calculations submitted for the International Standard Problem (ISP) n. 35. This is a blind exercise, simulating the effects of steam injection and spray behaviour on gas distribution and mixing. The result of the application of this methodology to nineteen selected variables calculated by ten participants are here summarized, and the comparison (where possible) of the accuracy evaluated for the pre-test and for the post-test calculations of a same user is also presented. (author)

  8. An eddy-permitting, dynamically consistent adjoint-based assimilation system for the tropical Pacific: Hindcast experiments in 2000

    KAUST Repository

    Hoteit, Ibrahim; Cornuelle, B.; Heimbach, P.

    2010-01-01

    An eddy-permitting adjoint-based assimilation system has been implemented to estimate the state of the tropical Pacific Ocean. The system uses the Massachusetts Institute of Technology's general circulation model and its adjoint. The adjoint method

  9. 32 CFR 842.43 - Filing a claim.

    Science.gov (United States)

    2010-07-01

    ... completed Standard Form 95 or other signed and written demand for money damages in a sum certain. A claim... Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE CLAIMS AND LITIGATION ADMINISTRATIVE... amend a claim at any time prior to final action. To amend a claim, the claimant or his or her authorized...

  10. Image coding based on maximum entropy partitioning for identifying ...

    Indian Academy of Sciences (India)

    A new coding scheme based on maximum entropy partitioning is proposed in our work, particularly to identify the improbable intensities related to different emotions. The improbable intensities when used as a mask decode the facial expression correctly, providing an effectiveplatform for future emotion categorization ...

  11. Survey of malpractice claims in dermatology

    International Nuclear Information System (INIS)

    Altman, J.

    1975-01-01

    A statistical survey of malpractice claims asserted against dermatologists was made. The subject matter of the claims was divided into eight major categories: drug reactions, x-ray burns, poor cosmetic result following surgery, poor cosmetic result following medication, failure to diagnose cancer, improper diagnosis, infection from treatment, and miscellaneous. The study showed that a group of ''serious'' damage cases, which accounted for 34 percent of total claims, generated 94 percent of total dollar losses. The problem areas for malpractice claims appeared to be drug reactions, cosmetic chemosurgery, and failure to diagnose cancer. (U.S.)

  12. Quadratic Hedging Methods for Defaultable Claims

    International Nuclear Information System (INIS)

    Biagini, Francesca; Cretarola, Alessandra

    2007-01-01

    We apply the local risk-minimization approach to defaultable claims and we compare it with intensity-based evaluation formulas and the mean-variance hedging. We solve analytically the problem of finding respectively the hedging strategy and the associated portfolio for the three methods in the case of a default put option with random recovery at maturity

  13. 37 CFR 7.12 - Claim of color.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Claim of color. 7.12 Section... § 7.12 Claim of color. (a) If color is claimed as a feature of the mark in the basic application and/or registration, the international application must include a statement that color is claimed as a...

  14. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  15. A Review on Block Matching Motion Estimation and Automata Theory based Approaches for Fractal Coding

    Directory of Open Access Journals (Sweden)

    Shailesh Kamble

    2016-12-01

    Full Text Available Fractal compression is the lossy compression technique in the field of gray/color image and video compression. It gives high compression ratio, better image quality with fast decoding time but improvement in encoding time is a challenge. This review paper/article presents the analysis of most significant existing approaches in the field of fractal based gray/color images and video compression, different block matching motion estimation approaches for finding out the motion vectors in a frame based on inter-frame coding and intra-frame coding i.e. individual frame coding and automata theory based coding approaches to represent an image/sequence of images. Though different review papers exist related to fractal coding, this paper is different in many sense. One can develop the new shape pattern for motion estimation and modify the existing block matching motion estimation with automata coding to explore the fractal compression technique with specific focus on reducing the encoding time and achieving better image/video reconstruction quality. This paper is useful for the beginners in the domain of video compression.

  16. Construction method of QC-LDPC codes based on multiplicative group of finite field in optical communication

    Science.gov (United States)

    Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui

    2016-09-01

    In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.

  17. 45 CFR 35.4 - Administrative claims; evidence and information to be submitted.

    Science.gov (United States)

    2010-10-01

    ... bearing on either the responsibility of the United States for the death or the damages claimed. (b... information to be submitted. (a) Death. In support of a claim based on death, the claimant may be required to submit the following evidence or information: (1) An authenticated death certificate or other competent...

  18. Completeness of retail pharmacy claims data: implications for pharmacoepidemiologic studies and pharmacy practice in elderly patients.

    Science.gov (United States)

    Polinski, Jennifer M; Schneeweiss, Sebastian; Levin, Raisa; Shrank, William H

    2009-09-01

    In the elderly (those aged >or=65 years), retail pharmacy claims are used to study drug use among the uninsured after drug policy changes, to prevent drug-drug interactions and duplication of therapy, and to guide medication therapy management. Claims include only prescriptions filled at 1 pharmacy location or within 1 pharmacy chain and do not include prescriptions filled at outside pharmacies, potentially limiting research accuracy and pharmacy-based safety interventions. The aims of this study were to assess elderly patients' pharmacy loyalty and to identify predictors of using multiple pharmacies. Patients enrolled in the Pharmaceutical Assistance Contract for the Elderly (PACE) pharmacy benefit program with corresponding Medicare claims in the state of Pennsylvania comprised the study cohort. Among patients with pharmacy claims from all pharmacies used in 2004-2005, a primary pharmacy was defined as the pharmacy where at least 50% of a patient's prescriptions were filled. The number of pharmacies/chains used and prescriptions filled in 2005 was calculated. Predictors of using multiple pharmacies in 2005 were age, female gender, white race, urban residency, comorbidities, number of distinct chemical drugs (unique medications) used, and number of prescriptions filled, which were all assessed in 2004. In total, pharmacy claims data from 182,116 patients (147,718 women [81.1%]; mean [SD] age, 78.8 [7.1] years; 168,175 white [92.3%]; 76,580 [42.1%] residing in an urban zip code area) were included. Of the 182,116 PACE patients in the study, a primary pharmacy was identified for 180,751 patients (99.3%). In 2005, patients filled an average of 59.3 prescriptions, with 57.0 prescriptions (96.1%) having been filled at the primary pharmacy. Compared with patients who used or=15 unique medications had a 2.66 times (95% CI, 2.53-2.80) greater likelihood of using multiple pharmacies in 2005. Patients aged >or=85 years were 1.07 times (95% CI, 1.04-1.11) as likely to use

  19. Completeness of Retail Pharmacy Claims Data: Implications for Pharmacoepidemiologic Studies and Pharmacy Practice in Elderly Adults

    Science.gov (United States)

    Polinski, Jennifer M.; Schneeweiss, Sebastian; Levin, Raisa; Shrank, William H.

    2009-01-01

    Background In the elderly (those aged ≥65 years), retail pharmacy claims are used to study drug use among the uninsured after drug policy changes, to prevent drug drug interactions and duplication of therapy, and to guide medication therapy management. Claims include only prescriptions filled at one pharmacy location or within one pharmacy chain and do not include prescriptions filled at outside pharmacies, potentially limiting research accuracy and pharmacy-based safety interventions. Objectives The aims of this study were to assess elderly patients’ pharmacy loyalty and to identify predictors of using multiple pharmacies. Methods Patients enrolled in the Pharmaceutical Assistance Contract for the Elderly pharmacy benefit program with corresponding Medicare claims in the state of Pennsylvania comprised the study cohort. Among patients with pharmacy claims from all pharmacies used in 2004–2005, a primary pharmacy was defined as the pharmacy where >50% of a patient’s prescriptions were filled. The number of pharmacies/chains used and prescriptions filled in 2005 was calculated. Predictors of using multiple pharmacies in 2005 were age, gender, race, urban residency, comorbidities, number of unique medications used, and number of prescriptions, which were all assessed in 2004. Results In total, pharmacy claims data from 182,235 patients (147,718 [81.1%] women; mean [SD] age 78.8 [7.1] years; 168,175 white; 76,580 residing in an urban zip code area) were included. In 2005, patients filled an average of 59.3 prescriptions, with 57.0 (96.1%) prescriptions having been filled at the primary pharmacy. Compared with patients who used <5 unique medications in 2004, patients who used 6 to 9 unique medications had 1.39 times (95% CI, 1.34–1.44), and patients who used 15 unique medications had 2.68 times (95% CI, 2.55–2.82) greater likelihood of using multiple pharmacies in 2005. Patients aged ≥85 years were 1.07 times (95% CI, 1.03–1.11) as likely to use

  20. Help, my rating looks bad! Coding comorbidities in arthroplasty.

    Science.gov (United States)

    Galloway, Joseph D; Voss, Frank R

    2016-09-01

    In medicine today, there is a trend toward increasing transparency. Higher quality and better value are being sought, and one of the methods being used is publicly reported health care outcomes. However, there is a problem that comes from our loss of anonymity. Physicians who are being individually watched have to choose between doing what is best for the patient and doing what would look good when it is publicly reported. Often this might mean choosing not to treat a particularly sick patient who is unlikely to have a good outcome. Adjusting outcomes to account for risk factors should be a way to prevent this effect, but these methods need to be studied more. The current performance measures being released are based on administrative claims data, and to date, much of that information is not properly risk adjusted. To ensure that the increasing transparency reveals an accurate picture, it is critical that the complexity of care provided by surgeons be carefully documented. Therefore, we propose accurate coding of patients' comorbidities during hospitalization for total knee arthroplasty and total hip arthroplasty, and we have included a chart detailing our recommendations of the specific diagnostic codes that are most important.

  1. Validation of intellectual disability coding through hospital morbidity records using an intellectual disability population-based database in Western Australia.

    Science.gov (United States)

    Bourke, Jenny; Wong, Kingsley; Leonard, Helen

    2018-01-23

    To investigate how well intellectual disability (ID) can be ascertained using hospital morbidity data compared with a population-based data source. All children born in 1983-2010 with a hospital admission in the Western Australian Hospital Morbidity Data System (HMDS) were linked with the Western Australian Intellectual Disability Exploring Answers (IDEA) database. The International Classification of Diseases hospital codes consistent with ID were also identified. The characteristics of those children identified with ID through either or both sources were investigated. Of the 488 905 individuals in the study, 10 218 (2.1%) were identified with ID in either IDEA or HMDS with 1435 (14.0%) individuals identified in both databases, 8305 (81.3%) unique to the IDEA database and 478 (4.7%) unique to the HMDS dataset only. Of those unique to the HMDS dataset, about a quarter (n=124) had died before 1 year of age and most of these (75%) before 1 month. Children with ID who were also coded as such in the HMDS data were more likely to be aged under 1 year, female, non-Aboriginal and have a severe level of ID, compared with those not coded in the HMDS data. The sensitivity of using HMDS to identify ID was 14.7%, whereas the specificity was much higher at 99.9%. Hospital morbidity data are not a reliable source for identifying ID within a population, and epidemiological researchers need to take these findings into account in their study design. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. A Secure Network Coding Based on Broadcast Encryption in SDN

    Directory of Open Access Journals (Sweden)

    Yue Chen

    2016-01-01

    Full Text Available By allowing intermediate nodes to encode the received packets before sending them out, network coding improves the capacity and robustness of multicast applications. But it is vulnerable to the pollution attacks. Some signature schemes were proposed to thwart such attacks, but most of them need to be homomorphic that the keys cannot be generated and managed easily. In this paper, we propose a novel fast and secure switch network coding multicast (SSNC on the software defined networks (SDN. In our scheme, the complicated secure multicast management was separated from the fast data transmission based on the SDN. Multiple multicasts will be aggregated to one multicast group according to the requirements of services and the network status. Then, the controller will route aggregated multicast group with network coding; only the trusted switch will be allowed to join the network coding by using broadcast encryption. The proposed scheme can use the traditional cryptography without homomorphy, which greatly reduces the complexity of the computation and improves the efficiency of transmission.

  3. Chiropractic claims in the English-speaking world.

    Science.gov (United States)

    Ernst, Edzard; Gilbey, Andrew

    2010-04-09

    Some chiropractors and their associations claim that chiropractic is effective for conditions that lack sound supporting evidence or scientific rationale. This study therefore sought to determine the frequency of World Wide Web claims of chiropractors and their associations to treat, asthma, headache/migraine, infant colic, colic, ear infection/earache/otitis media, neck pain, whiplash (not supported by sound evidence), and lower back pain (supported by some evidence). A review of 200 chiropractor websites and 9 chiropractic associations' World Wide Web claims in Australia, Canada, New Zealand, the United Kingdom, and the United States was conducted between 1 October 2008 and 26 November 2008. The outcome measure was claims (either direct or indirect) regarding the eight reviewed conditions, made in the context of chiropractic treatment. We found evidence that 190 (95%) chiropractor websites made unsubstantiated claims regarding at least one of the conditions. When colic and infant colic data were collapsed into one heading, there was evidence that 76 (38%) chiropractor websites made unsubstantiated claims about all the conditions not supported by sound evidence. Fifty-six (28%) websites and 4 of the 9 (44%) associations made claims about lower back pain, whereas 179 (90%) websites and all 9 associations made unsubstantiated claims about headache/migraine. Unsubstantiated claims were made about asthma, ear infection/earache/otitis media, neck pain, The majority of chiropractors and their associations in the English-speaking world seem to make therapeutic claims that are not supported by sound evidence, whilst only 28% of chiropractor websites promote lower back pain, which is supported by some evidence. We suggest the ubiquity of the unsubstantiated claims constitutes an ethical and public health issue.

  4. Development of the Monju core safety analysis numerical models by super-COPD code

    International Nuclear Information System (INIS)

    Yamada, Fumiaki; Minami, Masaki

    2010-12-01

    Japan Atomic Energy Agency constructed a computational model for safety analysis of Monju reactor core to be built into a modularized plant dynamics analysis code Super-COPD code, for the purpose of heat removal capability evaluation at the in total 21 defined transients in the annex to the construction permit application. The applicability of this model to core heat removal capability evaluation has been estimated by back to back result comparisons of the constituent models with conventionally applied codes and by application of the unified model. The numerical model for core safety analysis has been built based on the best estimate model validated by the actually measured plant behavior up to 40% rated power conditions, taking over safety analysis models of conventionally applied COPD and HARHO-IN codes, to be capable of overall calculations of the entire plant with the safety protection and control systems. Among the constituents of the analytical model, neutronic-thermal model, heat transfer and hydraulic models of PHTS, SHTS, and water/steam system are individually verified by comparisons with the conventional calculations. Comparisons are also made with the actually measured plant behavior up to 40% rated power conditions to confirm the calculation adequacy and conservativeness of the input data. The unified analytical model was applied to analyses of in total 8 anomaly events; reactivity insertion, abnormal power distribution, decrease and increase of coolant flow rate in PHTS, SHTS and water/steam systems. The resulting maximum values and temporal variations of the key parameters in safety evaluation; temperatures of fuel, cladding, in core sodium coolant and RV inlet and outlet coolant have negligible discrepancies against the existing analysis result in the annex to the construction permit application, verifying the unified analytical model. These works have enabled analytical evaluation of Monju core heat removal capability by Super-COPD utilizing the

  5. 20 CFR 405.340 - Deciding a claim without a hearing before an administrative law judge.

    Science.gov (United States)

    2010-04-01

    ....340 Deciding a claim without a hearing before an administrative law judge. (a) Decision wholly... the decision is based. (b) You do not wish to appear. The administrative law judge may decide a claim... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Deciding a claim without a hearing before an...

  6. 15 CFR 700.90 - Protection against claims.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Protection against claims. 700.90 Section 700.90 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE NATIONAL SECURITY INDUSTRIAL BASE REGULATIONS...

  7. Internet-based Advertising Claims and Consumer Reasons for Using Electronic Cigarettes by Device Type in the US

    OpenAIRE

    Pulvers, K; Sun, JY; Zhuang, Y-L; Holguin, G; Zhu, S-H

    2017-01-01

    Objectives Important differences exist between closed-system and open-system e-cigarettes, but it is unknown whether online companies are marketing these devices differently and whether consumer reasons for using e-cigarettes vary by device type. This paper compares Internet-based advertising claims of closed- versus open-system products, and evaluates US consumers’ reasons for using closed- versus open-system e-cigarettes. Methods Internet sites selling exclusively closed (N = 130)...

  8. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    Science.gov (United States)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years

  9. A Secure Network Coding-based Data Gathering Model and Its Protocol in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Qian Xiao

    2012-09-01

    Full Text Available To provide security for data gathering based on network coding in wireless sensor networks (WSNs, a secure network coding-based data gathering model is proposed, and a data-privacy preserving and pollution preventing (DPPaamp;PP protocol using network coding is designed. DPPaamp;PP makes use of a new proposed pollution symbol selection and pollution (PSSP scheme based on a new obfuscation idea to pollute existing symbols. Analyses of DPPaamp;PP show that it not only requires low overhead on computation and communication, but also provides high security on resisting brute-force attacks.

  10. 12 CFR 627.2750 - Priority of claims-banks.

    Science.gov (United States)

    2010-01-01

    ...) All claims for taxes. (f) All claims of creditors which are secured by specific assets or equities of... accordance with priorities of applicable Federal or State law. (g) All claims of holders of bonds issued by... claims of holders of consolidated and System-wide bonds and all claims of the other Farm Credit banks...

  11. 12 CFR 793.4 - Administrative claims; evidence and information to be submitted.

    Science.gov (United States)

    2010-01-01

    ... evidence or information which may have a bearing on the responsibility of the United States for the death... GOVERNMENT Procedures § 793.4 Administrative claims; evidence and information to be submitted. (a) Death. In support of a claim based on death, the claimant may be required to submit the following evidence or...

  12. Online advertising and marketing claims by providers of proton beam therapy: are they guideline-based?

    Science.gov (United States)

    Corkum, Mark T; Liu, Wei; Palma, David A; Bauman, Glenn S; Dinniwell, Robert E; Warner, Andrew; Mishra, Mark V; Louie, Alexander V

    2018-03-15

    Cancer patients frequently search the Internet for treatment options, and hospital websites are seen as reliable sources of knowledge. Guidelines support the use of proton radiotherapy in specific disease sites or on clinical trials. This study aims to evaluate direct-to-consumer advertising content and claims made by proton therapy centre (PTC) websites worldwide. Operational PTC websites in English were identified through the Particle Therapy Co-Operative Group website. Data abstraction of website content was performed independently by two investigators. Eight international guidelines were consulted to determine guideline-based indications for proton radiotherapy. Univariate and multivariate logistic regression models were used to determine the characteristics of PTC websites that indicated proton radiotherapy offered greater disease control or cure rates. Forty-eight PTCs with 46 English websites were identified. 60·9% of PTC websites claimed proton therapy provided improved disease control or cure. U.S. websites listed more indications than international websites (15·5 ± 5·4 vs. 10·4 ± 5·8, p = 0·004). The most common disease sites advertised were prostate (87·0%), head and neck (87·0%) and pediatrics (82·6%), all of which were indicated in least one international guideline. Several disease sites advertised were not present in any consensus guidelines, including pancreatobiliary (52·2%), breast (50·0%), and esophageal (43·5%) cancers. Multivariate analysis found increasing number of disease sites and claiming their centre was a local or regional leader in proton radiotherapy was associated with indicating proton radiotherapy offers greater disease control or cure. Information from PTC websites often differs from recommendations found in international consensus guidelines. As online marketing information may have significant influence on patient decision-making, alignment of such information with accepted guidelines and consensus

  13. 20 CFR 670.520 - Are students permitted to hold jobs other than work-based learning opportunities?

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Are students permitted to hold jobs other than work-based learning opportunities? 670.520 Section 670.520 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT...

  14. MatMCNP: A Code for Producing Material Cards for MCNP

    Energy Technology Data Exchange (ETDEWEB)

    DePriest, Kendall Russell [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Saavedra, Karen C. [American Structurepoint, Inc., Indianapolis, IN (United States)

    2014-09-01

    A code for generating MCNP material cards (MatMCNP) has been written and verified for naturally occurring, stable isotopes. The program allows for material specification as either atomic or weight percent (fractions). MatMCNP also permits the specification of enriched lithium, boron, and/or uranium. In addition to producing the material cards for MCNP, the code calculates the atomic (or number) density in atoms/barn-cm as well as the multiplier that should be used to convert neutron and gamma fluences into dose in the material specified.

  15. Nutrition labelling, marketing techniques, nutrition claims and health claims on chip and biscuit packages from sixteen countries.

    Science.gov (United States)

    Mayhew, Alexandra J; Lock, Karen; Kelishadi, Roya; Swaminathan, Sumathi; Marcilio, Claudia S; Iqbal, Romaina; Dehghan, Mahshid; Yusuf, Salim; Chow, Clara K

    2016-04-01

    Food packages were objectively assessed to explore differences in nutrition labelling, selected promotional marketing techniques and health and nutrition claims between countries, in comparison to national regulations. Cross-sectional. Chip and sweet biscuit packages were collected from sixteen countries at different levels of economic development in the EPOCH (Environmental Profile of a Community's Health) study between 2008 and 2010. Seven hundred and thirty-seven food packages were systematically evaluated for nutrition labelling, selected promotional marketing techniques relevant to nutrition and health, and health and nutrition claims. We compared pack labelling in countries with labelling regulations, with voluntary regulations and no regulations. Overall 86 % of the packages had nutrition labels, 30 % had health or nutrition claims and 87 % displayed selected marketing techniques. On average, each package displayed two marketing techniques and one health or nutrition claim. In countries with mandatory nutrition labelling a greater proportion of packages displayed nutrition labels, had more of the seven required nutrients present, more total nutrients listed and higher readability compared with those with voluntary or no regulations. Countries with no health or nutrition claim regulations had fewer claims per package compared with countries with regulations. Nutrition label regulations were associated with increased prevalence and quality of nutrition labels. Health and nutrition claim regulations were unexpectedly associated with increased use of claims, suggesting that current regulations may not have the desired effect of protecting consumers. Of concern, lack of regulation was associated with increased promotional marketing techniques directed at children and misleadingly promoting broad concepts of health.

  16. Project W-314 phase I environmental permits and approvals plan

    International Nuclear Information System (INIS)

    TOLLEFSON, K.S.

    1999-01-01

    This document describes the range of environmental actions, including required permits and other agency approvals, for Project W-314 activities in the Hanford Site's Tank Waste Remediation System. This document outlines alternative approaches to satisfying applicable environmental standards, and describes selected strategies for acquiring permits and other approvals needed for waste feed delivery to proceed. This document also includes estimated costs and schedule to obtain the required permits and approvals based on the selected strategy. It also provides estimated costs for environmental support during design and construction based on the preliminary project schedule provided

  17. SCDAP/RELAP5/MOD 3.1 code manual: Damage progression model theory. Volume 2

    International Nuclear Information System (INIS)

    Davis, K.L.

    1995-06-01

    The SCDAP/RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during a severe accident. The code models the coupled behavior of the reactor coolant system, the core, fission products released during a severe accident transient as well as large and small break loss of coolant accidents, operational transients such as anticipated transient without SCRAM, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater conditioning systems. This volume contains detailed descriptions of the severe accident models and correlations. It provides the user with the underlying assumptions and simplifications used to generate and implement the basic equations into the code, so an intelligent assessment of the applicability and accuracy of the resulting calculation can be made

  18. SCDAP/RELAP5/MOD 3.1 code manual: Damage progression model theory. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Davis, K.L. [ed.; Allison, C.M.; Berna, G.A. [Lockheed Idaho Technologies Co., Idaho Falls, ID (United States)] [and others

    1995-06-01

    The SCDAP/RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during a severe accident. The code models the coupled behavior of the reactor coolant system, the core, fission products released during a severe accident transient as well as large and small break loss of coolant accidents, operational transients such as anticipated transient without SCRAM, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater conditioning systems. This volume contains detailed descriptions of the severe accident models and correlations. It provides the user with the underlying assumptions and simplifications used to generate and implement the basic equations into the code, so an intelligent assessment of the applicability and accuracy of the resulting calculation can be made.

  19. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  20. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  1. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  2. 76 FR 36176 - Fully Developed Claim (Fully Developed Claims-Applications for Compensation, Pension, DIC, Death...

    Science.gov (United States)

    2011-06-21

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0747] Fully Developed Claim (Fully Developed Claims--Applications for Compensation, Pension, DIC, Death Pension, and/or Accrued Benefits); Correction AGENCY: Veterans Benefits Administration, Department of Veterans Affairs. ACTION: Notice; correction...

  3. Risk of Contractors’ Claims On the Example of Road Works

    Science.gov (United States)

    Rybka, Iwona; Bondar-Nowakowska, Elżbieta; Pawluk, Katarzyna; Połoński, Mieczysław

    2017-10-01

    The aim of the study is to analyse claims filed by building contractors during the project implementation. The work is divided into two parts. In the first part problems associated with the management of claims in the construction process were discussed. Bearing in mind that claims may result in prolongation of the investment or exceeding planned budget, possibilities of applying information included in documents connected with claims procedure to risk management was analysed in the second part of the study. The basis of the analysis is a review of 226 documents. They originate from 8 construction sites completed in the last 5 years in southwestern Poland. In each case, these were linear road projects, executed by different contractors, according to conditions in the contract set out in the “Yellow Book” FIDIC. In the study, other documents relating events that according to contractors entitled them to claim were also analysed. They included among others: project documentation, terms of reference, construction log, reports and correspondence under the contract. The events constituting the reason for contractors` claims were classified according to their sources. 8 areas of potential threats were distinguished. They were presented in the form of a block diagram. Most events initiating the claims were reported in the following group - adverse actions of third parties, while the fewest were recorded in the group - the lack of access to the construction site. Based on calculated similarity indicators it was found that considered construction sites were diversified in terms of the number of the events occurrence that generated the claim and their sources. In recent years, many road projects are completed behind the schedule and their initially planned budgets are significantly exceeded. Conducted research indicated that data derived from the analysis of documents connected with claims can be applied to identify and classify both cost and schedule risk factors

  4. Coding update of the SMFM definition of low risk for cesarean delivery from ICD-9-CM to ICD-10-CM.

    Science.gov (United States)

    Armstrong, Joanne; McDermott, Patricia; Saade, George R; Srinivas, Sindhu K

    2017-07-01

    In 2015, the Society for Maternal-Fetal Medicine developed a low risk for cesarean delivery definition based on administrative claims-based diagnosis codes described by the International Classification of Diseases, Ninth Revision, Clinical Modification. The Society for Maternal-Fetal Medicine definition is a clinical enrichment of 2 available measures from the Joint Commission and the Agency for Healthcare Research and Quality measures. The Society for Maternal-Fetal Medicine measure excludes diagnosis codes that represent clinically relevant risk factors that are absolute or relative contraindications to vaginal birth while retaining diagnosis codes such as labor disorders that are discretionary risk factors for cesarean delivery. The introduction of the International Statistical Classification of Diseases, 10th Revision, Clinical Modification in October 2015 expanded the number of available diagnosis codes and enabled a greater depth and breadth of clinical description. These coding improvements further enhance the clinical validity of the Society for Maternal-Fetal Medicine definition and its potential utility in tracking progress toward the goal of safely lowering the US cesarean delivery rate. This report updates the Society for Maternal-Fetal Medicine definition of low risk for cesarean delivery using International Statistical Classification of Diseases, 10th Revision, Clinical Modification coding. Copyright © 2017. Published by Elsevier Inc.

  5. Performance of asynchronous fiber-optic code division multiple access system based on three-dimensional wavelength/time/space codes and its link analysis.

    Science.gov (United States)

    Singh, Jaswinder

    2010-03-10

    A novel family of three-dimensional (3-D) wavelength/time/space codes for asynchronous optical code-division-multiple-access (CDMA) systems with "zero" off-peak autocorrelation and "unity" cross correlation is reported. Antipodal signaling and differential detection is employed in the system. A maximum of [(W x T+1) x W] codes are generated for unity cross correlation, where W and T are the number of wavelengths and time chips used in the code and are prime. The conditions for violation of the cross-correlation constraint are discussed. The expressions for number of generated codes are determined for various code dimensions. It is found that the maximum number of codes are generated for S systems. The codes have a code-set-size to code-size ratio greater than W/S. For instance, with a code size of 2065 (59 x 7 x 5), a total of 12,213 users can be supported, and 130 simultaneous users at a bit-error rate (BER) of 10(-9). An arrayed-waveguide-grating-based reconfigurable encoder/decoder design for 2-D implementation for the 3-D codes is presented so that the need for multiple star couplers and fiber ribbons is eliminated. The hardware requirements of the coders used for various modulation/detection schemes are given. The effect of insertion loss in the coders is shown to be significantly reduced with loss compensation by using an amplifier after encoding. An optical CDMA system for four users is simulated and the results presented show the improvement in performance with the use of loss compensation.

  6. Are the correct herbal claims by Hildegard von Bingen only lucky strikes? A new statistical approach.

    Science.gov (United States)

    Uehleke, Bernhard; Hopfenmueller, Werner; Stange, Rainer; Saller, Reinhard

    2012-01-01

    Ancient and medieval herbal books are often believed to describe the same claims still in use today. Medieval herbal books, however, provide long lists of claims for each herb, most of which are not approved today, while the herb's modern use is often missing. So the hypothesis arises that a medieval author could have randomly hit on 'correct' claims among his many 'wrong' ones. We developed a statistical procedure based on a simple probability model. We applied our procedure to the herbal books of Hildegard von Bingen (1098- 1179) as an example for its usefulness. Claim attributions for a certain herb were classified as 'correct' if approximately the same as indicated in actual monographs. The number of 'correct' claim attributions was significantly higher than it could have been by pure chance, even though the vast majority of Hildegard von Bingen's claims were not 'correct'. The hypothesis that Hildegard would have achieved her 'correct' claims purely by chance can be clearly rejected. The finding that medical claims provided by a medieval author are significantly related to modern herbal use supports the importance of traditional medicinal systems as an empirical source. However, since many traditional claims are not in accordance with modern applications, they should be used carefully and analyzed in a systematic, statistics-based manner. Our statistical approach can be used for further systematic comparison of herbal claims of traditional sources as well as in the fields of ethnobotany and ethnopharmacology. Copyright © 2012 S. Karger AG, Basel.

  7. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  8. Code cases for implementing risk-based inservice testing in the ASME OM code

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1996-01-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices

  9. Code cases for implementing risk-based inservice testing in the ASME OM code

    Energy Technology Data Exchange (ETDEWEB)

    Rowley, C.W.

    1996-12-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices.

  10. Nature of Medical Malpractice Claims Against Radiation Oncologists

    Energy Technology Data Exchange (ETDEWEB)

    Marshall, Deborah; Tringale, Kathryn [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States); Connor, Michael [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States); University of California Irvine School of Medicine, Irvine, California (United States); Punglia, Rinaa [Department of Radiation Oncology, Brigham and Women' s Hospital, Harvard Medical School, Boston, Massachusetts (United States); Recht, Abram [Department of Radiation Oncology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, Massachusetts (United States); Hattangadi-Gluth, Jona, E-mail: jhattangadi@ucsd.edu [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States)

    2017-05-01

    Purpose: To examine characteristics of medical malpractice claims involving radiation oncologists closed during a 10-year period. Methods and Materials: Malpractice claims filed against radiation oncologists from 2003 to 2012 collected by a nationwide liability insurance trade association were analyzed. Outcomes included the nature of claims and indemnity payments, including associated presenting diagnoses, procedures, alleged medical errors, and injury severity. We compared the likelihood of a claim resulting in payment in relation to injury severity categories (death as referent) using binomial logistic regression. Results: There were 362 closed claims involving radiation oncology, 102 (28%) of which were paid, resulting in $38 million in indemnity payments. The most common alleged errors included “improper performance” (38% of closed claims, 18% were paid; 29% [$11 million] of total indemnity), “errors in diagnosis” (25% of closed claims, 46% were paid; 44% [$17 million] of total indemnity), and “no medical misadventure” (14% of closed claims, 8% were paid; less than 1% [$148,000] of total indemnity). Another physician was named in 32% of claims, and consent issues/breach of contract were cited in 18%. Claims for injury resulting in death represented 39% of closed claims and 25% of total indemnity. “Improper performance” was the primary alleged error associated with injury resulting in death. Compared with claims involving death, major temporary injury (odds ratio [OR] 2.8, 95% confidence interval [CI] 1.29-5.85, P=.009), significant permanent injury (OR 3.1, 95% CI 1.48-6.46, P=.003), and major permanent injury (OR 5.5, 95% CI 1.89-16.15, P=.002) had a higher likelihood of a claim resulting in indemnity payment. Conclusions: Improper performance was the most common alleged malpractice error. Claims involving significant or major injury were more likely to be paid than those involving death. Insights into the nature of liability claims against

  11. Frictions in Project-Based Supply of Permits

    International Nuclear Information System (INIS)

    Liski, M.; Virrankoski, J.

    2004-01-01

    Emissions trading in climate change can entail large overall cost savings and transfers between developed and developing countries. However, the search for acceptable JI or CDM projects implies a deviation from the perfect market framework used in previous estimations. Our model combines the search market for projects with a frictionless permit market to quantify the supply-side frictions in the CO2 market. We also decompose the effects of frictions into the effects of search friction, bargaining, and bilateralism. A calibration using previous cost estimates of CO2 reductions illustrate changes in cost savings and allocative implications

  12. 32 CFR 842.95 - Non-assertable claims.

    Science.gov (United States)

    2010-07-01

    ... ADMINISTRATIVE CLAIMS Property Damage Tort Claims in Favor of the United States (31 U.S.C. 3701, 3711-3719) § 842...) Reimbursement for military or civilian employees for their negligence claims paid by the United States. (b) Loss...

  13. 37 CFR 360.12 - Form and content of claims.

    Science.gov (United States)

    2010-07-01

    ... SUBMISSION OF ROYALTY CLAIMS FILING OF CLAIMS TO ROYALTY FEES COLLECTED UNDER COMPULSORY LICENSE Satellite Claims § 360.12 Form and content of claims. (a) Forms. (1) Each claim to compulsory license royalty fees... owner entitled to claim the royalty fees. (ii) A general statement of the nature of the copyright owner...

  14. 37 CFR 360.3 - Form and content of claims.

    Science.gov (United States)

    2010-07-01

    ... SUBMISSION OF ROYALTY CLAIMS FILING OF CLAIMS TO ROYALTY FEES COLLECTED UNDER COMPULSORY LICENSE Cable Claims § 360.3 Form and content of claims. (a) Forms. (1) Each claim to cable compulsory license royalty fees... copyright owner entitled to claim the royalty fees. (ii) A general statement of the nature of the copyright...

  15. Content-Based Multi-Channel Network Coding Algorithm in the Millimeter-Wave Sensor Network

    Directory of Open Access Journals (Sweden)

    Kai Lin

    2016-07-01

    Full Text Available With the development of wireless technology, the widespread use of 5G is already an irreversible trend, and millimeter-wave sensor networks are becoming more and more common. However, due to the high degree of complexity and bandwidth bottlenecks, the millimeter-wave sensor network still faces numerous problems. In this paper, we propose a novel content-based multi-channel network coding algorithm, which uses the functions of data fusion, multi-channel and network coding to improve the data transmission; the algorithm is referred to as content-based multi-channel network coding (CMNC. The CMNC algorithm provides a fusion-driven model based on the Dempster-Shafer (D-S evidence theory to classify the sensor nodes into different classes according to the data content. By using the result of the classification, the CMNC algorithm also provides the channel assignment strategy and uses network coding to further improve the quality of data transmission in the millimeter-wave sensor network. Extensive simulations are carried out and compared to other methods. Our simulation results show that the proposed CMNC algorithm can effectively improve the quality of data transmission and has better performance than the compared methods.

  16. Quantum BCH Codes Based on Spectral Techniques

    International Nuclear Information System (INIS)

    Guo Ying; Zeng Guihua

    2006-01-01

    When the time variable in quantum signal processing is discrete, the Fourier transform exists on the vector space of n-tuples over the Galois field F 2 , which plays an important role in the investigation of quantum signals. By using Fourier transforms, the idea of quantum coding theory can be described in a setting that is much different from that seen that far. Quantum BCH codes can be defined as codes whose quantum states have certain specified consecutive spectral components equal to zero and the error-correcting ability is also described by the number of the consecutive zeros. Moreover, the decoding of quantum codes can be described spectrally with more efficiency.

  17. A Novel Error Resilient Scheme for Wavelet-based Image Coding Over Packet Networks

    OpenAIRE

    WenZhu Sun; HongYu Wang; DaXing Qian

    2012-01-01

    this paper presents a robust transmission strategy for wavelet based scalable bit stream over packet erasure channel. By taking the advantage of the bit plane coding and the multiple description coding, the proposed strategy adopts layered multiple description coding (LMDC) for the embedded wavelet coders to improve the error resistant capability of the important bit planes in the meaning of D(R) function. Then, the post-compression rate-distortion (PCRD) optimization process is used to impro...

  18. Local coding based matching kernel method for image classification.

    Directory of Open Access Journals (Sweden)

    Yan Song

    Full Text Available This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.

  19. The influence of motor vehicle legislation on injury claim incidence.

    Science.gov (United States)

    Lemstra, Mark; Olszynski, W P

    2005-01-01

    Although there have been numerous strategies to prevent motor vehicle collisions and their subsequent injuries, few have been effective in preventing motor vehicle injury claims. In this paper, we examine the role of legislation and compensation system in altering injury claim incidence. The population base for our natural experiment was all Saskatchewan, Manitoba, British Columbia and Quebec residents who submitted personal injury claims to their respective motor vehicle insurance provider from 1990 to 1999. The provinces of Saskatchewan and Manitoba switched from Tort to pure No-Fault insurance on January 1, 1995 and on March 1, 1994 respectively. British Columbia maintained tort insurance and Quebec maintained pure no-fault insurance throughout the entire 10-year period. The conversion from tort insurance to pure no-fault motor vehicle insurance resulted in a five-year 31% (RR = 0.69; 95% CI 0.68-0.70) reduction in total injury claims per 100,000 residents in Saskatchewan and a five-year 43% (RR = 0.57; 95% CI 0.56-0.58) reduction in Manitoba. At the same time, the province of British Columbia retained tort insurance and had a five-year 5% reduction (RR = 0.95; 95% CI 0.94-0.99). Quebec, which retained pure no-fault throughout the entire 10-year period, had less than one third of the injury claims per 100,000 residents than the tort province of British Columbia. The conversion from tort to pure no-fault legislation has a large influence in reducing motor vehicle injury claim incidence in Canada. Legislative system and injury compensation scheme have an observable impact on injury claim incidence and can therefore have significant impact on the health care system.

  20. 32 CFR Appendix to Part 281 - Claims Description

    Science.gov (United States)

    2010-07-01

    ... advance decision functions for claims under the following statutes: (a) 31 U.S.C. 3702, concerning claims... SETTLING PERSONNEL AND GENERAL CLAIMS AND PROCESSING ADVANCE DECISION REQUESTS Pt. 281, App. Appendix to... Personnel Management performs these functions for claims involving civilian employees' compensation and...

  1. 42 CFR 456.722 - Electronic claims management system.

    Science.gov (United States)

    2010-10-01

    ... Electronic Claims Management System for Outpatient Drug Claims § 456.722 Electronic claims management system...'s Medicaid Management Information System (MMIS) applicable to prescription drugs. (ii) Notifying the... 42 Public Health 4 2010-10-01 2010-10-01 false Electronic claims management system. 456.722...

  2. The role of health-related claims and health-related symbols in consumer behaviour

    NARCIS (Netherlands)

    Hieke, S.; Kuljanic, N.; Wills, J.M.; Pravst, I.; Kaur, A.; Raats, M.M.; Trijp, van H.C.M.; Verbeke, W.; Grunert, K.G.

    2015-01-01

    Health claims and symbols are potential aids to help consumers identify foods that are healthier options. However, little is known as to how health claims and symbols are used by consumers in real-world shopping situations, thus making the science-based formulation of new labelling policies and

  3. Development and validation of GWHEAD, a three-dimensional groundwater head computer code

    International Nuclear Information System (INIS)

    Beckmeyer, R.R.; Root, R.W.; Routt, K.R.

    1980-03-01

    A computer code has been developed to solve the groundwater flow equation in three dimensions. The code has finite-difference approximations solved by the strongly implicit solution procedure. Input parameters to the code include hydraulic conductivity, specific storage, porosity, accretion (recharge), and initial hydralic head. These parameters may be input as varying spatially. The hydraulic conductivity may be input as isotropic or anisotropic. The boundaries either may permit flow across them or may be impermeable. The code has been used to model leaky confined groundwater conditions and spherical flow to a continuous point sink, both of which have exact analytical solutions. The results generated by the computer code compare well with those of the analytical solutions. The code was designed to be used to model groundwater flow beneath fuel reprocessing and waste storage areas at the Savannah River Plant

  4. Electronic Health Record-Related Events in Medical Malpractice Claims.

    Science.gov (United States)

    Graber, Mark L; Siegal, Dana; Riah, Heather; Johnston, Doug; Kenyon, Kathy

    2015-11-06

    There is widespread agreement that the full potential of health information technology (health IT) has not yet been realized and of particular concern are the examples of unintended consequences of health IT that detract from the safety of health care or from the use of health IT itself. The goal of this project was to obtain additional information on these health IT-related problems, using a mixed methods (qualitative and quantitative) analysis of electronic health record-related harm in cases submitted to a large database of malpractice suits and claims. Cases submitted to the CRICO claims database and coded during 2012 and 2013 were analyzed. A total of 248 cases (<1%) involving health IT were identified and coded using a proprietary taxonomy that identifies user- and system-related sociotechnical factors. Ambulatory care accounted for most of the cases (146 cases). Cases were most typically filed as a result of an error involving medications (31%), diagnosis (28%), or a complication of treatment (31%). More than 80% of cases involved moderate or severe harm, although lethal cases were less likely in cases from ambulatory settings. Etiologic factors spanned all of the sociotechnical dimensions, and many recurring patterns of error were identified. Adverse events associated with health IT vulnerabilities can cause extensive harm and are encountered across the continuum of health care settings and sociotechnical factors. The recurring patterns provide valuable lessons that both practicing clinicians and health IT developers could use to reduce the risk of harm in the future. The likelihood of harm seems to relate more to a patient's particular situation than to any one class of error.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share thework provided it is properly cited. The work cannot be changed in any way or used

  5. Claim Assessment Profile: A Method for Capturing Healthcare Evidence in the Scientific Evaluation and Review of Claims in Health Care (SEaRCH).

    Science.gov (United States)

    Hilton, Lara; Jonas, Wayne B

    2017-02-01

    Grounding health claims in an evidence base is essential for determining safety and effectiveness. However, it is not appropriate to evaluate all healthcare claims with the same methods. "Gold standard" randomized controlled trials may skip over important qualitative and observational data about use, benefits, side effects, and preferences, issues especially salient in research on complementary and integrative health (CIH) practices. This gap has prompted a move toward studying treatments in their naturalistic settings. In the 1990s, a program initiated under the National Institutes of Health was designed to provide an outreach to CIH practices for assessing the feasibility of conducting retrospective or prospective evaluations. The Claim Assessment Profile further develops this approach, within the framework of Samueli Institute's Scientific Evaluation and Review of Claims in Health Care (SEaRCH) method. The goals of a Claim Assessment Profile are to clarify the elements that constitute a practice, define key outcomes, and create an explanatory model of these impacts. The main objective is to determine readiness and capacity of a practice to engage in evaluation of effectiveness. This approach is informed by a variety of rapid assessment and stakeholder-driven methods. Site visits, structured qualitative interviews, surveys, and observational data on implementation provide descriptive data about the practice. Logic modeling defines inputs, processes, and outcome variables; Path modeling defines an analytic map to explore. The Claim Assessment Profile is a rapid assessment of the evaluability of a healthcare practice. The method was developed for use on CIH practices but has also been applied in resilience research and may be applied beyond the healthcare sector. Findings are meant to provide sufficient data to improve decision-making for stakeholders. This method provides an important first step for moving existing promising yet untested practices into

  6. 77 FR 22267 - Eagle Permits; Changes in the Regulations Governing Eagle Permitting

    Science.gov (United States)

    2012-04-13

    ... with rotating wind turbines. Permit Duration and Transferability In February 2011, we published draft... permit applicants, because of the known risk to eagles from collisions with wind turbines and electric... change does not affect the tenure of any other migratory bird or eagle permit type. DATES: Electronic...

  7. Development and Application of a Code for Internal Exposure (CINEX) based on the CINDY code

    International Nuclear Information System (INIS)

    Kravchik, T.; Duchan, N.; Sarah, R.; Gabay, Y.; Kol, R.

    2004-01-01

    Internal exposure to radioactive materials at the NRCN is evaluated using the CINDY (Code for Internal Dosimetry) Package. The code was developed by the Pacific Northwest Laboratory to assist the interpretation of bioassay data, provide bioassay projections and evaluate committed and calendar-year doses from intake or bioassay measurement data. It provides capabilities to calculate organ dose and effective dose equivalents using the International Commission on Radiological Protection (ICRP) 30 approach. The CINDY code operates under DOS operating system and consequently its operation needs a relatively long procedure which also includes a lot of manual typing that can lead to personal human mistakes. A new code has been developed at the NRCN, the CINEX (Code for Internal Exposure), which is an Excel application and leads to a significant reduction in calculation time (in the order of 5-10 times) and in the risk of personal human mistakes. The code uses a database containing tables which were constructed by the CINDY and contain the bioassay values predicted by the ICRP30 model after an intake of an activity unit of each isotope. Using the database, the code than calculates the appropriate intake and consequently the committed effective dose and organ dose. Calculations with the CINEX code were compared to similar calculations with the CINDY code. The discrepancies were less than 5%, which is the rounding error of the CINDY code. Attached is a table which compares parameters calculated with the CINEX and the CINDY codes (for a class Y uranium). The CINEX is now used at the NRCN to calculate occupational intakes and doses to workers with radioactive materials

  8. Estimation of Missed Statin Prescription Use in an Administrative Claims Dataset.

    Science.gov (United States)

    Wade, Rolin L; Patel, Jeetvan G; Hill, Jerrold W; De, Ajita P; Harrison, David J

    2017-09-01

    Nonadherence to statin medications is associated with increased risk of cardiovascular disease and poses a challenge to lipid management in patients who are at risk for atherosclerotic cardiovascular disease. Numerous studies have examined statin adherence based on administrative claims data; however, these data may underestimate statin use in patients who participate in generic drug discount programs or who have alternative coverage. To estimate the proportion of patients with missing statin claims in a claims database and determine how missing claims affect commonly used utilization metrics. This retrospective cohort study used pharmacy data from the PharMetrics Plus (P+) claims dataset linked to the IMS longitudinal pharmacy point-of-sale prescription database (LRx) from January 1, 2012, through December 31, 2014. Eligible patients were represented in the P+ and LRx datasets, had ≥1 claim for a statin (index claim) in either database, and had ≥ 24 months of continuous enrollment in P+. Patients were linked between P+ and LRx using a deterministic method. Duplicate claims between LRx and P+ were removed to produce a new dataset comprised of P+ claims augmented with LRx claims. Statin use was then compared between P+ and the augmented P+ dataset. Utilization metrics that were evaluated included percentage of patients with ≥ 1 missing statin claim over 12 months in P+; the number of patients misclassified as new users in P+; the number of patients misclassified as nonstatin users in P+; the change in 12-month medication possession ratio (MPR) and proportion of days covered (PDC) in P+; the comparison between P+ and LRx of classifications of statin treatment patterns (statin intensity and patients with treatment modifications); and the payment status for missing statin claims. Data from 965,785 patients with statin claims in P+ were analyzed (mean age 56.6 years; 57% male). In P+, 20.1% had ≥ 1 missing statin claim post-index; 13.7% were misclassified as

  9. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Directory of Open Access Journals (Sweden)

    Jianyong Liu

    2015-01-01

    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  10. Hanford facility dangerous waste Part A, Form 3 and Part B permit application documentation, Central Waste Complex (WA7890008967)(TSD: TS-2-4)

    Energy Technology Data Exchange (ETDEWEB)

    Saueressig, D.G.

    1998-05-20

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating, treatment, storage, and/or disposal units, such as the Central Waste Complex (this document, DOE/RL-91-17). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needed by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. A checklist indicating where information is contained in the Central Waste Complex permit application documentation, in relation to the Washington State Department of Ecology guidance, is located in the Contents section. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the Central Waste Complex permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text. Information provided in this Central Waste Complex permit application documentation is current as of May 1998.

  11. 28 CFR 32.32 - Time for filing claim.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Time for filing claim. 32.32 Section 32.32 Judicial Administration DEPARTMENT OF JUSTICE PUBLIC SAFETY OFFICERS' DEATH, DISABILITY, AND EDUCATIONAL ASSISTANCE BENEFIT CLAIMS Educational Assistance Benefit Claims § 32.32 Time for filing claim. (a...

  12. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    This thesis consists of six chapters. The first chapter, contains a short introduction to coding theory in which we explain the coding theory concepts we use. In the second chapter, we present the required theory for evaluation codes and also give an example of some fundamental codes in coding...... theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...

  13. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  14. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  15. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  16. Reviewing the Suitability of Affirmative Action and the Inherent Requirements of the Job as Grounds of Justification to Equal Pay Claims in Terms Of the Employment Equity Act 55 of 1998

    Directory of Open Access Journals (Sweden)

    Shamier

    2018-01-01

    Full Text Available The Employment Equity Act 55 of 1998 ("EEA" has been amended to include a specific provision dealing with equal pay claims in the form of section 6(4. Section 6(4 of the EEA prohibits unfair discrimination in terms and conditions of employment between employees performing the same or substantially the same work or work of equal value. The Minister of Labour has issued Regulations and a Code to assist with the implementation of the principle of equal pay. Both the Regulations and the Code set out the criteria for assessing work of equal value as well as the grounds of justification to a claim of equal pay for work of equal value (factors justifying differentiation in terms and conditions of employment. The EEA refers to two grounds of justification in respect of unfair discrimination claims, namely affirmative action and the inherent requirements of the job. There is support for the view that these grounds of justification are not suitable to equal pay claims. There is a contrary view that these grounds of justification can apply to equal pay claims. The Labour Courts have not had the opportunity to analyse these grounds of justification in the context of equal pay claims. It is thus necessary to analyse these grounds of justification in order to ascertain whether they provide justifications proper to equal pay claims. The purpose of this article is to analyse the grounds of justification of pay discrimination as contained in South African law, the Conventions and Materials of the International Labour Organisation and the equal pay laws of the United Kingdom. Lastly, an analysis will be undertaken to determine whether affirmative action and the inherent requirements of the job provide justifications proper to equal pay claims.

  17. VLSI Architectures for Sliding-Window-Based Space-Time Turbo Trellis Code Decoders

    Directory of Open Access Journals (Sweden)

    Georgios Passas

    2012-01-01

    Full Text Available The VLSI implementation of SISO-MAP decoders used for traditional iterative turbo coding has been investigated in the literature. In this paper, a complete architectural model of a space-time turbo code receiver that includes elementary decoders is presented. These architectures are based on newly proposed building blocks such as a recursive add-compare-select-offset (ACSO unit, A-, B-, Γ-, and LLR output calculation modules. Measurements of complexity and decoding delay of several sliding-window-technique-based MAP decoder architectures and a proposed parameter set lead to defining equations and comparison between those architectures.

  18. DEVELOPMENT OF SALES APPLICATION OF PREPAID ELECTRICITY VOUCHER BASED ON ANFROID PLATFORM USING QUICK RESPONSE CODE (QR CODE

    Directory of Open Access Journals (Sweden)

    Ricky Akbar

    2017-09-01

    Full Text Available Perusahaan Listrik Negara (PLN has implemented a smart electricity system or prepaid electricity. The customers pay the electricity voucher first before use the electricity. The token contained in electricity voucher that has been purchased by the customer is inserted into the Meter Prabayar (MPB installed in the location of customers. When a customer purchases a voucher, it will get a receipt that contains all of the customer's identity and the 20-digit of voucher code (token to be entered into MPB as a substitute for electrical energy credit. Receipts obtained by the customer is certainly vulnerable to loss, or hijacked by unresponsible parties. In this study, authors designed and develop an android based application by utilizing QR code technology as a replacement for the receipt of prepaid electricity credit which contains the identity of the customer and the 20-digit voucher code. The application is developed by implemented waterfall methodology. The implementation process of the waterfall methods used, are (1 analysis of functional requirement of the system by conducting a preliminary study and data collection based on field studies and literature, (2 system design by using UML diagrams and Business Process Model Notation (BPMN and Entity Relationship diagram (ERD, (3 design implementation by using OOP (Object Oriented programming technique. Web application is developed by using laravel PHP framework and database MySQL while mobile application is developed by using B4A (4 developed system is tested by using blackbox method testing. Final result of this research is a Web and mobile applications for the sale of electricityvoucher by QR Code technology.

  19. Ex-vessel break in ITER divertor cooling loop analysis with the ECART code

    CERN Document Server

    Cambi, G; Parozzi, F; Porfiri, MT

    2003-01-01

    A hypothetical double-ended pipe rupture in the ex-vessel section of the International Thermonuclear Experimental Reactor (ITER) divertor primary heat transfer system during pulse operation has been assessed using the nuclear source term ECART code. That code was originally designed and validated for traditional nuclear power plant safety analyses, and has been internationally recognized as a relevant nuclear source term codes for nuclear fission plants. It permits the simulation of chemical reactions and transport of radioactive gases and aerosols under two-phase flow transients in generic flow systems, using a built-in thermal-hydraulic model. A comparison with the results given in ITER Generic Site Safety Report, obtained using a thermal-hydraulic system code (ATHENA), a containment code (INTRA) and an aerosol transportation code (NAUA), in a sequential way, is also presented and discussed.

  20. Tradeable carbon permits

    International Nuclear Information System (INIS)

    Koutstaal, P.R.

    1995-01-01

    The research project on tradeable carbon permits has focused on three elements. First of all, the practical implications of designing a system of tradeable emission permits for reducing CO2 has been studied. In the second part, the consequences of introducing a system of tradeable carbon permits for entry barriers have been considered. Finally, the institutional requirements and welfare effects of coordination of CO2 abatement in a second-best world have been examined

  1. 50 CFR 679.4 - Permits.

    Science.gov (United States)

    2010-10-01

    ... this section, with the exception that an IFQ hired master permit or a CDQ hired master permit need not... program permit or card type is: Permit is in effect from issue date through the end of: For more... section (C) Halibut & sablefish hired master permits Specified fishing year Paragraph (d)(2) of this...

  2. Worst-Case-Optimal Dynamic Reinsurance for Large Claims

    DEFF Research Database (Denmark)

    Korn, Ralf; Menkens, Olaf; Steffensen, Mogens

    2012-01-01

    We control the surplus process of a non-life insurance company by dynamic proportional reinsurance. The objective is to maximize expected (utility of the) surplus under the worst-case claim development. In the large claim case with a worst-case upper limit on claim numbers and claim sizes, we fin...

  3. 32 CFR 536.29 - Revision of filed claims.

    Science.gov (United States)

    2010-07-01

    ... AGAINST THE UNITED STATES Investigation and Processing of Claims § 536.29 Revision of filed claims. (a... the writing alleges a new theory of liability, a new tortfeasor, a new party claimant, a different... amendment, not a new claim. Similarly, the addition of required information not on the original claim...

  4. Mode-dependent templates and scan order for H.264/AVC-based intra lossless coding.

    Science.gov (United States)

    Gu, Zhouye; Lin, Weisi; Lee, Bu-Sung; Lau, Chiew Tong; Sun, Ming-Ting

    2012-09-01

    In H.264/advanced video coding (AVC), lossless coding and lossy coding share the same entropy coding module. However, the entropy coders in the H.264/AVC standard were original designed for lossy video coding and do not yield adequate performance for lossless video coding. In this paper, we analyze the problem with the current lossless coding scheme and propose a mode-dependent template (MD-template) based method for intra lossless coding. By exploring the statistical redundancy of the prediction residual in the H.264/AVC intra prediction modes, more zero coefficients are generated. By designing a new scan order for each MD-template, the scanned coefficients sequence fits the H.264/AVC entropy coders better. A fast implementation algorithm is also designed. With little computation increase, experimental results confirm that the proposed fast algorithm achieves about 7.2% bit saving compared with the current H.264/AVC fidelity range extensions high profile.

  5. State Licenses & Permits

    Data.gov (United States)

    Small Business Administration — Starting a business? Confused about whether you need a business license or permit? Virtually every business needs some form of license or permit to operate legally....

  6. Coding response to a case-mix measurement system based on multiple diagnoses.

    Science.gov (United States)

    Preyra, Colin

    2004-08-01

    To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post.

  7. Coding Response to a Case-Mix Measurement System Based on Multiple Diagnoses

    Science.gov (United States)

    Preyra, Colin

    2004-01-01

    Objective To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Data Sources Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Study Design Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Principal Findings Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Conclusions Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post. PMID:15230940

  8. Guidance for writing permits for the use or disposal of sewage sludge. Draft report

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-01

    Section 405(d) of the Clean Water Act (CWA) directs the U.S. Environmental Protection Agency (EPA) to develop regulations containing guidelines for the use and disposal of sewage sludge. On February 19th, 1993, EPA published final regulations at 40 Code of Federal Regulations (CFR) Part 503 as the culmination of a major effort to develop technical standards in response to Section 405(d). These regulations govern three sewage sludge use and disposal practices: land application, surface disposal, and incineration. A key element in EPA's implementation of the Part 503 regulations is educating Agency and State personnel about these new requirements. Although the regulations are generally directly enforceable against all persons involved in the use and disposal of sewage sludge, they will also be implemented through permits issued to treatment works treating domestic sewage as defined in 40 CFR 122.22. Thus, the primary focus of the manual is to assist permit writers in incorporating the Part 503 requirements into permits; it serves as an update to the Guidance for Writing Case-by-Case Permit Conditions for Municipal Sewage Sludge (PB91-145508/HDM).

  9. 20 CFR 410.232 - Withdrawal of a claim.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Withdrawal of a claim. 410.232 Section 410.232 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL COAL MINE HEALTH AND SAFETY ACT OF 1969... Claims and Evidence § 410.232 Withdrawal of a claim. (a) Before adjudication of claim. A claimant (or an...

  10. Comparison of the sand liquefaction estimated based on codes and practical earthquake damage phenomena

    Science.gov (United States)

    Fang, Yi; Huang, Yahong

    2017-12-01

    Conducting sand liquefaction estimated based on codes is the important content of the geotechnical design. However, the result, sometimes, fails to conform to the practical earthquake damages. Based on the damage of Tangshan earthquake and engineering geological conditions, three typical sites are chosen. Moreover, the sand liquefaction probability was evaluated on the three sites by using the method in the Code for Seismic Design of Buildings and the results were compared with the sand liquefaction phenomenon in the earthquake. The result shows that the difference between sand liquefaction estimated based on codes and the practical earthquake damage is mainly attributed to the following two aspects: The primary reasons include disparity between seismic fortification intensity and practical seismic oscillation, changes of groundwater level, thickness of overlying non-liquefied soil layer, local site effect and personal error. Meanwhile, although the judgment methods in the codes exhibit certain universality, they are another reason causing the above difference due to the limitation of basic data and the qualitative anomaly of the judgment formulas.

  11. Spike-based population coding and working memory.

    Directory of Open Access Journals (Sweden)

    Martin Boerlin

    2011-02-01

    Full Text Available Compelling behavioral evidence suggests that humans can make optimal decisions despite the uncertainty inherent in perceptual or motor tasks. A key question in neuroscience is how populations of spiking neurons can implement such probabilistic computations. In this article, we develop a comprehensive framework for optimal, spike-based sensory integration and working memory in a dynamic environment. We propose that probability distributions are inferred spike-per-spike in recurrently connected networks of integrate-and-fire neurons. As a result, these networks can combine sensory cues optimally, track the state of a time-varying stimulus and memorize accumulated evidence over periods much longer than the time constant of single neurons. Importantly, we propose that population responses and persistent working memory states represent entire probability distributions and not only single stimulus values. These memories are reflected by sustained, asynchronous patterns of activity which make relevant information available to downstream neurons within their short time window of integration. Model neurons act as predictive encoders, only firing spikes which account for new information that has not yet been signaled. Thus, spike times signal deterministically a prediction error, contrary to rate codes in which spike times are considered to be random samples of an underlying firing rate. As a consequence of this coding scheme, a multitude of spike patterns can reliably encode the same information. This results in weakly correlated, Poisson-like spike trains that are sensitive to initial conditions but robust to even high levels of external neural noise. This spike train variability reproduces the one observed in cortical sensory spike trains, but cannot be equated to noise. On the contrary, it is a consequence of optimal spike-based inference. In contrast, we show that rate-based models perform poorly when implemented with stochastically spiking neurons.

  12. 27 CFR 70.608 - Action on claims.

    Science.gov (United States)

    2010-04-01

    ... Section 70.608 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT... appropriate TTB officer shall date stamp and examine each claim filed under this subpart and will determine the validity of the claim. Claims and supporting data involving customs duties will be forwarded to...

  13. The Educational Benefits Claimed for Physical Education and School Sport: An Academic Review

    Science.gov (United States)

    Bailey, Richard; Armour, Kathleen; Kirk, David; Jess, Mike; Pickup, Ian; Sandford, Rachel

    2009-01-01

    This academic review critically examines the theoretical and empirical bases of claims made for the educational benefits of physical education and school sport (PESS). An historical overview of the development of PESS points to the origins of claims made in four broad domains: physical, social, affective and cognitive. Analysis of the evidence…

  14. Visual and intelligent transients and accidents analyzer based on thermal-hydraulic system code

    International Nuclear Information System (INIS)

    Meng Lin; Rui Hu; Yun Su; Ronghua Zhang; Yanhua Yang

    2005-01-01

    Full text of publication follows: Many thermal-hydraulic system codes were developed in the past twenty years, such as RELAP5, RETRAN, ATHLET, etc. Because of their general and advanced features in thermal-hydraulic computation, they are widely used in the world to analyze transients and accidents. But there are following disadvantages for most of these original thermal-hydraulic system codes. Firstly, because models are built through input decks, so the input files are complex and non-figurative, and the style of input decks is various for different users and models. Secondly, results are shown in off-line data file form. It is not convenient for analysts who may pay more attention to dynamic parameters trend and changing. Thirdly, there are few interfaces with other program in these original thermal-hydraulic system codes. This restricts the codes expanding. The subject of this paper is to develop a powerful analyzer based on these thermal-hydraulic system codes to analyze transients and accidents more simply, accurately and fleetly. Firstly, modeling is visual and intelligent. Users build the thermalhydraulic system model using component objects according to their needs, and it is not necessary for them to face bald input decks. The style of input decks created automatically by the analyzer is unified and can be accepted easily by other people. Secondly, parameters concerned by analyst can be dynamically communicated to show or even change. Thirdly, the analyzer provide interface with other programs for the thermal-hydraulic system code. Thus parallel computation between thermal-hydraulic system code and other programs become possible. In conclusion, through visual and intelligent method, the analyzer based on general and advanced thermal-hydraulic system codes can be used to analysis transients and accidents more effectively. The main purpose of this paper is to present developmental activities, assessment and application results of the visual and intelligent

  15. Secure-Network-Coding-Based File Sharing via Device-to-Device Communication

    OpenAIRE

    Wang, Lei; Wang, Qing

    2017-01-01

    In order to increase the efficiency and security of file sharing in the next-generation networks, this paper proposes a large scale file sharing scheme based on secure network coding via device-to-device (D2D) communication. In our scheme, when a user needs to share data with others in the same area, the source node and all the intermediate nodes need to perform secure network coding operation before forwarding the received data. This process continues until all the mobile devices in the netw...

  16. DATA-POOL : a direct-access data base for large-scale nuclear codes

    International Nuclear Information System (INIS)

    Yamano, Naoki; Koyama, Kinji; Naito, Yoshitaka; Minami, Kazuyoshi.

    1991-12-01

    A direct-access data base DATA-POOL has been developed for large-scale nuclear codes. The data can be stored and retrieved with specifications of simple node names, by using the DATA-POOL access package written in the FORTRAN 77 language. A management utility POOL for the DATA-POOL is also provided. A typical application of the DATA-POOL is shown to the RADHEAT-V4 code system developed for performing safety analyses of radiation shielding. Many samples and error messages are also noted to apply the DATA-POOL for the other code systems. This report is provided for a manual of DATA-POOL. (author)

  17. 32 CFR 537.16 - Scope for maritime claims.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Scope for maritime claims. 537.16 Section 537.16... BEHALF OF THE UNITED STATES § 537.16 Scope for maritime claims. The Army Maritime Claims Settlement Act... claims for damage to: (1) DA-accountable properties of a kind that are within the federal maritime...

  18. 32 CFR 536.119 - Scope for maritime claims.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Scope for maritime claims. 536.119 Section 536... CLAIMS AGAINST THE UNITED STATES Maritime Claims § 536.119 Scope for maritime claims. The AMCSA applies...) Damage that is maritime in nature and caused by tortious conduct of U.S. military personnel or federal...

  19. Error Concealment using Neural Networks for Block-Based Image Coding

    Directory of Open Access Journals (Sweden)

    M. Mokos

    2006-06-01

    Full Text Available In this paper, a novel adaptive error concealment (EC algorithm, which lowers the requirements for channel coding, is proposed. It conceals errors in block-based image coding systems by using neural network. In this proposed algorithm, only the intra-frame information is used for reconstruction of the image with separated damaged blocks. The information of pixels surrounding a damaged block is used to recover the errors using the neural network models. Computer simulation results show that the visual quality and the MSE evaluation of a reconstructed image are significantly improved using the proposed EC algorithm. We propose also a simple non-neural approach for comparison.

  20. File compression and encryption based on LLS and arithmetic coding

    Science.gov (United States)

    Yu, Changzhi; Li, Hengjian; Wang, Xiyu

    2018-03-01

    e propose a file compression model based on arithmetic coding. Firstly, the original symbols, to be encoded, are input to the encoder one by one, we produce a set of chaotic sequences by using the Logistic and sine chaos system(LLS), and the values of this chaotic sequences are randomly modified the Upper and lower limits of current symbols probability. In order to achieve the purpose of encryption, we modify the upper and lower limits of all character probabilities when encoding each symbols. Experimental results show that the proposed model can achieve the purpose of data encryption while achieving almost the same compression efficiency as the arithmetic coding.