WorldWideScience

Sample records for generic core scales

  1. The Performance of the PedsQL™ Generic Core Scales in Children with Sickle Cell Disease

    OpenAIRE

    2008-01-01

    The objective of this study was to determine the feasibility, reliability and validity of the Pediatric Quality of Life Inventory™ generic core scales (PedsQL™ questionnaire) in children with sickle cell disease. This was a cross-sectional study of children from an urban hospital-based sickle cell disease clinic and an urban primary care clinic. The study participants were children ages 2 to 18 years who presented to clinic for a routine visit. Health-related quality of life (HRQL) was the ma...

  2. Validation of persian version of PedsQL™ 4.0 TM generic core scales in toddlers and children

    Directory of Open Access Journals (Sweden)

    Alaleh Gheissari

    2012-01-01

    Conclusion: Results showed that Persian version of PedsQL™ 4.0 TM 4.0 Generic Core Scales is valid and acceptable for pediatric health researches. It is necessary to alternate scoring for 2-4 years old questionnaire and to find a way to increase reliability for healthy children aged 8-12 years especially, according to Iranian culture.

  3. Validation of Persian Version of PedsQL™ 4.0™ Generic Core Scales in Toddlers and Children

    Science.gov (United States)

    Gheissari, Alaleh; Farajzadegan, Ziba; Heidary, Maryam; Salehi, Fatemeh; Masaeli, Ali; Mazrooei, Amin; Varni, James W; Fallah, Zahra; Zandieh, Fariborz

    2012-01-01

    Introduction: To evaluate the reliability, validity and feasibility of the Persian version of the Pediatric Quality of Life inventory (PedsQL™ 4.0™ 4.0) Generic Core Scales in Iranian healthy students ages 7-15 and chronically ill children ages 2-18. Methods: We followed the translation methodology proposed by developer to validate Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales for children. Six hundred and sixty children and adolescents and their parents were enrolled. Sample of 160 healthy students were chosen by random cluster method between 4 regions of Isfahan education offices and 60 chronically ill children were recruited from St. Alzahra hospital private clinics. The questionnaires were fulfilled by the participants. Results: The Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales discriminated between healthy and chronically ill children (healthy students mean score was 12.3 better than chronically ill children, P<0.001). Cronbachs’ alpha internal consistency values exceeded 0.7 for children self reports and proxy reports of children 5-7 years old and 13-18 years old. Reliability of proxy reports for 2-4 years old was much lower than 0.7. Although, proxy reports for chronically ill children 8-12 years old was more than 0.7, these reports for healthy children with same age group was slightly lower than 0.7. Constructive, criterion face and content validity were acceptable. In addition, the Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales was feasible and easy to complete. Conclusion: Results showed that Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales is valid and acceptable for pediatric health researches. It is necessary to alternate scoring for 2-4 years old questionnaire and to find a way to increase reliability for healthy children aged 8-12 years especially, according to Iranian culture. PMID:22701775

  4. The PedsQL™ in Pediatric Patients with Spinal Muscular Atrophy: Feasibility, Reliability, and Validity of the Pediatric Quality of Life Inventory™ Generic Core Scales and Neuromuscular Module

    Science.gov (United States)

    Iannaccone, Susan T.; Hynan, Linda S.; Morton, Anne; Buchanan, Renee; Limbers, Christine A.; Varni, James W.

    2009-01-01

    For Phase II and III clinical trials in children with Spinal Muscular Atrophy (SMA), reliable and valid outcome measures are necessary. Since 2000, the American Spinal Muscular Atrophy Randomized Trials (AmSMART) group has established reliability and validity for measures of strength, lung function, and motor function in the population from age 2 years to 18 years. The PedsQL™ (Pediatric Quality of Life Inventory™) Measurement Model was designed to integrate the relative merits of generic and disease-specific approaches, with disease-specific modules. The PedsQL™ 3.0 Neuromuscular Module was designed to measure HRQOL dimensions specific to children ages 2 to 18 years with neuromuscular disorders, including SMA. One hundred seventy-six children with SMA and their parents completed the PedsQL™ 4.0 Generic Core Scales and PedsQL™ 3.0 Neuromuscular Module. The PedsQL™ demonstrated feasibility, reliability and validity in the SMA population. Consistent with the conceptualization of disease-specific symptoms as causal indicators of generic HRQOL, the majority of intercorrelations among the Neuromuscular Module Scales and the Generic Core Scales were in the medium to large range, supporting construct validity. For the purposes of a clinical trial, the PedsQL™ Neuromuscular Module and Generic Core Scales provide an integrated measurement model with the advantages of both generic and condition-specific instruments. PMID:19846309

  5. Using Rasch rating scale model to reassess the psychometric properties of the Persian version of the PedsQLTM 4.0 Generic Core Scales in school children

    Directory of Open Access Journals (Sweden)

    Jafari Peyman

    2012-03-01

    Full Text Available Abstract Background Item response theory (IRT is extensively used to develop adaptive instruments of health-related quality of life (HRQoL. However, each IRT model has its own function to estimate item and category parameters, and hence different results may be found using the same response categories with different IRT models. The present study used the Rasch rating scale model (RSM to examine and reassess the psychometric properties of the Persian version of the PedsQLTM 4.0 Generic Core Scales. Methods The PedsQLTM 4.0 Generic Core Scales was completed by 938 Iranian school children and their parents. Convergent, discriminant and construct validity of the instrument were assessed by classical test theory (CTT. The RSM was applied to investigate person and item reliability, item statistics and ordering of response categories. Results The CTT method showed that the scaling success rate for convergent and discriminant validity were 100% in all domains with the exception of physical health in the child self-report. Moreover, confirmatory factor analysis supported a four-factor model similar to its original version. The RSM showed that 22 out of 23 items had acceptable infit and outfit statistics (0.6, person reliabilities were low, item reliabilities were high, and item difficulty ranged from -1.01 to 0.71 and -0.68 to 0.43 for child self-report and parent proxy-report, respectively. Also the RSM showed that successive response categories for all items were not located in the expected order. Conclusions This study revealed that, in all domains, the five response categories did not perform adequately. It is not known whether this problem is a function of the meaning of the response choices in the Persian language or an artifact of a mostly healthy population that did not use the full range of the response categories. The response categories should be evaluated in further validation studies, especially in large samples of chronically ill patients.

  6. Validity and reliability of the Iranian version of the Pediatric Quality of Life Inventory™ 4.0 (PedsQL™ Generic Core Scales in children

    Directory of Open Access Journals (Sweden)

    Amiri Parisa

    2012-01-01

    Full Text Available Abstract Background This study aimed to investigate the reliability and validity of the Iranian version of the Pediatric Quality of Life Inventory™ 4.0 (PedsQL™ 4.0 Generic Core Scales in children. Methods A standard forward and backward translation procedure was used to translate the US English version of the PedsQL™ 4.0 Generic Core Scales for children into the Iranian language (Persian. The Iranian version of the PedsQL™ 4.0 Generic Core Scales was completed by 503 healthy and 22 chronically ill children aged 8-12 years and their parents. The reliability was evaluated using internal consistency. Known-groups discriminant comparisons were made, and exploratory factor analysis (EFA and confirmatory factor analysis (CFA were conducted. Results The internal consistency, as measured by Cronbach's alpha coefficients, exceeded the minimum reliability standard of 0.70. All monotrait-multimethod correlations were higher than multitrait-multimethod correlations. The intraclass correlation coefficients (ICC between the children self-report and parent proxy-reports showed moderate to high agreement. Exploratory factor analysis extracted six factors from the PedsQL™ 4.0 for both self and proxy reports, accounting for 47.9% and 54.8% of total variance, respectively. The results of the confirmatory factor analysis for 6-factor models for both self-report and proxy-report indicated acceptable fit for the proposed models. Regarding health status, as hypothesized from previous studies, healthy children reported significantly higher health-related quality of life than those with chronic illnesses. Conclusions The findings support the initial reliability and validity of the Iranian version of the PedsQL™ 4.0 as a generic instrument to measure health-related quality of life of children in Iran.

  7. Health related quality of life assessment in Pakistani paediatric cancer patients using PedsQLTM 4.0 generic core scale and PedsQL™ cancer module

    Directory of Open Access Journals (Sweden)

    Chaudhry Zainab

    2012-05-01

    Full Text Available Abstract Background The purpose of the study was to evaluate and compare the HRQOL of paediatric cancer in comparison to the healthy children across age groups, using PedsQLTM 4.0 Generic Core Scales and the PedsQL™ Cancer Module. Method The PedsQLTM 4.0 Generic Core Scales and PedsQL Cancer Module 3.0 were administered on 56 children including 26 cancer patients and 30 healthy children while employing self and proxy report forms. Furthermore, the results were compared with their healthy comparison group. Results The results indicated a significant relationship between HRQOL reports of cancer patients and their parents. However, the mean of paediatric cancer patients is significantly lower as compare to their healthy comparison group. The mean of proxy report is lower overall on both PedsQL and PedsQL cancer module reports. Conclusion Conclusively, overall HRQOL of cancer patients was lower than healthy children but it is quite similar to their parents’ perception. Whereas, the parental mean on PedsQL and PedsQL 3.0 Cancer Module are significantly low. The study indicated a marked difference between cancer patients and healthy children’s HRQOL perception and unfortunately in country like Pakistan where cancer is on increase, no significant work has yet been done to explore this area of research. The present study highlighted the need to focus on the particular psychological health services required to serve the physically challenged population.

  8. Generic maximum likely scale selection

    DEFF Research Database (Denmark)

    Pedersen, Kim Steenstrup; Loog, Marco; Markussen, Bo

    2007-01-01

    The fundamental problem of local scale selection is addressed by means of a novel principle, which is based on maximum likelihood estimation. The principle is generally applicable to a broad variety of image models and descriptors, and provides a generic scale estimation methodology. The focus...... on second order moments of multiple measurements outputs at a fixed location. These measurements, which reflect local image structure, consist in the cases considered here of Gaussian derivatives taken at several scales and/or having different derivative orders....

  9. Generic Dynamic Scaling in Kinetic Roughening

    OpenAIRE

    Ramasco, José J.; López, Juan M.; Rodríguez, Miguel A.

    2000-01-01

    We study the dynamic scaling hypothesis in invariant surface growth. We show that the existence of power-law scaling of the correlation functions (scale invariance) does not determine a unique dynamic scaling form of the correlation functions, which leads to the different anomalous forms of scaling recently observed in growth models. We derive all the existing forms of anomalous dynamic scaling from a new generic scaling ansatz. The different scaling forms are subclasses of this generic scali...

  10. Measuring health-related quality of life in children with cancer living in mainland China: feasibility, reliability and validity of the Chinese mandarin version of PedsQL 4.0 Generic Core Scales and 3.0 Cancer Module

    Directory of Open Access Journals (Sweden)

    Ji Yi

    2011-11-01

    Full Text Available Abstract Background The Pediatric Quality of Life Inventory (PedsQL is widely used instrument to measure pediatric health-related quality of life (HRQOL for children aged 2 to 18 years. The purpose of the current study was to investigate the feasibility, reliability and validity of the Chinese mandarin version of the PedsQL 4.0 Generic Core Scales and 3.0 Cancer Module in a group of Chinese children with cancer. Methods The PedsQL 4.0 Genetic Core Scales and the PedsQL 3.0 Cancer Module were administered to children with cancer (aged 5-18 years and parents of such children (aged 2-18 years. For comparison, a survey on a demographically group-matched sample of the general population with children (aged 5-18 and parents of children (aged 2-18 years was conducted with the PedsQL 4.0 Genetic Core Scales. Result The minimal mean percentage of missing item responses (except the School Functioning scale supported the feasibility of the PedsQL 4.0 Generic Core Scales and 3.0 Cancer Module for Chinese children with cancer. Most of the scales showed satisfactory reliability with Cronbach's α of exceeding 0.70, and all scales demonstrated sufficient test-retest reliability. Assessing the clinical validity of the questionnaires, statistically significant difference was found between healthy children and children with cancer, and between children on-treatment versus off-treatment ≥12 months. Positive significant correlations were observed between the scores of the PedsQL 4.0 Generic Core Scale and the PedsQL 3.0 Cancer Module. Exploratory factor analysis demonstrated sufficient factorial validity. Moderate to good agreement was found between child self- and parent proxy-reports. Conclusion The findings support the feasibility, reliability and validity of the Chinese Mandarin version of PedsQL 4.0 Generic Core Scales and 3.0 Cancer Module in children with cancer living in mainland China.

  11. Validation of the Korean version of the pediatric quality of life inventory™ 4.0 (PedsQL™ generic core scales in school children and adolescents using the rasch model

    Directory of Open Access Journals (Sweden)

    Varni James W

    2008-06-01

    Full Text Available Abstract Background The Pediatric Quality of Life Inventory™ (PedsQL™ is a child self-report and parent proxy-report instrument designed to assess health-related quality of life (HRQOL in healthy and ill children and adolescents. It has been translated into over 70 international languages and proposed as a valid and reliable pediatric HRQOL measure. This study aimed to assess the psychometric properties of the Korean translation of the PedsQL™ 4.0 Generic Core Scales. Methods Following the guidelines for linguistic validation, the original US English scales were translated into Korean and cognitive interviews were administered. The field testing responses of 1425 school children and adolescents and 1431 parents to the Korean version of PedsQL™ 4.0 Generic Core Scales were analyzed utilizing confirmatory factor analysis and the Rasch model. Results Consistent with studies using the US English instrument and other translation studies, score distributions were skewed toward higher HRQOL in a predominantly healthy population. Confirmatory factor analysis supported a four-factor and a second order-factor model. The analysis using the Rasch model showed that person reliabilities are low, item reliabilities are high, and the majority of items fit the model's expectation. The Rasch rating scale diagnostics showed that PedsQL™ 4.0 Generic Core Scales in general have the optimal number of response categories, but category 4 (almost always a problem is somewhat problematic for the healthy school sample. The agreements between child self-report and parent proxy-report were moderate. Conclusion The results demonstrate the feasibility, validity, item reliability, item fit, and agreement between child self-report and parent proxy-report of the Korean version of PedsQL™ 4.0 Generic Core Scales for school population health research in Korea. However, the utilization of the Korean version of the PedsQL™ 4.0 Generic Core Scales for healthy school

  12. Impaired health-related quality of life in children and adolescents with chronic conditions: a comparative analysis of 10 disease clusters and 33 disease categories/severities utilizing the PedsQL™ 4.0 Generic Core Scales

    Directory of Open Access Journals (Sweden)

    Burwinkle Tasha M

    2007-07-01

    Full Text Available Abstract Background Advances in biomedical science and technology have resulted in dramatic improvements in the healthcare of pediatric chronic conditions. With enhanced survival, health-related quality of life (HRQOL issues have become more salient. The objectives of this study were to compare generic HRQOL across ten chronic disease clusters and 33 disease categories/severities from the perspectives of patients and parents. Comparisons were also benchmarked with healthy children data. Methods The analyses were based on over 2,500 pediatric patients from 10 physician-diagnosed disease clusters and 33 disease categories/severities and over 9,500 healthy children utilizing the PedsQL™ 4.0 Generic Core Scales. Patients were recruited from general pediatric clinics, subspecialty clinics, and hospitals. Results Pediatric patients with diabetes, gastrointestinal conditions, cardiac conditions, asthma, obesity, end stage renal disease, psychiatric disorders, cancer, rheumatologic conditions, and cerebral palsy self-reported progressively more impaired overall HRQOL than healthy children, respectively, with medium to large effect sizes. Patients with cerebral palsy self-reported the most impaired HRQOL, while patients with diabetes self-reported the best HRQOL. Parent proxy-reports generally paralleled patient self-report, with several notable differences. Conclusion The results demonstrate differential effects of pediatric chronic conditions on patient HRQOL across diseases clusters, categories, and severities utilizing the PedsQL™ 4.0 Generic Core Scales from the perspectives of pediatric patients and parents. The data contained within this study represents a larger and more diverse population of pediatric patients with chronic conditions than previously reported in the extant literature. The findings contribute important information on the differential effects of pediatric chronic conditions on generic HRQOL from the perspectives of children and

  13. Factorial invariance of child self-report across healthy and chronic health condition groups: a confirmatory factor analysis utilizing the PedsQLTM 4.0 Generic Core Scales.

    Science.gov (United States)

    Limbers, Christine A; Newman, Daniel A; Varni, James W

    2008-07-01

    The objective of the present study was to examine the factorial invariance of the PedsQL 4.0 Generic Core Scales for child self-report across 11,433 children ages 5-18 with chronic health conditions and healthy children. Multigroup Confirmatory Factor Analysis was performed specifying a five-factor model. Two multigroup structural equation models, one with constrained parameters and the other with unconstrained parameters, were proposed in order to compare the factor loadings across children with chronic health conditions and healthy children. Metric invariance (i.e., equal factor loadings) was demonstrated based on stability of the Comparative Fit Index (CFI) between the two models, and several additional indices of practical fit including the root mean squared error of approximation, the Non-normed Fit Index, and the Parsimony Normed Fit Index. The findings support an equivalent five-factor structure on the PedsQL 4.0 Generic Core Scales across healthy and chronic health condition groups. These findings suggest that when differences are found across chronic health condition and healthy groups when utilizing the PedsQL, these differences are more likely real differences in self-perceived health-related quality of life, rather than differences in interpretation of the PedsQL items as a function of health status.

  14. Comparison between Utility of the Thai Pediatric Quality of Life Inventory 4.0 Generic Core Scales and 3.0 Cerebral Palsy Module

    Science.gov (United States)

    Tantilipikorn, Pinailug; Watter, Pauline; Prasertsukdee, Saipin

    2013-01-01

    Health-related quality of life (HRQOL) is increasingly being considered in the management of patients with various conditions. HRQOL instruments can be broadly classified as generic or disease-specific measures. Several generic HRQOL instruments in different languages have been developed for paediatric populations including the Pediatric Quality…

  15. Health-related quality of life in young adult patients with rheumatoid arthritis in Iran: reliability and validity of the Persian translation of the PedsQL™ 4.0 Generic Core Scales Young Adult Version.

    Science.gov (United States)

    Pakpour, Amir H; Zeidi, Isa Mohammadi; Hashemi, Fariba; Saffari, Mohsen; Burri, Andrea

    2013-01-01

    The objective of the present study was to determine the reliability and validity of the Persian translation of the Pediatric Quality of Life Inventory (PedsQL™) 4.0 Generic Core Scales Young Adult Version in an Iranian sample of young adult patients with rheumatoid arthritis (RA). One hundred ninety-seven young adult patients with RA completed the 23-item PedsQL™ and the 36-item Short-Form Health Survey (SF-36). Disease activity based on Disease Activity Score 28 was also measured. Internal consistency and test-retest reliability, as well as construct, discriminant, and convergent validity, were tested. Confirmatory factor analysis (CFA) was used to verify the original factor structure of the PedsQL™. Also, responsiveness to change in PedsQL™ scores over time was assessed. Cronbach's alpha coefficients ranged from α = 0.82 to α = 0.91. Test-retest reproducibility was satisfactory for all scales and the total scale score. The PedsQL proved good convergent validity with the SF-36. The PedsQL distinguished well between young adult patients and healthy young adults and also RA groups with different comorbidities. The CFA did not confirm the original four-factor model, instead, analyses revealed a best-fitting five-factor model for the PedsQL™ Young Adult Version. Repeated measures analysis of variance indicated that the PedsQL scale scores for young adults increased significantly over time. The Persian translation of the PedsQL™ 4.0 Generic Core Scales Young Adult Version demonstrated good psychometric properties in young adult patients with RA and can be recommended for the use in RA research in Iran.

  16. Factorial invariance of child self-report across age subgroups: a confirmatory factor analysis of ages 5 to 16 years utilizing the PedsQL 4.0 Generic Core Scales.

    Science.gov (United States)

    Limbers, Christine A; Newman, Daniel A; Varni, James W

    2008-01-01

    The utilization of health-related quality of life (HRQOL) measurement in an effort to improve pediatric health and well-being and determine the value of health care services has grown dramatically over the past decade. The paradigm shift toward patient-reported outcomes (PROs) in clinical trials has provided the opportunity to emphasize the value and essential need for pediatric patient self-report. In order for HRQOL/PRO comparisons to be meaningful for subgroup analyses, it is essential to demonstrate factorial invariance. This study examined age subgroup factorial invariance of child self-report for ages 5 to 16 years on more than 8,500 children utilizing the PedsQL 4.0 Generic Core Scales. Multigroup Confirmatory Factor Analysis (MGCFA) was performed specifying a five-factor model. Two multigroup structural equation models, one with constrained parameters and the other with unconstrained parameters, were proposed to compare the factor loadings across the age subgroups. Metric invariance (i.e., equal factor loadings) across the age subgroups was demonstrated based on stability of the Comparative Fit Index between the two models, and several additional indices of practical fit including the Root Mean Squared Error of Approximation, the Non-Normed Fit Index, and the Parsimony Normed Fit Index. The findings support an equivalent five-factor structure across the age subgroups. Based on these data, it can be concluded that children across the age subgroups in this study interpreted items on the PedsQL 4.0 Generic Core Scales in a similar manner regardless of their age.

  17. How young can children reliably and validly self-report their health-related quality of life?: An analysis of 8,591 children across age subgroups with the PedsQL™ 4.0 Generic Core Scales

    Directory of Open Access Journals (Sweden)

    Burwinkle Tasha M

    2007-01-01

    Full Text Available Abstract Background The last decade has evidenced a dramatic increase in the development and utilization of pediatric health-related quality of life (HRQOL measures in an effort to improve pediatric patient health and well-being and determine the value of healthcare services. The emerging paradigm shift toward patient-reported outcomes (PROs in clinical trials has provided the opportunity to further emphasize the value and essential need for pediatric patient self-reported outcomes measurement. Data from the PedsQL™ DatabaseSM were utilized to test the hypothesis that children as young as 5 years of age can reliably and validly report their HRQOL. Methods The sample analyzed represented child self-report age data on 8,591 children ages 5 to 16 years from the PedsQL™ 4.0 Generic Core Scales DatabaseSM. Participants were recruited from general pediatric clinics, subspecialty clinics, and hospitals in which children were being seen for well-child checks, mild acute illness, or chronic illness care (n = 2,603, 30.3%, and from a State Children's Health Insurance Program (SCHIP in California (n = 5,988, 69.7%. Results Items on the PedsQL™ 4.0 Generic Core Scales had minimal missing responses for children as young as 5 years old, supporting feasibility. The majority of the child self-report scales across the age subgroups, including for children as young as 5 years, exceeded the minimum internal consistency reliability standard of 0.70 required for group comparisons, while the Total Scale Scores across the age subgroups approached or exceeded the reliability criterion of 0.90 recommended for analyzing individual patient scale scores. Construct validity was demonstrated utilizing the known groups approach. For each PedsQL™ scale and summary score, across age subgroups, including children as young as 5 years, healthy children demonstrated a statistically significant difference in HRQOL (better HRQOL than children with a known chronic health

  18. Initial validation of the Argentinean Spanish version of the PedsQL™ 4.0 Generic Core Scales in children and adolescents with chronic diseases: acceptability and comprehensibility in low-income settings

    Directory of Open Access Journals (Sweden)

    Bauer Gabriela

    2008-08-01

    Full Text Available Abstract Background To validate the Argentinean Spanish version of the PedsQL™ 4.0 Generic Core Scales in Argentinean children and adolescents with chronic conditions and to assess the impact of socio-demographic characteristics on the instrument's comprehensibility and acceptability. Reliability, and known-groups, and convergent validity were tested. Methods Consecutive sample of 287 children with chronic conditions and 105 healthy children, ages 2–18, and their parents. Chronically ill children were: (1 attending outpatient clinics and (2 had one of the following diagnoses: stem cell transplant, chronic obstructive pulmonary disease, HIV/AIDS, cancer, end stage renal disease, complex congenital cardiopathy. Patients and adult proxies completed the PedsQL™ 4.0 and an overall health status assessment. Physicians were asked to rate degree of health status impairment. Results The PedsQL™ 4.0 was feasible (only 9 children, all 5 to 7 year-olds, could not complete the instrument, easy to administer, completed without, or with minimal, help by most children and parents, and required a brief administration time (average 5–6 minutes. People living below the poverty line and/or low literacy needed more help to complete the instrument. Cronbach Alpha's internal consistency values for the total and subscale scores exceeded 0.70 for self-reports of children over 8 years-old and parent-reports of children over 5 years of age. Reliability of proxy-reports of 2–4 year-olds was low but improved when school items were excluded. Internal consistency for 5–7 year-olds was low (α range = 0.28–0.76. Construct validity was good. Child self-report and parent proxy-report PedsQL™ 4.0 scores were moderately but significantly correlated (ρ = 0.39, p Conclusion Results suggest that the Argentinean Spanish PedsQL™ 4.0 is suitable for research purposes in the public health setting for children over 8 years old and parents of children over 5 years old

  19. A generic scale for assessment of attitudes towards social robots

    DEFF Research Database (Denmark)

    2016-01-01

    The research field into social robotics is expanding and with it the need for consistent methods for assessing attitudinal stance towards social robots. In this paper we describe the development and planned validation of the Attitudes towards social robots scale (ASOR-5): a generic questionnaire...

  20. A generic library for large scale solution of PDEs on modern heterogeneous architectures

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter

    2012-01-01

    Adapting to new programming models for modern multi- and many-core architectures requires code-rewriting and changing algorithms and data structures, in order to achieve good efficiency and scalability. We present a generic library for solving large scale partial differential equations (PDEs....... As a proto-type for large scale PDE solvers, we present the assembling of a tool for simulation of three dimensional fully nonlinear water waves. Measurements show scalable performance results - in the same order as a dedicated non-library version of the wave tool. Introducing a domain decomposition...

  1. Scaling Turbo Boost to a 1000 cores

    CERN Document Server

    S, Ananth Narayan; Fedorova, Alexandra

    2010-01-01

    The Intel Core i7 processor code named Nehalem provides a feature named Turbo Boost which opportunistically varies the frequencies of the processor's cores. The frequency of a core is determined by core temperature, the number of active cores, the estimated power consumption, the estimated current consumption, and operating system frequency scaling requests. For a chip multi-processor(CMP) that has a small number of physical cores and a small set of performance states, deciding the Turbo Boost frequency to use on a given core might not be difficult. However, we do not know the complexity of this decision making process in the context of a large number of cores, scaling to the 100s, as predicted by researchers in the field.

  2. 儿童青少年生存质量普适性核心量表信度效度评价%Validity and reliability of Pediatric Quality of Life Inventory Version 4.0 Generic Core Scales in Chinese children and adolescents

    Institute of Scientific and Technical Information of China (English)

    陈裕明; 何丽萍; 麦锦城; 郝元涛; 熊莉华; 陈维清; 吴江南

    2008-01-01

    Objective To evaluate the reliability and validity of parent proxy-report scales of Pediatric Quality of Life Inventory Version 4.0 (PedsQLTM4.0) Generic Core Scales, the Chinese Version.Methods 3493 school students aged 6-18 years were recruited using multistage cluster sampling method.Health-related quality of life was assessed using the above-mentioned PedsQLTM4.0 scales. The internal consistency was assessed, using Cronbach's α coefficient, while its validity was tested through correlation analysis, t-test and exploratory factor analysis. Results The internal consistency reliability for Total Scale Score (Cronbach's α= 0.90), Physical Health Summary Score (α= 0.81 ), and Psychosocial Health Summary Score (α = 0.89 ) were excellent. Six major factors were extracted by factor analysis which basically matched the designed structure of the original version accounting for nearly 66% of the variance.The total Scale Score significantly decreased by 3.5 to 13.3 (P<0.05) in children and adolescents who had diseases including cold, skin hypersensitiveness, food allergy, courbature or arthralgia, breathlessness with a frequency of 6 times or more per year or had asthma as compared to those with lower frequency (≤5 times/y) of the diseases or without asthma. We found moderate to high correlations between items and the subscales. Correlation coefficients ranged between 0.45 to 0.84 ( P<0.01 ). Conclusion The reliability and validity of the parent proxy-report scales of PedsQLTM4.0 Generic Core Scales of the Chinese Version were as good as the original version. Our findings suggested that the scales could be applied to evaluate the health-related quality of life in childhood children in similar Chinese regions to Guangzhou.%目的 评价简体中文版儿童生存质量普适性核心量表(PedsQLTM4.0)家长报告的信度和效度.方法 采用多级整群的抽样方法,对广州市3493名6~18岁中小学生家长进行问卷调查.用克朗巴赫α系数考

  3. Generic BWR-4 degraded core in-vessel study. Status report

    Energy Technology Data Exchange (ETDEWEB)

    1984-11-01

    Original intent of this project was to produce a phenomenological study of the in-vessel degradation which occurs during the TQUX and TQUV sequences for a generic BWR-4 from the initiation of the FSAR Chapter 15 operational transient through core debris bed formation to the failure of the primary pressure boundary. Bounding calculations were to be performed for the two high pressure and low pressure non-LOCA scenarios to assess the uncertainties in the current state of knowledge regarding the source terms for containment integrity studies. Source terms as such were defined in terms of hydrogen generation, unreacted metal, and coolant inventroy, and in terms of the form, sequencing and mode of dispersal through the primary vessel boundary. Fission product release was not to be considered as part of this study. Premature termination of the project, however, led to the dicontinuation of work on an as is basis. Work on the in-core phase from the point of scram to core debris bed formation was largely completed. A preliminary scoping calculation on the debris bed phase had been initiated. This report documents the status of the study at termination.

  4. A Solution to Fastest Distributed Consensus Problem for Generic Star & K-cored Star Networks

    CERN Document Server

    Jafarizadeh, Saber

    2012-01-01

    Distributed average consensus is the main mechanism in algorithms for decentralized computation. In distributed average consensus algorithm each node has an initial state, and the goal is to compute the average of these initial states in every node. To accomplish this task, each node updates its state by a weighted average of its own and neighbors' states, by using local communication between neighboring nodes. In the networks with fixed topology, convergence rate of distributed average consensus algorithm depends on the choice of weights. This paper studies the weight optimization problem in distributed average consensus algorithm. The network topology considered here is a star network where the branches have different lengths. Closed-form formulas of optimal weights and convergence rate of algorithm are determined in terms of the network's topological parameters. Furthermore generic K-cored star topology has been introduced as an alternative to star topology. The introduced topology benefits from faster con...

  5. Scaling-up the use of generic antiretrovirals in resource-limited countries: generic drugs for health.

    Science.gov (United States)

    Beck, Eduard J; Passarelli, Carlos; Lui, Iris; Guichard, Anne-Claire; Simao, Mariangela; De Lay, Paul; Loures, Luiz

    2014-01-01

    The number of people living with HIV (PLHIV) continues to increase around the world because of the increasing number on antiretroviral therapy (ART) and their associated increase of life expectancy, in addition to the number of people newly infected with HIV each year. Unless a 'cure' can be found for HIV infection, PLHIV can anticipate the need to take antiretroviral drugs (ARVs) for the rest of their lives. Because ARVs are now being used for HIV prevention, as well as for therapeutic purposes, the need for effective, affordable ARVs with few adverse effects will continue to rise. It is important to note that the dramatic growth in treatment coverage of PLHIV seen during the past decade has been primarily due to the increased use of generic ARVs. Thus, there will be a need to scale-up the research and development, production, distribution and access to generic ARVs and ART regimens. However, these processes must occur within national and international regulated free-market economic systems and must deal with increasingly multifaceted patent issues affecting the price while ensuring the quality of the ARVs. National and international regulatory mechanisms will have to evolve, which will affect broader national and international economic and trade issues. Because of the complexity of these issues, the Editors of this Supplement conceived of asking experts in their fields to describe the various steps from relevant research and development, to production of generic ARVs, their delivery to countries and subsequently to PLHIV in low- and middle-income countries. A main objective was to highlight how these steps are interrelated, how the production and delivery of these drugs to PLHIV in resource-limited countries can be made more effective and efficient, and what the lessons are for the production and delivery of a broader set of drugs to people in low- and middle-income countries.

  6. Saddle stresses for generic theories with a preferred acceleration scale

    CERN Document Server

    Magueijo, Joao

    2012-01-01

    We show how scaling arguments may be used to generate templates for the tidal stresses around saddles for a vast class of MONDian theories {\\it detached from their obligations as dark matter alternatives}. Such theories are to be seen simply as alternative theories of gravity with a preferred acceleration scale, and could be tested in the solar system by extending the LISA Pathfinder mission. The constraints thus obtained may then be combined, if one wishes, with requirements arising from astrophysical and cosmological applications, but a clear separation of the issues is achieved. The central technical content of this paper is the derivation of a scaling prescription allowing complex numerical work to be bypassed in the generation of templates.

  7. Methionine Oxidation Perturbs the Structural Core of the Prion Protein and Suggests a Generic Misfolding Pathway*

    Science.gov (United States)

    Younan, Nadine D.; Nadal, Rebecca C.; Davies, Paul; Brown, David R.; Viles, John H.

    2012-01-01

    Oxidative stress and misfolding of the prion protein (PrPC) are fundamental to prion diseases. We have therefore probed the effect of oxidation on the structure and stability of PrPC. Urea unfolding studies indicate that H2O2 oxidation reduces the thermodynamic stability of PrPC by as much as 9 kJ/mol. 1H-15N NMR studies indicate methionine oxidation perturbs key hydrophobic residues on one face of helix-C as follows: Met-205, Val-209, and Met-212 together with residues Val-160 and Tyr-156. These hydrophobic residues pack together and form the structured core of the protein, stabilizing its ternary structure. Copper-catalyzed oxidation of PrPC causes a more significant alteration of the structure, generating a monomeric molten globule species that retains its native helical content. Further copper-catalyzed oxidation promotes extended β-strand structures that lack a cooperative fold. This transition from the helical molten globule to β-conformation has striking similarities to a misfolding intermediate generated at low pH. PrP may therefore share a generic misfolding pathway to amyloid fibers, irrespective of the conditions promoting misfolding. Our observations support the hypothesis that oxidation of PrP destabilizes the native fold of PrPC, facilitating the transition to PrPSc. This study gives a structural and thermodynamic explanation for the high levels of oxidized methionine in scrapie isolates. PMID:22654104

  8. A Generic Length-scale Equation For Second-order Turbulence Models of Oceanic Boundary Layers

    Science.gov (United States)

    Umlauf, L.; Burchard, H.

    A generic transport equation for a generalized length-scale in second-order turbulence closure models for geophysical boundary layers is suggested. This variable consists of the products of powers of the turbulent kinetic energy, k, and the integral length-scale, l. The new approach generalizes traditional second-order models used in geophysical boundary layer modelling, e.g. the Mellor-Yamada model and the k- model, which, however, can be recovered as special cases. It is demonstrated how this new model can be calibrated with measurements in some typical geophysical boundary layer flows. As an example, the generic model is applied to the uppermost oceanic boundary layer directly influenced by the effects of breaking surface waves. Recent measurements show that in this layer the classical law of the wall is invalid, since there turbulence is dominated by turbulent transport of TKE from above, and not by shear-production. A widely accepted approach to describe the wave-affected layer with a one-equation turbulence model was suggested by Craig and Banner (1994). Here, some deficien- cies of their solutions are pointed out and a generalization of their ideas for the case of two-equation models is suggested. Direct comparison with very recently obtained measurements of the dissipation rate, , in the wave-affected boundary layer with com- puted results clearly demonstrate that only the generic two-equation model yields cor- rect predictions for the profiles of and the turbulent length scale, l. Also, the pre- dicted velocity profiles in the wave-affected layer, important e.g. for the interpretation of surface drifter experiments, are reproduced correctly only by the generic model. Implementation and computational costs of the generic model are comparable with traditonal two-equation models.

  9. Classification as a generic tool for characterising status and changes of regional scale groundwater systems

    Science.gov (United States)

    Barthel, Roland; Haaf, Ezra

    2016-04-01

    Regional hydrogeology is becoming increasingly important, but at the same time, scientifically sound, universal solutions for typical groundwater problems encountered on the regional scale are hard to find. While managers, decision-makers and state agencies operating on regional and national levels have always shown a strong interest in regional scale hydrogeology, researchers from academia tend to avoid the subject, focusing instead on local scales. Additionally, hydrogeology has always had a tendency to regard every problem as unique to its own site- and problem-specific context. Regional scale hydrogeology is therefore pragmatic rather than aiming at developing generic methodology (Barthel, 2014; Barthel and Banzhaf, 2016). One of the main challenges encountered on the regional scale in hydrogeology is the extreme heterogeneity that generally increases with the size of the studied area - paired with relative data scarcity. Even in well-monitored regions of the world, groundwater observations are usually clustered, leaving large areas without any direct data. However, there are many good reasons for assessing the status and predicting the behavior of groundwater systems under conditions of global change even for those areas and aquifers without observations. This is typically done by using rather coarsely discretized and / or poorly parameterized numerical models, or by using very simplistic conceptual hydrological models that do not take into account the complex three-dimensional geological setup. Numerical models heavily rely on local data and are resource-demanding. Conceptual hydrological models only deliver reliable information on groundwater if the geology is extremely simple. In this contribution, we present an approach to derive statistically relevant information for un-monitored areas, making use of existing information from similar localities that are or have been monitored. The approach combines site-specific knowledge with conceptual assumptions on

  10. Measuring belief in conspiracy theories: The Generic Conspiracist Beliefs scale (GCB

    Directory of Open Access Journals (Sweden)

    Robert eBrotherton

    2013-05-01

    Full Text Available The psychology of conspiracy theory beliefs is not yet well understood, although research indicates that there are stable individual differences in conspiracist ideation – individuals’ general tendency to engage with conspiracy theories. Researchers have created several short self-report measures of conspiracist ideation. These measures largely consist of items referring to an assortment of prominent conspiracy theories regarding specific real-world events. However, these instruments have not been psychometrically validated, and this assessment approach suffers from practical and theoretical limitations. Therefore, we present the Generic Conspiracist Beliefs (GCB scale: a novel measure of individual differences in generic conspiracist ideation. The scale was developed and validated across four studies. In Study 1, exploratory factor analysis of a novel 75-item measure of non-event-based conspiracist beliefs identified five conspiracist facets. The 15-item GCB scale was developed to sample from each of these themes. Studies 2, 3 and 4 examined the structure and validity of the GCB, demonstrating internal reliability, content, criterion-related, convergent and discriminant validity, and good test-retest reliability. In sum, this research indicates that the GCB is a psychometrically sound and practically useful measure of conspiracist ideation, and the findings add to our theoretical understanding of conspiracist ideation as a monological belief system unpinned by a relatively small number of generic assumptions about the typicality of conspiratorial activity in the world.

  11. Generic framework for meso-scale assessment of climate change hazards in coastal environments

    DEFF Research Database (Denmark)

    Appelquist, Lars Rosendahl

    2013-01-01

    This paper presents a generic framework for assessing inherent climate change hazards in coastal environments through a combined coastal classification and hazard evaluation system. The framework is developed to be used at scales relevant for regional and national planning and aims to cover all...... coastal environments worldwide through a specially designed coastal classification system containing 113 generic coastal types. The framework provides information on the degree to which key climate change hazards are inherent in a particular coastal environment, and covers the hazards of ecosystem...... disruption, gradual inundation, salt water intrusion, erosion and flooding. The system includes a total of 565 individual hazard evaluations, each graduated into four different hazard levels based on a scientific literature review. The framework uses a simple assessment methodology with limited data...

  12. Generic, Extensible, Configurable Push-Pull Framework for Large-Scale Science Missions

    Science.gov (United States)

    Foster, Brian M.; Chang, Albert Y.; Freeborn, Dana J.; Crichton, Daniel J.; Woollard, David M.; Mattmann, Chris A.

    2011-01-01

    The push-pull framework was developed in hopes that an infrastructure would be created that could literally connect to any given remote site, and (given a set of restrictions) download files from that remote site based on those restrictions. The Cataloging and Archiving Service (CAS) has recently been re-architected and re-factored in its canonical services, including file management, workflow management, and resource management. Additionally, a generic CAS Crawling Framework was built based on motivation from Apache s open-source search engine project called Nutch. Nutch is an Apache effort to provide search engine services (akin to Google), including crawling, parsing, content analysis, and indexing. It has produced several stable software releases, and is currently used in production services at companies such as Yahoo, and at NASA's Planetary Data System. The CAS Crawling Framework supports many of the Nutch Crawler's generic services, including metadata extraction, crawling, and ingestion. However, one service that was not ported over from Nutch is a generic protocol layer service that allows the Nutch crawler to obtain content using protocol plug-ins that download content using implementations of remote protocols, such as HTTP, FTP, WinNT file system, HTTPS, etc. Such a generic protocol layer would greatly aid in the CAS Crawling Framework, as the layer would allow the framework to generically obtain content (i.e., data products) from remote sites using protocols such as FTP and others. Augmented with this capability, the Orbiting Carbon Observatory (OCO) and NPP (NPOESS Preparatory Project) Sounder PEATE (Product Evaluation and Analysis Tools Elements) would be provided with an infrastructure to support generic FTP-based pull access to remote data products, obviating the need for any specialized software outside of the context of their existing process control systems. This extensible configurable framework was created in Java, and allows the use of

  13. Scaling of Core Material in Rubble Mound Breakwater Model Tests

    DEFF Research Database (Denmark)

    Burcharth, H. F.; Liu, Z.; Troch, P.

    1999-01-01

    The permeability of the core material influences armour stability, wave run-up and wave overtopping. The main problem related to the scaling of core materials in models is that the hydraulic gradient and the pore velocity are varying in space and time. This makes it impossible to arrive at a fully...... correct scaling. The paper presents an empirical formula for the estimation of the wave induced pressure gradient in the core, based on measurements in models and a prototype. The formula, together with the Forchheimer equation can be used for the estimation of pore velocities in cores. The paper proposes...... that the diameter of the core material in models is chosen in such a way that the Froude scale law holds for a characteristic pore velocity. The characteristic pore velocity is chosen as the average velocity of a most critical area in the core with respect to porous flow. Finally the method is demonstrated...

  14. Can manual ability be measured with a generic ABILHAND scale? A cross-sectional study conducted on six diagnostic groups

    Science.gov (United States)

    Arnould, Carlyne; Vandervelde, Laure; Batcho, Charles Sèbiyo; Penta, Massimo; Thonnard, Jean-Louis

    2012-01-01

    Objectives Several ABILHAND Rasch-built manual ability scales were previously developed for chronic stroke (CS), cerebral palsy (CP), rheumatoid arthritis (RA), systemic sclerosis (SSc) and neuromuscular disorders (NMD). The present study aimed to explore the applicability of a generic manual ability scale unbiased by diagnosis and to study the nature of manual ability across diagnoses. Design Cross-sectional study. Setting Outpatient clinic homes (CS, CP, RA), specialised centres (CP), reference centres (CP, NMD) and university hospitals (SSc). Participants 762 patients from six diagnostic groups: 103 CS adults, 113 CP children, 112 RA adults, 156 SSc adults, 124 NMD children and 124 NMD adults. Primary and secondary outcome measures Manual ability as measured by the ABILHAND disease-specific questionnaires, diagnosis and nature (ie, uni-manual or bi-manual involvement and proximal or distal joints involvement) of the ABILHAND manual activities. Results The difficulties of most manual activities were diagnosis dependent. A principal component analysis highlighted that 57% of the variance in the item difficulty between diagnoses was explained by the symmetric or asymmetric nature of the disorders. A generic scale was constructed, from a metric point of view, with 11 items sharing a common difficulty among diagnoses and 41 items displaying a category-specific location (asymmetric: CS, CP; and symmetric: RA, SSc, NMD). This generic scale showed that CP and NMD children had significantly less manual ability than RA patients, who had significantly less manual ability than CS, SSc and NMD adults. However, the generic scale was less discriminative and responsive to small deficits than disease-specific instruments. Conclusions Our finding that most of the manual item difficulties were disease-dependent emphasises the danger of using generic scales without prior investigation of item invariance across diagnostic groups. Nevertheless, a generic manual ability scale could be

  15. Scaling of Core Material in Rubble Mound Breakwater Model Tests

    DEFF Research Database (Denmark)

    Burcharth, H. F.; Liu, Z.; Troch, P.

    1999-01-01

    correct scaling. The paper presents an empirical formula for the estimation of the wave induced pressure gradient in the core, based on measurements in models and a prototype. The formula, together with the Forchheimer equation can be used for the estimation of pore velocities in cores. The paper proposes...

  16. Development of a Dynamically Scaled Generic Transport Model Testbed for Flight Research Experiments

    Science.gov (United States)

    Jordan, Thomas; Langford, William; Belcastro, Christine; Foster, John; Shah, Gautam; Howland, Gregory; Kidd, Reggie

    2004-01-01

    This paper details the design and development of the Airborne Subscale Transport Aircraft Research (AirSTAR) test-bed at NASA Langley Research Center (LaRC). The aircraft is a 5.5% dynamically scaled, remotely piloted, twin-turbine, swept wing, Generic Transport Model (GTM) which will be used to provide an experimental flight test capability for research experiments pertaining to dynamics modeling and control beyond the normal flight envelope. The unique design challenges arising from the dimensional, weight, dynamic (inertial), and actuator scaling requirements necessitated by the research community are described along with the specific telemetry and control issues associated with a remotely piloted subscale research aircraft. Development of the necessary operational infrastructure, including operational and safety procedures, test site identification, and research pilots is also discussed. The GTM is a unique vehicle that provides significant research capacity due to its scaling, data gathering, and control characteristics. By combining data from this testbed with full-scale flight and accident data, wind tunnel data, and simulation results, NASA will advance and validate control upset prevention and recovery technologies for transport aircraft, thereby reducing vehicle loss-of-control accidents resulting from adverse and upset conditions.

  17. A generic trust framework for large-scale open systems using machine learning

    CERN Document Server

    Liu, Xin; Datta, Anwitaman

    2011-01-01

    In many large scale distributed systems and on the web, agents need to interact with other unknown agents to carry out some tasks or transactions. The ability to reason about and assess the potential risks in carrying out such transactions is essential for providing a safe and reliable environment. A traditional approach to reason about the trustworthiness of a transaction is to determine the trustworthiness of the specific agent involved, derived from the history of its behavior. As a departure from such traditional trust models, we propose a generic, machine learning approach based trust framework where an agent uses its own previous transactions (with other agents) to build a knowledge base, and utilize this to assess the trustworthiness of a transaction based on associated features, which are capable of distinguishing successful transactions from unsuccessful ones. These features are harnessed using appropriate machine learning algorithms to extract relationships between the potential transaction and prev...

  18. Effects of neutronics characteristics for a generic gas core reactor when selected parameters are changed

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Claudio Luiz de [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil). Dept. de Engenharia Nuclear

    2000-07-01

    The Battelle Revised Thermos code, that solves the integral neutron transport equation is used to perform the analysis of a chosen system, which includes materials and number densities that are typically encountered in gaseous core reactors, where selected parameters as pressure (or number density, if the gas temperature is kept constant) of the hydrogen; and temperature and material of the external moderator are changed. (author)

  19. Scaling properties in the adsorption of ionic polymeric surfactants on generic nanoparticles of metallic oxides by mesoscopic simulation

    CERN Document Server

    Mayoral, E

    2014-01-01

    We study the scaling of adsorption isotherms of polyacrylic dispersants on generic surfaces of metallic oxides $XnOm$ as a function of the number of monomeric units, using Electrostatic Dissipative Particle Dynamics simulations. The simulations show how the scaling properties in these systems emerge and how the isotherms rescale to a universal curve, reproducing reported experimental results. The critical exponent for these systems is also obtained, in perfect agreement with the scaling theory of deGennes. Some important applications are mentioned.

  20. Radiological consequence assessments of degraded core accident scenarios derived from a generic Level 2 PSA of a BWR

    Energy Technology Data Exchange (ETDEWEB)

    Homma, Toshimitsu; Ishikawa, Jun; Tomita, Kenichi; Muramatsu, Ken [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2000-12-01

    The radiological consequence assessments have been made of postulated core damage accidents with source terms derived from a generic Level 2 PSA of a BWR carried out by the Japan Atomic Energy Research Institute (JAERI). The source terms used were for the five core damage accident sequences with the drywell and wetwell failure cases, the release control case by venting of the containment and the accident termination case by the containment spray. The radiological consequences have been assessed for individual dose, collective dose, individual risk of early health effects and individual risk of late health effects by a probabilistic accident consequence assessment code, OSCAAR developed in JAERI. Following conclusions were obtained for the assumed source terms. In case of the over pressure failures of the primary containment vessel, the early fatalities can be mitigated through the implementation of early countermeasures, and the late cancer fatalities remains small. For the release control and accident termination cases, the individual and collective doses to the public can be reduced without any countermeasures due to the release reduction of the volatile radionuclides such as iodine and cesium. (author)

  1. Psychometric performance of a generic walking scale (Walk-12G) in multiple sclerosis and Parkinson's disease.

    Science.gov (United States)

    Bladh, Stina; Nilsson, Maria H; Hariz, Gun-Marie; Westergren, Albert; Hobart, Jeremy; Hagell, Peter

    2012-04-01

    Walking difficulties are common in neurological and other disorders, as well as among the elderly. There is a need for reliable and valid instruments for measuring walking difficulties in everyday life since existing gait tests are clinician rated and focus on situation specific capacity. The Walk-12G was adapted from the 12-item multiple sclerosis walking scale as a generic patient-reported rating scale for walking difficulties in everyday life. The aim of this study is to examine the psychometric properties of the Walk-12G in people with multiple sclerosis (MS) and Parkinson's disease (PD). The Walk-12G was translated into Swedish and evaluated qualitatively among 25 people with and without various neurological and other conditions. Postal survey (MS, n = 199; PD, n = 189) and clinical (PD, n = 36) data were used to test its psychometric properties. Respondents considered the Walk-12G relevant and easy to use. Mean completion time was 3.5 min. Data completeness was good (0.6). Coefficient alpha and test-retest reliabilities were >0.9, and standard errors of measurement were 2.3-2.8. Construct validity was supported by correlations in accordance with a priori expectations. Results are similar to those with previous Walk-12G versions, indicating that scale adaptation was successful. Data suggest that the Walk-12G meets rating scale criteria for clinical trials, making it a valuable complement to available gait tests. Further studies involving other samples and application of modern psychometric methods are warranted to examine the scale in more detail.

  2. Generic and low dose antiretroviral therapy in adults and children: implication for scaling up treatment in resource limited settings

    Directory of Open Access Journals (Sweden)

    Ramautarsing Reshmie

    2010-06-01

    Full Text Available Abstract Although access to antiretroviral therapy (ART for the treatment of HIV has increased during the last decade, many patients are still in need of treatment. With limited funds to provide ART to millions of patients worldwide, there is a need for alternative ways to scale up ART in resource limited settings. This review provides an overview of pharmacokinetic, safety and efficacy studies of generic and reduced dose ART. The production of generic ART has greatly influenced the decline in drug prices and the increased in ART access. Generic ART has good pharmacokinetic profile, safety and efficacy. Toxicity is however the main cause for ART discontinuation. Several dose reduction studies have shown adequate pharmacokinetic parameters and short term efficacy with reduced dose ART. Ethnicity may affect drug metabolism; several pharmacokinetic studies have confirmed higher plasma ART concentration in Asians. Randomized efficacy trial of reduced versus standard ART is warranted.

  3. Fine-scale heterogeneity in the Earth's inner core

    Science.gov (United States)

    Vidale; Earle

    2000-03-16

    The seismological properties of the Earth's inner core have become of particular interest as we understand more about its composition and thermal state. Observations of anisotropy and velocity heterogeneity in the inner core are beginning to reveal how it has grown and whether it convects. The attenuation of seismic waves in the inner core is strong, and studies of seismic body waves have found that this high attenuation is consistent with either scattering or intrinsic attenuation. The outermost portion of the inner core has been inferred to possess layering and to be less anisotropic than at greater depths. Here we present observations of seismic waves scattered in the inner core which follow the expected arrival time of the body-wave reflection from the inner-core boundary. The amplitude of these scattered waves can be explained by stiffness variations of 1.2% with a scale length of 2 kilometres across the outermost 300 km of the inner core. These variations might be caused by variations in composition, by pods of partial melt in a mostly solid matrix or by variations in the orientation or strength of seismic anisotropy.

  4. Identification of candidate categories of the International Classification of Functioning Disability and Health (ICF for a Generic ICF Core Set based on regression modelling

    Directory of Open Access Journals (Sweden)

    Üstün Bedirhan T

    2006-07-01

    Full Text Available Abstract Background The International Classification of Functioning, Disability and Health (ICF is the framework developed by WHO to describe functioning and disability at both the individual and population levels. While condition-specific ICF Core Sets are useful, a Generic ICF Core Set is needed to describe and compare problems in functioning across health conditions. Methods The aims of the multi-centre, cross-sectional study presented here were: a to propose a method to select ICF categories when a large amount of ICF-based data have to be handled, and b to identify candidate ICF categories for a Generic ICF Core Set by examining their explanatory power in relation to item one of the SF-36. The data were collected from 1039 patients using the ICF checklist, the SF-36 and a Comorbidity Questionnaire. ICF categories to be entered in an initial regression model were selected following systematic steps in accordance with the ICF structure. Based on an initial regression model, additional models were designed by systematically substituting the ICF categories included in it with ICF categories with which they were highly correlated. Results Fourteen different regression models were performed. The variance the performed models account for ranged from 22.27% to 24.0%. The ICF category that explained the highest amount of variance in all the models was sensation of pain. In total, thirteen candidate ICF categories for a Generic ICF Core Set were proposed. Conclusion The selection strategy based on the ICF structure and the examination of the best possible alternative models does not provide a final answer about which ICF categories must be considered, but leads to a selection of suitable candidates which needs further consideration and comparison with the results of other selection strategies in developing a Generic ICF Core Set.

  5. Generic experiments at the sump model 'Zittauer Stroemungswanne' (ZSW) for the behaviour of mineral wool in the sump and the reactor core

    Energy Technology Data Exchange (ETDEWEB)

    Alt, Soeren; Hampel, R.; Kaestner, Wolfgang [Hochschule Zittau/Goerlitz (DE)] (and others)

    2011-03-15

    The investigation of insulation debris transport, sedimentation, penetration into the reactor core and head loss build up becomes important to reactor safety research for PWR and BWR, when considering the long-term behaviour of emergency core cooling systems during loss of coolant accidents. Research projects are being performed in cooperation between the University of Applied Sciences Zittau/Goerlitz and the Helmholtz-Zentrum Dresden-Rossendorf. The projects include experimental investigations of different processes and phenomena of insulation debris in coolant flow and the development of CFD models. Generic complex experiments serve for building up a data base for the validation of models for single effects and their coupling in CFD codes. This paper includes the description of the experimental facility for complex generic experiments (ZSW), an overview about experimental boundary conditions and results for upstream and down-stream phenomena as well as for the long-time behaviour due to corrosive processes. (orig.)

  6. Validation of the generic medical interview satisfaction scale: the G-MISS questionnaire.

    Science.gov (United States)

    Maurice-Szamburski, Axel; Michel, Pierre; Loundou, Anderson; Auquier, Pascal

    2017-02-14

    Patients have about seven medical consultations a year. Despite the importance of medical interviews in the healthcare process, there is no generic instrument to assess patients' experiences in general practices, medical specialties, and surgical specialties. The main objective was to validate a questionnaire assessing patients' experiences with medical consultations in various practices. The G-MISS study was a prospective multi-center trial that enrolled patients from May to July 2016. A total of 2055 patients were included from general practices, medical specialties, and surgical specialties. Patients filled out a questionnaire assessing various aspects of their experience and satisfaction within 1 week after their medical interview. The validation process relied on item response theory. Internal validity was examined using exploratory factorial analysis. The statistical model used the root mean square error of approximation, confirmatory fit index, and standard root mean square residual as fit indices. Scalability and reliability were assessed with the Rasch model and Cronbach's alpha coefficients, respectively. Scale properties across the three subgroups were explored with differential item functioning. The G-MISS final questionnaire contained 16 items, structured in three dimensions of patients' experiences: "Relief", "Communication", and "Compliance". A global index of patients' experiences was computed as the mean of the dimension scores. All fit indices from the statistical model were satisfactory (RMSEA = 0.03, CFI = 0.98, SRMR = 0.06). The overall scalability had a good fit to the Rasch model. Each dimension was reliable, with Cronbach's alpha ranging from 0.73 to 0.86. Differential item functioning across the three consultation settings was negligible. Patients undergoing medical or surgical specialties reported higher scores in the "Relief" dimension compared with general practice (83.0 ± 11.6 or 82.4 ± 11.6 vs. 73.2 ± 16

  7. A new generic approach for estimating the concentrations of down-the-drain chemicals at catchment and national scale

    Energy Technology Data Exchange (ETDEWEB)

    Keller, V.D.J. [Centre for Ecology and Hydrology, Hydrological Risks and Resources, Maclean Building, Crowmarsh Gifford, Wallingford OX10 8BB (United Kingdom)]. E-mail: vke@ceh.ac.uk; Rees, H.G. [Centre for Ecology and Hydrology, Hydrological Risks and Resources, Maclean Building, Crowmarsh Gifford, Wallingford OX10 8BB (United Kingdom); Fox, K.K. [University of Lancaster (United Kingdom); Whelan, M.J. [Unilever Safety and Environmental Assurance Centre, Colworth (United Kingdom)

    2007-07-15

    A new generic approach for estimating chemical concentrations in rivers at catchment and national scales is presented. Domestic chemical loads in waste water are estimated using gridded population data. River flows are estimated by combining predicted runoff with topographically derived flow direction. Regional scale exposure is characterised by two summary statistics: PEC{sub works}, the average concentration immediately downstream of emission points, and, PEC{sub area}, the catchment-average chemical concentration. The method was applied to boron at national (England and Wales) and catchment (Aire-Calder) scales. Predicted concentrations were within 50% of measured mean values in the Aire-Calder catchment and in agreement with results from the GREAT-ER model. The concentration grids generated provide a picture of the spatial distribution of expected chemical concentrations at various scales, and can be used to identify areas of potentially high risk. - A new grid-based approach to predict spatially-referenced freshwater concentrations of domestic chemicals.

  8. On the performance of a generic length scale turbulence model within an adaptive finite element ocean model

    Science.gov (United States)

    Hill, Jon; Piggott, M. D.; Ham, David A.; Popova, E. E.; Srokosz, M. A.

    2012-10-01

    Research into the use of unstructured mesh methods for ocean modelling has been growing steadily in the last few years. One advantage of using unstructured meshes is that one can concentrate resolution where it is needed. In addition, dynamic adaptive mesh optimisation (DAMO) strategies allow resolution to be concentrated when this is required. Despite the advantage that DAMO gives in terms of improving the spatial resolution where and when required, small-scale turbulence in the oceans still requires parameterisation. A two-equation, generic length scale (GLS) turbulence model (one equation for turbulent kinetic energy and another for a generic turbulence length-scale quantity) adds this parameterisation and can be used in conjunction with adaptive mesh techniques. In this paper, an implementation of the GLS turbulence parameterisation is detailed in a non-hydrostatic, finite-element, unstructured mesh ocean model, Fluidity-ICOM. The implementation is validated by comparing to both a laboratory-scale experiment and real-world observations, on both fixed and adaptive meshes. The model performs well, matching laboratory and observed data, with resolution being adjusted as necessary by DAMO. Flexibility in the prognostic fields used to construct the error metric used in DAMO is required to ensure best performance. Moreover, the adaptive mesh models perform as well as fixed mesh models in terms of root mean square error to observation or theoretical mixed layer depths, but uses fewer elements and hence has a reduced computational cost.

  9. Implementation of a reference-scaled average bioequivalence approach for highly variable generic drug products by the US Food and Drug Administration.

    Science.gov (United States)

    Davit, Barbara M; Chen, Mei-Ling; Conner, Dale P; Haidar, Sam H; Kim, Stephanie; Lee, Christina H; Lionberger, Robert A; Makhlouf, Fairouz T; Nwakama, Patrick E; Patel, Devvrat T; Schuirmann, Donald J; Yu, Lawrence X

    2012-12-01

    Highly variable (HV) drugs are defined as those for which within-subject variability (%CV) in bioequivalence (BE) measures is 30% or greater. Because of this high variability, studies designed to show whether generic HV drugs are bioequivalent to their corresponding HV reference drugs may need to enroll large numbers of subjects even when the products have no significant mean differences. To avoid unnecessary human testing, the US Food and Drug Administration's Office of Generic Drugs developed a reference-scaled average bioequivalence (RSABE) approach, whereby the BE acceptance limits are scaled to the variability of the reference product. For an acceptable RSABE study, an HV generic drug product must meet the scaled BE limit and a point estimate constraint. The approach has been implemented successfully. To date, the RSABE approach has supported four full approvals and one tentative approval of HV generic drug products.

  10. Generic Mechanism of Optimal Energy Transfer Efficiency: A Scaling Theory of the Mean First Passage Time in Exciton Systems

    CERN Document Server

    Wu, Jianlan; Silbey, Robert J

    2013-01-01

    An asymptotic scaling theory is presented using the conceptual basis of trapping-free subspace (i.e., orthogonal subspace) to establish the generic mechanism of optimal efficiency of excitation energy transfer (EET) in light-harvesting systems. Analogous to Kramers' turnover in classical rate theory, the enhanced efficiency in the weak damping limit and the suppressed efficiency in the strong damping limit define two asymptotic scaling regimes, which are interpolated to predict the functional form of optimal efficiency of the trapping-free subspace. In the presence of static disorder, the scaling law of transfer time with respect to dephasing rate changes from linear to square root, suggesting a weaker dependence on the environment. Though formulated in the context of EET, the analysis and conclusions apply in general to open quantum processes, including electron transfer, fluorescence emission, and heat conduction.

  11. Design of a Generic and Flexible Data Structure for Efficient Formulation of Large Scale Network Problems

    DEFF Research Database (Denmark)

    Quaglia, Alberto; Sarup, Bent; Sin, Gürkan;

    2013-01-01

    The formulation of Enterprise-Wide Optimization (EWO) problems as mixed integer nonlinear programming requires collecting, consolidating and systematizing large amount of data, coming from different sources and specific to different disciplines. In this manuscript, a generic and flexible data...... structure for efficient formulation of enterprise-wide optimization problems is presented. Through the integration of the described data structure in our synthesis and design framework, the problem formulation workflow is automated in a software tool, reducing time and resources needed to formulate large...... problems, while ensuring at the same time data consistency and quality at the application stage....

  12. Small scale folding observed in the NEEM ice core

    Science.gov (United States)

    Jansen, Daniela; Llorens, Maria-Gema; Westhoff, Julien; Steinbach, Florian; Bons, Paul D.; Kipfstuhl, Sepp; Griera, Albert; Weikusat, Ilka

    2015-04-01

    Disturbances on the centimeter scale in the layering of the NEEM ice core (North Greenland) can be mapped by means of visual stratigraphy as long as the ice does have a visual layering, such as, for example, cloudy bands. Different focal depths of the visual stratigraphy method allow, to a certain extent, a three dimensional view of the structures. In this study we present a structural analysis of the visible folds, discuss characteristics and frequency and present examples of typical fold structures. With this study we aim to quantify the potential impact of small scale folding on the integrity of climate proxy data. We also analyze the structures with regard to the stress environment under which they formed. The structures evolve from gentle waves at about 1700 m to overturned z-folds with increasing depth. Occasionally, the folding causes significant thickening of layers. Their shape indicates that they are passive features and are probably not initiated by rheology differences between layers. Layering is heavily disturbed and tracing of single layers is no longer possible below a depth of 2160 m. Lattice orientation distributions for the corresponding core sections were analyzed where available in addition to visual stratigraphy. The data show axial-plane parallel strings of grains with c.axis orientations that deviate from that of the matrix, which has more or less a single-maximum fabric at the depth where the folding occurs. We conclude from these data that folding is a consequence of deformation along localized shear planes and kink bands. The findings are compared with results from other deep ice cores. The observations presented are supplemented by microstructural modeling using a crystal plasticity code that reproduces deformation, applying a Fast Fourier Transform (FFT), coupled with ELLE to include dynamic recrystallization processes. The model results reproduce the development of bands of grains with a tilted orientation relative to the single maximum

  13. A Bioequivalence Approach for Generic Narrow Therapeutic Index Drugs: Evaluation of the Reference-Scaled Approach and Variability Comparison Criterion.

    Science.gov (United States)

    Jiang, Wenlei; Makhlouf, Fairouz; Schuirmann, Donald J; Zhang, Xinyuan; Zheng, Nan; Conner, Dale; Yu, Lawrence X; Lionberger, Robert

    2015-07-01

    Various health communities have expressed concerns regarding whether average bioequivalence (BE) limits (80.00-125.00%) for the 90% confidence interval of the test-to-reference geometric mean ratio are sufficient to ensure therapeutic equivalence between a generic narrow therapeutic index (NTI) drug and its reference listed drug (RLD). Simulations were conducted to investigate the impact of different BE approaches for NTI drugs on study power, including (1) direct tightening of average BE limits and (2) a scaled average BE approach where BE limits are tightened based on the RLD's within-subject variability. Addition of a variability comparison (using a one-tailed F test) increased the difficulty for generic NTIs more variable than their corresponding RLDs to demonstrate bioequivalence. Based on these results, the authors evaluate the fully replicated, 2-sequence, 2-treatment, 4-period crossover study design for NTI drugs where the test product demonstrates BE based on a scaled average bioequivalence criterion and a within-subject variability comparison criterion.

  14. Validation Results for Core-Scale Oil Shale Pyrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Staten, Josh; Tiwari, Pankaj

    2015-03-01

    This report summarizes a study of oil shale pyrolysis at various scales and the subsequent development a model for in situ production of oil from oil shale. Oil shale from the Mahogany zone of the Green River formation was used in all experiments. Pyrolysis experiments were conducted at four scales, powdered samples (100 mesh) and core samples of 0.75”, 1” and 2.5” diameters. The batch, semibatch and continuous flow pyrolysis experiments were designed to study the effect of temperature (300°C to 500°C), heating rate (1°C/min to 10°C/min), pressure (ambient and 500 psig) and size of the sample on product formation. Comprehensive analyses were performed on reactants and products - liquid, gas and spent shale. These experimental studies were designed to understand the relevant coupled phenomena (reaction kinetics, heat transfer, mass transfer, thermodynamics) at multiple scales. A model for oil shale pyrolysis was developed in the COMSOL multiphysics platform. A general kinetic model was integrated with important physical and chemical phenomena that occur during pyrolysis. The secondary reactions of coking and cracking in the product phase were addressed. The multiscale experimental data generated and the models developed provide an understanding of the simultaneous effects of chemical kinetics, and heat and mass transfer on oil quality and yield. The comprehensive data collected in this study will help advance the move to large-scale in situ oil production from the pyrolysis of oil shale.

  15. Generic and Layered Framework Components for the Control of a Large Scale Data Acquisition System

    CERN Document Server

    Köstner, S; Charlet, D; Fontanelli, F; Frank, M; Gaspar, C; Haefeli, G; Jacobsson, R; Jost, B; Mini, G; Neufeld, N; Nogueira, R; Potterat, C; Robbe, P; Sannino, M; Videau, I

    2008-01-01

    The complexity of today's experiments in High Energy Physics results in a large amount of readout channels which can count up to a million and above. The experiments in general consist of various subsystems which themselves comprise a large amount of detectors requiring sophisticated DAQ and readout electronics. We report here on the structured software layers to control such a data acquisition system for the case of LHCb which is one of the four experiments for LHC. Additional focus is given on the protocols in use as well as the required hardware. An abstraction layer was implemented to allow access on the different and distinct hardware types in a coherent and generic manner. The hierarchical structure which allows propagating commands down to the subsystems is explained. Via finite state machines an expert system with auto-recovery abilities can be modeled.

  16. Can key vegetation parameters be retrieved at the large-scale using LAI satellite products and a generic modelling approach ?

    Science.gov (United States)

    Dewaele, Helene; Calvet, Jean-Christophe; Carrer, Dominique; Laanaia, Nabil

    2016-04-01

    In the context of climate change, the need to assess and predict the impact of droughts on vegetation and water resources increases. The generic approaches permitting the modelling of continental surfaces at large-scale has progressed in recent decades towards land surface models able to couple cycles of water, energy and carbon. A major source of uncertainty in these generic models is the maximum available water content of the soil (MaxAWC) usable by plants which is constrained by the rooting depth parameter and unobservable at the large-scale. In this study, vegetation products derived from the SPOT/VEGETATION satellite data available since 1999 are used to optimize the model rooting depth over rainfed croplands and permanent grasslands at 1 km x 1 km resolution. The inter-annual variability of the Leaf Area Index (LAI) is simulated over France using the Interactions between Soil, Biosphere and Atmosphere, CO2-reactive (ISBA-A-gs) generic land surface model and a two-layer force-restore (FR-2L) soil profile scheme. The leaf nitrogen concentration directly impacts the modelled value of the maximum annual LAI. In a first step this parameter is estimated for the last 15 years by using an iterative procedure that matches the maximum values of LAI modelled by ISBA-A-gs to the highest satellite-derived LAI values. The Root Mean Square Error (RMSE) is used as a cost function to be minimized. In a second step, the model rooting depth is optimized in order to reproduce the inter-annual variability resulting from the drought impact on the vegetation. The evaluation of the retrieved soil rooting depth is achieved using the French agricultural statistics of Agreste. Retrieved leaf nitrogen concentrations are compared with values from previous studies. The preliminary results show a good potential of this approach to estimate these two vegetation parameters (leaf nitrogen concentration, MaxAWC) at the large-scale over grassland areas. Besides, a marked impact of the

  17. Generic mechanism of optimal energy transfer efficiency: a scaling theory of the mean first-passage time in exciton systems.

    Science.gov (United States)

    Wu, Jianlan; Silbey, Robert J; Cao, Jianshu

    2013-05-17

    An asymptotic scaling theory is presented using the conceptual basis of trapping-free subspace (i.e., orthogonal subspace) to establish the generic mechanism of optimal efficiency of excitation energy transfer in light-harvesting systems. A quantum state orthogonal to the trap will exhibit noise-assisted transfer, clarifying the significance of initial preparation. For such an initial state, the efficiency is enhanced in the weak damping limit (⟨t⟩ ∼ 1/Γ), and suppressed in the strong damping limit (⟨t⟩ ∼ Γ), analogous to Kramers turnover in classical rate theory. An interpolating expression ⟨t⟩ = A/Γ + B + CΓ quantitatively describes the trapping time over the entire range of the dissipation strength, and predicts the optimal efficiency at Γ(opt) ∼ J for homogenous systems. In the presence of static disorder, the scaling law of transfer time with respect to dephasing rate changes from linear to square root, suggesting a weaker dependence on the environment. The prediction of the scaling theory is verified in a symmetric dendrimer system by numerically exact quantum calculations. Though formulated in the context of excitation energy transfer, the analysis and conclusions apply in general to open quantum processes, including electron transfer, fluorescence emission, and heat conduction.

  18. Fitting the Generic Multi-Parameter Crossover Model: Towards Realistic Scaling Estimates

    NARCIS (Netherlands)

    Struzik, Z.R.; Dooijes, E.H.; Groen, F.C.A.

    1997-01-01

    The primary concern of fractal metrology is providing a means of reliable estimation of scaling exponents such as fractal dimension, in order to prove the null hypothesis that a particular object can be regarded as fractal. In the particular context to be discussed in this contribution, the central

  19. The relevance of non-generic events in scale space models

    NARCIS (Netherlands)

    Kuijper, Arjan; Florack, L.

    2002-01-01

    In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of spatial critical points under the influence of blurring. We show how the mathematical framework of catastrophe theory can be used to describe and model the behaviour of critical point

  20. A generic trust framework for large-scale open systems using machine learning

    OpenAIRE

    Liu, Xin; Tredan, Gilles; Datta, Anwitaman

    2011-01-01

    In many large scale distributed systems and on the web, agents need to interact with other unknown agents to carry out some tasks or transactions. The ability to reason about and assess the potential risks in carrying out such transactions is essential for providing a safe and reliable environment. A traditional approach to reason about the trustworthiness of a transaction is to determine the trustworthiness of the specific agent involved, derived from the history of its behavior. As a depart...

  1. Effect of wettability on scale-up of multiphase flow from core-scale to reservoir fine-grid-scale

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y.C.; Mani, V.; Mohanty, K.K. [Univ. of Houston, TX (United States)

    1997-08-01

    Typical field simulation grid-blocks are internally heterogeneous. The objective of this work is to study how the wettability of the rock affects its scale-up of multiphase flow properties from core-scale to fine-grid reservoir simulation scale ({approximately} 10{prime} x 10{prime} x 5{prime}). Reservoir models need another level of upscaling to coarse-grid simulation scale, which is not addressed here. Heterogeneity is modeled here as a correlated random field parameterized in terms of its variance and two-point variogram. Variogram models of both finite (spherical) and infinite (fractal) correlation length are included as special cases. Local core-scale porosity, permeability, capillary pressure function, relative permeability functions, and initial water saturation are assumed to be correlated. Water injection is simulated and effective flow properties and flow equations are calculated. For strongly water-wet media, capillarity has a stabilizing/homogenizing effect on multiphase flow. For small variance in permeability, and for small correlation length, effective relative permeability can be described by capillary equilibrium models. At higher variance and moderate correlation length, the average flow can be described by a dynamic relative permeability. As the oil wettability increases, the capillary stabilizing effect decreases and the deviation from this average flow increases. For fractal fields with large variance in permeability, effective relative permeability is not adequate in describing the flow.

  2. How is kinematic structure connected to the core scale from filament scale?; Mopra mapping observations with multi-lines of dense cores in Lupus I

    Science.gov (United States)

    Kiyokane, Kazuhiro; Saito, Masao; Tachihara, Kengo; Saigo, Kazuya; van Kempen, Tim; Cortes, Paulo; Hill, Tracey; Knee, Lewis; Kurono, Yasutaka; Takahashi, Satoko; Aya, Higuchi; Nyman, Lars-Ake

    2014-06-01

    Recently, high sensitivity mappings of nearby molecular clouds in far-infrared and submillimeter wavelengths with Hershel and AzTEC/ASTE show ubiquitous existence of the filamentary structures with 0.1-pc uniform width. It is important to investigate dense core formation from large scale structure via fragmentation. We have conducted MOPRA multi-line mapping observations covered on 0.02 - 0.2 pc scales of 8 dense cores in a filamentary cloud of nearby Lupus I at 140 pc. A class 0/I protostellar core IRAS 15398-3359 is included as a sample, which has an adjacent prestellar core with the separation of 0.13pc in the west. The maps of N2H+, HNC, HC3N show well associated with each core. The velocity field of C18O shows 1.4 km/s/pc from north to south over the region containing two dense cores, which is consistent with past observation of NANTEN. In contrast to C18O results, the velocity field of HC3N shows different structures, which suggest counter rotation of two dense cores; 1.2 km/s/pc from north-west to south-east around a protostellar core and 0.8 km/s/pc from east to west around a presteller core. The filament will be fragmentized and collapsed to dense cores when the line density is over 2Cs/G (where Cs is sound speed and G is gravitational constant). If that velocity gradient was caused by such situation, it should be red-blue-red-blue across two dense cores but the observed kinematics is not consistent with this scenario, which requires that the filament structure would be extremely curved with a skew angle. Although we cannot reject the collapsing interruption, those results suggest the spin-up rotating picture separated from large-scale structure.

  3. Generic Issues on Broad-Scale Soil Monitoring Schemes: A Review

    Institute of Scientific and Technical Information of China (English)

    D. ARROUAYS; B. P. MARCHANT; N. P. A. SABY; J. MEERSMANS; T. G. ORTON; M. P. MARTIN; P. H. BELLAMY; R. M. LARK; M. KIBBLEWHITE

    2012-01-01

    Numerous scientific challenges arise when designing a soil monitoring network (SMN),especially when assessing large areas and several properties that are driven by numerous controlling factors of various origins and scales.Different broad approaches to the establishment of SMNs are distinguished.It is essential to establish an adequate sampling protocol that can be applied rigorously at each sampling location and time.We make recommendations regarding the within-site sampling of soil.Different statistical methods should be associated with the different types of sampling design.We review new statistical methods that account for different sources of uncertainty.Except for those parameters for which a consensus exists,the question of testing method harmonisation remains a very difficult issue.The establishment of benchmark sites devoted to harmonisation and inter-calibration is advocated as a technical solution.However,to our present knowledge,no study has addressed crucial scientific issues such as how many calibration sites are necessary and how to locate them.

  4. Adapting a generic tuberculosis control operational guideline and scaling it up in China: a qualitative case study

    Directory of Open Access Journals (Sweden)

    Liu Feiying

    2008-07-01

    Full Text Available Abstract Background The TB operational guideline (the deskguide is a detailed action guide for county TB doctors aiming to improve the quality of DOTS, while the China national TB policy guide is a guide to TB control that is comprehensive but lacks operational usability for frontline TB doctors. This study reports the process of deskguide adaptation, its scale-up and lessons learnt for policy implications. Methods The deskguide was translated, reviewed, and revised in a working group process. Details of the eight adaptation steps are reported here. An operational study was embedded in the adaptation process. Two comparable prefectures were chosen as pilot and control sites in each of two participating provinces. In the pilot sites, the deskguide was used with the national policy guide in routine in-service training and supervisory trips; while in the control sites, only the national policy guide was used. In-depth interviews and focus groups were conducted with 16 county TB doctors, 16 township doctors, 17 village doctors, 63 TB patients and 57 patient family members. Following piloting, the deskguide was incorporated into the national TB guidelines for county TB dispensary use. Results Qualitative research identified that the deskguide was useful in the daily practice of county TB doctors. Patients in the pilot sites had a better knowledge of TB and better treatment support compared with those in the control sites. Conclusion The adaptation process highlighted a number of general strategies to adapt generic guidelines into country specific ones: 1 local policy-makers and practitioners should have a leading role; 2 a systematic working process should be employed with capable focal persons; and 3 the guideline should be embedded within the current programmes so it is sustainable and replicable for further scale-up.

  5. Light-year scale radio cores in four LINER galaxies

    NARCIS (Netherlands)

    Filho, ME; Barthel, PD; Ho, LC

    2002-01-01

    The LINER galaxies NGC2911, NGC3079, NGC3998, and NGC6500 were observed at 5 GHz with the European VLBI Network at a resolution of 5 milliarcsecond and found to possess at-spectrum, variable, high-brightness temperature (T-B > 10(8) K) radio cores. These radio characteristics reinforce the view that

  6. Generic variation?

    DEFF Research Database (Denmark)

    Jensen, Torben Juel

    2009-01-01

    Abstract In modern Danish, a handful of pronouns may be used to refer to a generic referent. In recent decades, the second person singular pronoun du has gained ground, apparently in parallel to similar recent developments in other languages. Even though generic du may not be as old as the tradit...

  7. Psychometric testing of an instrument measuring core competencies of nursing students: an application of Mokken scaling.

    Science.gov (United States)

    Perng, Shoa-Jen; Watson, Roger

    2013-01-01

    Assessing the core competencies of nursing students provides information about students' learning outcomes for educational evaluation and improvement. The aim of this study was to develop the Nursing Students Core Competencies scale to measure 8 core competencies of nursing students in Taiwan. The study employed factor analysis and Mokken scaling analysis for psychometric testing of this instrument between a group of nursing graduates and their evaluators. The results indicated that the Nursing Students Core Competencies scale has demonstrated evidence of internal consistency, structural validity, unidimensionality, and a hierarchy of items for students' self-assessment and instructor's rating. The use of Mokken scaling analysis extends the knowledge of developing competence assessment tools; it can be used to reveal the domains or items of competency nursing students perceive that are easy or difficult, providing information for curricular design.

  8. Rethinking generic skills

    Directory of Open Access Journals (Sweden)

    Roy Canning

    2013-10-01

    Full Text Available The paper provides a critical analysis of the notion of generic or transversal skillscontained with European Union policy discourses. The author presents a conceptualframework that challenges the idea that generic skills are universal, transferable andautonomous. An alternative analysis is put forward that argues the case forcontextualising skills and knowledge within particular understandings and cultures thatare more collective than individualistic in nature. The arguments are framed withinwider cross-disciplinary debates in linguistics, geosemiotics and social-cultural theoryand build upon an earlier paper exploring core skills in the UK (Canning, 2007.

  9. Large-Scale Flow and Spiral Core Instability in Rayleigh-Benard Convection

    CERN Document Server

    Aranson, I S; Steinberg, V; Tsimring, L S; Aranson, Igor; Assenheimer, Michel; Steinberg, Victor; Tsimring, Lev S.

    1996-01-01

    The spiral core instability, observed in large aspect ratio Rayleigh-Benard convection, is studied numerically in the framework of the Swift-Hohenberg equation coupled to a large-scale flow. It is shown that the instability leads to non-trivial core dynamics and is driven by the self-generated vorticity. Moreover, the recently reported transition from spirals to hexagons near the core is shown to occur only in the presence of a non-variational nonlinearity, and is triggered by the spiral core instability. Qualitative agreement between the simulations and the experiments is demonstrated.

  10. A Bioequivalence Approach for Generic Narrow Therapeutic Index Drugs: Evaluation of the Reference-Scaled Approach and Variability Comparison Criterion

    OpenAIRE

    Jiang, Wenlei; Makhlouf, Fairouz; Schuirmann, Donald J.; Zhang, Xinyuan; Zheng, Nan; Conner, Dale; Yu, Lawrence X.; Lionberger, Robert

    2015-01-01

    Various health communities have expressed concerns regarding whether average bioequivalence (BE) limits (80.00–125.00%) for the 90% confidence interval of the test-to-reference geometric mean ratio are sufficient to ensure therapeutic equivalence between a generic narrow therapeutic index (NTI) drug and its reference listed drug (RLD). Simulations were conducted to investigate the impact of different BE approaches for NTI drugs on study power, including (1) direct tightening of average BE lim...

  11. Generic safety documentation model

    Energy Technology Data Exchange (ETDEWEB)

    Mahn, J.A.

    1994-04-01

    This document is intended to be a resource for preparers of safety documentation for Sandia National Laboratories, New Mexico facilities. It provides standardized discussions of some topics that are generic to most, if not all, Sandia/NM facilities safety documents. The material provides a ``core`` upon which to develop facility-specific safety documentation. The use of the information in this document will reduce the cost of safety document preparation and improve consistency of information.

  12. Using Profile Analysis via Multidimensional Scaling (PAMS) to identify core profiles from the WMS-III.

    Science.gov (United States)

    Frisby, Craig L; Kim, Se-Kang

    2008-03-01

    Profile Analysis via Multidimensional Scaling (PAMS) is a procedure for extracting latent core profiles in a multitest data set. The PAMS procedure offers several advantages compared with other profile analysis procedures. Most notably, PAMS estimates individual profile weights that reflect the degree to which an individual's observed profile approximates the shape and scatter of latent core profiles. The PAMS procedure was applied to index scores of nonreplicated participants from the standardization sample (N = 1,033) for the Wechsler Memory Scale--Third Edition (D. Tulsky, J. Zhu, & M. F. Ledbetter, 2002). PAMS extracted discrepant visual memory and auditory memory versus working memory core profiles for the complete 16- to 89-year-old sample and discrepant working memory and auditory memory versus working memory core profiles for the 75- to 89-year-old cohort. Implications for use of PAMS in future research are discussed.

  13. Typed combinators for generic traversal

    NARCIS (Netherlands)

    Lämmel, R.; Vonk, J.

    2001-01-01

    Lacking support for generic traversal, functional programming languages suffer from a scalability problem when applied to large-scale program transformation problems. As a solution, we introduce emph{functional strategies: typeful generic functions that not only can be applied to terms of any type,

  14. Design and Implementation of a Generic Energy-Harvesting Framework Applied to the Evaluation of a Large-Scale Electronic Shelf-Labeling Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Kamerman Ad

    2010-01-01

    Full Text Available Most wireless sensor networks (WSNs consist of battery-powered nodes and are limited to hundreds of nodes. Battery replacement is a very costly operation and a key factor in limiting successful large-scale deployments. The recent advances in both energy harvesters and low-power communication systems hold promise for deploying large-scale wireless green-powered sensor networks (WGSNs. This will enable new applications and will eliminate environmentally unfriendly battery disposal. This paper explores the use of energy harvesters to scavenge power for nodes in a WSN. The design and implementation of a generic energy-harvesting framework, suited for a WSN simulator as well as a real-life testbed, are proposed. These frameworks are used to evaluate whether a carrier sense multiple access with collision avoidance scheme is sufficiently reliable for use in emerging large-scale energy harvesting electronic shelf label (EHESL systems (i.e., 12000 labels in a star topology. Both the simulator and testbed experiments yielded an average success rate up to 92%, with an arrival rate of 40 transceive cycles per second. We have demonstrated that our generic energy-harvesting framework is useful for WGSN research because the simulator allowed us to verify the achieved results on the real-life testbed and vice versa.

  15. Reactive/Adsorptive transport in (partially-) saturated porous media: from pore scale to core scale

    NARCIS (Netherlands)

    Raoof, A.

    2011-01-01

    Pore-scale modeling provides opportunities to study transport phenomena in fundamental ways because detailed information is available at the microscopic pore scale. This offers the best hope for bridging the traditional gap that exists between pore scale and macro (lab) scale description of the proc

  16. Family Genericity

    DEFF Research Database (Denmark)

    Ernst, Erik

    2006-01-01

    Type abstraction in object-oriented languages embody two techniques, each with its own strenghts and weaknesses. The first technique is extension, yielding abstraction mechanisms with good support for gradual specification. The prime example is inheritance. The second technique is functional abst...... the result as family genericity. The presented language design has been implemented....

  17. Family Genericity

    DEFF Research Database (Denmark)

    Ernst, Erik

    2006-01-01

    Type abstraction in object-oriented languages embody two techniques, each with its own strenghts and weaknesses. The first technique is extension, yielding abstraction mechanisms with good support for gradual specification. The prime example is inheritance. The second technique is functional abst...... the result as family genericity. The presented language design has been implemented....

  18. A new scale for disaster nursing core competencies: Development and psychometric testing.

    Science.gov (United States)

    Al Thobaity, Abdulellah; Williams, Brett; Plummer, Virginia

    2016-02-01

    All nurses must have core competencies in preparing for, responding to and recovering from a disaster. In the Kingdom of Saudi Arabia (KSA), as in many other countries, disaster nursing core competencies are not fully understood and lack reliable, validated tools. Thus, it is imperative to develop a scale for exploring disaster nursing core competencies, roles and barriers in the KSA. This study's objective is to develop a valid, reliable scale that identifies and explores core competencies of disaster nursing, nurses' roles in disaster management and barriers to developing disaster nursing in the KSA. This study developed a new scale testing its validity and reliability. A principal component analysis (PCA) was used to develop and test psychometric properties of the new scale. The PCA used a purposive sample of nurses from emergency departments in two hospitals in the KSA. Participants rated 93 paper-based, self-report questionnaire items from 1 to 10 on a Likert scale. PCA using Varimax rotation was conducted to explore factors emerging from responses. The study's participants were 132 nurses (66% response rate). PCA of the 93 questionnaire items revealed 49 redundant items (which were deleted) and 3 factors with eigenvalues of >1. The remaining 44 items accounted for 77.3% of the total variance. The overall Cronbach's alpha was 0.96 for all factors: 0.98 for Factor 1, 0.92 for Factor 2 and 0.86 for Factor 3. This study provided a validated, reliable scale for exploring nurses' core competencies, nurses' roles and barriers to developing disaster nursing in the KSA. The new scale has many implications, such as for improving education, planning and curricula. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  19. Power scaling estimate of crystalline fiber waveguides with rare earth doped YAG cores

    Science.gov (United States)

    Li, Da; Hong, Pengda; Meissner, Stephanie K.; Meissner, Helmuth E.

    2016-03-01

    Power scaling analysis based on the model by Dawson et al. [1,2] for circular core fibers has been applied to estimating power scaling of crystalline fiber waveguides (CFWs) with RE3+ doped single crystalline or ceramic YAG (RE=rare earth: Yb, Er, Tm and Ho). Power scaling limits include stimulated Brillouin scattering, thermal lensing effect, and limits to coupling of pump light into CFWs. The CFW designs we have considered consist, in general, of a square doped RE3+:YAG core, an inner cladding of either undoped or laser-inactive-ion-doped YAG and an outer cladding of sapphire. The presented data have been developed for the structures fabricated using the Adhesive-Free Bonding (AFB®) technique, but the results should be essentially independent of fabrication technique, assuming perfect core/inner cladding/outer cladding interfaces. Hard power scaling limits exist for a specific CFW design and are strongly based on the physical constants of the material and its spectroscopic specifics. For example, power scaling limit was determined as ~16 kW for 2.5% ceramic Yb:YAG/YAG (core material/inner cladding material) at fiber length of 1.7 m and core diameter of 69 μm. Considering the present manufacturing limit for CFW length to be, e.g., 0.5 m, the actual maximum output power will be limited to ~4.4 kW for a Yb:YAG/YAG CFW. Power limit estimates have also been computed for Er3+, Tm3+ and Ho3+doped core based CFWs.

  20. Using Profile Analysis via Multidimensional Scaling (PAMS) to Identify Core Profiles from the WMS-III

    Science.gov (United States)

    Frisby, Craig L.; Kim, Se-Kang

    2008-01-01

    Profile Analysis via Multidimensional Scaling (PAMS) is a procedure for extracting latent core profiles in a multitest data set. The PAMS procedure offers several advantages compared with other profile analysis procedures. Most notably, PAMS estimates individual profile weights that reflect the degree to which an individual's observed profile…

  1. PWR core and spent fuel pool analysis using scale and nestle

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, J. E.; Maldonado, G. I. [Dept. of Nuclear Engineering, Univ. of Tennessee, Knoxville, TN 37996-2300 (United States); St Clair, R.; Orr, D. [Duke Energy, 526 S. Church St, Charlotte, NC 28202 (United States)

    2012-07-01

    The SCALE nuclear analysis code system [SCALE, 2011], developed and maintained at Oak Ridge National Laboratory (ORNL) is widely recognized as high quality software for analyzing nuclear systems. The SCALE code system is composed of several validated computer codes and methods with standard control sequences, such as the TRITON/NEWT lattice physics sequence, which supplies dependable and accurate analyses for industry, regulators, and academia. Although TRITON generates energy-collapsed and space-homogenized few group cross sections, SCALE does not include a full-core nodal neutron diffusion simulation module within. However, in the past few years, the open-source NESTLE core simulator [NESTLE, 2003], originally developed at North Carolina State Univ. (NCSU), has been updated and upgraded via collaboration between ORNL and the Univ. of Tennessee (UT), so it now has a growingly seamless coupling to the TRITON/NEWT lattice physics [Galloway, 2010]. This study presents the methodology used to couple lattice physics data between TRITON and NESTLE in order to perform a three-dimensional full-core analysis employing a 'real-life' Duke Energy PWR as the test bed. The focus for this step was to compare the key parameters of core reactivity and radial power distribution versus plant data. Following the core analysis, following a three cycle burn, a spent fuel pool analysis was done using information generated from NESTLE for the discharged bundles and was compared to Duke Energy spent fuel pool models. The KENO control module from SCALE was employed for this latter stage of the project. (authors)

  2. Core-scale solute transport model selection using Monte Carlo analysis

    CERN Document Server

    Malama, Bwalya; James, Scott C

    2013-01-01

    Model applicability to core-scale solute transport is evaluated using breakthrough data from column experiments conducted with conservative tracers tritium (H-3) and sodium-22, and the retarding solute uranium-232. The three models considered are single-porosity, double-porosity with single-rate mobile-immobile mass-exchange, and the multirate model, which is a deterministic model that admits the statistics of a random mobile-immobile mass-exchange rate coefficient. The experiments were conducted on intact Culebra Dolomite core samples. Previously, data were analyzed using single- and double-porosity models although the Culebra Dolomite is known to possess multiple types and scales of porosity, and to exhibit multirate mobile-immobile-domain mass transfer characteristics at field scale. The data are reanalyzed here and null-space Monte Carlo analysis is used to facilitate objective model selection. Prediction (or residual) bias is adopted as a measure of the model structural error. The analysis clearly shows ...

  3. CORE

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Hundebøll, Martin

    2013-01-01

    different flows. Instead of maintaining these approaches separate, we propose a protocol (CORE) that brings together these coding mechanisms. Our protocol uses random linear network coding (RLNC) for intra- session coding but allows nodes in the network to setup inter- session coding regions where flows...... intersect. Routes for unicast sessions are agnostic to other sessions and setup beforehand, CORE will then discover and exploit intersecting routes. Our approach allows the inter-session regions to leverage RLNC to compensate for losses or failures in the overhearing or transmitting process. Thus, we...... increase the benefits of XORing by exploiting the underlying RLNC structure of individual flows. This goes beyond providing additional reliability to each individual session and beyond exploiting coding opportunistically. Our numerical results show that CORE outperforms both forwarding and COPE...

  4. CORE

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Hundebøll, Martin

    2013-01-01

    different flows. Instead of maintaining these approaches separate, we propose a protocol (CORE) that brings together these coding mechanisms. Our protocol uses random linear network coding (RLNC) for intra- session coding but allows nodes in the network to setup inter- session coding regions where flows...... intersect. Routes for unicast sessions are agnostic to other sessions and setup beforehand, CORE will then discover and exploit intersecting routes. Our approach allows the inter-session regions to leverage RLNC to compensate for losses or failures in the overhearing or transmitting process. Thus, we...... increase the benefits of XORing by exploiting the underlying RLNC structure of individual flows. This goes beyond providing additional reliability to each individual session and beyond exploiting coding opportunistically. Our numerical results show that CORE outperforms both forwarding and COPE...

  5. Analysis of Monolith Cores from an Engineering Scale Demonstration of a Prospective Cast Stone Process

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, C. L. [Savannah River Site (SRS), Aiken, SC (United States); Cozzi, A. D. [Savannah River Site (SRS), Aiken, SC (United States); Hill, K. A. [Savannah River Site (SRS), Aiken, SC (United States)

    2016-06-01

    The primary disposition path of Low Activity Waste (LAW) at the DOE Hanford Site is vitrification. A cementitious waste form is one of the alternatives being considered for the supplemental immobilization of the LAW that will not be treated by the primary vitrification facility. Washington River Protection Solutions (WRPS) has been directed to generate and collect data on cementitious or pozzolanic waste forms such as Cast Stone. This report documents the coring and leach testing of monolithic samples cored from an engineering-scale demonstration (ES Demo) with non-radioactive simulants. The ES Demo was performed at SRNL in October of 2013 using the Scaled Continuous Processing Facility (SCPF) to fill an 8.5 ft. diameter x 3.25 ft. high container with simulated Cast Stone grout. The Cast Stone formulation was chosen from the previous screening tests. Legacy salt solution from previous Hanford salt waste testing was adjusted to correspond to the average LAW composition generated from the Hanford Tank Waste Operation Simulator (HTWOS). The dry blend materials, ordinary portland cement (OPC), Class F fly ash, and ground granulated blast furnace slag (GGBFS or BFS), were obtained from Lafarge North America in Pasco, WA. In 2014 core samples originally obtained approximately six months after filling the ES Demo were tested along with bench scale molded samples that were collected during the original pour. A latter set of core samples were obtained in late March of 2015, eighteen months after completion of the original ES Demo. Core samples were obtained using a 2” diameter x 11” long coring bit. The ES Demo was sampled in three different regions consisting of an outer ring, a middle ring and an inner core zone. Cores from these three lateral zones were further segregated into upper, middle and lower vertical segments. Monolithic core samples were tested using the Environmental Protection Agency (EPA) Method 1315, which is designed to provide mass transfer rates

  6. Core-scale solute transport model selection using Monte Carlo analysis

    Science.gov (United States)

    Malama, Bwalya; Kuhlman, Kristopher L.; James, Scott C.

    2013-06-01

    Model applicability to core-scale solute transport is evaluated using breakthrough data from column experiments conducted with conservative tracers tritium (3H) and sodium-22 (22Na ), and the retarding solute uranium-232 (232U). The three models considered are single-porosity, double-porosity with single-rate mobile-immobile mass-exchange, and the multirate model, which is a deterministic model that admits the statistics of a random mobile-immobile mass-exchange rate coefficient. The experiments were conducted on intact Culebra Dolomite core samples. Previously, data were analyzed using single-porosity and double-porosity models although the Culebra Dolomite is known to possess multiple types and scales of porosity, and to exhibit multirate mobile-immobile-domain mass transfer characteristics at field scale. The data are reanalyzed here and null-space Monte Carlo analysis is used to facilitate objective model selection. Prediction (or residual) bias is adopted as a measure of the model structural error. The analysis clearly shows single-porosity and double-porosity models are structurally deficient, yielding late-time residual bias that grows with time. On the other hand, the multirate model yields unbiased predictions consistent with the late-time -5/2 slope diagnostic of multirate mass transfer. The analysis indicates the multirate model is better suited to describing core-scale solute breakthrough in the Culebra Dolomite than the other two models.

  7. Radial And Lateral Topographic Scales of the Inner-Core Boundary

    Science.gov (United States)

    Zheng, Y.; Cormier, V. F.; Fehler, M. C.

    2012-12-01

    Strong seismic evidence suggests that the inner core boundary region is dynamic. First, the strong PKP-df coda wave train in the previous time-lapse studies using earthquake doublets cannot be explained by simple rotation of the anisotropic inner core. Second, the observed PKiKP reflection amplitude from nuclear tests does not follow the prediction of a simple spherical model such as the PREM or IASP91. Third, observed amplitude and the traveltime of the PKP C-diff arrival favor different but inconsistent models (PREM-like linear gradient and IASP91-like no gradient, respectively) for the lowermost outer core. Fourth, we observed a seismic phase that had not been reported in the literature in the range of 150-153 degrees, which is about 2.0 seconds after the PKP-df and it has slightly positive slowness deviation compared to the PKIKP-df. Seismic migration showed that this arrival is associated with scattering objects above the turning depth of the PKIKP-df, close to the inner-core boundary. Using an elastic boundary element method, which takes into account of the fluid-solid boundary condition, we simulated high frequency (> 1 Hz) wave propagation and scattering in the inner core boundary region. We propose the presence of inner core topography as a plausible mechanism to explain all these observations. Our preliminary results by modeling the PKiKP amplitude showed that the previously proposed "mosaic structure" of the inner core could be well explained by inner core topography, with horizontal scale ~ 10km and vertical scale ~2km. In addition, for a fluid-solid boundary, topography can generate a strong Scholte wave, which is an interface wave (like the Rayleigh wave) whose amplitude decays exponentially away from the boundary. The Scholte wave can leak energy out of the C-diff wave therefore reducing the C-diff amplitude. Modeling the PKiKP, PKP-df coda and PKP C-diff allows us to place vertical and horizontal topographic bounds for the inner-core boundary.

  8. Accelerated gravity testing of aquitard core permeability and implications at formation and regional scale

    Directory of Open Access Journals (Sweden)

    W. A. Timms

    2015-03-01

    Full Text Available Evaluating the possibility of leakage through low permeability geological strata is critically important for sustainable water supplies, the extraction of fuels from strata such as coal beds, and the confinement of waste within the earth. The current work demonstrates that relatively rapid and reliable hydraulic conductivity (K measurement of aquitard cores using accelerated gravity can inform and constrain larger scale assessments of hydraulic connectivity. Steady state fluid velocity through a low K porous sample is linearly related to accelerated gravity (g-level in a centrifuge permeameter (CP unless consolidation or geochemical reactions occur. The CP module was custom designed to fit a standard 2 m diameter geotechnical centrifuge (550 g maximum with a capacity for sample dimensions of 30 to 100 mm diameter and 30 to 200 mm in length, and a maximum total stress of ~2 MPa at the base of the core. Formation fluids were used as influent to limit any shrink–swell phenomena which may alter the permeability. Vertical hydraulic conductivity (Kv results from CP testing of cores from three sites within the same regional clayey silt formation varied (10−7 to 10−9 m s−1, n = 14. Results at one of these sites (1.1 × 10−10 to 3.5 × 10−9 m s−1, n = 5 that were obtained in Kv values (3 × 10−9 m s−1 from pore pressure responses over several weeks within a 30 m clayey sequence. Core scale and in situ Kv results were compared with vertical connectivity within a regional flow model, and considered in the context of heterogeneity and preferential flow paths at site and formation scale. More reliable assessments of leakage and solute transport though aquitards over multi-decadal timescales can be achieved by accelerated core testing together with advanced geostatistical and numerical methods.

  9. Harnessing Petaflop-Scale Multi-Core Supercomputing for Problems in Space Science

    Science.gov (United States)

    Albright, B. J.; Yin, L.; Bowers, K. J.; Daughton, W.; Bergen, B.; Kwan, T. J.

    2008-12-01

    The particle-in-cell kinetic plasma code VPIC has been migrated successfully to the world's fastest supercomputer, Roadrunner, a hybrid multi-core platform built by IBM for the Los Alamos National Laboratory. How this was achieved will be described and examples of state-of-the-art calculations in space science, in particular, the study of magnetic reconnection, will be presented. With VPIC on Roadrunner, we have performed, for the first time, plasma PIC calculations with over one trillion particles, >100× larger than calculations considered "heroic" by community standards. This allows examination of physics at unprecedented scale and fidelity. Roadrunner is an example of an emerging paradigm in supercomputing: the trend toward multi-core systems with deep hierarchies and where memory bandwidth optimization is vital to achieving high performance. Getting VPIC to perform well on such systems is a formidable challenge: the core algorithm is memory bandwidth limited with low compute-to-data ratio and requires random access to memory in its inner loop. That we were able to get VPIC to perform and scale well, achieving >0.374 Pflop/s and linear weak scaling on real physics problems on up to the full 12240-core Roadrunner machine, bodes well for harnessing these machines for our community's needs in the future. Many of the design considerations encountered commute to other multi-core and accelerated (e.g., via GPU) platforms and we modified VPIC with flexibility in mind. These will be summarized and strategies for how one might adapt a code for such platforms will be shared. Work performed under the auspices of the U.S. DOE by the LANS LLC Los Alamos National Laboratory. Dr. Bowers is a LANL Guest Scientist; he is presently at D. E. Shaw Research LLC, 120 W 45th Street, 39th Floor, New York, NY 10036.

  10. On the magnetic field evolution time-scale in superconducting neutron star cores

    Science.gov (United States)

    Passamonti, Andrea; Akgün, Taner; Pons, José A.; Miralles, Juan A.

    2017-08-01

    We revisit the various approximations employed to study the long-term evolution of the magnetic field in neutron star cores and discuss their limitations and possible improvements. A recent controversy on the correct form of the induction equation and the relevant evolution time-scale in superconducting neutron star cores is addressed and clarified. We show that this ambiguity in the estimation of time-scales arises as a consequence of nominally large terms that appear in the induction equation, but which are, in fact, mostly irrotational. This subtlety leads to a discrepancy by many orders of magnitude when velocity fields are absent or ignored. Even when internal velocity fields are accounted for, only the solenoidal part of the electric field contributes to the induction equation, which can be substantially smaller than the irrotational part. We also argue that stationary velocity fields must be incorporated in the slow evolution of the magnetic field as the next level of approximation.

  11. Core and peripheral criteria of video game addiction in the game addiction scale for adolescents.

    Science.gov (United States)

    Brunborg, Geir Scott; Hanss, Daniel; Mentzoni, Rune Aune; Pallesen, Ståle

    2015-05-01

    Assessment of video game addiction often involves measurement of peripheral criteria that indicate high engagement with games, and core criteria that indicate problematic use of games. A survey of the Norwegian population aged 16-74 years (N=10,081, response rate 43.6%) was carried out in 2013, which included the Gaming Addiction Scale for Adolescents (GAS). Confirmatory factor analysis showed that a two-factor structure, which separated peripheral criteria from core criteria, fitted the data better (CFI=0.963; RMSEA=0.058) compared to the original one-factor solution where all items are determined to load only on one factor (CFI=0.905, RMSEA=0.089). This was also found when we analyzed men aged ≤33 years, men aged >33 years, women aged ≤33 years, and women aged >33 years separately. This indicates that the GAS measures both engagement and problems related to video games. Multi-group measurement invariance testing showed that the factor structure was valid in all four groups (configural invariance) for the two-factor structure but not for the one-factor structure. A novel approach to categorization of problem gamers and addicted gamers where only the core criteria items are used (the CORE 4 approach) was compared to the approach where all items are included (the GAS 7 approach). The current results suggest that the CORE 4 approach might be more appropriate for classification of problem gamers and addicted gamers compared to the GAS 7 approach.

  12. Crystallization of ion clouds in octupole traps: structural transitions, core melting, and scaling laws

    CERN Document Server

    Calvo, Florent; Yurtsever, Ersin

    2009-01-01

    The stable structures and melting properties of ion clouds in isotropic octupole traps are investigated using a combination of semi-analytical and numerical models, with a particular emphasis at finite size scaling effects. Small-size clouds are found to be hollow and arranged in shells corresponding approximately to the solutions of the Thomson problem. The shell structure is lost in clusters containing more than a few thousands of ions, the inner parts of the cloud becoming soft and amorphous. While melting is triggered in the core shells, the melting temperature unexpectedly follows the rule expected for three-dimensional dense particles, with a depression scaling linearly with the inverse radius.

  13. From Snakes to Stars, the Statistics of Collapsed Objects - II. Testing a Generic Scaling Ansatz for Hierarchical Clustering

    CERN Document Server

    Munshi, D; Melott, A L; Munshi, Dipak; Coles, Peter; Melott, Adrian L.

    1999-01-01

    We develop a diagrammatic technique to represent the multi-point cumulative probability density function (CPDF) of mass fluctuations in terms of the statistical properties of individual collapsed objects and relate this to other statistical descriptors such as cumulants, cumulant correlators and factorial moments. We use this approach to establish key scaling relations describing various measurable statistical quantities if clustering follows a simple general scaling ansatz, as expected in hierarchical models. We test these detailed predictions against high-resolution numerical simulations. We show that, when appropriate variables are used, the count probability distribution function (CPDF) and void probability distribution function (VPF) shows clear scaling properties in the non-linear regime. Generalising the results to the two-point count probability distribution function (2CPDF), and the bivariate void probability function (2VPF) we find good match with numerical simulations. We explore the behaviour of t...

  14. The development and validation of the core competencies scale (CCS) for the college and university students.

    Science.gov (United States)

    Ruan, Bin; Mok, Magdalena Mo Ching; Edginton, Christopher R; Chin, Ming Kai

    2012-01-01

    This article describes the development and validation of the Core Competencies Scale (CCS) using Bok's (2006) competency framework for undergraduate education. The framework included: communication, critical thinking, character development, citizenship, diversity, global understanding, widening of interest, and career and vocational development. The sample comprised 70 college and university students. Results of analysis using Rasch rating scale modelling showed that there was strong empirical evidence on the validity of the measures in contents, structure, interpretation, generalizability, and response options of the CCS scale. The implication of having developed Rasch-based valid and dependable measures in this study for gauging the value added of college and university education to their students is that the feedback generated from CCS will enable evidence-based decision and policy making to be implemented and strategized. Further, program effectiveness can be measured and thus accountability on the achievement of the program objectives.

  15. Targeting, out-scaling and prioritising climate-smart interventions in agricultural systems: Lessons from applying a generic framework to the livestock sector in sub-Saharan Africa.

    Science.gov (United States)

    Notenbaert, An; Pfeifer, Catherine; Silvestri, Silvia; Herrero, Mario

    2017-02-01

    As a result of population growth, urbanization and climate change, agricultural systems around the world face enormous pressure on the use of resources. There is a pressing need for wide-scale innovation leading to development that improves the livelihoods and food security of the world's population while at the same time addressing climate change adaptation and mitigation. A variety of promising climate-smart interventions have been identified. However, what remains is the prioritization of interventions for investment and broad dissemination. The suitability and adoption of interventions depends on a variety of bio-physical and socio-economic factors. Also their impacts, when adopted and out-scaled, are likely to be highly heterogeneous. This heterogeneity expresses itself not only spatially and temporally but also in terms of the stakeholders affected, some might win and some might lose. A mechanism that can facilitate a systematic, holistic assessment of the likely spread and consequential impact of potential interventions is one way of improving the selection and targeting of such options. In this paper we provide climate smart agriculture (CSA) planners and implementers at all levels with a generic framework for evaluating and prioritising potential interventions. This entails an iterative process of mapping out recommendation domains, assessing adoption potential and estimating impacts. Through examples, related to livestock production in sub-Saharan Africa, we demonstrate each of the steps and how they are interlinked. The framework is applicable in many different forms, scales and settings. It has a wide applicability beyond the examples presented and we hope to stimulate readers to integrate the concepts in the planning process for climate-smart agriculture, which invariably involves multi-stakeholder, multi-scale and multi-objective decision-making.

  16. An integrated, cross-disciplinary study of soil hydrophobicity at atomic, molecular, core and landscape scales

    Science.gov (United States)

    Matthews, G. Peter; Doerr, Stefan; Van Keulen, Geertje; Dudley, Ed; Francis, Lewis; Whalley, Richard; Gazze, Andrea; Hallin, Ingrid; Quinn, Gerry; Sinclair, Kat; Ashton, Rhys

    2017-04-01

    Soil hydrophobicity can lead to reduced soil fertility and heightened flood risk caused by increased run-off. Soil hydrophobicity is a well-known phenomenon when induced by natural events such as wildfires and anthropogenic causes including adding organic wastes or hydrocarbon contaminants. This presentation concerns a much more subtle effect - the naturally occurring changes between hydrophilic and hydrophobic states caused by periods of wetness and drought. Although subtle, they nevertheless affect vast areas of soil, and so their effects can be very significant, and are predicted to increase under climate change conditions. To understand the effect, a major interdisciplinary study has been commissioned by the UK's Natural Environment Research Council (NERC) to investigate soil hydrophobicity over length scales ranging from atomic through molecular, core and landscape scale. We present the key findings from the many publications currently in preparation. The programme is predicated on the hypothesis that changes in soil protein abundance and localization, induced by variations in soil moisture and temperature, are crucial driving forces for transitions between hydrophobic and hydrophilic conditions at soil particle surfaces, and that these effects can be meaningfully upscaled from molecular to landscape scale. Three soils were chosen based on the severity of hydrophobicity that can be achieved in the field: severe to extreme (natural rough pasture, Wales), intermediate to severe (pasture, Wales), and subcritical (managed research grassland, Rothamsted Research, England). The latter is already highly characterised so was also used as a control. Hydrophobic/ hydrophilic transitions were determined from water droplet penetration times. Scientific advances in the following five areas will be described: (i) the identification of these soil proteins by proteomic methods, using novel separation methods which reduces interference by humic acids, and allows identification

  17. Core and Peripheral Criteria of Video Game Addiction in the Game Addiction Scale for Adolescents

    Science.gov (United States)

    Hanss, Daniel; Mentzoni, Rune Aune; Pallesen, Ståle

    2015-01-01

    Abstract Assessment of video game addiction often involves measurement of peripheral criteria that indicate high engagement with games, and core criteria that indicate problematic use of games. A survey of the Norwegian population aged 16–74 years (N=10,081, response rate 43.6%) was carried out in 2013, which included the Gaming Addiction Scale for Adolescents (GAS). Confirmatory factor analysis showed that a two-factor structure, which separated peripheral criteria from core criteria, fitted the data better (CFI=0.963; RMSEA=0.058) compared to the original one-factor solution where all items are determined to load only on one factor (CFI=0.905, RMSEA=0.089). This was also found when we analyzed men aged ≤33 years, men aged >33 years, women aged ≤33 years, and women aged >33 years separately. This indicates that the GAS measures both engagement and problems related to video games. Multi-group measurement invariance testing showed that the factor structure was valid in all four groups (configural invariance) for the two-factor structure but not for the one-factor structure. A novel approach to categorization of problem gamers and addicted gamers where only the core criteria items are used (the CORE 4 approach) was compared to the approach where all items are included (the GAS 7 approach). The current results suggest that the CORE 4 approach might be more appropriate for classification of problem gamers and addicted gamers compared to the GAS 7 approach. PMID:25826043

  18. Transport coefficients and entropy-scaling law in liquid iron up to Earth-core pressures.

    Science.gov (United States)

    Cao, Qi-Long; Wang, Pan-Pan; Huang, Duo-Hui; Yang, Jun-Sheng; Wan, Ming-Jie; Wang, Fan-Hou

    2014-03-21

    Molecular dynamics simulations were applied to study the structural and transport properties, including the pair distribution function, the structure factor, the pair correlation entropy, self-diffusion coefficient, and viscosity, of liquid iron under high temperature and high pressure conditions. Our calculated results reproduced experimentally determined structure factors of liquid iron, and the calculated self-diffusion coefficients and viscosity agree well with previous simulation results. We show that there is a moderate increase of self-diffusion coefficients and viscosity along the melting curve up to the Earth-core pressure. Furthermore, the temperature dependencies of the pair correlation entropy, self-diffusion, and viscosity under high pressure condition have been investigated. Our results suggest that the temperature dependence of the pair correlation entropy is well described by T(-1) scaling, while the Arrhenius law well describes the temperature dependencies of self-diffusion coefficients and viscosity under high pressure. In particular, we find that the entropy-scaling laws, proposed by Rosenfeld [Phys. Rev. A 15, 2545 (1977)] and Dzugutov [Nature (London) 381, 137 (1996)] for self-diffusion coefficients and viscosity in liquid metals under ambient pressure, still hold well for liquid iron under high temperature and high pressure conditions. Using the entropy-scaling laws, we can obtain transport properties from structural properties under high pressure and high temperature conditions. The results provide a useful ingredient in understanding transport properties of planet's cores.

  19. Comparison of Prestellar Core Elongations and Large-scale Molecular Cloud Structures in the Lupus I Region

    Science.gov (United States)

    Poidevin, Frédérick; Ade, Peter A. R.; Angile, Francesco E.; Benton, Steven J.; Chapin, Edward L.; Devlin, Mark J.; Fissel, Laura M.; Fukui, Yasuo; Gandilo, Natalie N.; Gundersen, Joshua O.; Hargrave, Peter C.; Klein, Jeffrey; Korotkov, Andrei L.; Matthews, Tristan G.; Moncelsi, Lorenzo; Mroczkowski, Tony K.; Netterfield, Calvin B.; Novak, Giles; Nutter, David; Olmi, Luca; Pascale, Enzo; Savini, Giorgio; Scott, Douglas; Shariff, Jamil A.; Diego Soler, Juan; Tachihara, Kengo; Thomas, Nicholas E.; Truch, Matthew D. P.; Tucker, Carole E.; Tucker, Gregory S.; Ward-Thompson, Derek

    2014-08-01

    Turbulence and magnetic fields are expected to be important for regulating molecular cloud formation and evolution. However, their effects on sub-parsec to 100 parsec scales, leading to the formation of starless cores, are not well understood. We investigate the prestellar core structure morphologies obtained from analysis of the Herschel-SPIRE 350 μm maps of the Lupus I cloud. This distribution is first compared on a statistical basis to the large-scale shape of the main filament. We find the distribution of the elongation position angle of the cores to be consistent with a random distribution, which means no specific orientation of the morphology of the cores is observed with respect to the mean orientation of the large-scale filament in Lupus I, nor relative to a large-scale bent filament model. This distribution is also compared to the mean orientation of the large-scale magnetic fields probed at 350 μm with the Balloon-borne Large Aperture Telescope for Polarimetry during its 2010 campaign. Here again we do not find any correlation between the core morphology distribution and the average orientation of the magnetic fields on parsec scales. Our main conclusion is that the local filament dynamics—including secondary filaments that often run orthogonally to the primary filament—and possibly small-scale variations in the local magnetic field direction, could be the dominant factors for explaining the final orientation of each core.

  20. Comparison of prestellar core elongations and large-scale molecular cloud structures in the Lupus I region

    Energy Technology Data Exchange (ETDEWEB)

    Poidevin, Frédérick [UCL, KLB, Department of Physics and Astronomy, Gower Place, London WC1E 6BT (United Kingdom); Ade, Peter A. R.; Hargrave, Peter C.; Nutter, David [School of Physics and Astronomy, Cardiff University, Queens Buildings, The Parade, Cardiff CF24 3AA (United Kingdom); Angile, Francesco E.; Devlin, Mark J.; Klein, Jeffrey [Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States); Benton, Steven J.; Netterfield, Calvin B. [Department of Physics, University of Toronto, 60 St. George Street, Toronto, ON M5S 1A7 (Canada); Chapin, Edward L. [XMM SOC, ESAC, Apartado 78, E-28691 Villanueva de la Canãda, Madrid (Spain); Fissel, Laura M.; Gandilo, Natalie N. [Department of Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, ON M5S 3H4 (Canada); Fukui, Yasuo [Department of Physics, Nagoya University, Chikusa-ku, Nagoya, Aichi 464-8601 (Japan); Gundersen, Joshua O. [Department of Physics, University of Miami, 1320 Campo Sano Drive, Coral Gables, FL 33146 (United States); Korotkov, Andrei L. [Department of Physics, Brown University, 182 Hope Street, Providence, RI 02912 (United States); Matthews, Tristan G.; Novak, Giles [Department of Physics and Astronomy, Northwestern University, 2145 Sheridan Road, Evanston, IL 60208 (United States); Moncelsi, Lorenzo; Mroczkowski, Tony K. [California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Olmi, Luca, E-mail: fpoidevin@iac.es [Physics Department, University of Puerto Rico, Rio Piedras Campus, Box 23343, UPR station, San Juan, PR 00931 (United States); and others

    2014-08-10

    Turbulence and magnetic fields are expected to be important for regulating molecular cloud formation and evolution. However, their effects on sub-parsec to 100 parsec scales, leading to the formation of starless cores, are not well understood. We investigate the prestellar core structure morphologies obtained from analysis of the Herschel-SPIRE 350 μm maps of the Lupus I cloud. This distribution is first compared on a statistical basis to the large-scale shape of the main filament. We find the distribution of the elongation position angle of the cores to be consistent with a random distribution, which means no specific orientation of the morphology of the cores is observed with respect to the mean orientation of the large-scale filament in Lupus I, nor relative to a large-scale bent filament model. This distribution is also compared to the mean orientation of the large-scale magnetic fields probed at 350 μm with the Balloon-borne Large Aperture Telescope for Polarimetry during its 2010 campaign. Here again we do not find any correlation between the core morphology distribution and the average orientation of the magnetic fields on parsec scales. Our main conclusion is that the local filament dynamics—including secondary filaments that often run orthogonally to the primary filament—and possibly small-scale variations in the local magnetic field direction, could be the dominant factors for explaining the final orientation of each core.

  1. Millennial and sub-millennial scale climatic variations recorded in polar ice cores over the last glacial period

    National Research Council Canada - National Science Library

    Capron, E; Landais, A; Chappellaz, J; Schilt, A; Buiron, D; Dahl-Jensen, D; Johnsen, S. J; Jouzel, J; Lemieux-Dudon, B; Loulergue, L; Leuenberger, M; Masson-Delmotte, V; Meyer, H; Oerter, H; Stenni, B

    2010-01-01

    Since its discovery in Greenland ice cores, the millennial scale climatic variability of the last glacial period has been increasingly documented at all latitudes with studies focusing mainly on Marine Isotopic Stage 3 (MIS 3...

  2. Implementation of a reference-scaled average bioequivalence approach for highly variable generic drug products of agomelatine in Chinese subjects

    Directory of Open Access Journals (Sweden)

    Fang Tang

    2016-01-01

    Full Text Available The aim of this study was to apply the reference-scaled average bioequivalence (RSABE approach to evaluate the bioequivalence of 2 formulations of agomelatine, and to investigate the pharmacokinetic properties of agomelatine in Chinese healthy male subjects. This was performed in a single-dose, randomized-sequence, open-label, four-way crossover study with a one-day washout period between doses. Healthy Chinese males were randomly assigned to receive 25 mg of either the test or reference formulation. The formulations were considered bioequivalent if 90% confidence intervals (CIs for the log-transformed ratios and ratio of geometric means (GMR of AUC and Cmax of agomelatine were within the predetermined bioequivalence range based on RSABE method. Results showed that both of the 90% CIs for the log-transformed ratios of AUC and Cmax of 7-desmethyl-agomelatine and 3-hydroxy-agomelatine were within the predetermined bioequivalence range. The 90% CIs for natural log-transformed ratios of Cmax, AUC0–t and AUC0–∞ of agomelatine (104.42–139.86, 101.33–123.83 and 97.90–117.94 were within the RSABE acceptance limits, and 3-hydroxy-agomelatine (105.55–123.03, 101.95–109.10 and 101.72–108.70 and 7-desmethyl-agomelatine (104.50–125.23, 102.36–111.50 and 101.62–110.64 were within the FDA bioequivalence definition intervals (0.80–1.25 for AUC and 0.75–1.33 for Cmax. The RSABE approach was successful in evaluating the bioequivalence of these two formulations.

  3. Skin rash during treatment with generic itraconazole

    Directory of Open Access Journals (Sweden)

    Antonio De Vuono

    2014-01-01

    Full Text Available Generic drugs have the same active substance, the same pharmaceutical form, the same therapeutic indications and a similar bioequivalence with the reference medicinal product (branded. Although a similar efficacy is postulated, some cases of clinical inefficacy during treatment with generic formulations have been reported. In this case, we describe a woman with onychomycosis that developed a skin rash during treatment with a generic formulation of itraconazole. Drug administration and its re-challenge confirmed the association between itraconazole and skin rash. Both Naranjo probability scale and World Health Organization causality assessment scale documented a probable association between generic-itraconazole and skin rash. The switch from generic formulation to brand one induced an improvement of symptoms. Since we are unable to evaluate the role of each excipient in the development of skin rash, we cannot rule out their involvement. However, more data are necessary to better define the similarities or differences between branded and generic formulations.

  4. The Symptom Checklist-core depression (SCL-CD6) scale

    DEFF Research Database (Denmark)

    Magnusson Hanson, Linda L; Westerlund, Hugo; Leineweber, Constanze

    2014-01-01

    AIMS: Major depressive disorders are common, with substantial impact on individuals/society. Brief scales for depression severity, based on a small number of characteristics all of which are necessary for diagnosis, have been recommended in self-reported versions for clinical work or research when...... aiming to quickly and accurately measure depression. We have examined psychometric properties of a brief 6-item version of the Symptom Checklist (SCL), the Symptom Checklist core depression scale (SCL-CD6) and aimed to identify a cut-point for epidemiological research. METHODS: The psychometric...... evaluation of the SCL-CD6 was mainly performed by a Mokken analysis of unidimensionality in a random sample of 1476 residents in the Stockholm County, aged 18-64 years. The standardization of SCL-CD6 was based on ROC analysis, using the Major Depression Inventory as index of validity. Predictive validity...

  5. Infrared length scale and extrapolations for the no-core shell model

    CERN Document Server

    Wendt, K A; Papenbrock, T; Sääf, D

    2015-01-01

    We precisely determine the infrared (IR) length scale of the no-core shell model (NCSM). In the NCSM, the $A$-body Hilbert space is truncated by the total energy, and the IR length can be determined by equating the intrinsic kinetic energy of $A$ nucleons in the NCSM space to that of $A$ nucleons in a $3(A-1)$-dimensional hyper-radial well with a Dirichlet boundary condition for the hyper radius. We demonstrate that this procedure indeed yields a very precise IR length by performing large-scale NCSM calculations for $^{6}$Li. We apply our result and perform accurate IR extrapolations for bound states of $^{4}$He, $^{6}$He, $^{6}$Li, $^{7}$Li. We also attempt to extrapolate NCSM results for $^{10}$B and $^{16}$O with bare interactions from chiral effective field theory over tens of MeV.

  6. Development of an integrated generic model for multi-scale assessment of the impacts of agro-ecosystems on major ecosystem services in West Africa.

    Science.gov (United States)

    Belem, Mahamadou; Saqalli, Mehdi

    2017-11-01

    This paper presents an integrated model assessing the impacts of climate change, agro-ecosystem and demographic transition patterns on major ecosystem services in West-Africa along a partial overview of economic aspects (poverty reduction, food self-sufficiency and income generation). The model is based on an agent-based model associated with a soil model and multi-scale spatial model. The resulting Model for West-Africa Agro-Ecosystem Integrated Assessment (MOWASIA) is ecologically generic, meaning it is designed for all sudano-sahelian environments but may then be used as an experimentation facility for testing different scenarios combining ecological and socioeconomic dimensions. A case study in Burkina Faso is examined to assess the environmental and economic performances of semi-continuous and continuous farming systems. Results show that the semi-continuous system using organic fertilizer and fallowing practices contribute better to environment preservation and food security than the more economically performant continuous system. In addition, this study showed that farmers heterogeneity could play an important role in agricultural policies planning and assessment. In addition, the results showed that MOWASIA is an effective tool for designing, analysing the impacts of agro-ecosystems. Copyright © 2017. Published by Elsevier Ltd.

  7. Detailed Interstellar Polarimetric Properties of the Pipe Nebula at Core Scales

    CERN Document Server

    Franco, G A P; Girart, J M

    2010-01-01

    We use R-band CCD linear polarimetry collected for about 12000 background field stars in 46 fields of view toward the Pipe nebula to investigate the properties of the polarization across this dark cloud. Based on archival 2MASS data we estimate that the surveyed areas present total visual extinctions in the range 0.6 < Av < 4.6. While the observed polarizations show a well ordered large scale pattern, with polarization vectors almost perpendicularly aligned to the cloud's long axis, at core scales one see details that are characteristics of each core. Although many observed stars present degree of polarization which are unusual for the common interstellar medium, our analysis suggests that the dust grains constituting the diffuse parts of the Pipe nebula seem to have the same properties as the normal Galactic interstellar medium. Estimates of the second-order structure function of the polarization angles suggest that most of the Pipe nebula is magnetically dominated and that turbulence is sub-Alvenic. T...

  8. Scaling Denitrification Fluxes from Cores to Catchments: Spatial and Temporal Controls

    Science.gov (United States)

    Duncan, J. M.; Band, L. E.; Groffman, P. M.

    2015-12-01

    The influence of spatial and temporal heterogeneity on nitrogen cycling can be profound but catchment scale understanding remains elusive. One of the largest sources of uncertainty is the importance of denitrification. Determining in situ rates of denitrification in elements of landscape that remove a disproportionately high amount of N from certain areas of catchment (hot spots) in response to seasonal and event driven conditions (hot moments) is critical to closing watershed nitrogen budgets. We develop an approach to scale denitrification flux from seasonal soil cores collected in different landscape positions to the entire watershed using a combination of laboratory core experiments, terrain analysis and in situ soil oxygen and soil moisture content sensors. In the Pond Branch watershed in the Piedmont region of Maryland, nitrogen deposition values are relatively high (9kg/ha/yr) with low stream export (0.5 kg/ha/yr). Our data suggest that at least 16-27% of this retention can be accounted for by denitrification in certain areas of the riparian zone. We highlight the importance of riparian microtopography and the need to better link observations and models.

  9. Disentangling the dynamic core: a research program for a neurodynamics at the large-scale.

    Science.gov (United States)

    Le Van Quyen, Michel

    2003-01-01

    My purpose in this paper is to sketch a research direction based on Francisco Varela's pioneering work in neurodynamics (see also Rudrauf et al. 2003, in this issue). Very early on he argued that the internal coherence of every mental-cognitive state lies in the global self-organization of the brain activities at the large-scale, constituting a fundamental pole of integration called here a "dynamic core". Recent neuroimaging evidence appears to broadly support this hypothesis and suggests that a global brain dynamics emerges at the large scale level from the cooperative interactions among widely distributed neuronal populations. Despite a growing body of evidence supporting this view, our understanding of these large-scale brain processes remains hampered by the lack of a theoretical language for expressing these complex behaviors in dynamical terms. In this paper, I propose a rough cartography of a comprehensive approach that offers a conceptual and mathematical framework to analyze spatio-temporal large-scale brain phenomena. I emphasize how these nonlinear methods can be applied, what property might be inferred from neuronal signals, and where one might productively proceed for the future. This paper is dedicated, with respect and affection, to the memory of Francisco Varela.

  10. From Cores to Envelopes to Disks: A Multi-scale View of Magnetized Star Formation

    Science.gov (United States)

    Hull, Charles L. H.

    2014-12-01

    Observations of polarization in star forming regions have been made across many wavelengths, many size scales, and many stages of stellar evolution. One of the overarching goals of these observations has been to determine the importance of magnetic fields -- which are the cause of the polarization -- in the star formation process. We begin by describing the commissioning and the calibration of the 1.3 mm dual-polarization receiver system we built for CARMA (the Combined Array for Research in Millimeter-wave Astronomy), a radio telescope in the eastern Sierra region of California. One of the primary science drivers behind the polarization system is to observe polarized thermal emission from dust grains in the dense clumps of dust and gas where the youngest, Class 0 protostars are forming. We go on to describe the CARMA TADPOL survey -- the largest high-resolution (~1000 AU scale) survey to date of dust polarization in low-mass protostellar cores -- and discuss our main findings: (1) Magnetic fields (B-fields) on scales of ~1000 AU are not tightly aligned with protostellar outflows. Rather, the data are consistent both with scenarios where outflows and magnetic fields are preferentially misaligned (perpendicular) and where they are randomly aligned. (2) Sources with high CARMA polarization fractions have consistent B-field orientations on large scales (~20'', measured using single-dish submillimeter telescopes) and small scales (~2.5'', measured by CARMA). We interpret this to mean that in at least some cases B-fields play a role in regulating the infall of material all the way down to the ~1000 AU scales of protostellar envelopes. Finally, (3) While on the whole outflows appear to be randomly aligned with B-fields, in sources with low polarization fractions there is a hint that outflows are preferentially perpendicular to small-scale B-fields, which suggests that in these sources the fields have been wrapped up by envelope rotation. This work shows that the ~1000 AU

  11. The (mismeasurement of the Dark Triad Dirty Dozen: exploitation at the core of the scale

    Directory of Open Access Journals (Sweden)

    Petri J. Kajonius

    2016-03-01

    Full Text Available Background. The dark side of human character has been conceptualized in the Dark Triad Model: Machiavellianism, psychopathy, and narcissism. These three dark traits are often measured using single long instruments for each one of the traits. Nevertheless, there is a necessity of short and valid personality measures in psychological research. As an independent research group, we replicated the factor structure, convergent validity and item response for one of the most recent and widely used short measures to operationalize these malevolent traits, namely, Jonason’s Dark Triad Dirty Dozen. We aimed to expand the understanding of what the Dirty Dozen really captures because the mixed results on construct validity in previous research. Method. We used the largest sample to date to respond to the Dirty Dozen (N = 3,698. We firstly investigated the factor structure using Confirmatory Factor Analysis and an exploratory distribution analysis of the items in the Dirty Dozen. Secondly, using a sub-sample (n = 500 and correlation analyses, we investigated the Dirty Dozen dark traits convergent validity to Machiavellianism measured by the Mach-IV, psychopathy measured by Eysenck’s Personality Questionnaire Revised, narcissism using the Narcissism Personality Inventory, and both neuroticism and extraversion from the Eysenck’s questionnaire. Finally, besides these Classic Test Theory analyses, we analyzed the responses for each Dirty Dozen item using Item Response Theory (IRT. Results. The results confirmed previous findings of a bi-factor model fit: one latent core dark trait and three dark traits. All three Dirty Dozen traits had a striking bi-modal distribution, which might indicate unconcealed social undesirability with the items. The three Dirty Dozen traits did converge too, although not strongly, with the contiguous single Dark Triad scales (r between .41 and .49. The probabilities of filling out steps on the Dirty Dozen narcissism-items were

  12. Soil hydrophobicity - relating effects at atomic, molecular, core and national scales

    Science.gov (United States)

    Matthews, Peter; Doerr, Stefan; Van Keulen, Geertje; Dudley, Ed; Francis, Lewis; Whalley, Richard; Gazze, Andrea; Hallin, Ingrid; Quinn, Gerry; Sinclair, Kat; Ashton, Rhys

    2016-04-01

    The detrimental impacts of soil hydrophobicity include increased runoff, erosion and flooding, reduced biomass production, inefficient use of irrigation water and preferential leaching of pollutants. Its impacts may exacerbate flood risk associated with more extreme drought and precipitation events predicted with UK climate change scenarios. The UK's Natural Environment Research Council (NERC) has therefore funded a major research programme to investigate soil hydrophobicity over length scales ranging from atomic through molecular, core and landscape scale. This presentation gives an overview of the findings to date. The programme is predicated on the hypothesis that changes in soil protein abundance and localization, induced by variations in soil moisture and temperature, are crucial driving forces for transitions between hydrophobic and hydrophilic conditions at soil particle surfaces. Three soils were chosen based on the severity of hydrophobicity that can be achieved in the field: severe to extreme (Cefn Bryn, Gower, Wales), intermediate to severe (National Botanical Garden, Wales), and subcritical (Park Grass, Rothamsted Research near London). The latter is already highly characterised so was also used as a control. Hydrophobic/ hydrophilic transitions were measured from water droplet penetration times. Scientific advances in the following five areas will be described: (i) the identification of these soil proteins by proteomic methods, using a novel separation method which reduces interference by humic acids, and allows identification by ESI and MALDI TOF mass spectrometry and database searches, (ii) the examination of such proteins, which form ordered hydrophobic ridges, and measurement of their elasticity, stickiness and hydrophobicity at nano- to microscale using atomic force microscopy adapted for the rough surfaces of soil particles, (iii) the novel use of a picoliter goniometer to show hydrophobic effects at a 1 micron diameter droplet level, which

  13. A Rasch scaling validation of a 'core' near-death experience.

    Science.gov (United States)

    Lange, Rense; Greyson, Bruce; Houran, James

    2004-05-01

    For those with true near-death experiences (NDEs), Greyson's (1983, 1990) NDE Scale satisfactorily fits the Rasch rating scale model, thus yielding a unidimensional measure with interval-level scaling properties. With increasing intensity, NDEs reflect peace, joy and harmony, followed by insight and mystical or religious experiences, while the most intense NDEs involve an awareness of things occurring in a different place or time. The semantics of this variable are invariant across True-NDErs' gender, current age, age at time of NDE, and latency and intensity of the NDE, thus identifying NDEs as 'core' experiences whose meaning is unaffected by external variables, regardless of variations in NDEs' intensity. Significant qualitative and quantitative differences were observed between True-NDErs and other respondent groups, mostly revolving around the differential emphasis on paranormal/mystical/religious experiences vs. standard reactions to threat. The findings further suggest that False-Positive respondents reinterpret other profound psychological states as NDEs. Accordingly, the Rasch validation of the typology proposed by Greyson (1983) also provides new insights into previous research, including the possibility of embellishment over time (as indicated by the finding of positive, as well as negative, latency effects) and the potential roles of religious affiliation and religiosity (as indicated by the qualitative differences surrounding paranormal/mystical/religious issues).

  14. Mpc-scale diffuse radio emission in two massive cool-core clusters of galaxies

    Science.gov (United States)

    Sommer, Martin W.; Basu, Kaustuv; Intema, Huib; Pacaud, Florian; Bonafede, Annalisa; Babul, Arif; Bertoldi, Frank

    2017-04-01

    Radio haloes are diffuse synchrotron sources on scales of ∼1 Mpc that are found in merging clusters of galaxies, and are believed to be powered by electrons re-accelerated by merger-driven turbulence. We present measurements of extended radio emission on similarly large scales in two clusters of galaxies hosting cool cores: Abell 2390 and Abell 2261. The analysis is based on interferometric imaging with the Karl G. Jansky Very Large Array, Very Large Array and Giant Metrewave Radio Telescope. We present detailed radio images of the targets, subtract the compact emission components and measure the spectral indices for the diffuse components. The radio emission in A2390 extends beyond a known sloshing-like brightness discontinuity, and has a very steep in-band spectral slope at 1.5 GHz that is similar to some known ultrasteep spectrum radio haloes. The diffuse signal in A2261 is more extended than in A2390 but has lower luminosity. X-ray morphological indicators, derived from XMM-Newton X-ray data, place these clusters in the category of relaxed or regular systems, although some asymmetric features that can indicate past minor mergers are seen in the X-ray brightness images. If these two Mpc-scale radio sources are categorized as giant radio haloes, they question the common assumption of radio haloes occurring exclusively in clusters undergoing violent merging activity, in addition to commonly used criteria for distinguishing between radio haloes and minihaloes.

  15. Small-scale cyclones on the periphery of a Gulf Stream warm-core ring

    Science.gov (United States)

    Kennelly, M. A.; Evans, R. H.; Joyce, T. M.

    1985-01-01

    Small-scale cyclones found around Gulf Stream warm-core ring 82B are investigated by using infrared satellite images and current information obtained with an acoustic-Doppler velocimeter. Currents in these cyclones reveal speeds ranging from 20 to 80 cm/s. One small cyclone or 'ringlet' found in June 1982 was studied extensively by removing the basic rotational velocities of 82B. The azimuthal velocity field for this ringlet was used with the gradient current equation to calculate the absolute dynamic topography at 100 dbar. It was found that the ringlet was 13 dyn-cm lower than its surroundings. In addition, neglect of the centrifugal term would have changed the dynamic topography of the ringlet by 30 percent. From a comparison with CTD data the absolute reference level was determined, and a vertical profile of horizontal currents was calculated for the ringlet. Other cyclones were found throughout the slope water region around warm-core ring 82B with observable lifetimes of 1 to 2 weeks. The northeast quadrant of 82B was a favored generation site for ringlets. Two cyclones were observed to form in this region and were advected anticyclonically around 82B. Typically, at any one time, six cyclones with diameters of approximately 40 to 50 km can be detected north of the Gulf Stream by using satellite images.

  16. The core self-evaluation scale: psychometric properties of the german version in a representative sample.

    Science.gov (United States)

    Zenger, Markus; Körner, Annett; Maier, Günter W; Hinz, Andreas; Stöbel-Richter, Yve; Brähler, Elmar; Hilbert, Anja

    2015-01-01

    The Core Self-Evaluation Scale (CSES) is an economical self-reporting instrument that assesses fundamental evaluations of self-worthiness and capabilities. The broad aims of this study were to test the CSES's psychometric properties. The study is based on a representative survey of the German general population. Confirmatory factor analyses were conducted for different models with 1, 2, and 4 latent factors. The CSES was found to be reliable and valid, as it correlated as expected with measures of depression, anxiety, quality of life, self-report health status, and pain. A 2-factor model with 2 related factors (r = -.62) showed the best model fit. Furthermore, the CSES was measurement invariant across gender and age. In general, males had higher values of positive self-evaluations and lower negative self-evaluations than females. It is concluded that the CSES is a useful tool for assessing resource-oriented personality constructs.

  17. Testing the Large-scale Environments of Cool-core and Non-cool-core Clusters with Clustering Bias

    Science.gov (United States)

    Medezinski, Elinor; Battaglia, Nicholas; Coupon, Jean; Cen, Renyue; Gaspari, Massimo; Strauss, Michael A.; Spergel, David N.

    2017-02-01

    There are well-observed differences between cool-core (CC) and non-cool-core (NCC) clusters, but the origin of this distinction is still largely unknown. Competing theories can be divided into internal (inside-out), in which internal physical processes transform or maintain the NCC phase, and external (outside-in), in which the cluster type is determined by its initial conditions, which in turn leads to different formation histories (i.e., assembly bias). We propose a new method that uses the relative assembly bias of CC to NCC clusters, as determined via the two-point cluster-galaxy cross-correlation function (CCF), to test whether formation history plays a role in determining their nature. We apply our method to 48 ACCEPT clusters, which have well resolved central entropies, and cross-correlate with the SDSS-III/BOSS LOWZ galaxy catalog. We find that the relative bias of NCC over CC clusters is b = 1.42 ± 0.35 (1.6σ different from unity). Our measurement is limited by the small number of clusters with core entropy information within the BOSS footprint, 14 CC and 34 NCC clusters. Future compilations of X-ray cluster samples, combined with deep all-sky redshift surveys, will be able to better constrain the relative assembly bias of CC and NCC clusters and determine the origin of the bimodality.

  18. Determining pore length scales and pore surface relaxivity of rock cores by internal magnetic fields modulation at 2MHz NMR.

    Science.gov (United States)

    Liu, Huabing; Nogueira d'Eurydice, Marcel; Obruchkov, Sergei; Galvosas, Petrik

    2014-09-01

    Pore length scales and pore surface relaxivities of rock cores with different lithologies were studied on a 2MHz Rock Core Analyzer. To determine the pore length scales of the rock cores, the high eigenmodes of spin bearing molecules satisfying the diffusion equation were detected with optimized encoding periods in the presence of internal magnetic fields Bin. The results were confirmed using a 64MHz NMR system, which supports the feasibility of high eigenmode detection at fields as low as 2MHz. Furthermore, this methodology was combined with relaxometry measurements to a two-dimensional experiment, which provides correlation between pore length and relaxation time. This techniques also yields information on the surface relaxivity of the rock cores. The estimated surface relaxivities were then compared to the results using an independent NMR method.

  19. Induced Core Formation Time in Subcritical Magnetic Clouds by Large-Scale Trans-Alfv\\'enic Flows

    CERN Document Server

    Kudoh, Takahiro

    2014-01-01

    We clarify the mechanism of accelerated core formation by large-scale nonlinear flows in subcritical magnetic clouds by finding a semi-analytical formula for the core formation time and describing the physical processes that lead to them. Recent numerical simulations show that nonlinear flows induce rapid ambipolar diffusion that leads to localized supercritical regions that can collapse. Here, we employ non-ideal magnetohydrodynamic simulations including ambipolar diffusion for gravitationally stratified sheets threaded by vertical magnetic fields. One of the horizontal dimensions is eliminated, resulting in a simpler two-dimensional simulation that can clarify the basic process of accelerated core formation. A parameter study of simulations shows that the core formation time is inversely proportional to the square of the flow speed when the flow speed is greater than the Alfv\\'en speed. We find a semi-analytical formula that explains this numerical result. The formula also predicts that the core formation t...

  20. Tropical Ice Core Records: Evidence for Asynchronous Glaciation on Milankovitch Time Scales

    Science.gov (United States)

    Thompson, L. G.

    2001-12-01

    Ice core records are available from selected high altitude, low and mid-latitude ice caps. Comparisons are made among the histories from the Tibetan Plateau, the tropical Andes of South America, and Kilimanjaro in East Africa. Three of these records (Guliya in China, Huascarán in Peru, and Sajama in Bolivia) contain ice deposited during the Last Glacial Stage (LGS). The oxygen isotopic ratios (δ 18O) of this ice suggest significant tropical cooling ( ~5° C). Comparison of a global array of cores reveals large-scale similarities as well as important regional differences. The δ 18O shift from Early Holocene to LGM is 5.4‰ on Sajama, 6.3‰ on Huascarán, ~5.3‰ in central Greenland, 6.6‰ at Byrd Station in Antarctica and 5.4‰ at Vostok also in Antarctica. These records all show similar isotopic depletion, reflecting significant global cooling at the Late Glacial Maximum (LGM). As continental ice sheets form only in high latitudes (>40° ), those regions have provided most of the evidence for the pulsing of Quaternary glaciations. In low latitudes, glaciers are restricted to the high mountains and only recently have enough long tropical ice core histories become available to investigate the timing of glaciations there. Long ice cores recovered to bedrock at 7 high-altitude (>5300 m) sites on three continents are investigated for synchroneity of their glaciation histories. The cores from Huascarán in Peru at 9° S and Sajama in Bolivia at 18° S contain continuous records back into the LGS. Both glaciers clearly survived the early Holocene warm period (9 to 6 ka B.P.), but neither contains a long record of glacial stage climate back to the previous interglacial. Rather, the published records from Huascarán and Sajama extend back ~19 kyr and 25 kyr, respectively. Hence, both mountaintops, among the highest in South America, appear to have been ice free during a time considered significantly colder than the Holocene. The records from Dasuopu (28° N) and

  1. A Search for Small-Scale Clumpiness in Dense Cores of Molecular Clouds

    CERN Document Server

    Pirogov, L E; 10.1134/S1063772908120020

    2009-01-01

    We have analyzed HCN(1-0) and CS(2-1) line profiles obtained with high signal-to-noise ratios toward distinct positions in three selected objects in order to search for small-scale structure in molecular cloud cores associated with regions of high-mass star formation. In some cases, ripples were detected in the line profiles, which could be due to the presence of a large number of unresolved small clumps in the telescope beam. The number of clumps for regions with linear scales of ~0.2-0.5 pc is determined using an analytical model and detailed calculations for a clumpy cloud model; this number varies in the range: ~2 10^4-3 10^5, depending on the source. The clump densities range from ~3 10^5-10^6 cm^{-3}, and the sizes and volume filling factors of the clumps are ~(1-3) 10^{-3} pc and ~0.03-0.12. The clumps are surrounded by inter-clump gas with densities not lower than ~(2-7) 10^4 cm^{-3}. The internal thermal energy of the gas in the model clumps is much higher than their gravitational energy. Their mean ...

  2. Mpc-scale diffuse radio emission in two massive cool-core clusters of galaxies

    CERN Document Server

    Sommer, Martin W; Intema, Huib; Pacaud, Florian; Bonafede, Annalisa; Babul, Arif; Bertoldi, Frank

    2016-01-01

    Radio halos are diffuse synchrotron sources on scales of ~1 Mpc that are found in merging clusters of galaxies, and are believed to be powered by electrons re-accelerated by the merger-driven turbulence. We present measurements of extended radio emission on similarly large scales in two clusters of galaxies hosting cool cores: Abell 2390 and Abell 2261. The analysis is based on interferometric imaging with the JVLA, VLA and GMRT. We present detailed radio images of the targets, subtract the compact emission components, and measure the spectral indices for the diffuse components. The radio emission in A2390 extends beyond a known sloshing-like brightness discontinuity, and has a very steep in-band spectral slope at 1.5 GHz that is similar to some known ultra-steep spectrum radio halos. The diffuse signal in A2261 is more extended than in A2390 but has lower luminosity. X-ray morphological indicators, derived from XMM-Newton X-ray data, place these clusters in the category of relaxed or regular systems, althoug...

  3. An ice core derived 1013-year catchment-scale annual rainfall reconstruction in subtropical eastern Australia

    Science.gov (United States)

    Tozer, Carly R.; Vance, Tessa R.; Roberts, Jason L.; Kiem, Anthony S.; Curran, Mark A. J.; Moy, Andrew D.

    2016-05-01

    Paleoclimate research indicates that the Australian instrumental climate record (˜ 100 years) does not cover the full range of hydroclimatic variability that is possible. To better understand the implications of this on catchment-scale water resources management, a 1013-year (1000-2012 common era (CE)) annual rainfall reconstruction was produced for the Williams River catchment in coastal eastern Australia. No high-resolution paleoclimate proxies are located in the region and so a teleconnection between summer sea salt deposition recorded in ice cores from East Antarctica and rainfall variability in eastern Australia was exploited to reconstruct the catchment-scale rainfall record. The reconstruction shows that significantly longer and more frequent wet and dry periods were experienced in the preinstrumental compared to the instrumental period. This suggests that existing drought and flood risk assessments underestimate the true risks due to the reliance on data and statistics obtained from only the instrumental record. This raises questions about the robustness of existing water security and flood protection measures and has serious implications for water resources management, infrastructure design and catchment planning. The method used in this proof of concept study is transferable and enables similar insights into the true risk of flood/drought to be gained for other paleoclimate proxy poor regions for which suitable remote teleconnected proxies exist. This will lead to improved understanding and ability to deal with the impacts of multi-decadal to centennial hydroclimatic variability.

  4. WETTABILITY AND IMBIBITION: MICROSCOPIC DISTRIBUTION OF WETTING AND ITS CONSEQUENCES AT THE CORE AND FIELD SCALES

    Energy Technology Data Exchange (ETDEWEB)

    Jill S. Buckley; Norman R. Morrow; Chris Palmer; Purnendu K. Dasgupta

    2003-02-01

    The questions of reservoir wettability have been approached in this project from three directions. First, we have studied the properties of crude oils that contribute to wetting alteration in a reservoir. A database of more than 150 different crude oil samples has been established to facilitate examination of the relationships between crude oil chemical and physical properties and their influence on reservoir wetting. In the course of this work an improved SARA analysis technique was developed and major advances were made in understanding asphaltene stability including development of a thermodynamic Asphaltene Solubility Model (ASM) and empirical methods for predicting the onset of instability. The CO-Wet database is a resource that will be used to guide wettability research in the future. The second approach is to study crude oil/brine/rock interactions on smooth surfaces. Contact angle measurements were made under controlled conditions on mica surfaces that had been exposed to many of the oils in the CO-Wet database. With this wealth of data, statistical tests can now be used to examine the relationships between crude oil properties and the tendencies of those oils to alter wetting. Traditionally, contact angles have been used as the primary wetting assessment tool on smooth surfaces. A new technique has been developed using an atomic forces microscope that adds a new dimension to the ability to characterize oil-treated surfaces. Ultimately we aim to understand wetting in porous media, the focus of the third approach taken in this project. Using oils from the CO-Wet database, experimental advances have been made in scaling the rate of imbibition, a sensitive measure of core wetting. Application of the scaling group to mixed-wet systems has been demonstrated for a range of core conditions. Investigations of imbibition in gas/liquid systems provided the motivation for theoretical advances as well. As a result of this project we have many new tools for studying

  5. Finding generically stable measures

    CERN Document Server

    Simon, Pierre

    2010-01-01

    We discuss two constructions for obtaining generically stable Keisler measures in an NIP theory. First, we show how to symmetrize an arbitrary invariant measure to obtain a generically stable one from it. Next, we show that suitable sigma-additive probability measures give rise to generically stable measures. Also included is a proof that generically stable measures over o-minimal theories and the p-adics are smooth.

  6. The generic article

    NARCIS (Netherlands)

    Farkas, D.F.; Swart, Henriëtte de

    2005-01-01

    We take a fresh look at the connection between genericity and (in)definiteness by reconsidering a long-standing puzzle concerning the relation between definiteness and genericity. We contrast English on the one hand and Romance languages and Hungarian on the other, focusing on generic sentences invo

  7. Testing the Large-Scale Environments of Cool-core and Noncool-core Clusters with Clustering Bias

    CERN Document Server

    Medezinski, Elinor; Coupon, Jean; Cen, Renyue; Gaspari, Massimo; Strauss, Michael A; Spergel, David N

    2016-01-01

    There is a well observed bimodality in X-ray astronomy between cool-core (CC) and non-cool-core (NCC) clusters, but the origin of this distinction is still largely unknown. Competing theories can be divided into internal (inside-out), in which internal physical processes transform or maintain the NCC phase, and external (outside-in), in which the cluster type is determined by its initial conditions, which in turn lead to different formation histories (i.e., assembly bias). We propose a new method that uses the relative assembly bias of CC to NCC clusters, as determined via the two-point cluster-galaxy cross-correlation function (CCF), to test whether formation history plays a role in determining their nature. We apply our method to 48 ACCEPT clusters, which have well resolved central entropies, and cross-correlate with the SDSS-III/BOSS LOWZ galaxy catalog. We find that the relative bias of NCC over CC clusters is $b = 1.42 \\pm 0.35$ ($1.6\\sigma$ different from unity). Our measurement is limited by the small ...

  8. Substituing supplementary subtests for core subtests on reliability of WISC-IV Indexes and Full Scale IQ.

    Science.gov (United States)

    Ryan, Joseph J; Glass, Laura A

    2006-02-01

    The effects of replacing core subtests with supplementary subtests on composite score reliabilities were evaluated for the WISC-IV Indexes and Full Scale IQ. When Wechsler's guidelines are followed, i.e., only one substitution for each Index; no more than two substitutions from different Indexes when assessing the Full Scale IQ, summary score reliabilities remain high, and measurement error, as defined by confidence intervals around obtained scores, never increases by more than 1 index score point. In three instances, substitution of a supplementary subtest for a core subtest actually increased the reliabilities and decreased the amount of associated measurement error.

  9. Is there a connection between Earth's core and climate at multidecadal time scales?

    Science.gov (United States)

    Lambert, Sébastien; Marcus, Steven; de Viron, Olivier

    2017-04-01

    The length-of-day (LOD) undergoes multidecadal variations of several milliseconds (ms) attributed to changes in the fluid outer core angular momentum. These variations resemble a quasi-periodic oscillation of duration 60 to 70 years, although the periodicity (and its accurate length) are disputable because of the relatively short observational time span and the lower quality of the observations before the 20th century. Interestingly, similar variations show up in various measured or reconstructed climate indices including the sea surface (SST) and surface air (SAT) temperatures. It has been shown in several studies that LOD variations lead SST and SAT variations by a few years. No clear scenarios have been raised so far to explain the link between external, astronomical forcing (e.g., Solar wind), Earth's rotation (core-driven torsional) oscillations, and Earth's surface processes (climate variations) at these time scales. Accumulating evidence, however, suggests the centrifugal tides generated by multidecadal LOD variations as a 'valve' to control the transfer of thermal energy from the lithosphere to the surface via geothermal fluxes. This hypothesis is supported by recent studies reporting significant correlations between tidal and rotational excitation and seafloor and surface volcanism. In this study, we extend recent works from us and other independent authors by re-assessing the correlations between multidecadal LOD, climate indices, Solar and magnetic activities, as well as gridded data including SST, SAT, and cloud cover. We pay a special attention to the time lags: when a significant correlation is found, the value of the lag may help to discriminate between various possible scenarios. We locate some `hot spots', particularly in the Atlantic ocean and along the trajectory of the upper branch of the Atlantic meridional overturning circulation (AMOC), where the 70-yr oscillation is strongly marked. In addition, we discuss the possibility for centrifugal

  10. Comparison of Prestellar Core Elongations and Large-Scale Molecular Cloud Structures in the Lupus I Region

    CERN Document Server

    Poidevin, F; Angile, F E; Benton, S J; Chapin, E L; Devlin, M J; Fissel, L M; Fukui, Y; Gandilo, N N; Gundersen, J O; Hargrave, P C; Klein, J; Korotkov, A L; Matthews, T G; Moncelsi, L; Mroczkowski, T K; Netterfield, C B; Novak, G; Nutter, D; Olmi, L; Pascale, E; Savini, G; Scott, D; Shariff, J A; Soler, J D; Tachihara, K; Thomas, N E; Truch, M D P; Tucker, C E; Tucker, G S; Ward-Thompson, D

    2014-01-01

    Turbulence and magnetic fields are expected to be important for regulating molecular cloud formation and evolution. However, their effects on subparsec to 100 parsec scales, leading to the formation of starless cores, is not well understood. We investigate the prestellar core structure morphologies obtained from analysis of the Herschel-SPIRE 350 $\\mu$m maps of the Lupus I cloud. This distribution is first compared on a statistical basis to the large scale shape of the main filament. We find the distribution of the elongation position angle of the cores to be consistent with a random distribution, which means no specific orientation of the morphology of the cores is observed with respect to a large-scale filament shape model for Lupus I, or relative to a large-scale bent filament model. This distribution is also compared to the mean orientation of the large-scale magnetic fields probed at 350 $\\mu$m with the Balloon-borne Large Aperture Telescope for Polarimetry (BLASTPol) during its 2010 campaign. Here again...

  11. Climatic changes on orbital and sub-orbital time scale recorded by the Guliya ice core in Tibetan Plateau

    Institute of Scientific and Technical Information of China (English)

    姚檀栋; 徐柏青; 蒲健辰

    2001-01-01

    Based on ice core records in the Tibetan Plateau and Greenland, the features and possible causes of climatic changes on orbital and sub-orbital time scale were discussed. Orbital time scale climatic change recorded in ice core from the Tibetan Plateau is typically ahead of that from polar regions, which indicates that climatic change in the Tibetan Plateau might be earlier than polar regions. The solar radiation change is a major factor that dominates the climatic change on orbital time scale. However, climatic events on sub-orbital time scale occurred later in the Tibetan Plateau than in the Arctic Region, indicating a different mechanism. For example, the Younger Dryas and Heinrich events took place earlier in Greenland ice core record than in Guliya ice core record. It is reasonable to propose the hypothesis that these climatic events were affected possibly by the Laurentide Ice Sheet. Therefore, ice sheet is critically important to climatic change on sub-orbital time scale in some ice ages.

  12. Dimensional regularization is generic

    CERN Document Server

    Fujikawa, Kazuo

    2016-01-01

    The absence of the quadratic divergence in the Higgs sector of the Standard Model in the dimensional regularization is usually regarded to be an exceptional property of a specific regularization. To understand what is going on in the dimensional regularization, we illustrate how to reproduce the results of the dimensional regularization for the $\\lambda\\phi^{4}$ theory in the more conventional regularization such as the higher derivative regularization; the basic postulate involved is that the quadratically divergent induced mass, which is independent of the scale change of the physical mass, is kinematical and unphysical. This is consistent with the derivation of the Callan-Symanzik equation, which is a comparison of two theories with slightly different masses, for the $\\lambda\\phi^{4}$ theory without encountering the quadratic divergence. We thus suggest that the dimensional regularization is generic in a bottom-up approach starting with a successful low-energy theory. We also define a modified version of t...

  13. The validity of generic trends on multiple scales in rock-physical and rock-mechanical properties of the Whitby Mudstone, United Kingdom

    NARCIS (Netherlands)

    Douma, L.A.N.R.; Primarini, M.I.W.; Houben, M.E.; Barnhoorn, A.

    2017-01-01

    Finding generic trends in mechanical and physical rock properties will help to make predictions of the rock-mechanical behaviour of shales. Understanding the rock-mechanical behaviour of shales is important for the successful development of unconventional hydrocarbon reservoirs. This paper presents

  14. The validity of generic trends on multiple scales in rock-physical and rock-mechanical properties of the Whitby Mudstone, United Kingdom

    NARCIS (Netherlands)

    Douma, L.A.N.R.; Primarini, M.I.W.; Houben, M.E.; Barnhoorn, A.

    Finding generic trends in mechanical and physical rock properties will help to make predictions of the rock-mechanical behaviour of shales. Understanding the rock-mechanical behaviour of shales is important for the successful development of unconventional hydrocarbon reservoirs. This paper presents

  15. Qualification of a full plant nodalization for the prediction of the core exit temperature through a scaling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, J., E-mail: jordi.freixa-terradas@upc.edu; Martínez-Quiroga, V., E-mail: victor.martinez.quiroga@upc.edu; Reventós, F., E-mail: francesc.reventos@upc.edu

    2016-11-15

    Highlights: • Core exit temperature is used in PWRs as an indication of core heat up. • Qualification of full scale nuclear reactors by means of a scaling methodology. • Scaling of RELAP5 calculations to full scale power plants. - Abstract: System codes and their necessary power plant nodalizations are an essential step in thermal hydraulic safety analysis. In order to assess the safety of a particular power plant, in addition to the validation and verification of the code, the nodalization of the system needs to be qualified. Since most existing experimental data come from scaled-down facilities, any qualification process must therefore address scale considerations. The Group of Thermal Hydraulic Studies at Technical University of Catalonia has developed a scaling-up methodology (SCUP) for the qualification of full-scale nodalizations through a systematic procedure based on the extrapolation of post-test simulations of Integral Test Facility experiments. In the present work, the SCUP methodology will be employed to qualify the nodalization of the AscóNPP, a Pressurized Water Reactor (PWR), for the reproduction of an important safety phenomenon which is the effectiveness of the Core Exit Temperature (CET) as an Accident Management (AM) indicator. Given the difficulties in placing measurements in the core region, CET measurements are used as a criterion for the initiation of safety operational procedures during accidental conditions in PWR. However, the CET response has some limitation in detecting inadequate core cooling simply because the measurement is not taken in the position where the cladding exposure occurs. In order to apply the SCUP methodology, the OECD/NEA ROSA-2 Test 3, an SBLOCA in the hot leg, has been selected as a starting point. This experiment was conducted at the Large Scale Test Facility (LSTF), a facility operated by the Japanese Atomic Energy Agency (JAEA) and was focused on the assessment of the effectiveness of AM actions triggered by

  16. USGS Small-scale Dataset - 1:1,000,000-Scale Core Based Statistical Areas 201309 Shapefile

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer portrays Core Based Statistical Areas in the United States and Puerto Rico. The map layer was created from the CENSUS 2010 TIGER/Line files produced...

  17. An ice core derived 1013-year catchment scale annual rainfall reconstruction in subtropical eastern Australia

    Science.gov (United States)

    Tozer, C. R.; Vance, T. R.; Roberts, J.; Kiem, A. S.; Curran, M. A. J.; Moy, A. D.

    2015-12-01

    Paleoclimate research indicates that the instrumental climate record (~100 years in Australia) does not cover the full range of hydroclimatic variability possible. To better understand the implications of this for catchment-scale water resources management, an annual rainfall reconstruction is produced for the Williams River catchment in coastal eastern Australia. No high resolution palaeoclimate proxies are located in the region and so a teleconnection between summer sea salt deposition recorded in ice cores from East Antarctica and rainfall variability in eastern Australia was exploited to reconstruct 1013 years of rainfall (AD 1000-2012). The reconstruction shows that significantly longer and more frequent wet and dry periods were experienced in the preinstrumental compared to the instrumental period. This suggests that existing drought and flood risk assessments underestimate the true risks due to the reliance on data and statistics obtained from only the instrumental record. This raises questions about the robustness of existing water security and flood protection measures and has serious implications for water resources management, infrastructure design, and catchment planning. The method used in this proof of concept study is transferable and enables similar insights into the true risk of flood/drought to be gained for other locations that are teleconnected to East Antarctica. This will lead to improved understanding and ability to deal with the impacts of multidecadal to centennial hydroclimatic variability.

  18. Integrating multi-scale geophysical and drill-core data to improve hydraulic characterization of continental sedimentary basins

    Science.gov (United States)

    Kukowski, Nina; Methe, Pascal; Goepel, Andreas

    2017-04-01

    Physical properties of rocks in the uppermost continental crust e.g. sedimentary basins are very heterogeneously distributed and anisotropic making it necessary to perform advanced post processing techniques on geophysical data. Whereas e.g. electrical resistivity or seismic tomography allow only for identifying physical properties' variability on a scale from roughly several tens of metres to several hundred metres, drill cores reveal physical heterogeneity on the cm-scale. To study the impact of small scale acoustic and hydraulic heterogeneity on fluid flow in a sedimentary basin we use combined data sets from the Thuringian Basin in Germany, a small southern extension of the North German Basin characterised by Permian to Triassic sediments. Our data sets consist of three reflection seismic lines acquired within the framework of the multidisciplinary project INFLUINS (INtegrated FLUid dynamics IN Sedimentary basins) and as site survey for deep drilling, geophysical logging data from a 1,179 m deep drill hole in the centre of the Thuringian Basin, and Multi Sensor Core Logger (MSCL) data of the cores recovered from this drill hole. Geophysical borehole logging was performed immediate after drilling on the highest vertical resolution (about 10 cm) possible using state of the art commercial logging tools. MSCL-data were acquired at an even higher resolution of about 1 to 2 cm , which enables both, calibrating logging data and zooming in the spatial heterogeneity of physical properties. These measurements are complemented with laboratory measurements of rock physical properties (e.g. thermal conductivity, permeability) using selected core samples. Here, we mainly focus on seismic (sonic velocity, density) and hydraulic (porosity, permeability) parameters. This multi-methodological approach allows us on the one hand to estimate improved local to regional average values for physical parameters but most importantly also to highlight the role of thin layers, the physical

  19. Generic drugs in dermatology: part I.

    Science.gov (United States)

    Payette, Michael; Grant-Kels, Jane M

    2012-03-01

    The cost of health care in the United States is increasing. In order to help control these rising costs, all parties involved in the delivery of health care, including dermatologists, need to be part of the solution of ethically reducing the cost of delivery of care. One potential means of meeting this goal is to increase the use of generic medications in daily practice. Generic medications can offer equally efficacious therapy at significantly lower prices, which can translate into large scale savings for the individual patient, the payer, and the overall health care system. Herein we provide an overview of new drug development, review the history of the generic drug industry, describe how generic drugs are approved by the US Food and Drug Administration, and define the concepts of bioequivalence and therapeutic equivalence. In part II, we explore various factors impacting generic drug use, provide cost analyses of dermatologic brand name and generic drugs, and review data addressing potential differences in the effectiveness of brand name versus generic drugs in dermatology. The cost of brand name and generic medications is highly variable by pharmacy, state, and payer. We used one source (www.drugstore.com) as an example and for consistency across all medications discussed herein. Prices included here may not reflect actual retail prices across the United States.

  20. Crustal concealing of small-scale core-field secular variation

    DEFF Research Database (Denmark)

    Hulot, G.; Olsen, Nils; Thebault, E.;

    2009-01-01

    The Earth's magnetic field is mainly produced within the Earth's liquid and electrically conducting core, as a result of a process known as the geodynamo. Many other sources also contribute to the magnetic signal accessible to observation at the Earth's surface, partly obscuring the main core...

  1. Effect of the scale of quantitative trait data on the representativeness of a cotton germplasm sub-core collection

    Institute of Scientific and Technical Information of China (English)

    Jian-cheng WANG; Jin HU; Ya-jing GUAN; Yan-fang ZHU

    2013-01-01

    A cotton germplasm collection with data for 20 quantitative traits was used to investigate the effect of the scale of quantitative trait data on the representativeness of plant sub-core collections.The relationship between the representativeness of a sub-core collection and two influencing factors,the number of traits and the sampling percentage,was studied.A mixed linear model approach was used to eliminate environmental errors and predict genotypic values of accessions.Sub-core collections were constructed using a least distance stepwise sampling(LDSS)method combining standardized Euclidean distance and an unweighted pair-group method with arithmetic means (UPGMA)cluster method.The mean difference percentage(MD),variance difference percentage(VD),coincidence rate of range(CR),and variable rate of coefficient of variation(VR)served as evaluation parameters.Monte Carlo simulation was conducted to study the relationship among the number of traits,the sampling percentage,and the four evaluation parameters.The results showed that the representativeness of a sub-core collection was affected greatly by the number of traits and the sampling percentage,and that these two influencing factors were closely connected.Increasing the number of traits improved the representativeness of a sub-core collection when the data of genotypic values were used.The change in the genetic diversity of sub-core collections with different sampling percentages showed a linear tendency when the number of traits was small,and a logarithmic tendency when the number of traits was large.However,the change in the genetic diversity of sub-core collections with different numbers of traits always showed a strong logarithmic tendency when the sampling percentage was changing.A CR threshold method based on Monte Carlo simulation is proposed to determine the rational number of traits for a relevant sampling percentage of a sub-core collection.

  2. Utilisation of real-scale renewable energy test facility for validation of generic wind turbine and wind power plant controller models

    Energy Technology Data Exchange (ETDEWEB)

    Zeni, Lorenzo; Hesselbæk, Bo; Bech, John; Sørensen, Poul Ejnar; Gevorgian, Vahan; Wallen, Robb

    2016-09-01

    This article presents an example of application of a modern test facility conceived for experiments regarding the integration of renewable energy in the power system. The capabilities of the test facility are used to validate dynamic simulation models of wind power plants and their controllers. The models are based on standard and generic blocks. The successful validation of events related to the control of active power (control phenomena in <10 Hz range, including frequency control and power oscillation damping) is described, demonstrating the capabilities of the test facility and drawing the track for future work and improvements.

  3. Bioequivalence of generic drugs.

    Science.gov (United States)

    Andrade, Chittaranjan

    2015-09-01

    Generic drugs are bioequivalent to the original brand; this is a prerequisite for marketing approval. It is theoretically possible that one generic drug may overestimate the pharmacokinetic (PK) parameters of the original and another generic may underestimate these PK parameters; in consequence, these 2 generics may not be bioequivalent between themselves. The result could be loss of efficacy or development of drug-related adverse effects if these generics are interchanged in stable patients. In a recent study involving 292 indirect comparisons of generic formulations of 9 different drugs, mathematical modeling showed that in most cases (87.0% for maximum concentration, 90.1% for area under the curve, and 80.5% for both) generic drugs are bioequivalent to each other. These reassuring findings notwithstanding, prudence dictates that, in stable patients, generic drugs should be interchanged only if there is a good reason for it. This is because bioequivalent brands of drugs may differ in their excipient content, and this can result in variations in safety profiles.

  4. Generic Fortran Containers (GFC)

    Energy Technology Data Exchange (ETDEWEB)

    2016-09-01

    The Fortran language does not provide a standard library that implements generic containers, like linked lists, trees, dictionaries, etc. The GFC software provides an implementation of generic Fortran containers natively written in Fortran 2003/2008 language. The following containers are either already implemented or planned: Stack (done), Linked list (done), Tree (done), Dictionary (done), Queue (planned), Priority queue (planned).

  5. 1:1,000,000-Scale Core Based Statistical Areas - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer portrays Core Based Statistical Areas in the United States and Puerto Rico. The map layer was created from the CENSUS 2010 TIGER/Line files produced...

  6. Centennial-scale records of total organic carbon in sediment cores from the South Yellow Sea, China

    Science.gov (United States)

    Zhu, Qing; Lin, Jia; Hong, Yuehui; Yuan, Lirong; Liu, Jinzhong; Xu, Xiaoming; Wang, Jianghai

    2017-05-01

    Global carbon cycling is a significant factor that controls climate change. The centennial-scale variations in total organic carbon (TOC) contents and its sources in marginal sea sediments may reflect the influence of human activities on global climate change. In this study, two fine-grained sediment cores from the Yellow Sea Cold Water Mass of the South Yellow Sea were used to systematically determine TOC contents and stable carbon isotope ratios. These results were combined with previous data of black carbon and 210Pb dating from which we reconstructed the centennial-scale initial sequences of TOC, terrigenous TOC (TOCter) and marine autogenous TOC (TOCmar) after selecting suitable models to correct the measured TOC (TOCcor). These sequences showed that the TOCter decreased with time in the both cores while the TOCmar increased, particularly the rapid growth in core H43 since the late 1960s. According to the correlation between the Huanghe (Yellow) River discharge and the TOCcor, TOCter, or TOCmar, we found that the TOCter in the two cores mainly derived from the Huanghe River and was transported by it, and that higher Huanghe River discharge could strengthen the decomposition of TOCmar. The newly obtained initial TOC sequences provide important insights into the interaction between human activities and natural processes.

  7. Calculations of 3D full-scale VVER fuel assembly and core models using MCU and BIPR-7A codes

    Energy Technology Data Exchange (ETDEWEB)

    Aleshin, Sergey S.; Bikeev, Artem S.; Bolshagin, Sergey N.; Kalugin, Mikhail A.; Kosourov, Evgeniy K.; Pavlovichev, Aleksandr M.; Pryanichnikov, Aleksandr V.; Sukhino-Khomenko, Evgenia A.; Shcherenko, Anna I.; Shcherenko, Anastasia I.; Shkarovskiy, Denis A. [Nuclear Research Centre ' ' Kurchatov Institute' ' , Moscow (Russian Federation)

    2015-09-15

    Two types of calculations were made to compare BIPR-7A and MCU results for 3D full-scale models. First EPS (emergency protection system) efficiency and in-core power distributions were analyzed for an equilibrium fuel load of VVER-1000 assuming its operation within an 18-month cycle. Computations were performed without feedbacks and with fuel burnup distributed over the core. After 3D infinite lattices of full-scale VVER-1000 fuel assemblies (A's) with uranium fuel 4.4% enrichment and uranium-erbium fuel 4.4% enrichment and Er{sub 2}O{sub 3} 1 % wt were considered. Computations were performed with feedbacks and fuel burnup at the constant power level. For different time moments effective multiplication factor and power distribution were obtained. EPS efficiency and reactivity effects at chosen time moments were analyzed.

  8. Bosonics: Phononics, Magnonics, Plasmonics in Nano-Scale Disorder(Nanonics), Metamaterials, Astro-Seismology (Meganonics): Brillouin-Siegel GENERIC: Generalized-Disorder Collective-Boson Mode-Softening Universality-Principle (G...P) With PIPUB Many-Body Localization

    Science.gov (United States)

    Siegel, Edward

    Siegel and Matsubara[Statphys-13(`77) Intl.Conf.Lattice-Dyn.(`77)Scripta Met.13,913(`80)]JMMM:5, 1, 84 (`77)22,1:41,58(`80)Mag.Lett.(`80)Phys./Chem.Liquids:4,(4) (`75)5,(1)(76)] generalization to GENERIC Siegel[J.Non-Xline-Sol.40,453(`80)] G...P GENERIC Brillouin[Wave-Propagation in Periodic-Structures(`22)]-Landau[`41]-Feynman[`51]-de Boer[in Phonons/Phonon-Interactions(`64)]-Egelstaff[Intro.Liquid-State(`65)]-Hubbard-Beebe[J.Phys.C(`67)]-``Anderson''[1958]- Siegel [J.Non-Xl.-Sol. 40, 453(`80)] GENERIC many-body localization. GENERIC Hubbard-Beebe[J.Phys.C(`67)] static structure-factor S(k) modulated kinetic-energy ω(k) = ℏ ⌃(2)k⌃(2)/2mS(k) expressing G....P(``bass-ackwardly'') aka homogeneity and isotropy creates GENERIC G...P with GENERIC pseudo-isotropic pseudo-Umklapp backscattering (PIPUB) for GENERIC many-body localization of and/or by mutually interacting collective-bosons: phonons(phononics) with magnons(magnonics) with plasmons(plasmonics) with fermions (electros, holes)...etc. in nano-scale ``disorder'', metamaterials and on very-macro-scales (surprisingly) Bildsten et.al. astro-seismology(meganonics) of red-giant main-sequence stars(Mira, Betelguese)!

  9. The effects of core stability strength exercise on muscle activity and trunk impairment scale in stroke patients.

    Science.gov (United States)

    Yu, Seong-Hun; Park, Seong-Doo

    2013-01-01

    The purpose of this study was to examine the effects of core stability-enhancing exercises on the lower trunk and muscle activity of stroke patients. The control group (n = 10) underwent standard exercise therapy, while the experiment group (n =10) underwent both the core stability-enhancing exercise and standard exercise therapy simultaneously. The standard exercise therapy applied to the two groups included weight bearing and weight shifts and joint movements to improve flexibility and the range of motion. The core stability-enhancing exercise was performed 5 times a week for 30 min over a period of 4 weeks in the room where the patients were treated. For all 20 subject, the items measured before the exercise were measured after the therapeutic intervention, and changes in muscle activity of the lower trunk were evaluated. The activity and stability of the core muscles were measured using surface electromyography and the trunk impairment scale (TIS). The mean TIS score and muscle activity of the lower trunk increased in the experiment group significantly after performing the core stability-enhancing exercise (Pcore stability-enhancing exercise is effective in improving muscle activity of the lower trunk, which is affected by hemiplegia.

  10. Unlocking the Physiochemical Controls on Organic Carbon Dynamics from the Soil Pore- to Core-Scale

    Science.gov (United States)

    Smith, A. P.; Tfaily, M. M.; Bond-Lamberty, B. P.; Todd-Brown, K. E.; Bailey, V. L.

    2015-12-01

    The physical organization of soil includes pore networks of varying size and connectivity. These networks control microbial access to soil organic carbon (C) by spatially separating microorganisms and C by both distance and size exclusion. The extent to which this spatially isolated C is vulnerable to microbial transformation under hydrologically dynamic conditions is unknown, and limits our ability to predict the source and sink capacity of soils. We investigated the effects of shifting hydrologic connectivity and soil structure on greenhouse gas C emissions from surface soils collected from the Disney Wilderness Preserve (Florida, USA). We subjected intact soil cores and re-packed homogenized soil cores to simulated groundwater rise or precipitation, monitoring their CO2 and CH4 emissions over 24 hours. Soil pore water was then extracted from each core using different suctions to sample water retained by pore throats of different sizes and then characterized by Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometry. Greater respiration rates were observed from homogenized cores compared to intact cores, and from soils wet from below, in which the wetting front is driven by capillary forces, filling fine pores first. This suggests that C located in fine pores may turn over via diffusion processes that lead to the colocation of this C with other resources and microorganisms. Both the complexity and concentration of soluble-C increased with decreasing pore size domains. Pore water extracted from homogenized cores had greater C concentrations than from intact cores, with the greatest concentrations in pore waters sampled from very fine pores, highlighting the importance of soil structure in physically protecting C. These results suggest that the spatial separation of decomposers from C is a key mechanism stabilizing C in these soils. Further research is ongoing to accurately represent this protection mechanism, and the conditions under which it breaks

  11. Cost assessment of a generic magnetic fusion reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sheffield, J.; Dory, R.A.; Cohn, S.M.; Delene, J.G.; Parsly, L.F.; Ashby, D.E.T.F.; Reiersen, W.T.

    1986-03-01

    A generic reactor model is used to examine the economic viability of generating electricity by magnetic fusion. The simple model uses components that are representative of those used in previous reactor studies of deuterium-tritium-burning tokamaks, stellarators, bumpy tori, reversed-field pinches (RFPs), and tandem mirrors. Conservative costing assumptions are made. The generic reactor is not a tokamak; rather, it is intended to emphasize what is common to all magnetic fusion rectors. The reactor uses a superconducting toroidal coil set to produce the dominant magnetic field. To this extent, it is not as good an approximation to systems such as the RFP in which the main field is produced by a plasma current. The main output of the study is the cost of electricity as a function of the weight and size of the fusion core - blanket, shield, structure, and coils. The model shows that a 1200-MW(e) power plant with a fusion core weight of about 10,000 tonnes should be competitive in the future with fission and fossil plants. Studies of the sensitivity of the model to variations in the assumptions show that this result is not sensitively dependent on any given assumption. Of particular importance is the result that a fusion reactor of this scale may be realized with only moderate advances in physics and technology capabilities.

  12. A Generic Dynamic Emulator

    CERN Document Server

    Albert, Carlo

    2011-01-01

    In applied sciences, we often deal with deterministic simulation models that are too slow for simulation-intensive tasks such as calibration or real-time control. In this paper, an emulator for a generic dynamic model, given by a system of ordinary non-linear differential equations, is developed. The non-linear differential equations are linearized and Gaussian white noise is added to account for the non-linearities. The resulting linear stochastic system is conditioned on a set of solutions of the non-linear equations that have been calculated prior to the emulation. A path-integral approach is used to derive the Gaussian distribution of the emulated solution. The solution reveals that most of the computational burden can be shifted to the conditioning phase of the emulator and the complexity of the actual emulation step only scales like $\\mathcal O(Nnm^2)$, where $N$ is the number of time-points at which the solution is to be emulated, $n$ the number of solutions the emulator is conditioned on and $m$ the n...

  13. redGEM: Systematic reduction and analysis of genome-scale metabolic reconstructions for development of consistent core metabolic models.

    Science.gov (United States)

    Ataman, Meric; Hernandez Gardiol, Daniel F; Fengos, Georgios; Hatzimanikatis, Vassily

    2017-07-01

    Genome-scale metabolic reconstructions have proven to be valuable resources in enhancing our understanding of metabolic networks as they encapsulate all known metabolic capabilities of the organisms from genes to proteins to their functions. However the complexity of these large metabolic networks often hinders their utility in various practical applications. Although reduced models are commonly used for modeling and in integrating experimental data, they are often inconsistent across different studies and laboratories due to different criteria and detail, which can compromise transferability of the findings and also integration of experimental data from different groups. In this study, we have developed a systematic semi-automatic approach to reduce genome-scale models into core models in a consistent and logical manner focusing on the central metabolism or subsystems of interest. The method minimizes the loss of information using an approach that combines graph-based search and optimization methods. The resulting core models are shown to be able to capture key properties of the genome-scale models and preserve consistency in terms of biomass and by-product yields, flux and concentration variability and gene essentiality. The development of these "consistently-reduced" models will help to clarify and facilitate integration of different experimental data to draw new understanding that can be directly extendable to genome-scale models.

  14. redGEM: Systematic reduction and analysis of genome-scale metabolic reconstructions for development of consistent core metabolic models.

    Directory of Open Access Journals (Sweden)

    Meric Ataman

    2017-07-01

    Full Text Available Genome-scale metabolic reconstructions have proven to be valuable resources in enhancing our understanding of metabolic networks as they encapsulate all known metabolic capabilities of the organisms from genes to proteins to their functions. However the complexity of these large metabolic networks often hinders their utility in various practical applications. Although reduced models are commonly used for modeling and in integrating experimental data, they are often inconsistent across different studies and laboratories due to different criteria and detail, which can compromise transferability of the findings and also integration of experimental data from different groups. In this study, we have developed a systematic semi-automatic approach to reduce genome-scale models into core models in a consistent and logical manner focusing on the central metabolism or subsystems of interest. The method minimizes the loss of information using an approach that combines graph-based search and optimization methods. The resulting core models are shown to be able to capture key properties of the genome-scale models and preserve consistency in terms of biomass and by-product yields, flux and concentration variability and gene essentiality. The development of these "consistently-reduced" models will help to clarify and facilitate integration of different experimental data to draw new understanding that can be directly extendable to genome-scale models.

  15. Scaling to 150K cores: Recent algorithm and performance engineering developments enabling XGC1 to run at scale

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Mark F [Department of Applied Physics and Applied Mathematics, Columbia University (United States); Ku, Seung-Hoe; Chang, C-S [Courant Institute of Mathematical Sciences, New York University (United States); Worley, Patrick; D' Azevedo, Ed [Computer Science and Mathematics Division, Oak Ridge National Laboratory (United States); Cummings, Julian C, E-mail: mark.adams@columbia.ed, E-mail: sku@cims.nyu.ed, E-mail: worleyph@ornl.go, E-mail: dazevedoef@ornl.go, E-mail: cummings@cacr.caltech.ed, E-mail: cschang@cims.nyu.ed [Center for Advanced Computing Research, California Institute of Technology (United States)

    2009-07-01

    Particle-in-cell (PIC) methods have proven to be effective in discretizing the Vlasov-Maxwell system of equations describing the core of toroidal burning plasmas for many decades. Recent physical understanding of the importance of edge physics for stability and transport in tokamaks has lead to development of the first fully toroidal edge PIC code - XGC1. The edge region poses special problems in meshing for PIC methods due to the lack of closed flux surfaces, which makes field-line following meshes and coordinate systems problematic. We present a solution to this problem with a semi-field line following mesh method in a cylindrical coordinate system. Additionally, modern supercomputers require highly concurrent algorithms and implementations, with all levels of the memory hierarchy being efficiently utilized to realize optimal code performance. This paper presents a mesh and particle partitioning method, suitable to our meshing strategy, for use on highly concurrent cache-based computing platforms.

  16. Microstructure-dependent mechanical properties of electrospun core-shell scaffolds at multi-scale levels.

    Science.gov (United States)

    Horner, Christopher B; Ico, Gerardo; Johnson, Jed; Zhao, Yi; Nam, Jin

    2016-06-01

    Mechanical factors among many physiochemical properties of scaffolds for stem cell-based tissue engineering significantly affect tissue morphogenesis by controlling stem cell behaviors including proliferation and phenotype-specific differentiation. Core-shell electrospinning provides a unique opportunity to control mechanical properties of scaffolds independent of surface chemistry, rendering a greater freedom to tailor design for specific applications. In this study, we synthesized electrospun core-shell scaffolds having different core composition and/or core-to-shell dimensional ratios. Two independent biocompatible polymer systems, polyetherketoneketone (PEKK) and gelatin as the core materials while maintaining the shell polymer with polycaprolactone (PCL), were utilized. The mechanics of such scaffolds was analyzed at the microscale and macroscales to determine the potential implications it may hold for cell-material and tissue-material interactions. The mechanical properties of individual core-shell fibers were controlled by core-shell composition and structure. The individual fiber modulus correlated with the increase in percent core size ranging from 0.55±0.10GPa to 1.74±0.22GPa and 0.48±0.12GPa to 1.53±0.12GPa for the PEKK-PCL and gelatin-PCL fibers, respectively. More importantly, it was demonstrated that mechanical properties of the scaffolds at the macroscale were dominantly determined by porosity under compression. The increase of scaffold porosity from 70.2%±1.0% to 93.2%±0.5% by increasing the core size in the PEKK-PCL scaffold resulted in the decrease of the compressive elastic modulus from 227.67±20.39kPa to 14.55±1.43kPa while a greater changes in the porosity of gelatin-PCL scaffold from 54.5%±4.2% to 89.6%±0.4% resulted in the compressive elastic modulus change from 484.01±30.18kPa to 17.57±1.40kPa. On the other hand, the biphasic behaviors under tensile mechanical loading result in a range from a minimum of 5.42±1.05MPa to a maximum

  17. Numerical simulation of a Hypothetical Core Disruptive Accident in a small-scale model of a nuclear reactor

    Energy Technology Data Exchange (ETDEWEB)

    Robbe, M.F. E-mail: robbe@aquilon.cea.frmfrobbe@cea.fr; Lepareux, M.; Treille, E.; Cariou, Y

    2003-08-01

    In the case of a Hypothetical Core Disruptive Accident (HCDA) in a Liquid Metal Fast Breeder Reactor, it is assumed that the core of the nuclear reactor has melted partially and that the chemical interaction between molten fuel and liquid sodium has created a high-pressure gas bubble in the core. The violent expansion of this bubble loads and deforms the reactor vessel and the internal structures, thus endangering the safety of the nuclear plant. The MARA 10 experimental test simulates a HCDA in a 1/30-scale mock-up schematising a reactor block. In the mock-up, the liquid sodium cooling the reactor core is replaced by water and the argon blanket laying below the reactor roof is simulated by an air blanket. The explosion is triggered by an explosive charge. This paper presents a numerical simulation of the test with the EUROPLEXUS code and an analysis of the computed results. In particular, the evolution of the fluid flows and the deformations of the internal and external structures are analysed in detail. Finally, the current computed results are compared with the experimental ones and with previous numerical results computed with the SIRIUS and CASTEM-PLEXUS codes.

  18. Membrane biofilm communities in full-scale membrane bioreactors are not randomly assembled and consist of a core microbiome

    KAUST Repository

    Matar, Gerald K.

    2017-06-21

    Finding efficient biofouling control strategies requires a better understanding of the microbial ecology of membrane biofilm communities in membrane bioreactors (MBRs). Studies that characterized the membrane biofilm communities in lab-and pilot-scale MBRs are numerous, yet similar studies in full-scale MBRs are limited. Also, most of these studies have characterized the mature biofilm communities with very few studies addressing early biofilm communities. In this study, five full-scale MBRs located in Seattle (Washington, U.S.A.) were selected to address two questions concerning membrane biofilm communities (early and mature): (i) Is the assembly of biofilm communities (early and mature) the result of random immigration of species from the source community (i.e. activated sludge)? and (ii) Is there a core membrane biofilm community in full-scale MBRs? Membrane biofilm (early and mature) and activated sludge (AS) samples were collected from the five MBRs, and 16S rRNA gene sequencing was applied to investigate the bacterial communities of AS and membrane biofilms (early and mature). Alpha and beta diversity measures revealed clear differences in the bacterial community structure between the AS and biofilm (early and mature) samples in the five full-scale MBRs. These differences were mainly due to the presence of large number of unique but rare operational taxonomic units (∼13% of total reads in each MBR) in each sample. In contrast, a high percentage (∼87% of total reads in each MBR) of sequence reads was shared between AS and biofilm samples in each MBR, and these shared sequence reads mainly belong to the dominant taxa in these samples. Despite the large fraction of shared sequence reads between AS and biofilm samples, simulated biofilm communities from random sampling of the respective AS community revealed that biofilm communities differed significantly from the random assemblages (P < 0.001 for each MBR), indicating that the biofilm communities (early

  19. Millennial and sub-millennial scale climatic variations recorded in polar ice cores over the last glacial period

    DEFF Research Database (Denmark)

    Capron, E.; Landais, A.; Chappellaz, J.

    2010-01-01

    Since its discovery in Greenland ice cores, the millennial scale climatic variability of the last glacial period has been increasingly documented at all latitudes with studies focusing mainly on Marine Isotopic Stage 3 (MIS 3; 28–60 thousand of years before present, hereafter ka) and characterized...... a succession of abrupt events associated with long Greenland InterStadial phases (GIS) enabling us to highlight a sub-millennial scale climatic variability depicted by (i) short-lived and abrupt warming events preceding some GIS (precursor-type events) and (ii) abrupt warming events at the end of some GIS...... (rebound-type events). The occurrence of these sub-millennial scale events is suggested to be driven by the insolation at high northern latitudes together with the internal forcing of ice sheets. Thanks to a recent NorthGRIP-EPICA Dronning Maud Land (EDML) common timescale over MIS 5, the bipolar sequence...

  20. Practicing the Generic (City)

    DEFF Research Database (Denmark)

    Hansen, Lone Koefoed

    2010-01-01

    Flanagan proposes that most locative media artworks neglect the particularities of spaces, their historical and political layers. Koolhaas, on the other hand, states that all urban areas are alike, that we are facing a global Generic City. The paper analyses digital media artist Esther Polak......’s NomadicMILK project in light of the generic and particular properties of space as laid out by Flanagan and Koolhaas in order to discuss the possible reconfiguring practices of locative media....

  1. Acoustic Source Localization via Distributed Sensor Networks using Tera-scale Optical-Core Devices

    Energy Technology Data Exchange (ETDEWEB)

    Imam, Neena [ORNL; Barhen, Jacob [ORNL; Wardlaw, Michael [Office of Naval Research

    2008-01-01

    For real-time acoustic source localization applications, one of the primary challenges is the considerable growth in computational complexity associated with the emergence of ever larger, active or passive, distributed sensor networks. The complexity of the calculations needed to achieve accurate source localization increases dramatically with the size of sensor arrays, resulting in substantial growth of computational requirements that cannot be met with standard hardware. One option to meet this challenge builds upon the emergence of digital optical-core devices. The objective of this work was to explore the implementation of key building block algorithms used in underwater source localization on an optical-core digital processing platform recently introduced by Lenslet Inc. They investigate key concepts of threat-detection algorithms such as Time Difference Of Arrival (TDOA) estimation via sensor data correlation in the time domain with the purpose of implementation on the optical-core processor. they illustrate their results with the aid of numerical simulation and actual optical hardware runs. The major accomplishments of this research, in terms of computational speedup and numerical accurcy achieved via the deployment of optical processing technology, should be of substantial interest to the acoustic signal processing community.

  2. Sedimentary Facies Controls on the Upscaling of Petrophysical Properties from Core to Log Scales and Its Implications to Petroleum Exploration

    Institute of Scientific and Technical Information of China (English)

    LiuKeyu; BrettTopham; LincolnPaterson; PeterEadington; PangXiongqi

    2004-01-01

    The clastic sedimentary realm comprises a number of genetically distinct depositional systems, which are dominated by distinct depositional processes. A variogram and a Levy-stable probability distribution-based geostatistical method have been applied to analyze petrophysical properties from well logs and cores from a variety of depositional environments in sedimentary basins from Australia to quantify the heterogeneity and upscaling range of different depositional systems. Two reservoir sequences with contrasting sedimentary facies, depositional processes and a diagenetic history are investigated for their petrographic, petrophysical and log characters and their scaling behaviour. The microscopic derived petrophysical parameters, including visual porosity, grain size, sorting and amount of matrix, core plug measured porosity and permeability and log-derived V-shale, porosity and permeability, have been found to be well correlated (]R]=0.72 to 0.91) across all the scales for the reservoir sequence deposited under a single predominant depositional process and a gradational change of the energy regime (Bilyara-1). In contrast, for the reservoir sequence (East Swan-2), which was deposited in heterogeneous processes and underwent diagenetic alteration, the crosscorrelation of the petrophysical properties derived from the three different scales is extremely poor (|R|=0.01-0.54). Logderived porosity and permeability for a thinly bedded reservoir sequence with an individual bed thinner than one metre can therefore be affected by the intrinsic averaging effects of the logging tools.

  3. A stochastic nonlinear oscillator model for glacial millennial-scale climate transitions derived from ice-core data

    Directory of Open Access Journals (Sweden)

    F. Kwasniok

    2012-11-01

    Full Text Available A stochastic Duffing-type oscillator model, i.e noise-driven motion with inertia in a potential landscape, is considered for glacial millennial-scale climate transitions. The potential and noise parameters are estimated from a Greenland ice-core record using a nonlinear Kalman filter. For the period from 60 to 20 ky before present, a bistable potential with a deep well corresponding to a cold stadial state and a shallow well corresponding to a warm interstadial state is found. The system is in the strongly dissipative regime and can be very well approximated by an effective one-dimensional Langevin equation.

  4. The orbital scale evolution of regional climate recorded in a long sediment core from Heqing,China

    Institute of Scientific and Technical Information of China (English)

    SHEN Ji; XIAO HaiFeng; WANG SuMin; AN ZhiSheng; QIANG XiaoKe; XIAO XiaYun

    2007-01-01

    Based on the analysis of carbonate content and loss on ignition for a long sediment core (737 m in length) drilled in Heqing,the orbital scale evolution of the Southwest Monsoon is revealed,by using overlapped spectral analysis and filter methods. It is shown that the obliquity cycle and precession cycle are the key factors for the Southwest Monsoon evolution and that the change of the global ice volume and the uplift of the Qinghai-Tibetan Plateau also impose great influences on it.

  5. The taxonomy of the Japanese oak red scale insect, Kuwania quercus (Kuwana) (Hemiptera: Coccoidea: Kuwaniidae), with a generic diagnosis, a key to species and description of a new species from California.

    Science.gov (United States)

    San'An, Wu; Nan, Nan; Gullan, Penny; Deng, Jun

    2013-01-01

    The oak red scale insect, Kuwania quercus (Kuwana), was described from specimens collected from the bark of oak trees (Quercus species) in Japan. More recently, the species has been identified from California and China, but Californian specimens differ morphologically from Japanese material and are considered here to be a new species based on both morphological and molecular data. In this paper, an illustrated redescription of K. quercus is provided based on type specimens consisting of adult females, first-instar nymphs and intermediate-stage females, and a lectotype is designated for Sasakia quercus Kuwana. The new Californian species, Kuwania raygilli Wu & Gullan, is described and illustrated based on the adult female, first-instar nymph and intermediate-stage female. A new generic diagnosis for Kuwania Cockerell based on adult females and first-instar nymphs, and a key to species based on adult females are included.

  6. Verification of the CENTRM Module for Adaptation of the SCALE Code to NGNP Prismatic and PBR Core Designs

    Energy Technology Data Exchange (ETDEWEB)

    Ganapol, Barry; Maldonado, Ivan

    2014-01-23

    The generation of multigroup cross sections lies at the heart of the very high temperature reactor (VHTR) core design, whether the prismatic (block) or pebble-bed type. The design process, generally performed in three steps, is quite involved and its execution is crucial to proper reactor physics analyses. The primary purpose of this project is to develop the CENTRM cross-section processing module of the SCALE code package for application to prismatic or pebble-bed core designs. The team will include a detailed outline of the entire processing procedure for application of CENTRM in a final report complete with demonstration. In addition, they will conduct a thorough verification of the CENTRM code, which has yet to be performed. The tasks for this project are to: Thoroughly test the panel algorithm for neutron slowing down; Develop the panel algorithm for multi-materials; Establish a multigroup convergence 1D transport acceleration algorithm in the panel formalism; Verify CENTRM in 1D plane geometry; Create and test the corresponding transport/panel algorithm in spherical and cylindrical geometries; and, Apply the verified CENTRM code to current VHTR core design configurations for an infinite lattice, including assessing effectiveness of Dancoff corrections to simulate TRISO particle heterogeneity.

  7. A Strategy for Finding the Optimal Scale of Plant Core Collection Based on Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Jiancheng Wang

    2014-01-01

    Full Text Available Core collection is an ideal resource for genome-wide association studies (GWAS. A subcore collection is a subset of a core collection. A strategy was proposed for finding the optimal sampling percentage on plant subcore collection based on Monte Carlo simulation. A cotton germplasm group of 168 accessions with 20 quantitative traits was used to construct subcore collections. Mixed linear model approach was used to eliminate environment effect and GE (genotype × environment effect. Least distance stepwise sampling (LDSS method combining 6 commonly used genetic distances and unweighted pair-group average (UPGMA cluster method was adopted to construct subcore collections. Homogeneous population assessing method was adopted to assess the validity of 7 evaluating parameters of subcore collection. Monte Carlo simulation was conducted on the sampling percentage, the number of traits, and the evaluating parameters. A new method for “distilling free-form natural laws from experimental data” was adopted to find the best formula to determine the optimal sampling percentages. The results showed that coincidence rate of range (CR was the most valid evaluating parameter and was suitable to serve as a threshold to find the optimal sampling percentage. The principal component analysis showed that subcore collections constructed by the optimal sampling percentages calculated by present strategy were well representative.

  8. Design and implementation of a generic energy-harvesting framework applied to the evaluation of a large-scale electronic shelf-labeling wireless sensor network

    OpenAIRE

    Kamerman Ad; De Mil Pieter; Jooris Bart; Tytgat Lieven; Catteeuw Ruben; Moerman Ingrid; Demeester Piet

    2010-01-01

    Most wireless sensor networks (WSNs) consist of battery-powered nodes and are limited to hundreds of nodes. Battery replacement is a very costly operation and a key factor in limiting successful large-scale deployments. The recent advances in both energy harvesters and low-power communication systems hold promise for deploying large-scale wireless green-powered sensor networks (WGSNs). This will enable new applications and will eliminate environmentally unfriendly battery disposal. This pape...

  9. Patients' attitude about generics –Bulgarian perspective

    Directory of Open Access Journals (Sweden)

    Hristina Lebanova

    2012-01-01

    Full Text Available OBJECTIVE: The aim of the present study is to investigate (1 what is the patients' attitude towards and (2 preferences to use generic medicines in Bulgaria and (3 which are the main factors influencing their opinion.METHODS: Using pseudo-randomization we select a sample of 225 participants, men and women from general population, patients in community pharmacies. For our survey we used a standardized self-questionnaire of ten points. The influence of sex, age, education, medical history, knowledge of generic drugs and experience with generic substitution and medicines was examined through Chi-square tests.RESULTS: The results show that 74% of the participants seemed not to be informed on generic drugs and 26% received valuable and relevant information from their general practitioner or pharmacist. 94% believed that generic medicines are inferior to brand medicines on quality, safety and efficacy.CONCLUSIONS: The main reason for almost all the participants (94% to prefer original medicines, over generics is the insufficient information, they have. The core factors forming patients' opinion and expectations for generic drugs are medical professionals' recommendation and previous experience. The main advantages of the generics according to the participants in the study are the lower price and better accessibility. The results raise the issue of the awareness and level knowledge about generic medicines and the rational drug use in the general population.

  10. Generic Airspace Survey

    Science.gov (United States)

    Mogford, Richard H.; Bridges, Wayne; Gujarl, Vimmy; Lee, Paul U.; Preston, William

    2013-01-01

    This paper reports on an extension of generic airspace research to explore the amount of memorization and specialized skills required to manage sectors with specific characteristics or factors. Fifty-five retired controllers were given an electronic survey where they rated the amount of memorization or specialized skills needed for sixteen generic airspace factors. The results suggested similarities in the pattern of ratings between different areas of the US (East, Central, and West). The average of the ratings for each area also showed some differences between regions, with ratings being generally higher in the East area. All sixteen factors were rated as moderately to highly important and may be useful for future research on generic airspace, air traffic controller workload, etc.

  11. Generic robot architecture

    Science.gov (United States)

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2010-09-21

    The present invention provides methods, computer readable media, and apparatuses for a generic robot architecture providing a framework that is easily portable to a variety of robot platforms and is configured to provide hardware abstractions, abstractions for generic robot attributes, environment abstractions, and robot behaviors. The generic robot architecture includes a hardware abstraction level and a robot abstraction level. The hardware abstraction level is configured for developing hardware abstractions that define, monitor, and control hardware modules available on a robot platform. The robot abstraction level is configured for defining robot attributes and provides a software framework for building robot behaviors from the robot attributes. Each of the robot attributes includes hardware information from at least one hardware abstraction. In addition, each robot attribute is configured to substantially isolate the robot behaviors from the at least one hardware abstraction.

  12. Effects of risperidone on core symptoms of autistic disorder based on childhood autism rating scale: An open label study

    Directory of Open Access Journals (Sweden)

    Padideh Ghaeli

    2014-01-01

    Full Text Available Background: The aim of the present study was to evaluate the effect of risperidone in patients afflicted by autistic disorder especially with regards to its three core symptoms, including "relating to others", "communication skills", and "stereotyped behaviors" based on Childhood Autism Rating Scale (CARS. Materials and Methods: An 8-week open-label study of risperidone for treatment of autistic disorder in children 4-17 years old was designed. Risperidone dose titration was as follow: 0.02 mg/kg/day at the first week, 0.04 mg/kg/day at the second week, and 0.06 mg/kg/day at the third week and thereafter. The outcome measures were scores obtained by CARS, Aberrant Behavior Checklist (ABC, and Clinical Global Impression-Improvement (CGI-I scale. Results: Fifteen patients completed this study. After 8 weeks, CARS total score decreased significantly, (P=0.001. At the end of the study, social interactions and verbal communication skills of the patients were significantly improved (P<0.001, P=0.03, respectively. However, stereotypic behaviors did not show any significant change in this study. Increase in appetite and somnolence were the most reported side effects. Conclusion: This study suggests that risperidone may be an effective treatment for the management of core symptoms of autistic disorder.

  13. Unified Scaling Law for flux pinning in practical superconductors: III. Minimum datasets, core parameters, and application of the Extrapolative Scaling Expression

    Science.gov (United States)

    Ekin, Jack W.; Cheggour, Najib; Goodrich, Loren; Splett, Jolene

    2017-03-01

    of the USL in several new areas: (l) A five-fold reduction in the measurement space for unified temperature-strain apparatuses through extrapolation of minimum datasets; (2) Combination of data from separate temperature and strain apparatuses, which provides flexibility and productive use of more limited data; and (3) Full conductor characterization from as little as a single I c(B) curve when a few core parameters have been measured in a similar conductor. Default core scaling parameter values are also given, based on analysis of a wide range of practical Nb3Sn conductors.

  14. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  15. Jacobsen protocols for large-scale epoxidation of cyclic dienyl sulfones: application to the (+)-pretazettine core.

    Science.gov (United States)

    Ebrahimian, G Reza; du Jourdin, Xavier Mollat; Fuchs, Philip L

    2012-05-18

    A Jacobsen epoxidation protocol using H2O2 as oxidant was designed for the large-scale preparation of various epoxy vinyl sulfones. A number of cocatalysts were screened, and pH control led to increased reaction rate, higher turnover number, and improved reliability.

  16. Comparative genome-scale metabolic modeling of actinomycetes : The topology of essential core metabolism

    NARCIS (Netherlands)

    Alam, Mohammad Tauqeer; Medema, Marnix H.; Takano, Eriko; Breitling, Rainer; Gojobori, Takashi

    2011-01-01

    Actinomycetes are highly important bacteria. On one hand, some of them cause severe human and plant diseases, on the other hand, many species are known for their ability to produce antibiotics. Here we report the results of a comparative analysis of genome-scale metabolic models of 37 species of act

  17. Comparative genome-scale metabolic modeling of actinomycetes: the topology of essential core metabolism.

    NARCIS (Netherlands)

    Alam, M.T.; Medema, M.H.; Takano, E.; Breitling, R.

    2011-01-01

    Actinomycetes are highly important bacteria. On one hand, some of them cause severe human and plant diseases, on the other hand, many species are known for their ability to produce antibiotics. Here we report the results of a comparative analysis of genome-scale metabolic models of 37 species of act

  18. Comparative genome-scale metabolic modeling of actinomycetes: the topology of essential core metabolism.

    NARCIS (Netherlands)

    Alam, M.T.; Medema, M.H.; Takano, E.; Breitling, R.

    2011-01-01

    Actinomycetes are highly important bacteria. On one hand, some of them cause severe human and plant diseases, on the other hand, many species are known for their ability to produce antibiotics. Here we report the results of a comparative analysis of genome-scale metabolic models of 37 species of

  19. Comparative genome-scale metabolic modeling of actinomycetes : The topology of essential core metabolism

    NARCIS (Netherlands)

    Alam, Mohammad Tauqeer; Medema, Marnix H.; Takano, Eriko; Breitling, Rainer; Gojobori, Takashi

    2011-01-01

    Actinomycetes are highly important bacteria. On one hand, some of them cause severe human and plant diseases, on the other hand, many species are known for their ability to produce antibiotics. Here we report the results of a comparative analysis of genome-scale metabolic models of 37 species of

  20. Modelling of Generic Slung Load System

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Bendtsen, Jan Dimon; La Cour-Harbo, Anders

    2006-01-01

    This paper presents the result of modelling and verification of a generic slung load system using a small-scale helicopter. The model is intended for use in simulation, pilot training, estimation, and control. The model is derived using a redundant coordinate formulation based on Gauss Principle ...

  1. A large scale dynamo and magnetoturbulence in rapidly rotating core-collapse supernovae

    CERN Document Server

    Mösta, Philipp; Radice, David; Roberts, Luke F; Schnetter, Erik; Haas, Roland

    2015-01-01

    Magnetohydrodynamic (MHD) turbulence is of key importance in many high-energy astrophysical systems, including black-hole accretion disks, protoplanetary disks, neutron stars, and stellar interiors. MHD instabilities can amplify local magnetic field strength over very short time scales, but it is an open question whether this can result in the creation of a large scale ordered and dynamically relevant field. Specifically, the magnetorotational instability (MRI) has been suggested as a mechanism to grow magnetar-strength magnetic field ($\\gtrsim 10^{15}\\, \\mathrm{G}$) and magnetorotationally power the explosion of a rotating massive star. Such stars are progenitor candidates for type Ic-bl hypernova explosions that involve relativistic outflows and make up all supernovae connected to long gamma-ray bursts (GRBs). We have carried out global 3D general-relativistic magnetohydrodynamic (GRMHD) turbulence simulations that resolve the fastest growing mode (FGM) of the MRI. We show that MRI-driven MHD turbulence in ...

  2. Core-scale electrical resistivity tomography (ERT) monitoring of CO2-brine mixture in Fontainebleau sandstone

    Science.gov (United States)

    Bosch, David; Ledo, Juanjo; Queralt, Pilar; Bellmunt, Fabian; Luquot, Linda; Gouze, Philippe

    2016-07-01

    The main goal of the monitoring stage of Carbon Capture and Storage (CCS) is to obtain an accurate estimation of the subsurface CO2 accumulation and to detect any possible leakage. Laboratory experiments are necessary to investigate the small scale processes governing the CO2-brine-rock interaction. They also provide a means to calibrate the results coming from field scale geophysical methods. In this work we set up an experimental system which is able to perform Electrical Resistivity Tomography (ERT) measurements on centimeter-scale rock samples at various P-T conditions. We present the results of two new experiments related to CO2 monitoring, performed on a cylindrical (4 × 8 cm) Fontainebleau rock sample. In the first one, we have quantified the CO2 saturation at different volume fractions, representing zones from a deep saline aquifer with varying degrees of saturation. In the second one, we have monitored and quantified the effect of CO2 dissolution in the brine at a pressure of 40 bar during eight days, emulating the invasion of CO2 into a shallow aquifer. Results highlight the importance of accounting for the contribution of surface conductivity in highly CO2-saturated regions, even in clay-free rocks, and also for brine conductivity variation due to CO2 dissolution. Ignoring any of these effects will end up in a CO2 saturation underestimation. We present a modified CO2 saturation equation to account for these two influences.

  3. Consistently dated records from three Greenland ice cores reveal regional millennial-scale isotope gradients with possible Heinrich Event imprint

    Science.gov (United States)

    Seierstad, Inger K.; Rasmussen, Sune O.

    2014-05-01

    We here present records from the NGRIP, GRIP and GISP2 ice cores tied to the same chronology for the past 104 ka at an unprecedented time resolution. The three ice cores have been linked by matching distinct peaks in volcanic proxy records and other impurity records from the three ice cores, assuming that these layers of elevated impurity content represent the same, instantaneous event in the past at all three sites. In total there are more than 900 identified marker horizons between the three cores including previously published match points, of which we introduce a minor revision. Our matching is independently confirmed by new and existing volcanic ash layers (tephra). The depth-depth relationship from the detailed matching is used to transfer the most recent and widely used Greenland ice core chronology, the GICC05modelext timescale, to the two Summit cores, GRIP and GISP2. Furthermore, we provide gas chronologies for the Summit cores that are consistent with the GICC05modelext timescale by utilizing both existing and new unpublished gas data. A comparison of the GICC05modelext and the former GISP2 timescale reveals major discrepancies in short time intervals during the glacial section. We detect a pronounced change in the relative annual layer thickness between the two Summit sites and NGRIP across the Last Glacial termination and early-to-mid Holocene, which can be explained by a relative accumulation increase at NGRIP compared to the Summit region as response to the onset of the Holocene and the climatic optimum. Between stadials and interstadials we infer that the accumulation contrast typically was nearly 10% greater at Summit compared to at NGRIP. The δ18O temperature-proxy records from NGRIP, GRIP and GISP2 are generally very similar and display a synchronous behavior at climate transitions, but the δ18O differences between Summit and NGRIP is slowly changing over the last glacial-interglacial cycle superimposed by abrupt millennial-to centennial scale

  4. Exploring Generic Haskell

    NARCIS (Netherlands)

    Löh, A.

    2004-01-01

    This thesis is an exploration -- an exploration of a language extension of the functional programming language Haskell. The extension is called Generic Haskell, albeit the name has been used to refer to different objects over the last several years: Many papers have described different proposals, fe

  5. Generic Market Models

    NARCIS (Netherlands)

    R. Pietersz (Raoul); M. van Regenmortel

    2005-01-01

    textabstractCurrently, there are two market models for valuation and risk management of interest rate derivatives, the LIBOR and swap market models. In this paper, we introduce arbitrage-free constant maturity swap (CMS) market models and generic market models featuring forward rates that span perio

  6. A large-scale dynamo and magnetoturbulence in rapidly rotating core-collapse supernovae.

    Science.gov (United States)

    Mösta, Philipp; Ott, Christian D; Radice, David; Roberts, Luke F; Schnetter, Erik; Haas, Roland

    2015-12-17

    Magnetohydrodynamic turbulence is important in many high-energy astrophysical systems, where instabilities can amplify the local magnetic field over very short timescales. Specifically, the magnetorotational instability and dynamo action have been suggested as a mechanism for the growth of magnetar-strength magnetic fields (of 10(15) gauss and above) and for powering the explosion of a rotating massive star. Such stars are candidate progenitors of type Ic-bl hypernovae, which make up all supernovae that are connected to long γ-ray bursts. The magnetorotational instability has been studied with local high-resolution shearing-box simulations in three dimensions, and with global two-dimensional simulations, but it is not known whether turbulence driven by this instability can result in the creation of a large-scale, ordered and dynamically relevant field. Here we report results from global, three-dimensional, general-relativistic magnetohydrodynamic turbulence simulations. We show that hydromagnetic turbulence in rapidly rotating protoneutron stars produces an inverse cascade of energy. We find a large-scale, ordered toroidal field that is consistent with the formation of bipolar magnetorotationally driven outflows. Our results demonstrate that rapidly rotating massive stars are plausible progenitors for both type Ic-bl supernovae and long γ-ray bursts, and provide a viable mechanism for the formation of magnetars. Moreover, our findings suggest that rapidly rotating massive stars might lie behind potentially magnetar-powered superluminous supernovae.

  7. Narrow Scale Flow and a Weak Field by the Top of Earth's Core: Evidence from Orsted, Magsat and Secular Variation

    Science.gov (United States)

    Voorhies, Coerte V.

    2004-01-01

    As Earth's main magnetic field weakens, our magnetic shield against the onslaught of the solar wind thins. And the field strength needed to fend off battering by solar coronal mass ejections is decreasing, just when the delicate complexity of modem, vulnerable, electro-technological systems is increasing at an unprecedented rate. Recently, a working group of distinguished scientist from across the nation has asked NASA's Solid Earth and Natural Hazards program a key question: What are the dynamics of Earth s magnetic field and its interactions with the Earth system? Paleomagnetic studies of crustal rocks magnetized in the geologic past reveal that polarity reversals have occurred many times during Earth s history. Networked super-computer simulations of core field and flow, including effects of gravitational, pressure, rotational Coriolis, magnetic and viscous forces, suggest how this might happen in detail. And space-based measurements of the real, time-varying magnetic field help constrain estimates of the speed and direction of fluid iron flowing near the top of the core and enable tests of some hypotheses about such flow. Now scientists at NASA s Goddard Space Flight Center have developed and applied methods to test the hypotheses of narrow scale flow and of a dynamically weak magnetic field near the top of Earth s core. Using two completely different methods, C. V. Voorhies has shown these hypotheses lead to specific theoretical forms for the "spectrum" of Earth s main magnetic field and the spectrum of its rate of change. Much as solar physicists use a prism to separate sunlight into its spectrum, from long wavelength red to short wavelength blue light, geophysicists use a digital prism, spherical harmonic analysis, to separate the measured geomagnetic field into its spectrum, from long to short wavelength fields. They do this for the rate of change of the field as well.

  8. Short scale variation in presence and structure of complex core-mantle boundary regions beneath northern Mexico

    Science.gov (United States)

    Jasbinsek, J. J.

    2016-12-01

    A set of nine intermediate depth earthquakes with closely spaced epicenters in Central America recorded at a small aperture array in the western United States contain clear core-mantle boundary (CMB) reflections. Cross-correlation of [0.5,2] Hz bandpass filtered seismograms at the 11 station array results in well-constrained stacked PcP and ScP waveforms. Most events contain both PcP and ScP waveforms, providing two distinct areas of core-mantle boundary sampling. In approximately half of the stacked waveforms, additional pre- and/or post-cursory arrivals are observed with both PcP and ScP suggesting the presence of complicated CMB structures. Commonly the extra arrivals have the visual appearance of reverberations. Two primary observations are made: (1) One-dimensional forward modeling indicates that simple one-layer ultra-low velocity zone (ULVZ) models do not accurately reproduce the PcP and ScP waveforms, instead multi-layer ULVZ models provide a better fit to the waveforms, (2) Spatially the pattern of CMB regions requiring extra structure is contiguous, but change to a simple CMB structure over short distance scales. The simple one-dimensional modeling explored here cannot uniquely constrain the three-dimensional CMB structure, but provides insight into potential CMB structure that may be resolvable with higher accuracy and more computationally intensive forward seismogram modeling.

  9. Generic and biosimilar medicines: quid?

    Directory of Open Access Journals (Sweden)

    Steven Simoens

    2012-12-01

    Full Text Available Once intellectual property protection, data and marketing exclusivity of reference medicines have expired, generic medicines and biosimilar medicines can enter the off-patent market. This market entry is conditional on the approval of marketing authorization, pricing and reimbursement. Given that there tends to be confusion surrounding generic and biosimilar medicines, this Editorial introduces basic concepts related to generic and biosimilar medicines and presents the different studies and articles included in this supplement dedicated to generic and biosimilar medicines.

  10. Superlinear scaling in master-slave quantum chemical calculations using in-core storage of two-electron integrals.

    Science.gov (United States)

    Fossgård, Eirik; Ruud, Kenneth

    2006-02-01

    We describe the implementation of a parallel, in-core, integral-direct Hartree-Fock and density functional theory code for the efficient calculation of Hartree-Fock wave functions and density functional theory. The algorithm is based on a parallel master-slave algorithm, and the two-electron integrals calculated by a slave are stored in available local memory. To ensure the greatest computational savings, the master node keeps track of all integral batches stored on the different slaves. The code can reuse undifferentiated two-electron integrals both in the wave function optimization and in the evaluation of second-, third-, and fourth-order molecular properties. Superlinear scaling is achieved in a series of test examples, with speedups of up to 55 achieved for calculations run on medium-sized molecules on 16 processors with respect to the time used on a single processor.

  11. Generic wormhole throats

    CERN Document Server

    Visser, M; Visser, Matt; Hochberg, David

    1997-01-01

    Wormholes and black holes have traditionally been treated a quite separate objects with relatively little overlap. The possibility of a connection arises in that wormholes, if they exist, might have profound influence on black holes, their event horizons, and their internal structure. After discussing some connections, we embark on an overview of what can generally be said about traversable wormhole throats. We discuss the violations of the energy conditions that typically occur at and near the throat of any traversable wormhole and emphasize the generic nature of this result. We discuss the original Morris-Thorne wormhole and its generalization to a spherically symmetric time-dependent wormhole, and also discuss spherically symmetric Brans-Dicke wormholes. We also discuss the relationship with the topological censorship theorem. Finally we turn to a rather general class of wormholes that permit explicit analysis: generic static traversable wormholes (without any symmetry). We define the wormhole throat in te...

  12. Generic Network Location Service

    Directory of Open Access Journals (Sweden)

    Laban Mwansa

    2010-11-01

    Full Text Available This work presents the Generic Network Location Service based on the Chord implementation utilizing data structures called distributed hash tables (DHT or structured overlay networks, which are used to build scalable self-managing distributed systems. The provided algorithms guarantee resilience in the presence of dynamism: they guarantee consistent lookup results in the presence of nodes failing and leaving. Generic Network Location Service provides a Location Service system based on DHT technology, which is storing device location records in nodes within a Chord DHT. Location records are consisting of network device identification keys as attributes, which are used to create replicas of additional location records through established Chord hashing mechanisms. Storing device location records, in places address-able (using the DHT lookup by individual location record keys provides a simple way of implementing transla¬tion functions similar to well¬ known network services (e.g. ARP, DNS, ENUM. The generic network location ser¬vice presented in the paper is not supposed to be a substitu¬tion of the existing translation techniques (e.g. ARP, DNS, ENUM, but it is considered as an overlay service that uses data available in existing systems and provides some translations currently unavailable.

  13. Model-data comparison of soil organic oatter cycling: soil core scale

    Science.gov (United States)

    Wutzler, Thomas; Reichstein, Markus

    2010-05-01

    Soil organic matter (SOM) cycling is usually modeled as a donor controlled process, most often by first order kinetics. However, evidence of contradition of this donor-paradigm is appearing. One alternative hypothesis is that microbiological consumers of SOM play an important role and need to be taken into account more explicitely. Here we link SOM cycling to the modeling of microbial growth kinetics. We set up a suite of alternative models of microbial growth. Explicitly modelling the cycling of a label across carbon pools allowed to compare the model outputs to data of a soil priming experiment. The experimental data was taken from U. Hamer, & B. Marschner (2002 Journal of Plant Nutrition and Soil Science 165(3)), who incubated several 14C labelled substrates at 20°C in a model system that consisted of sand mixed with lignin for 26 days. Data streams of time series total respiration, respiration from labelled amendment and prior information on model parameters were used to determine the posterior probability density function of the model parameters of each of the model variants and to calculate Bayes-Factors, the ratios of the likelihood of the different model variants. This kind of data and Bayesian analysis is usable to compare model structures adapted to processes that determine the dynamics at this scale: co-limitation of depolymerization of older soil organic matter by both substrate and decomposers, prefererrential substrate usage, activation and deactivation and predation of microbes, and usage of both assimilated carbon and carbon of internal pools for maintenance and growth respiration.

  14. On the extent of size range and power law scaling for particles of natural carbonate fault cores

    Science.gov (United States)

    Billi, Andrea

    2007-09-01

    To determine the size range and both type and extent of the scaling laws for particles of loose natural carbonate fault rocks, six granular fault cores from Mesozoic carbonate strata of central Italy were sampled. Particle size distributions of twelve samples were determined by combining sieving and sedimentation methods. Results show that, regardless of the fault geometry, kinematics, and tectonic history, the size of fault rock particles respects a power law distribution across approximately four orders of magnitude. The fractal dimension ( D) of the particle size distribution in the analysed samples ranges between ˜2.0 and ˜3.5. A lower bound to the power law trend is evident in all samples except in those with the highest D-values; in these samples, the smallest analysed particles (˜0.0005 mm in diameter) were also included in the power law interval, meaning that the lower size limit of the power law distribution decreases for increasing D-values and that smallest particles start to be comminuted with increasing strain (i.e. increasing fault displacement and D-values). For increasing D-values, also the largest particles tends to decrease in number, but this evidence may be affected by a censoring bias connected with the sample size. Stick-slip behaviour is suggested for the studied faults on the basis of the inferred particle size evolutions. Although further analyses are necessary to make the results of this study more generalizable, the preliminary definition of the scaling rules for fault rock particles may serve as a tool for predicting a large scale of fault rock particles once a limited range is known. In particular, data from this study may result useful as input numbers in numerical models addressing the packing of fault rock particles for frictional and hydraulic purposes.

  15. Small-scale disturbances in the stratigraphy of the NEEM ice core: observations and numerical model simulations

    Science.gov (United States)

    Jansen, D.; Llorens, M.-G.; Westhoff, J.; Steinbach, F.; Kipfstuhl, S.; Bons, P. D.; Griera, A.; Weikusat, I.

    2016-02-01

    Disturbances on the centimetre scale in the stratigraphy of the North Greenland Eemian Ice Drilling (NEEM) ice core (North Greenland) can be mapped by an optical line scanner as long as the ice has visual layering, such as, for example, cloudy bands. Different focal depths allow, to a certain extent, a three-dimensional view of the structures. In this study we present a detailed analysis of the visible folds, discuss their characteristics and frequency, and present examples of typical fold structures. We also analyse the structures with regard to the deformation boundary conditions under which they formed. The structures evolve from gentle waves at about 1500 m to overturned z folds with increasing depth. Occasionally, the folding causes significant thickening of layers. Their similar fold shape indicates that they are passive features and are probably not initiated by rheology differences between alternating layers. Layering is heavily disturbed and tracing of single layers is no longer possible below a depth of 2160 m. C axes orientation distributions for the corresponding core sections were analysed, where available, in addition to visual stratigraphy. The data show axial-plane parallel strings of grains with c axis orientations that deviate from that of the matrix, which shows a single maximum fabric at the depth where the folding occurs. Numerical modelling of crystal viscoplastic deformation and dynamic recrystallisation was used to improve the understanding of the formation of the observed structures during deformation. The modelling reproduces the development of bands of grains with a tilted-lattice orientation relative to the single maximum fabric of the matrix, and also the associated local deformation. We conclude from these results that the observed folding can be explained by formation of these tilted-lattice bands.

  16. Higher fine-scale genetic structure in peripheral than in core populations of a long-lived and mixed-mating conifer - eastern white cedar (Thuja occidentalis L.

    Directory of Open Access Journals (Sweden)

    Pandey Madhav

    2012-04-01

    Full Text Available Abstract Background Fine-scale or spatial genetic structure (SGS is one of the key genetic characteristics of plant populations. Several evolutionary and ecological processes and population characteristics influence the level of SGS within plant populations. Higher fine-scale genetic structure may be expected in peripheral than core populations of long-lived forest trees, owing to the differences in the magnitude of operating evolutionary and ecological forces such as gene flow, genetic drift, effective population size and founder effects. We addressed this question using eastern white cedar (Thuja occidentalis as a model species for declining to endangered long-lived tree species with mixed-mating system. Results We determined the SGS in two core and two peripheral populations of eastern white cedar from its Maritime Canadian eastern range using six nuclear microsatellite DNA markers. Significant SGS ranging from 15 m to 75 m distance classes was observed in the four studied populations. An analysis of combined four populations revealed significant positive SGS up to the 45 m distance class. The mean positive significant SGS observed in the peripheral populations was up to six times (up to 90 m of that observed in the core populations (15 m. Spatial autocorrelation coefficients and correlograms of single and sub-sets of populations were statistically significant. The extent of within-population SGS was significantly negatively correlated with all genetic diversity parameters. Significant heterogeneity of within-population SGS was observed for 0-15 m and 61-90 m between core and peripheral populations. Average Sp, and gene flow distances were higher in peripheral (Sp = 0.023, σg = 135 m than in core (Sp = 0.014, σg = 109 m populations. However, the mean neighborhood size was higher in the core (Nb = 82 than in the peripheral (Nb = 48 populations. Conclusion Eastern white cedar populations have significant fine-scale genetic structure at short

  17. Generic medications in ophthalmology.

    Science.gov (United States)

    Zore, Matt; Harris, Alon; Tobe, Leslie Abrams; Siesky, Brent; Januleviciene, Ingrida; Behzadi, Jennifer; Amireskandari, Annahita; Egan, Patrick; Garff, Kevin; Wirostko, Barbara

    2013-03-01

    The purpose of this review is to discuss the process of genericisation of medications in the US and Europe with a focus on ophthalmic drugs. Regulatory guidelines of the US Food and Drug Administration and the European Medicines Agency will be discussed, and the advantages and concerns of genericisation will be explored. We will look at various studies concerning the safety and efficacy of generic drugs compared to their branded counterparts. In particular, the challenges of assuring bioequivalence and therapeutic equivalence in topical ophthalmic drugs will be examined.

  18. Generic patch inference

    DEFF Research Database (Denmark)

    Andersen, Jesper; Lawall, Julia

    2010-01-01

    A key issue in maintaining Linux device drivers is the need to keep them up to date with respect to evolutions in Linux internal libraries. Currently, there is little tool support for performing and documenting such changes. In this paper we present a tool, spdiff, that identifies common changes...... developers can use it to extract an abstract representation of the set of changes that others have made. Our experiments on recent changes in Linux show that the inferred generic patches are more concise than the corresponding patches found in commits to the Linux source tree while being safe with respect...

  19. Grain-scale imaging and compositional characterization of cryo-preserved India NGHP 01 gas-hydrate-bearing cores

    Science.gov (United States)

    Stern, Laura A.; Lorenson, T.D.

    2014-01-01

    We report on grain-scale characteristics and gas analyses of gas-hydrate-bearing samples retrieved by NGHP Expedition 01 as part of a large-scale effort to study gas hydrate occurrences off the eastern-Indian Peninsula and along the Andaman convergent margin. Using cryogenic scanning electron microscopy, X-ray spectroscopy, and gas chromatography, we investigated gas hydrate grain morphology and distribution within sediments, gas hydrate composition, and methane isotopic composition of samples from Krishna–Godavari (KG) basin and Andaman back-arc basin borehole sites from depths ranging 26 to 525 mbsf. Gas hydrate in KG-basin samples commonly occurs as nodules or coarse veins with typical hydrate grain size of 30–80 μm, as small pods or thin veins 50 to several hundred microns in width, or disseminated in sediment. Nodules contain abundant and commonly isolated macropores, in some places suggesting the original presence of a free gas phase. Gas hydrate also occurs as faceted crystals lining the interiors of cavities. While these vug-like structures constitute a relatively minor mode of gas hydrate occurrence, they were observed in near-seafloor KG-basin samples as well as in those of deeper origin (>100 mbsf) and may be original formation features. Other samples exhibit gas hydrate grains rimmed by NaCl-bearing material, presumably produced by salt exclusion during original hydrate formation. Well-preserved microfossil and other biogenic detritus are also found within several samples, most abundantly in Andaman core material where gas hydrate fills microfossil crevices. The range of gas hydrate modes of occurrence observed in the full suite of samples suggests a range of formation processes were involved, as influenced by local in situconditions. The hydrate-forming gas is predominantly methane with trace quantities of higher molecular weight hydrocarbons of primarily microbial origin. The composition indicates the gas hydrate is Structure I.

  20. Leveraging the power of multi-core platforms for large-scale geospatial data processing: Exemplified by generating DEM from massive LiDAR point clouds

    Science.gov (United States)

    Guan, Xuefeng; Wu, Huayi

    2010-10-01

    In recent years improvements in spatial data acquisition technologies, such as LiDAR, resulted in an explosive increase in the volume of spatial data, presenting unprecedented challenges for computation capacity. At the same time, the kernel of computing platforms the CPU, also evolved from a single-core to multi-core architecture. This radical change significantly affected existing data processing algorithms. Exemplified by the problem of generating DEM from massive air-borne LiDAR point clouds, this paper studies how to leverage the power of multi-core platforms for large-scale geospatial data processing and demonstrates how multi-core technologies can improve performance. Pipelining is adopted to exploit the thread level parallelism of multi-core platforms. First, raw point clouds are partitioned into overlapped blocks. Second, these discrete blocks are interpolated concurrently on parallel pipelines. On the interpolation run, intermediate results are sorted and finally merged into an integrated DEM. This parallelization demonstrates the great potential of multi-core platforms with high data throughput and low memory footprint. This approach achieves excellent performance speedup with greatly reduced processing time. For example, on a 2.0 GHz Quad-Core Intel Xeon platform, the proposed parallel approach can process approximately one billion LiDAR points (16.4 GB) in about 12 min and produces a 27,500×30,500 raster DEM, using less than 800 MB main memory.

  1. Multi-scale analysis on last millennium climate variations in Greenland by its ice core oxygen isotope

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The empirical mode decomposition method is used for analyzing the paleoclimate proxy δ18O from Greenland GISP2 ice core.The results show that millennium climate change trends in Greenland record the Medieval Warm Period (MWP) from 860AD-1350AD lasting for about 490 years,and the Little Ice Age (LIA) from 1350AD-1920AD lasting about 570 years.During these events,sub cooling-warming variations occurred.Its multi-scale oscillations changed with quasi-period of 3-year,6.5-year,12-year,24-year,49-year,96-year,213-year and 468-year,and are not only affected by ENSO but also by solar activity.The oscillation of intrinsic mode function IMF7,IMF8 and their tendency obviously appear in 1350AD which is considered as the key stage of transformation between MWP and LIA.The results give more detailed changes and their stages of millennium climate change in high latitude areas of the Northern Hemisphere.

  2. NET 40 Generics Beginner's Guide

    CERN Document Server

    Mukherjee, Sudipta

    2012-01-01

    This is a concise, practical guide that will help you learn Generics in .NET, with lots of real world and fun-to-build examples and clear explanations. It is packed with screenshots to aid your understanding of the process. This book is aimed at beginners in Generics. It assumes some working knowledge of C# , but it isn't mandatory. The following would get the most use out of the book: Newbie C# developers struggling with Generics. Experienced C++ and Java Programmers who are migrating to C# and looking for an alternative to other generic frameworks like STL and JCF would find this book handy.

  3. Generic Quantum Fourier Transforms

    CERN Document Server

    Moore, Cristopher; Russell, A; Moore, Cristopher; Rockmore, Daniel; Russell, Alexander

    2003-01-01

    The quantum Fourier transform (QFT) is the principal algorithmic tool underlying most efficient quantum algorithms. We present a generic framework for the construction of efficient quantum circuits for the QFT by ``quantizing'' the separation of variables technique that has been so successful in the study of classical Fourier transform computations. Specifically, this framework applies the existence of computable Bratteli diagrams, adapted factorizations, and Gel'fand-Tsetlin bases to offer efficient quantum circuits for the QFT over a wide variety a finite Abelian and non-Abelian groups, including all group families for which efficient QFTs are currently known and many new group families. Moreover, the method gives rise to the first subexponential-size quantum circuits for the QFT over the linear groups GL_k(q), SL_k(q), and the finite groups of Lie type, for any fixed prime power q.

  4. Generic torus canards

    Science.gov (United States)

    Vo, Theodore

    2017-10-01

    Torus canards are special solutions of fast/slow systems that alternate between attracting and repelling manifolds of limit cycles of the fast subsystem. A relatively new dynamic phenomenon, torus canards have been found in neural applications to mediate the transition from tonic spiking to bursting via amplitude-modulated spiking. In R3, torus canards are degenerate: they require one-parameter families of 2-fast/1-slow systems in order to be observed and even then, they only occur on exponentially thin parameter intervals. The addition of a second slow variable unfolds the torus canard phenomenon, making it generic and robust. That is, torus canards in fast/slow systems with (at least) two slow variables occur on open parameter sets. So far, generic torus canards have only been studied numerically, and their behaviour has been inferred based on averaging and canard theory. This approach, however, has not been rigorously justified since the averaging method breaks down near a fold of periodics, which is exactly where torus canards originate. In this work, we combine techniques from Floquet theory, averaging theory, and geometric singular perturbation theory to show that the average of a torus canard is a folded singularity canard. In so doing, we devise an analytic scheme for the identification and topological classification of torus canards in fast/slow systems with two fast variables and k slow variables, for any positive integer k. We demonstrate the predictive power of our results in a model for intracellular calcium dynamics, where we explain the mechanisms underlying a novel class of elliptic bursting rhythms, called amplitude-modulated bursting, by constructing the torus canard analogues of mixed-mode oscillations. We also make explicit the connection between our results here with prior studies of torus canards and torus canard explosion in R3, and discuss how our methods can be extended to fast/slow systems of arbitrary (finite) dimension.

  5. Article choice in plural generics

    NARCIS (Netherlands)

    Farkas, D.F.; Swart, Henriëtte de

    2007-01-01

    We discuss two groups of languages where article use contrasts in generic plural sentences but is otherwise essentially similar. The languages in the first group (English and Dutch) use bare plurals in the expression of kind reference (‘Dinosaurs are extinct’) and in generic generalizations (‘Dogs a

  6. Large-scale fabrication and application of magnetite coated Ag NW-core water-dispersible hybrid nanomaterials.

    Science.gov (United States)

    Wang, Baoyu; Zhang, Min; Li, Weizhen; Wang, Linlin; Zheng, Jing; Gan, Wenjun; Xu, Jingli

    2015-05-07

    In this work, we report a large scale synthetic procedure that allows attachment of magnetite nanoparticles onto Ag NWs in situ, which was conducted in a triethylene glycol (TREG) solution with iron acetylacetonate and Ag NWs as starting materials. The as-prepared Ag NW/Fe3O4 NP composites are well characterized by SEM, TEM, XRD, XPS, FT-IR, and VSM techniques. It was found that the mass ratio of iron acetylacetonate to Ag NWs plays a crucial role in controlling the amount of magnetite nanoparticles decorated on the Ag NWs. The resulting Ag NW/Fe3O4 NP composites exhibit superparamagnetic properties at room temperature, and can be well dispersed in aqueous and organic solutions, which is greatly beneficial for their application and functionality. Thus, the as-prepared magnetic silver nanowires show good catalytic activity, using the catalytic reduction of methylene blue (MB) as a model reaction. Furthermore, the Ag NW/Fe3O4 NP composites can be functionalized with polydopamine (Pdop), resorcinol-formaldehyde resin (PFR), and SiO2, respectively, in aqueous/ethanol solution. Meanwhile they can also be coated with polyphosphazene (PZS) in organic solution, resulting in a unique nanocable with well-defined core shell structures. Besides, taking Ag NW/Fe3O4@SiO2 as an example, a hollow magnetic silica nanotube can be obtained with the use of Ag NWs as physical templates and a solution of ammonium and H2O2. These can greatly improve the application of the Ag NW/Fe3O4 NP composites. The as-synthesized above nanocomposites have high potential for applications in the fields of polymers, wastewater treatment, sensors, and biomaterials.

  7. Theoretical simulation of a polarization splitter based on dual-core soft glass PCF with micron-scale gold wire

    Science.gov (United States)

    Liu, Qiang; Li, Shuguang; Wang, Xinyu; Shi, Min

    2016-12-01

    A polarization splitter based on dual-core soft glass photonic crystal fiber (PCF) filled with micron-scale gold wire is proposed. The characteristics of the polarization splitter are studied by changing the structural parameters of the PCF and the diameter of the gold wire with the finite element method (FEM). The simulation results reveal that the coupling length ratio of the soft glass-based PCF is close to 2 and the corresponding curve is more flat than that of the silica-based PCF. The broadband bandwidth is 226 nm in which the extinction ratio is lower than -20 dB by the soft glass-based PCF, i.e., from 1465 nm to 1691 nm which is competitive in the reported polarization splitters, and the bandwidth is just 32 nm by the silica-based PCF. The insertion loss by our polarization splitter is just 0.00248 dB and 0.43 dB at the wavelength of 1.47 μm and 1.55 μm. The birefringence is obviously increased and the coupling length is decreased by filling gold wire into the soft glass-based or the silica-based PCF. Also the birefringence based on the silica-based PCF is much larger than that based on the soft glass-based PCF whether or not the gold wire is introduced. The fabrication tolerance of the polarization splitter is also considered by changing the structural parameters. The polarization splitter possesses broad bandwidth, low insertion loss, simple structure and high fabrication tolerance. Project supported by the National Natural Science Foundation of China (Grant Nos. 61178026, 61475134, and 61505175).

  8. Phase-matched waveguide four-wave mixing scaled to higher peak powers with large-core-area hollow photonic-crystal fibers.

    Science.gov (United States)

    Konorov, S O; Serebryannikov, E E; Fedotov, A B; Miles, R B; Zheltikov, A M

    2005-05-01

    Hollow photonic-crystal fibers with large core diameters are shown to allow waveguide nonlinear-optical interactions to be scaled to higher pulse peak powers. Phase-matched four-wave mixing is predicted theoretically and demonstrated experimentally for millijoule nanosecond pulses propagating in a hollow photonic-crystal fiber with a core diameter of about 50 microm , suggesting the way to substantially enhance the efficiency of nonlinear-optical spectral transformations and wave mixing of high-power laser pulses in the gas phase.

  9. Granulation of core particles suitable for film coating by agitation fluidized bed III. Effect of scale, agitator rotational speed and blade shape on granule properties and development of a high accuracy scale-up theory.

    Science.gov (United States)

    Hamashita, Tomohiro; Ono, Tetsuo; Ono, Masaki; Tsunenari, Yoshinobu; Aketo, Takao; Watano, Satoru

    2009-04-01

    The preparation of core particles suitable for subsequent film coating was examined using different scales of agitation fluidized beds. Specifically, the effects of agitator rotational speed and agitator blade shape in different scales of granulators on granule properties such as mass median diameter, apparent density, friability and shape factor were studied. As the agitator rotational speed was increased or when the agitator blade height and angle were large, the mass median diameter and friability of the granules decreased, while the apparent density and shape factor increased, in a manner independent of the vessel size because the granules were subjected to greater compression, shearing and rolling effects. The same core particles could not be prepared using granulators with different vessel sizes by simply adopting a conventional scale-up theory(1,2)) based on kinetic energy similarity. Here, a novel scale-up theory that takes into account agitator blade shape factors is proposed.(3)) When the two scale-up theories were compared, our new theory was capable of predicting the granule properties more accurately than the conventional theory. By adopting this novel theory, the same core particles could be prepared under different operating conditions in any scale of granulator.

  10. Algorithms and data structures for massively parallel generic adaptive finite element codes

    KAUST Repository

    Bangerth, Wolfgang

    2011-12-01

    Today\\'s largest supercomputers have 100,000s of processor cores and offer the potential to solve partial differential equations discretized by billions of unknowns. However, the complexity of scaling to such large machines and problem sizes has so far prevented the emergence of generic software libraries that support such computations, although these would lower the threshold of entry and enable many more applications to benefit from large-scale computing. We are concerned with providing this functionality for mesh-adaptive finite element computations. We assume the existence of an "oracle" that implements the generation and modification of an adaptive mesh distributed across many processors, and that responds to queries about its structure. Based on querying the oracle, we develop scalable algorithms and data structures for generic finite element methods. Specifically, we consider the parallel distribution of mesh data, global enumeration of degrees of freedom, constraints, and postprocessing. Our algorithms remove the bottlenecks that typically limit large-scale adaptive finite element analyses. We demonstrate scalability of complete finite element workflows on up to 16,384 processors. An implementation of the proposed algorithms, based on the open source software p4est as mesh oracle, is provided under an open source license through the widely used deal.II finite element software library. © 2011 ACM 0098-3500/2011/12-ART10 $10.00.

  11. USGS Small-scale Dataset - 1:1,000,000-Scale Core Based Statistical Areas 201309 FileGDB 10.1

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer portrays Core Based Statistical Areas in the United States and Puerto Rico. The map layer was created from the CENSUS 2010 TIGER/Line files produced...

  12. A refined TALDICE-1a age scale from 55 to 112 ka before present for the Talos Dome ice core based on high-resolution methane measurements

    Directory of Open Access Journals (Sweden)

    S. Schüpbach

    2011-09-01

    Full Text Available A precise synchronization of different climate records is indispensable for a correct dynamical interpretation of paleoclimatic data. A chronology for the TALDICE ice core from the Ross Sea sector of East Antarctica has recently been presented based on methane synchronization with Greenland and the EDC ice cores and δ18Oice synchronization with EDC in the bottom part (TALDICE-1. Using new high-resolution methane data obtained with a continuous flow analysis technique, we present a refined age scale for the age interval from 55–112 thousand years (ka before present, where TALDICE is synchronized with EDC. New and more precise tie points reduce the uncertainties of the age scale from up to 1900 yr in TALDICE-1 to below 1100 yr over most of the refined interval and shift the Talos Dome dating to significantly younger ages during the onset of Marine Isotope Stage 3. Thus, discussions of climate dynamics at sub-millennial time scales are now possible back to 110 ka, in particular during the inception of the last ice age. Calcium data of EDC and TALDICE are compared to show the impact of the refinement to the synchronization of the two ice cores not only for the gas but also for the ice age scale.

  13. A refined TALDICE-1a age scale from 55 to 112 ka before present for the Talos Dome ice core based on high-resolution methane measurements

    Directory of Open Access Journals (Sweden)

    S. Schüpbach

    2011-04-01

    Full Text Available A precise synchronization of different climate records is indispensable for a correct dynamical interpretation of paleoclimatic data. A chronology for the TALDICE ice core from the Ross Sea sector of East Antarctica has recently been presented based on methane synchronization with Greenland and the EDC ice cores and δ18Oice synchronization with EDC in the bottom part (TALDICE-1. By the use of new high-resolution methane data, obtained with a continuous flow analysis technique, we present a refined age scale for the age interval from 55–112 ka before present where TALDICE is synchronized with EDC. New and more precise tie points reduce the uncertainties of the age scale from up to 2000 yr in TALDICE-1 to below 1000 yr over most of the refined interval. Thus, discussions of climate dynamics at sub-millennial time scales are now possible back to 110 ka, in particular during the inception of the last ice age. Calcium data of EDC and TALDICE are compared to show the impact of the refinement to the synchronization of the two ice cores not only for the gas but also for the ice age scale.

  14. Large-scale synthesis of nearly monodisperse CdSe/CdS core/shell nanocrystals using air-stable reagents via successive ion layer adsorption and reaction.

    Science.gov (United States)

    Li, J Jack; Wang, Y Andrew; Guo, Wenzhuo; Keay, Joel C; Mishima, Tetsuya D; Johnson, Matthew B; Peng, Xiaogang

    2003-10-15

    Successive ion layer adsorption and reaction (SILAR) originally developed for the deposition of thin films on solid substrates from solution baths is introduced as a technique for the growth of high-quality core/shell nanocrystals of compound semiconductors. The growth of the shell was designed to grow one monolayer at a time by alternating injections of air-stable and inexpensive cationic and anionic precursors into the reaction mixture with core nanocrystals. The principles of SILAR were demonstrated by the CdSe/CdS core/shell model system using its shell-thickness-dependent optical spectra as the probes with CdO and elemental S as the precursors. For this reaction system, a relatively high temperature, about 220-240 degrees C, was found to be essential for SILAR to fully occur. The synthesis can be readily performed on a multigram scale. The size distribution of the core/shell nanocrystals was maintained even after five monolayers of CdS shell (equivalent to about 10 times volume increase for a 3.5 nm CdSe nanocrystal) were grown onto the core nanocrystals. The epitaxial growth of the core/shell structures was verified by optical spectroscopy, TEM, XRD, and XPS. The photoluminescence quantum yield (PL QY) of the as-prepared CdSe/CdS core/shell nanocrystals ranged from 20% to 40%, and the PL full-width at half-maximum (fwhm) was maintained between 23 and 26 nm, even for those nanocrystals for which the UV-vis and PL peaks red-shifted by about 50 nm from that of the core nanocrystals. Several types of brightening phenomena were observed, some of which can further boost the PL QY of the core/shell nanocrystals. The CdSe/CdS core/shell nanocrystals were found to be superior in comparison to the highly luminescent CdSe plain core nanocrystals. The SILAR technique reported here can also be used for the growth of complex colloidal semiconductor nanostructures, such as quantum shells and colloidal quantum wells.

  15. Wide-scale detection of earthquake waveform doublets and further evidence for inner core super-rotation

    Science.gov (United States)

    Zhang, Jian; Richards, Paul G.; Schaff, David P.

    2008-09-01

    We report on more than 100 earthquake waveform doublets in five subduction zones, including an earthquake nest in Bucaramanga, Colombia. Each doublet is presumed to be a pair of earthquakes that repeat at essentially the same location. These doublets are important for studying earthquake physics, as well as temporal changes of the inner core. Particularly, our observation from one South Sandwich Islands (SSI) doublet recorded at station INK in Canada shows an inner core traveltime change of ~0.1 s over ~6 yr, confirming the inner-core differential motion occurring beneath Central America. Observations from one Aleutian Islands doublet, recorded at station BOSA in South Africa, and from one Kuril Islands doublet, recorded at station BDFB in Brazil, show an apparent inner core traveltime change of ~0.1 s over ~7 yr and ~6 yr, respectively, providing evidence for the temporal change of inner core properties beneath Central Asia and Canada, respectively. On the other hand, observations from one Tonga-Fiji-Solomon Islands doublet, recorded at station PTGA in Brazil, and from one Bucaramanga doublet, recorded at station WRAB in Australia and station CHTO in Thailand, show no/little temporal change (no more than 0.005 s yr-1, if any) of inner core traveltimes for the three corresponding ray paths for which the path in the inner core is nearly parallel to the equatorial plane. Such a pattern of observations showing both presence and possible absence of inner-core traveltime change can be explained by the geometry and relative directions of ray path, lateral velocity gradient and inner-core particle motion due to an eastward super-rotation of a few tenths of a degree per year.

  16. Scale Model Test and Transient Analysis of Steam Injector Driven Passive Core Injection System for Innovative-Simplified Nuclear Power Plant

    Science.gov (United States)

    Ohmori, Shuichi; Narabayashi, Tadashi; Mori, Michitsugu

    A steam injector (SI) is a simple, compact and passive pump and also acts as a high-performance direct-contact compact heater. This provides SI with capability to serve also as a direct-contact feed-water heater that heats up feed-water by using extracted steam from turbine. Our technology development aims to significantly simplify equipment and reduce physical quantities by applying "high-efficiency SI", which are applicable to a wide range of operation regimes beyond the performance and applicable range of existing SIs and enables unprecedented multistage and parallel operation, to the low-pressure feed-water heaters and emergency core cooling system of nuclear power plants, as well as achieve high inherent safety to prevent severe accidents by keeping the core covered with water (a severe accident-free concept). This paper describes the results of the scale model test, and the transient analysis of SI-driven passive core injection system (PCIS).

  17. Gamma-Ray Measurements of Naturally Occurring Radioactive Materials in Sludge, Scale and Well Cores of the Oil Industry in Southern Iraq

    Directory of Open Access Journals (Sweden)

    Abdul Ridha Hussain SUBBER

    2013-12-01

    Full Text Available Radioactivity of nuclides 238U, 226Ra, 232Th and 40K was measured in soil by γ-ray spectrometry using a NaI (Li detector. A criterion was set in order to analyze sludge samples from oil fields and oil well-cores in southern Basrah, in the Iraq oil fields. More than 3 γ-ray energy peaks were used for the determination of 226Ra and 232Th activity concentrations to obtain results that are more accurate. Relationships between the measured radionuclides were discussed. Radionuclides 238U and 226Ra were found in disequilibrium with ratio of specific activities (238 U/226 Ra less than unity for most of the sludge and core samples. The content of radioactive elements in the sludge, scale and well core is found within the range of other petroleum countries in the region.doi:10.14456/WJST.2014.93

  18. 助产士核心胜任力量表信度和效度研究%Midwife Core Competency Scale: Reliability and validity assessment

    Institute of Scientific and Technical Information of China (English)

    王德慧; 陆虹; 孙红

    2011-01-01

    目的:对助产士核心胜任力量表进行信度和效度的检测.方法:采用文献回顾的方法,重点参考国际助产联盟制定的助产士胜任力标准,通过助产专业的专家,形成助产士核心胜任力量表,并对北京市19家医院的300名助产士进行测评,对量表进行信度和效度分析,最终形成量表.结果:有效量表295份.助产士核心胜任力量表共由6个维度,54项条目组成,其内部一致性Cronbach'sα系数为0.978,各分维度的Cronbach'sα系数为0.921 ~ 0.938之间,均在0.9以上,总量表的内容效度比为0.95,结构效度6个因子的累计解释变量为70.927%,均在测量学可接受的范围.结论:该助产士核心胜任力量表具有良好的信度和效度,条目设置适用于我国助产士核心胜任力的评价.%Objective: To develop a scale to assess the midwife's core competency and to test the reliability and validity of Midwife Core Competency Scale. Methods: We developed Midwife Core Competency Scale through literature interview whose key points were the essential competencies for midwifery practice formulated by International Confederation of Midwives (ICM), through the midwifery experts consultation, through the investigation of 300 midwives from nineteen hospital in Beijing, and through the assessment of the reliability and validity of the questionnaire. Results: The number of effective scale was 295. The Scale comprised 6 dimensions and 54 items. The internal consistency Cronbach's a coefficient was 0.978.The content validity index was 0.95. The construct validity yielded six factors with an cumulative explained variance of 70.927% which was in the acceptable range. Conclusions: Midwife Core Competency Scale can be considered a reliable and valid scale for assessing midwifes' core competency.

  19. Encouraging generic use can yield significant savings.

    Science.gov (United States)

    Zimmerman, Christina

    2012-11-01

    Key findings. (1) Zero copayment for generic drugs is the greatest influencer of generic statin utilization. (2) Both higher copayments for generic drugs and lower copayments for competing brands are associated with a decreased probability of using generic statins. (3) Prior authorization and step therapy requirements for brand-name statins are associated with an increased use of generic drugs. (4) Greater use of generic statins should reduce costs for patients, plans, and Medicare.

  20. The global and persistent millennial-scale variability in the thermoluminescence profiles of shallow and deep mediterranean sea cores

    Energy Technology Data Exchange (ETDEWEB)

    Cini Castagnoli, G.; Bonino, G.; Taricco, C. [Turin Univ. (Italy). Dip. di Fisica Generale]|[CNR, Turin (Italy). Ist. di Cosmo-Geofisica

    1998-07-01

    In this paper the Authors present the thermoluminescence (TL) profile in the last 7500 y, measured in the upper part of the deep Tyrrhenian sea core CT85-5. The Authors show that millennial periodicity persisted during the last deglaciacion. The transition to Holocene was determined in our core by the oxygen isotope ratio {delta} {sup 18}O measured in Globigerina bulloides. The fact that the observed TL changes do not have a local character is also suggested by the excellent agreement between this deep sea TL profile of the uppermost part of the core and the TL profile measured in the shallow Ionian sea GT89-3 core over the last 2500 y, with a time resolution of 3.096 y.

  1. Pt monolayer shell on hollow Pd core electrocatalysts: Scale up synthesis, structure, and activity for the oxygen reduction reaction

    Directory of Open Access Journals (Sweden)

    Vukmirovic Miomir B.

    2013-01-01

    Full Text Available We report on synthesis, characterization and the oxygen reduction reaction (ORR kinetics of Pt monolayer shell on Pd(hollow, or Pd-Au(hollow core electrocatalysts. Comparison between the ORR catalytic activity of the electrocatalysts with hollow cores and those of Pt solid and Pt hollow nanoparticles has been obtained using the rotating disk electrode technique. Hollow nanoparticles were made using Ni or Cu nanoparticles as sacrificial templates. The Pt ORR specific and mass activities of the electrocatalysts with hollow cores were found considerably higher than those of the electrocatalysts with the solid cores. We attribute this enhanced Pt activity to the smooth surface morphology and hollow-induced lattice contraction, in addition to the mass-saving geometry of hollow particles.

  2. The effects of core stability strength exercise on muscle activity and trunk impairment scale in stroke patients

    OpenAIRE

    Yu, Seong-Hun; Park, Seong-Doo

    2013-01-01

    The purpose of this study was to examine the effects of core stability-enhancing exercises on the lower trunk and muscle activity of stroke patients. The control group (n = 10) underwent standard exercise therapy, while the experiment group (n =10) underwent both the core stability-enhancing exercise and standard exercise therapy simultaneously. The standard exercise therapy applied to the two groups included weight bearing and weight shifts and joint movements to improve flexibility and the ...

  3. Generic substitution - comparing the clinical efficacy of a generic ...

    African Journals Online (AJOL)

    1998-03-03

    Mar 3, 1998 ... Departments of Psychiatry and Biostatistics, University of the. Orange Free State ... industry has expanded rapidly during the last 2 decades! The need to contain the ... Substitution of a generic drug product for an innovator.

  4. Turbulence in the ICM from mergers, cool-core sloshing and jets: results from a new multi-scale filtering approach

    CERN Document Server

    Vazza, F; Brueggen, M

    2012-01-01

    We have designed a simple multi-scale method that identifies turbulent motions in hydrodynamical grid simulations. The method does not assmume an a-priori coherence scale to distinguish laminar and turbulent flows. Instead, the local mean velocity field around each cell is reconstructed with a multi-scale filtering technique, yielding the maximum scale of turbulent eddies by means of iterations. The method is robust, fast and easily applicable to any grid simulation. We present here the application of this technique to the study of spatial and spectral properties of turbulence in the intra cluster medium, measuring turbulent diffusion and anisotropy of the turbulent velocity field for a variety of driving mechanism: a) accretion of matter in galaxy clusters (simulated with ENZO); b) sloshing motions around cool-cores (simulated with FLASH); c) jet outflows from AGN (simulated with FLASH). The turbulent velocities driven by matter accretion in galaxy clusters are mostly tangential in the inner regions (inside ...

  5. Generic interpreters and microprocessor verification

    Science.gov (United States)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  6. Generic ISIS Transport Module Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of the Generic ISIS Transport Module is to provide a means to bring living specimens to and from orbit. In addition to living specimens, the module can...

  7. Hanford Generic Interim Safety Basis

    Energy Technology Data Exchange (ETDEWEB)

    Lavender, J.C.

    1994-09-09

    The purpose of this document is to identify WHC programs and requirements that are an integral part of the authorization basis for nuclear facilities that are generic to all WHC-managed facilities. The purpose of these programs is to implement the DOE Orders, as WHC becomes contractually obligated to implement them. The Hanford Generic ISB focuses on the institutional controls and safety requirements identified in DOE Order 5480.23, Nuclear Safety Analysis Reports.

  8. The Generic Data Capture Facility

    Science.gov (United States)

    Connell, Edward B.; Barnes, William P.; Stallings, William H.

    The Generic Data Capture Facility, which can provide data capture support for a variety of different types of spacecraft while enabling operations costs to be carefully controlled, is discussed. The data capture functions, data protection, isolation of users from data acquisition problems, data reconstruction, and quality and accounting are addressed. The TDM and packet data formats utilized by the system are described, and the development of generic facilities is considered.

  9. Knowledge Development Generic Framework Concept

    Science.gov (United States)

    2008-12-18

    Branch This is the final MNE 5 document on Knowledge Development. Contact ZTransfBw Abt II CDE@bundeswehr.org for inquiries regarding...subsequent updates beyond MNE 5 efforts. VERSION 1.30 18. December 2008 Knowledge Development Generic Framework Concept Draft Report...2. REPORT TYPE Final 3. DATES COVERED - 4. TITLE AND SUBTITLE Knowledge Development Generic Framework Concept 5a. CONTRACT NUMBER 5b

  10. Modelling of Generic Slung Load System

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Bendtsen, Jan Dimon; La Cour-Harbo, Anders

    2006-01-01

    of Least Constraint using the Udwadia-Kalaba equation and can be used to model all body to body slung load suspension types. The model gives an intuitive and easy-to-use way of modelling and simulating di erent slung load suspension types and it includes detection and response of wire slacking......This paper presents the result of modelling and verification of a generic slung load system using a small-scale helicopter. The model is intended for use in simulation, pilot training, estimation, and control. The model is derived using a redundant coordinate formulation based on Gauss Principle...

  11. Abstracting and Metrics of Core Frame Structurein Large-Scale Software Based onk-Core%基于k-核的大规模软件核心框架结构抽取与度量

    Institute of Scientific and Technical Information of China (English)

    李辉; 赵海; 郝立颖; 何滨

    2011-01-01

    对大规模开源软件结构层次性的实证分析发现其具有扁平层次结构特征.在此基础上,利用k-核对软件系统结构进行层次划分,抽取出软件系统的核心框架结构CFS(core frame structure);通过对CFS与其他层节点的加权连接度统计,发现CFS与其他层联系紧密,CFS的节点对其他层的节点有巨大影响.通过对CFS网络结构特征量的度量,发现CFS具有无尺度网络特征和小世界网络特征,体现了较高的软件复用程度,在软件系统整体结构中处于支配地位.%The case study on hierarchical structure of large-scale open-source software shows that software systems are reflecting characteristic of flat hierarchical structure.On this basis,we took advantage ofk-core to divide the software system structure into layers,abstract the core frame structure(CFS) of software system.The statistics on weighted connected degree between CFS and other layers show that CFS tightly communicates with other layers and its nodes have great influence on the nodes of other layers.Then,through the metrics on network parameters of CFS,It was found that CFS reflects characteristic of free-scale and small-world,and reflects higher degree of software reuse.In addition,CFS is in a decisive status in the software system integrated structure.

  12. Dynamic Voltage-Frequency and Workload Joint Scaling Power Management for Energy Harvesting Multi-Core WSN Node SoC

    Science.gov (United States)

    Li, Xiangyu; Xie, Nijie; Tian, Xinyue

    2017-01-01

    This paper proposes a scheduling and power management solution for energy harvesting heterogeneous multi-core WSN node SoC such that the system continues to operate perennially and uses the harvested energy efficiently. The solution consists of a heterogeneous multi-core system oriented task scheduling algorithm and a low-complexity dynamic workload scaling and configuration optimization algorithm suitable for light-weight platforms. Moreover, considering the power consumption of most WSN applications have the characteristic of data dependent behavior, we introduce branches handling mechanism into the solution as well. The experimental result shows that the proposed algorithm can operate in real-time on a lightweight embedded processor (MSP430), and that it can make a system do more valuable works and make more than 99.9% use of the power budget. PMID:28208730

  13. Dynamic Voltage-Frequency and Workload Joint Scaling Power Management for Energy Harvesting Multi-Core WSN Node SoC

    Directory of Open Access Journals (Sweden)

    Xiangyu Li

    2017-02-01

    Full Text Available This paper proposes a scheduling and power management solution for energy harvesting heterogeneous multi-core WSN node SoC such that the system continues to operate perennially and uses the harvested energy efficiently. The solution consists of a heterogeneous multi-core system oriented task scheduling algorithm and a low-complexity dynamic workload scaling and configuration optimization algorithm suitable for light-weight platforms. Moreover, considering the power consumption of most WSN applications have the characteristic of data dependent behavior, we introduce branches handling mechanism into the solution as well. The experimental result shows that the proposed algorithm can operate in real-time on a lightweight embedded processor (MSP430, and that it can make a system do more valuable works and make more than 99.9% use of the power budget.

  14. Dynamic Voltage-Frequency and Workload Joint Scaling Power Management for Energy Harvesting Multi-Core WSN Node SoC.

    Science.gov (United States)

    Li, Xiangyu; Xie, Nijie; Tian, Xinyue

    2017-02-08

    This paper proposes a scheduling and power management solution for energy harvesting heterogeneous multi-core WSN node SoC such that the system continues to operate perennially and uses the harvested energy efficiently. The solution consists of a heterogeneous multi-core system oriented task scheduling algorithm and a low-complexity dynamic workload scaling and configuration optimization algorithm suitable for light-weight platforms. Moreover, considering the power consumption of most WSN applications have the characteristic of data dependent behavior, we introduce branches handling mechanism into the solution as well. The experimental result shows that the proposed algorithm can operate in real-time on a lightweight embedded processor (MSP430), and that it can make a system do more valuable works and make more than 99.9% use of the power budget.

  15. Constraints on the Small Scale Heterogeneity in D" from Precursors to Short Distance PcP Wave and Implication for Roughness of Core-Mantle Boundary

    Science.gov (United States)

    Ni, S.; Zhang, B.

    2016-12-01

    Volumetric heterogeneity in D" layer and topography variation of the Core Mantle Boundary (CMB) are well established on large scales (thousand kms) to intermediate scales with seismological approaches. However, there are controversies regarding the level of heterogeneity in D" layer at small scales ( a few km - 10 km), with lower bound estimate of 0.1% to a few percent. And there are very limited reports of small scale topography of CMB. We take advantage of the small amplitude PcP waves at near podal distances (0-10 degree), and use the ratio of short period (1 Hz) PcP and its precursors to constrain level of small scale heterogeneity in the D" layer. We computed short period synthetic seismograms with 2D finite code for a series of volumetric heterogeneity models in the crust and in D", and find that PcP is not observable if the heterogeneity in D" is above 2%. We will present evidences of clearly observed PcP at short distances and argue for weak small scale heterogeneity in D". Assuming topography of CMB is related to isostasy, the volumetric heterogeneity in D" can be used to estimate CMB roughness.

  16. Metapopulation theory identifies biogeographical patterns among core and satellite marine bacteria scaling from tens to thousands of kilometers

    DEFF Research Database (Denmark)

    Lindh, Markus V.; Sjöstedt, Johanna; Ekstam, Börje

    2017-01-01

    with a satellite mode of rare endemic populations and a core mode of abundant cosmopolitan populations (e.g. Synechococcus, SAR11 and SAR86 clade members). Temporal changes in population distributions supported several theoretical frameworks. Still, bimodality was found among bacterioplankton communities across......Metapopulation theory developed in terrestrial ecology provides applicable frameworks for interpreting the role of local and regional processes in shaping species distribution patterns. Yet, empirical testing of metapopulation models on microbial communities is essentially lacking. We determined...

  17. CopperCore Service Integration

    Science.gov (United States)

    Vogten, Hubert; Martens, Harrie; Nadolski, Rob; Tattersall, Colin; van Rosmalen, Peter; Koper, Rob

    2007-01-01

    In an e-learning environment there is a need to integrate various e-learning services like assessment services, collaboration services, learning design services and communication services. In this article we present the design and implementation of a generic integrative service framework, called CopperCore Service Integration (CCSI). We will…

  18. Development of TDLAS sensor for diagnostics of CO, H2O and soot concentrations in reactor core of pilot-scale gasifier

    Science.gov (United States)

    Sepman, A.; Ögren, Y.; Gullberg, M.; Wiinikka, H.

    2016-02-01

    This paper reports on the development of the tunable diode laser absorption spectroscopy sensor near 4350 cm-1 (2298 nm) for measurements of CO and H2O mole fractions and soot volume fraction under gasification conditions. Due to careful selection of the molecular transitions [CO ( υ″ = 0 → υ' = 2) R34-R36 and H2O at 4349.337 cm-1], a very weak (negligible) sensitivity of the measured species mole fractions to the temperature distribution inside the high-temperature zone (1000 K < T < 1900 K) of the gasification process is achieved. The selected transitions are covered by the tuning range of single diode laser. The CO and H2O concentrations measured in flat flames generally agree better than 10 % with the results of 1-D flame simulations. Calibration-free absorption measurements of studied species in the reactor core of atmospheric pilot-scale entrained-flow gasifier operated at 0.1 MW power are reported. Soot concentration is determined from the measured broadband transmittance. The estimated uncertainties in the reactor core CO and H2O measurements are 15 and 20 %, respectively. The reactor core average path CO mole fractions are in quantitative agreement with the µGC CO concentrations sampled at the gasifier output.

  19. Large Scale Finite Element Thermal Analysis of the Bolts of a French PWR Core Internal Baffle Structure

    Energy Technology Data Exchange (ETDEWEB)

    Rupp, Isabelle; Christophe, Peniguel [EDF R and D, Paris (France); Tommy, Martin Michel [1 av du General de Gaulle, Paris (France)

    2009-11-15

    The internal core baffle structure of a French Pressurized Water Reactor (PWR) consists of a collection of baffles and formers that are attached to the barrel. The connections are done thanks to a large number of bolts (about 1500). After inspection, some of the bolts have been found cracked. This has been attributed to the Irradiation Assisted Stress Corrosion Cracking (IASCC). The Electricite De France (EDF) has set up a research program to gain better knowledge of the temperature distribution, which may affect the bolts and the whole structure. The temperature distribution in the structure was calculated thanks to the thermal code SYRTHES that used a finite element approach. The heat transfer between the by-pass flow inside the cavities of the core baffle and the structure was accounted for thanks to a strong thermal coupling between the thermal code SYRTHES and the CFD code named Code{sub S}aturne. The results for the CP0 plant design show that both the high temperature and strong temperature gradients could potentially induce mechanical stresses. The CPY design, where each bolt is individually cooled, had led to a reduction of temperatures inside the structures. A new parallel version of SYRTHES, for calculations on very large meshes and based on MPI, has been developed. A demonstration test on the complete structure that has led to about 1.1 billion linear tetraedra has been calculated on 2048 processors of the EDF Blue Gene computer

  20. TRAC code assessment using data from SCTF Core-III, a large-scale 2D/3D facility

    Energy Technology Data Exchange (ETDEWEB)

    Boyack, B.E.; Shire, P.R.; Harmony, S.C.; Rhee, G.

    1988-01-01

    Nine tests from the SCTF Core-III configuration have been analyzed using TRAC-PF1/MOD1. The objectives of these assessment activities were to obtain a better understanding of the phenomena occurring during the refill and reflood phases of a large-break loss-of-coolant accident, to determine the accuracy to which key parameters are calculated, and to identify deficiencies in key code correlations and models that provide closure for the differential equations defining thermal-hydraulic phenomena in pressurized water reactors. Overall, the agreement between calculated and measured values of peak cladding temperature is reasonable. In addition, TRAC adequately predicts many of the trends observed in both the integral effect and separate effect tests conducted in SCTF Core-III. The importance of assessment activities that consider potential contributors to discrepancies between the measured and calculated results arising from three sources are described as those related to (1) knowledge about the facility configuration and operation, (2) facility modeling for code input, and (3) deficiencies in code correlations and models. An example is provided. 8 refs., 7 figs., 2 tabs.

  1. SWIFT: Using task-based parallelism, fully asynchronous communication, and graph partition-based domain decomposition for strong scaling on more than 100,000 cores

    CERN Document Server

    Schaller, Matthieu; Chalk, Aidan B G; Draper, Peter W

    2016-01-01

    We present a new open-source cosmological code, called SWIFT, designed to solve the equations of hydrodynamics using a particle-based approach (Smooth Particle Hydrodynamics) on hybrid shared/distributed-memory architectures. SWIFT was designed from the bottom up to provide excellent strong scaling on both commodity clusters (Tier-2 systems) and Top100-supercomputers (Tier-0 systems), without relying on architecture-specific features or specialized accelerator hardware. This performance is due to three main computational approaches: (1) Task-based parallelism for shared-memory parallelism, which provides fine-grained load balancing and thus strong scaling on large numbers of cores. (2) Graph-based domain decomposition, which uses the task graph to decompose the simulation domain such that the work, as opposed to just the data, as is the case with most partitioning schemes, is equally distributed across all nodes. (3) Fully dynamic and asynchronous communication, in which communication is modelled as just anot...

  2. Rectification of two generic names

    NARCIS (Netherlands)

    Büttikofer, J.

    1896-01-01

    I am sorry to say that amongst the new generic names, occurring in my recent paper on the genus Pycnonotus and some allied Genera (N. L. M. XVII), Centrolophus and Gymnocrotaphus are already preoccupied among the Fishes, the first being used by Lacépède, the second by Günther. I propose, therefore,

  3. GENERIC model for multiphase systems

    NARCIS (Netherlands)

    Sagis, L.M.C.

    2010-01-01

    GENERIC is a nonequilibrium thermodynamic formalism in which the dynamic behavior of a system is described by a single compact equation involving two types of brackets: a Poisson bracket and a dissipative bracket. This formalism has proved to be a very powerful instrument to model the dynamic behavi

  4. Generic Hurricane Extreme Seas State

    DEFF Research Database (Denmark)

    Wehmeyer, Christof; Skourup, Jesper; Frigaard, Peter

    2012-01-01

    the US east coast and the Gulf of Mexico (1851 - 2009) and Japanese east coast (1951 -2009) form the basis for Weibull extreme value analyses to determine return period respective maximum wind speeds. Unidirectional generic sea state spectra are obtained by application of the empirical models...

  5. Generic Software Architecture for Launchers

    Science.gov (United States)

    Carre, Emilien; Gast, Philippe; Hiron, Emmanuel; Leblanc, Alain; Lesens, David; Mescam, Emmanuelle; Moro, Pierre

    2015-09-01

    The definition and reuse of generic software architecture for launchers is not so usual for several reasons: the number of European launcher families is very small (Ariane 5 and Vega for these last decades); the real time constraints (reactivity and determinism needs) are very hard; low levels of versatility are required (implying often an ad hoc development of the launcher mission). In comparison, satellites are often built on a generic platform made up of reusable hardware building blocks (processors, star-trackers, gyroscopes, etc.) and reusable software building blocks (middleware, TM/TC, On Board Control Procedure, etc.). If some of these reasons are still valid (e.g. the limited number of development), the increase of the available CPU power makes today an approach based on a generic time triggered middleware (ensuring the full determinism of the system) and a centralised mission and vehicle management (offering more flexibility in the design and facilitating the long term maintenance) achievable. This paper presents an example of generic software architecture which could be envisaged for future launchers, based on the previously described principles and supported by model driven engineering and automatic code generation.

  6. Multi-scale petrophysical and geomechanical characterization of full core from the Groningen Field to understand mechanical stratigraphy and compaction behavior

    Science.gov (United States)

    van Eijs, Rob; Hol, Sander; Marcelis, Fons; Ishmukhametova, Gulfiia; van der Linden, Arjan; Zuiderwijk, Pedro; Makurat, Axel

    2016-04-01

    The Groningen gas field in The Netherlands is one of the largest onshore gas reserves known. Advancing production from the field has resulted in field-scale deformation with surface subsidence and accompanied local seismicity. Part of the deformation is associated with compaction of the Permian reservoir. While depletion-induced reservoir compaction is expected to be controlled locally by grain-scale physical mechanisms such as sub-critical cracking or particle re-arrangement and intergranular pressure solution creep, understanding of the intra-reservoir variability of these mechanisms is still limited, though crucial for predicting the coupling between production, rock deformation, and surface effects. To aid an improved understanding of fundamental processes and scaling effects, approximately 200 meters of core over the reservoir section was taken from a well in the Groningen Field, drilled in July 2015 close to the village of Zeerijp. Using this material, we have performed detailed laboratory investigations and will continue to do so in significant numbers, to compare the results obtained with well- and field-scale observations. In this contribution, we present several exemplary mechanical data sets for the reservoir and caprock, and compare these data with well-scale petrophysical and mechanical information, notably sonic, scratch and visual geological details with the aim to arrive at a multi-scale description of petrophysical and geomechanical rock properties. Our first comparison reveals a strong contrast in compressibility and strength between the reservoir and caprock, as well as a contribution of inelastic strain to the total strain response of the tested rock samples. We will discuss the observed mechanical stratigraphy in considering regional and field scale deformation patterns.

  7. Characterization of reactive flow-induced evolution of carbonate rocks using digital core analysis- part 1: Assessment of pore-scale mineral dissolution and deposition.

    Science.gov (United States)

    Qajar, Jafar; Arns, Christoph H

    2016-09-01

    The application of X-ray micro-computed tomography (μ-CT) for quantitatively characterizing reactive-flow induced pore structure evolution including local particle detachment, displacement and deposition in carbonate rocks is investigated. In the studies conducted in this field of research, the experimental procedure has involved alternating steps of imaging and ex-situ core sample alteration. Practically, it is impossible to return the sample, with micron precision, to the same position and orientation. Furthermore, successive images of a sample in pre- and post-alteration states are usually taken at different conditions such as different scales, resolutions and signal-to-noise ratios. These conditions accompanying with subresolution features in the images make voxel-by-voxel comparisons of successive images problematic. In this paper, we first address the respective challenges in voxel-wise interpretation of successive images of carbonate rocks subject to reactive flow. Reactive coreflood in two carbonate cores with different rock types are considered. For the first rock, we used the experimental and imaging results published by Qajar et al. (2013) which showed a quasi-uniform dissolution regime. A similar reactive core flood was conducted in the second rock which resulted in wormhole-like dissolution regime. We particularly examine the major image processing operations such as transformation of images to the same grey-scale, noise filtering and segmentation thresholding and propose quantitative methods to evaluate the effectiveness of these operations in voxel-wise analysis of successive images of a sample. In the second part, we generalize the methodology based on the three-phase segmentation of normalized images, microporosity assignment and 2D histogram of image intensities to estimate grey-scale changes of individual image voxels for a general case where the greyscale images are segmented into arbitrary number of phases. The results show that local (voxel

  8. Characterization of reactive flow-induced evolution of carbonate rocks using digital core analysis- part 1: Assessment of pore-scale mineral dissolution and deposition

    Science.gov (United States)

    Qajar, Jafar; Arns, Christoph H.

    2016-09-01

    The application of X-ray micro-computed tomography (μ-CT) for quantitatively characterizing reactive-flow induced pore structure evolution including local particle detachment, displacement and deposition in carbonate rocks is investigated. In the studies conducted in this field of research, the experimental procedure has involved alternating steps of imaging and ex-situ core sample alteration. Practically, it is impossible to return the sample, with micron precision, to the same position and orientation. Furthermore, successive images of a sample in pre- and post-alteration states are usually taken at different conditions such as different scales, resolutions and signal-to-noise ratios. These conditions accompanying with subresolution features in the images make voxel-by-voxel comparisons of successive images problematic. In this paper, we first address the respective challenges in voxel-wise interpretation of successive images of carbonate rocks subject to reactive flow. Reactive coreflood in two carbonate cores with different rock types are considered. For the first rock, we used the experimental and imaging results published by Qajar et al. (2013) which showed a quasi-uniform dissolution regime. A similar reactive core flood was conducted in the second rock which resulted in wormhole-like dissolution regime. We particularly examine the major image processing operations such as transformation of images to the same grey-scale, noise filtering and segmentation thresholding and propose quantitative methods to evaluate the effectiveness of these operations in voxel-wise analysis of successive images of a sample. In the second part, we generalize the methodology based on the three-phase segmentation of normalized images, microporosity assignment and 2D histogram of image intensities to estimate grey-scale changes of individual image voxels for a general case where the greyscale images are segmented into arbitrary number of phases. The results show that local (voxel

  9. Measuring the triple O2 isotopic composition of air trapped in ice cores and quantifying thecauses of δ18Oatm Millennial Scale Variations

    DEFF Research Database (Denmark)

    Reutenauer, Corentin

    , a region of the world which still lacks of climatic reconstructions and whose role is widely debated in the context of millennial-scale climate variations. Atmospheric O2 is enriched in heavy isotopologues (δ17O , δ18O ) relative to O2 in ocean water, the ultimate source of O2 for photosynthesis...... in atmospheric δ18O and 17Δ (17Δatm= ln(δ17Oatm)−0.516 · ln(δ18Oatm). We review the current understanding of pastorbital and millennial time-scale variations of atmospheric O2 isotopes. We also give a description of air transport and associated processes in the firn, which alte rthe climatic signal preserved...... in ice core bubbles. Second, a very high analytical precision and accuracy is required to measure the past variations of δ18Oatm and especially 17Δatm preserved in ice core bubbles. One must primarily have the ability to measure variations as small as 10 permeg (0.01 ‰), corresponding to the millennial...

  10. The parsec-scale distributions of intensity, linear polarization and Faraday rotation in the core and jet of Mrk501 at 8.4-1.6 GHz

    Science.gov (United States)

    Croke, S. M.; O'Sullivan, S. P.; Gabuzda, D. C.

    2010-02-01

    Previous very long baseline interferometry (VLBI) observations of the nearby (z = 0.0337) active galactic nucleus (AGN) Mrk501 have revealed a complex total-intensity structure with an approximately 90° misalignment between the jet orientations on parsec and kiloparsec scales. The jet displays a `spine' of magnetic field orthogonal to the jet surrounded by a `sheath' of magnetic field aligned with the jet. Mrk501 is also one of a handful of AGN that are regularly detected at TeV energies, indicating the presence of high-energy phenomena in the core. However, multi-epoch analyses of the VLBI total-intensity structure have yielded only very modest apparent speeds for features in the VLBI jet. We investigate the total-intensity and linear-polarization structures of the parsec- to decaparsec-scale jet of Mrk501 using VLBA observations at 8.4, 5, 2.2 and 1.6 GHz. The rotation-measure distribution displays the presence of a Faraday rotation gradient across an extended stretch of the jet, providing new evidence for a helical magnetic field associated with the jet of this AGN. The position of the radio core from the base of the jet follows the law rcore(ν) ~ ν-1.1+/-0.2, consistent with the compact inner jet region being in equipartition. Hence, we estimate a magnetic field strength of ~40 mG at a distance of 1 pc.

  11. Item banking to improve, shorten and computerize self-reported fatigue: an illustration of steps to create a core item bank from the FACIT-Fatigue Scale.

    Science.gov (United States)

    Lai, Jin-shei; Cella, David; Chang, Chih-Hung; Bode, Rita K; Heinemann, Allen W

    2003-08-01

    Fatigue is a common symptom among cancer patients and the general population. Due to its subjective nature, fatigue has been difficult to effectively and efficiently assess. Modern computerized adaptive testing (CAT) can enable precise assessment of fatigue using a small number of items from a fatigue item bank. CAT enables brief assessment by selecting questions from an item bank that provide the maximum amount of information given a person's previous responses. This article illustrates steps to prepare such an item bank, using 13 items from the Functional Assessment of Chronic Illness Therapy Fatigue Subscale (FACIT-F) as the basis. Samples included 1022 cancer patients and 1010 people from the general population. An Item Response Theory (IRT)-based rating scale model, a polytomous extension of the Rasch dichotomous model was utilized. Nine items demonstrating acceptable psychometric properties were selected and positioned on the fatigue continuum. The fatigue levels measured by these nine items along with their response categories covered 66.8% of the general population and 82.6% of the cancer patients. Although the operational CAT algorithms to handle polytomously scored items are still in progress, we illustrated how CAT may work by using nine core items to measure level of fatigue. Using this illustration, a fatigue measure comparable to its full-length 13-item scale administration was obtained using four items. The resulting item bank can serve as a core to which will be added a psychometrically sound and operational item bank covering the entire fatigue continuum.

  12. Mylan to Offer Generic EpiPen

    Science.gov (United States)

    ... news/fullstory_160669.html Mylan to Offer Generic EpiPen Manufacturer responds to mounting criticism about price hikes ... cheaper generic version of the emergency allergy treatment EpiPen will be made available within the next few ...

  13. The generic danger and the idiosyncratic support

    Science.gov (United States)

    Temme, Arnaud; Nijp, Jelmer; van der Meij, Marijn; Samia, Jalal; Masselink, Rens

    2016-04-01

    This contribution argues two main points. First, that generic landscapes used in some modelling studies sometimes have properties or cause simulation results that are unrealistic. Such initially flat or straight-sloped landscapes, sometimes with minor random perturbations, e.g. form the backdrop for ecological simulations of vegetation growth and competition that predict catastrophic shifts. Exploratory results for semi-arid systems suggest that the results based on these generic landscapes are end-members from a distribution of results, rather than an unbiased, typical outcome. Apparently, the desire to avoid idiosyncrasy has unintended consequences. Second, we argue and illustrate that in fact new insights often come from close inspection of idiosyncratic case studies. Our examples from landslide systems, connectivity and soil formation show how a central role for the case study - either in empirical work or to provide model targets - has advanced our understanding. Both points contribute to the conclusion that it is dangerous to forget about annoying, small-scale, idiosyncratic and, indeed, perhaps bad-ass case studies in Earth Sciences.

  14. mJ range all-fiber MOPA prototype with hollow-core fiber beam delivery designed for large scale laser facilities seeding (Conference Presentation)

    Science.gov (United States)

    Scol, Florent; Gouriou, Pierre; Perrin, Arnaud; Gleyze, Jean-François; Valentin, Constance; Bouwmans, Géraud; Hugonnot, Emmanuel

    2017-03-01

    The Laser megajoule (LMJ) is a French large scale laser facility dedicated to inertial fusion research. Its front-ends are based on fiber laser technology and generate highly controlled beams in the nanojoule range. Scaling the energy of those fiber seeders to the millijoule range is a way explored to upgrade LMJ's architecture. We report on a fully integrated narrow line-width all-fiber MOPA prototype at 1053 nm designed to meet stringent requirements of large-scale laser facilities seeding. We achieve 750 µJ temporally-shaped pulses of few nanoseconds at 1 kHz. Thanks to its original longitudinal geometry and its wide output core (26µm MFD), the Yb-doped tapered fiber used in the power amplifier stage ensures a single-mode operation and negligible spectro-temporal distortions. The transport of 30 kW peak power pulses (from tapered fiber) in a 17 m long large mode area (39µm) hollow-core (HC) fiber is presented and points out frequency modulation to amplitude modulation conversion management issues. A S² measurement of this fiber allows to attribute this conversion to a slightly multimode behavior (< 13dB of extinction between the fundamental mode and higher order modes). Other HC fibers exhibiting a really single-mode behavior (<20 dB) have been tested and the comparison will be presented in the conference. Finally, fiber spatial beam shaping from coherent Gaussian beam to coherent top-hat intensity profile beam in the mJ range with a specifically designed and fabricated fiber will also be presented.

  15. Compositional Design of a Generic Design Agent

    NARCIS (Netherlands)

    Brazier, F.M.T.; Jonker, C.M.; Treur, J.; Wijngaards, N.J.E.

    2001-01-01

    This paper presents a generic architecture for a design agent, to be used in an Internet environment. The design agent is based on an existing generic agent model, and includes a refinement of a generic model for design, in which strategic reasoning

  16. Comparing approaches to generic programming in Haskell

    NARCIS (Netherlands)

    Hinze, R.; Jeuring, J.T.; Löh, A.

    2007-01-01

    The last decade has seen a number of approaches to data- type-generic programming: PolyP, Functorial ML, `Scrap Your Boiler- plate', Generic Haskell, `Generics for the Masses', etc. The approaches vary in sophistication and target audience: some propose full-blown pro- gramming languages, some sugge

  17. CO2 Exsolution from CO2 Saturated Water: Core-Scale Experiments and Focus on Impacts of Pressure Variations.

    Science.gov (United States)

    Xu, Ruina; Li, Rong; Ma, Jin; Jiang, Peixue

    2015-12-15

    For CO2 sequestration and utilization in the shallow reservoirs, reservoir pressure changes are due to the injection rate changing, a leakage event, and brine withdrawal for reservoir pressure balance. The amounts of exsolved CO2 which are influenced by the pressure reduction and the subsequent secondary imbibition process have a significant effect on the stability and capacity of CO2 sequestration and utilization. In this study, exsolution behavior of the CO2 has been studied experimentally using a core flooding system in combination with NMR/MRI equipment. Three series of pressure variation profiles, including depletion followed by imbibitions without or with repressurization and repetitive depletion and repressurization/imbibition cycles, were designed to investigate the exsolution responses for these complex pressure variation profiles. We found that the exsolved CO2 phase preferentially occupies the larger pores and exhibits a uniform spatial distribution. The mobility of CO2 is low during the imbibition process, and the residual trapping ratio is extraordinarily high. During the cyclic pressure variation process, the first cycle has the largest contribution to the amount of exsolved CO2. The low CO2 mobility implies a certain degree of self-sealing during a possible reservoir depletion.

  18. Enhancing Safety through Generic Competencies

    Directory of Open Access Journals (Sweden)

    S. Mockel

    2014-03-01

    Full Text Available This article provides insights into proactive safety management and mitigation. An analysis of accident reports reveals categories of supervening causes of accidents which can be directly linked to the concept of generic competencies (information management, communication and coordination, problem solving, and effect control. These findings strongly suggest adding the human element as another safety-constituting pillar to the concept of ship safety next to technology and regulation. We argue that the human element has unique abilities in dealing with critical and highly dynamic situations which can contribute to the system's recovery from non-routine or critical situations. By educating seafarers in generic competencies we claim to enable the people onboard to successfully deal with critical situations.

  19. Efficient Generation of Generic Entanglement

    CERN Document Server

    Oliveira, R; Plenio, M B

    2006-01-01

    We find that generic entanglement is physical, in the sense that it can be generated in polynomial time from two-qubit gates picked at random. We prove as the main result that such a process generates the average entanglement of the uniform (Haar) measure in at most $O(N^3)$ steps for $N$ qubits. This is despite an exponentially growing number of such gates being necessary for generating that measure fully on the state space. Numerics furthermore show a variation cut-off allowing one to associate a specific time with the achievement of the uniform measure entanglement distribution. Various extensions of this work are discussed. The results are relevant to entanglement theory and to protocols that assume generic entanglement can be achieved efficiently.

  20. Integrative analysis of large scale expression profiles reveals core transcriptional response and coordination between multiple cellular processes in a cyanobacterium

    Directory of Open Access Journals (Sweden)

    Bhattacharyya-Pakrasi Maitrayee

    2010-08-01

    Full Text Available Abstract Background Cyanobacteria are the only known prokaryotes capable of oxygenic photosynthesis. They play significant roles in global biogeochemical cycles and carbon sequestration, and have recently been recognized as potential vehicles for production of renewable biofuels. Synechocystis sp. PCC 6803 has been extensively used as a model organism for cyanobacterial studies. DNA microarray studies in Synechocystis have shown varying degrees of transcriptome reprogramming under altered environmental conditions. However, it is not clear from published work how transcriptome reprogramming affects pre-existing networks of fine-tuned cellular processes. Results We have integrated 163 transcriptome data sets generated in response to numerous environmental and genetic perturbations in Synechocystis. Our analyses show that a large number of genes, defined as the core transcriptional response (CTR, are commonly regulated under most perturbations. The CTR contains nearly 12% of Synechocystis genes found on its chromosome. The majority of genes in the CTR are involved in photosynthesis, translation, energy metabolism and stress protection. Our results indicate that a large number of differentially regulated genes identified in most reported studies in Synechocystis under different perturbations are associated with the general stress response. We also find that a majority of genes in the CTR are coregulated with 25 regulatory genes. Some of these regulatory genes have been implicated in cellular responses to oxidative stress, suggesting that reactive oxygen species are involved in the regulation of the CTR. A Bayesian network, based on the regulation of various KEGG pathways determined from the expression patterns of their associated genes, has revealed new insights into the coordination between different cellular processes. Conclusion We provide here the first integrative analysis of transcriptome data sets generated in a cyanobacterium. This

  1. Generic and disease-specific measures of quality of life in patients with mild Alzheimer's disease

    DEFF Research Database (Denmark)

    Bhattacharya, Suvosree; Vogel, Asmus; Hansen, Marie-Louise H

    2010-01-01

    The aim of the study was to investigate the pattern of association of generic and disease-specific quality of life (QoL) scales with standard clinical outcome variables in Alzheimer's disease (AD).......The aim of the study was to investigate the pattern of association of generic and disease-specific quality of life (QoL) scales with standard clinical outcome variables in Alzheimer's disease (AD)....

  2. Generic and disease-specific measures of quality of life in patients with mild Alzheimer's disease

    DEFF Research Database (Denmark)

    Bhattacharya, Sumangala; Vogel, A.; Hansen, M.L.;

    2010-01-01

    The aim of the study was to investigate the pattern of association of generic and disease-specific quality of life (QoL) scales with standard clinical outcome variables in Alzheimer's disease (AD).......The aim of the study was to investigate the pattern of association of generic and disease-specific quality of life (QoL) scales with standard clinical outcome variables in Alzheimer's disease (AD)....

  3. Modeling of Generic Slung Load System

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Bendtsen, Jan Dimon; la Cour-Harbo, Anders

    2009-01-01

    This paper presents the result of the modelling and verification of a generic slung load system using a small-scale helicopter. The model is intended for use in simulation, pilot training, estimation, and control. The model is derived using a redundant coordinate formulation based on Gauss......' Principle of Least Constraint using the Udwadia-Kalaba equation and can be used to model all body to body slung load suspension types. The model provides intuitive and easy-to-use means of modelling and simulating different slung load suspension types. It includes detection of, and response to, wire...... slackening and tightening as well as aerodynamic coupling between the helicopter and the load. Furthermore, it is shown how the model can be easily used for multi-lift systems either with multiple helicopters or multiple loads. A numerical stabilisation algorithm is introduced and finally the use...

  4. Interannual to decadal scale North Pacific climate dynamics during the last millennium from Eclipse Icefield (St. Elias Mountains) ice core stable isotope records

    Science.gov (United States)

    Kreutz, K. J.; Wake, C.; Yalcin, K.; Vogan, N.; Introne, D.; Fisher, D.; Osterberg, E.

    2006-12-01

    A 345 meter ice core recovered from the St. Elias Mountains, Yukon Territory, Canada during 2002 has been continuously analyzed for stable hydrogen isotopes (deltaD), and is used to interpret changes in the North Pacfic hydrologic cycle and climate variability over the past 1000 years. Given the high annual snow accumulation rate at the site (1.5 meters/year), the record is high resolution (subannual) and annually dated to 1450 AD, and dated with ice flow models prior to 1450 AD. Five-year averaged isotope data over the past millennium display a classic Little Ice Age (LIA)/Medieval Climate Anomaly (MCA) pattern; that is, lower isotope ratios during the LIA, and higher isotope ratios during the MCA. Using the simple isotope/temperature relationship typically applied to ice core data, the Eclipse record may indicate lower regional temperatures and enhanced temperature variability during the period 1250 to 1700 AD. However, isotope data from an ice core recovered near the summit of Mt. Logan is clearly related to different hydrologic regimes. Regardless of the scaling used on the Eclipse isotope data, a distinct drop in isotope ratio occurs just prior to 1200AD, and may correspond with changes observed in tropical coral records. We suggest that fundamental changes in teleconnection and/or ENSO/PDO dynamics between the high and low latitudes in the Pacific may be responsible for the 13th century event. Based on the 1000-year record at 5-year resolution, as well as annual isotope data for the past 550 years, the 20th century is not anomalous with respect to previous time periods.

  5. A single-step route for large-scale synthesis of core-shell palladium@platinum dendritic nanocrystals/reduced graphene oxide with enhanced electrocatalytic properties

    Science.gov (United States)

    Liu, Qi; Xu, Yan-Ru; Wang, Ai-Jun; Feng, Jiu-Ju

    2016-01-01

    In this report, a facile, seed-less and single-step method is developed for large-scale synthesis of core-shell Pd@Pt dendritic nanocrystals anchored on reduced graphene oxide (Pd@Pt DNC/rGO) under mild conditions. Poly(ethylene oxide) is employed as a structure-directing and stabilizing agent. Compared with commercial Pt/C (20 wt%) and Pd/C (20 wt%) catalysts, the as-obtained nanocomposite has large electrochemically active surface area (114.15 m2gmetal-1), and shows superior catalytic activity and stability with the mass activities of 1210.0 and 1128.5 mAmgmetal-1 for methanol and ethanol oxidation, respectively. The improved catalytic activity is mainly the consequence of the synergistic effects between Pd and Pt of the dendritic structures, as well as rGO as a support.

  6. Validation of a core outcome measure for palliative care in Africa: the APCA African Palliative Outcome Scale

    Directory of Open Access Journals (Sweden)

    Moll Tony

    2010-01-01

    Full Text Available Abstract Background Despite the burden of progressive incurable disease in Africa, there is almost no evidence on patient care or outcomes. A primary reason has been the lack of appropriate locally-validated outcome tools. This study aimed to validate a multidimensional scale (the APCA African Palliative Outcome Scale in a multi-centred international study. Methods Validation was conducted across 5 African services and in 3 phases: Phase 1. Face validity: content analysis of qualitative interviews and cognitive interviewing of POS; Phase 2. Construct validity: correlation of POS with Missoula-Vitas Quality of Life Index (Spearman's rank tests; Phase 3. Internal consistency (Cronbach's alpha calculated twice using 2 datasets, test-retest reliability (intraclass correlation coefficients calculated for 2 time points and time to complete (calculated twice using 2 datasets. Results The validation involved 682 patients and 437 family carers, interviewed in 8 different languages. Phase 1. Qualitative interviews (N = 90 patients; N = 38 carers showed POS items mapped well onto identified needs; cognitive interviews (N = 73 patients; N = 29 carers demonstrated good interpretation; Phase 2. POS-MVQoLI Spearman's rank correlations were low-moderate as expected (N = 285; Phase 3. (N = 307, 2nd assessment mean 21.2 hours after first, SD 7.2 Cronbach's Alpha was 0.6 on both datasets, indicating expected moderate internal consistency; test-retest found high intra-class correlation coefficients for all items (0.78-0.89; median time to complete 7 mins, reducing to 5 mins at second visit. Conclusions The APCA African POS has sound psychometric properties, is well comprehended and brief to use. Application of this tool offers the opportunity to at last address the omissions of palliative care research in Africa.

  7. Generic drugs in dermatology: part II.

    Science.gov (United States)

    Payette, Michael; Grant-Kels, Jane M

    2012-03-01

    In part I, we discussed new drug development, reviewed the history of the generic drug industry, described how generic drugs are approved by the US Food and Drug Administration, and defined the concepts of bioequivalence and therapeutic equivalence. Herein, we explore various factors impacting generic drug use across the different parties involved: the prescriber, the pharmacist, the patient, and the payer. We also include original cost analysis of dermatologic brand name and generic drugs and show the potential cost savings that can be achieved through generic substitution. We conclude with a review of the data addressing potential differences in the effectiveness of brand name versus generic drugs in dermatology. The cost of brand name and generic medications is highly variable by pharmacy, state, and payer. We used one source (www.drugstore.com) as an example and for consistency across all medications discussed herein. Prices included here may not reflect actual retail prices across the United States.

  8. Generic Graph Grammar: A Simple Grammar for Generic Procedural Modelling

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Bærentzen, Jakob Andreas

    2012-01-01

    in a directed cyclic graph. Furthermore, the basic productions are chosen such that Generic Graph Grammar seamlessly combines the capabilities of L-systems to imitate biological growth (to model trees, animals, etc.) and those of split grammars to design structured objects (chairs, houses, etc.). This results......Methods for procedural modelling tend to be designed either for organic objects, which are described well by skeletal structures, or for man-made objects, which are described well by surface primitives. Procedural methods, which allow for modelling of both kinds of objects, are few and usually...

  9. Generic products of antiepileptic drugs: a perspective on bioequivalence and interchangeability.

    Science.gov (United States)

    Bialer, Meir; Midha, Kamal K

    2010-06-01

    Most antiepileptic drugs (AEDs) are currently available as generic products, yet neurologists and patients are reluctant to switch to generics. Generic AEDs are regarded as bioequivalent to brand AEDs after meeting the average bioequivalence criteria; consequently, they are considered to be interchangeable with their respective brands without loss of efficacy and safety. According to the U.S. Food and Drug Administration (FDA) the present bioequivalence requirements are already so rigorous and constrained that there is little possibility that generics that meet regulatory bioequivalence criteria could lead to therapeutic problems. So is there a scientific rationale for the concerns about switching patients with epilepsy to bioequivalent generics? Herein we discuss the assessment of bioequivalence and propose a scaled-average bioequivalence approach where scaling of bioequivalence is carried out based on brand lot-to-lot variance as an alternative to the conventional bioequivalence test as a means to determine whether switching patients to generic formulations, or vice versa, is a safe and effective therapeutic option. Meeting the proposed scaled-average bioequivalence requirements will ensure that when an individual patient is switched, he or she has fluctuations in plasma levels similar to those from lot-to-lot of the brand reference levels and thus should make these generic products safely switchable without change in efficacy and safety outcomes.

  10. Generic Crystalline Disposal Reference Case

    Energy Technology Data Exchange (ETDEWEB)

    Painter, Scott Leroy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chu, Shaoping [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Harp, Dylan Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Perry, Frank Vinton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wang, Yifeng [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-20

    A generic reference case for disposal of spent nuclear fuel and high-level radioactive waste in crystalline rock is outlined. The generic cases are intended to support development of disposal system modeling capability by establishing relevant baseline conditions and parameters. Establishment of a generic reference case requires that the emplacement concept, waste inventory, waste form, waste package, backfill/buffer properties, EBS failure scenarios, host rock properties, and biosphere be specified. The focus in this report is on those elements that are unique to crystalline disposal, especially the geosphere representation. Three emplacement concepts are suggested for further analyses: a waste packages containing 4 PWR assemblies emplaced in boreholes in the floors of tunnels (KBS-3 concept), a 12-assembly waste package emplaced in tunnels, and a 32-assembly dual purpose canister emplaced in tunnels. In addition, three failure scenarios were suggested for future use: a nominal scenario involving corrosion of the waste package in the tunnel emplacement concepts, a manufacturing defect scenario applicable to the KBS-3 concept, and a disruptive glaciation scenario applicable to both emplacement concepts. The computational approaches required to analyze EBS failure and transport processes in a crystalline rock repository are similar to those of argillite/shale, with the most significant difference being that the EBS in a crystalline rock repository will likely experience highly heterogeneous flow rates, which should be represented in the model. The computational approaches required to analyze radionuclide transport in the natural system are very different because of the highly channelized nature of fracture flow. Computational workflows tailored to crystalline rock based on discrete transport pathways extracted from discrete fracture network models are recommended.

  11. A study of small-scale foliation in lengths of core enclosing fault zones in borehole WD-3, Permit Area D, Lac du Bonnet Batholith

    Energy Technology Data Exchange (ETDEWEB)

    Ejeckam, R.B.

    1992-12-01

    Small-scale foliation measurements in lengths of core from borehole WD-3 of Permit Area D of the Lac du Bonnet Batholith have defined five major mean orientation sets. They strike NW, N and NE. The orientations (strike to the left of the dip direction/dip) of these sets are as follows: Set I - 028/74 deg; II - 001/66 deg; III - 100/58 deg; IV - 076/83 deg; and V - 210/40 deg. The small-scale foliations were defined by different mineral types such as biotite crystals, plagioclase, mineral banding and quartz lenses. Well-developed biotite foliation is commonly present whenever well-developed plagioclase foliation exists, but as the strength of development weakens, the preferred orientations of plagioclase foliation do not correspond to those of biotite. It is also noted that the foliations appear to strike in directions orthogonal to the fractures in the fracture zones in the same depth interval. No significant change in foliation orientation was observed in Zones I to IV. Set V, however, whose mean orientation is 210/40 deg, is absent from the Zone IV interval, ranging from 872 to 905 m. (auth)

  12. Generic behaviours in impact fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Sator, N.; Mechkov, S.; Sausset, F. [Paris-6 Univ. Pierre et Marie Curie, Lab. de Physique Theorique de la Matiere Condensee, UMR CNRS 7600, 75 - Paris (France); Mechkov, S. [Ecole Normale Superieure, Lab. de Physique Statistique, 75 - Paris (France)

    2008-02-15

    From atomic nuclei to supernovae, including plates and rocks, every cohesive system can be broken into fragments, provided that the deposited energy is sufficiently large compared to its cohesive energy. We present a simple numerical model for investigating the general properties of fragmentation. By use of molecular dynamics simulations, we study the impact fragmentation of a solid disk of interacting particles with a wall. Regardless of the particular form of the interaction potential, the fragment size distribution exhibits a power law behaviour with an exponent that increases logarithmically with the energy deposited in the system, in agreement with experiments. We expect this behaviour to be generic in fragmentation phenomena. (authors)

  13. Factors influencing agreement between child self-report and parent proxy-reports on the Pediatric Quality of Life Inventory™ 4.0 (PedsQL™ generic core scales

    Directory of Open Access Journals (Sweden)

    Eiser Christine

    2006-08-01

    Full Text Available Abstract Background In situations where children are unable or unwilling to respond for themselves, measurement of quality of life (QOL is often obtained by parent proxy-report. However the relationship between child self and parent proxy-reports has been shown to be poor in some circumstances. Additionally the most appropriate statistical method for comparing ratings between child and parent proxy-reports has not been clearly established. The objectives of this study were to assess the: 1 agreement between child and parent proxy-reports on an established child QOL measure (the PedsQL™ using two different statistical methods; 2 effect of chronological age and domain type on agreement between children's and parents' reports on the PedsQL™; 3 relationship between parents' own well-being and their ratings of their child's QOL. Methods One hundred and forty-nine healthy children (5.5 – 6.5, 6.5 – 7.5, and 7.5 – 8.5 years completed the PedsQL™. One hundred and three of their parents completed these measures in relation to their child, and a measure of their own QOL (SF-36. Results Consistency between child and parent proxy-reports on the PedsQL™ was low, with Intra-Class correlation coefficients ranging from 0.02 to 0.23. Correlations were higher for the oldest age group for Total Score and Psychosocial Health domains, and for the Physical Health domain in the youngest age group. Statistically significant median differences were found between child and parent-reports on all subscales of the PedsQL™. The largest median differences were found for the two older age groups. Statistically significant correlations were found between parents' own QOL and their proxy-reports of child QOL across the total sample and within the middle age group. Conclusion Intra-Class correlation coefficients and median difference testing can provide different information on the relationship between parent proxy-reports and child self-reports. Our findings suggest that differences in the levels of parent-child agreement previously reported may be an artefact of the statistical method used. In addition, levels of agreement can be affected by child age, domains investigated, and parents' own QOL. Further studies are needed to establish the optimal predictors of levels of parent-child agreement.

  14. GENFIT - a generic track reconstruction toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Hoeppner, Christian; Neubert, Sebastian [Technische Universitaet Muenchen, Physik Department E18, 85748 Garching (Germany)

    2008-07-01

    Experiments in high energy physics use a combination of widely different detector systems to achieve an optimal measurement of particle trajectories. The software package GENFIT has been developed to provide a consistent treatment of track parameter estimation with hits from detectors providing different spatial information, e.g. strip projections, 3-D space points, drift distances to wires, etc. The concept is based on the idea of a full separation of parameterizations (hit-measurements and track models) from the algebra of regression algorithms. This implements the possibility to switch between different track propagation algorithms and detector geometries without changing the core fitting classes. Key components of the system are the Kalman filter and so-called virtual detector planes. An interface to the propagation package GEANE has also been realized. The poster illustrates the object-oriented architecture of the toolkit which uses generic programming techniques to realize the flexible and portable design. Some applications in the framework of the PANDA simulation studies are shown.

  15. GENFIT - a Generic track reconstruction toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Neubert, Sebastian; Hoeppner, Christian [Physik Department E18, TU Muenchen, D-85748 Garching (Germany)

    2008-07-01

    Experiments in high energy physics use a combination of widely different detector systems to achieve an optimal measurement of particle trajectories. The software package GENFIT has been developed to provide a consistent treatment of track parameter estimation with hits from detectors providing different spatial information, e.g. strip projections, 3-D space points, drift distances to wires, etc. The concept is based on the idea of a full separation of parameterizations (hit-measurements and track models) from the algebra of regression algorithms. This implements the possibility to switch between different track propagation algorithms and detector geometries without changing the core fitting classes. Key components of the system are the Kalman filter and the so called virtual detector planes. An interface to the propagation package GEANE has also been realized. The poster illustrates the object oriented architecture of the toolkit which uses generic programming techniques to realize the flexible and portable design. Some applications in the framework of the PANDA simulation studies are shown.

  16. Impact of brand or generic labeling on medication effectiveness and side effects.

    Science.gov (United States)

    Faasse, Kate; Martin, Leslie R; Grey, Andrew; Gamble, Greg; Petrie, Keith J

    2016-02-01

    Branding medication with a known pharmaceutical company name or product name bestows on the drug an added assurance of authenticity and effectiveness compared to a generic preparation. This study examined the impact of brand name and generic labeling on medication effectiveness and side effects. 87 undergraduate students with frequent headaches took part in the study. Using a within-subjects counterbalanced design, each participant took tablets labeled either as brand name "Nurofen" or "Generic Ibuprofen" to treat each of 4 headaches. In reality, half of the tablets were placebos, and half were active ibuprofen (400 mg). Participants recorded their headache pain on a verbal descriptor and visual analogue scale prior to taking the tablets, and again 1 hour afterward. Medication side effects were also reported. Pain reduction following the use of brand name labeled tablets was similar in active ibuprofen or a placebo. However, if the tablets had a generic label, placebo tablets were significantly less effective compared to active ibuprofen. Fewer side effects were attributed to placebo tablets with brand name labeling compared to the same placebo tablets with a generic label. Branding of a tablet appears to have conferred a treatment benefit in the absence of an active ingredient, while generic labeled tablets were substantially less effective if they contained no active ingredient. Branding is also associated with reduced attribution of side effects to placebo tablets. Future interventions to improve perceptions of generics may have utility in improving treatment outcomes from generic drugs. (c) 2016 APA, all rights reserved).

  17. Toward a generic UGV autopilot

    Science.gov (United States)

    Moore, Kevin L.; Whitehorn, Mark; Weinstein, Alejandro J.; Xia, Junjun

    2009-05-01

    Much of the success of small unmanned air vehicles (UAVs) has arguably been due to the widespread availability of low-cost, portable autopilots. While the development of unmanned ground vehicles (UGVs) has led to significant achievements, as typified by recent grand challenge events, to date the UGV equivalent of the UAV autopilot is not available. In this paper we describe our recent research aimed at the development of a generic UGV autopilot. Assuming we are given a drive-by-wire vehicle that accepts as inputs steering, brake, and throttle commands, we present a system that adds sonar ranging sensors, GPS/IMU/odometry, stereo camera, and scanning laser sensors, together with a variety of interfacing and communication hardware. The system also includes a finite state machine-based software architecture as well as a graphical user interface for the operator control unit (OCU). Algorithms are presented that enable an end-to-end scenario whereby an operator can view stereo images as seen by the vehicle and can input GPS waypoints either from a map or in the vehicle's scene-view image, at which point the system uses the environmental sensors as inputs to a Kalman filter for pose estimation and then computes control actions to move through the waypoint list, while avoiding obstacles. The long-term goal of the research is a system that is generically applicable to any drive-by-wire unmanned ground vehicle.

  18. PROBLEM OF GENERIC REPLACEMENT: ADVANTAGES AND DISADVANTAGES

    Directory of Open Access Journals (Sweden)

    S. N. Tolpygina

    2009-01-01

    Full Text Available The main differences between original and generic drugs as well as registration criteria for generics are described. Possible reasons of discrepancy in bioequivalence and therapeutic equivalence of original and generic drugs are reviewed. The examples of such a discrepancy as a result of comparative clinical trails (enalapril maleate are discussed. Approaches to planning of comparative trails on drug therapeutic equivalence are presented. 

  19. Generic domain models in software engineering

    Science.gov (United States)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  20. GENERIC COMMODITY PROMOTION AND PRODUCT DIFFERENTIATION

    OpenAIRE

    1999-01-01

    This paper considers whether generic promotion lowers the differentiation among competing brands as claimed in the 1997 Supreme Court case (Wileman et al. v. Glickman). Commodity promotion is modeled as a multi-stage game where products are vertically differentiated. Analytical results show that if the benefits of generic advertising from increased demand are outweighed by the costs from lower product differentiation then high-quality producers will not benefit from generic promotion but prod...

  1. Determinants of generic drug substitution in Switzerland

    Directory of Open Access Journals (Sweden)

    Lufkin Thomas M

    2011-01-01

    Full Text Available Abstract Background Since generic drugs have the same therapeutic effect as the original formulation but at generally lower costs, their use should be more heavily promoted. However, a considerable number of barriers to their wider use have been observed in many countries. The present study examines the influence of patients, physicians and certain characteristics of the generics' market on generic substitution in Switzerland. Methods We used reimbursement claims' data submitted to a large health insurer by insured individuals living in one of Switzerland's three linguistic regions during 2003. All dispensed drugs studied here were substitutable. The outcome (use of a generic or not was modelled by logistic regression, adjusted for patients' characteristics (gender, age, treatment complexity, substitution groups and with several variables describing reimbursement incentives (deductible, co-payments and the generics' market (prices, packaging, co-branded original, number of available generics, etc.. Results The overall generics' substitution rate for 173,212 dispensed prescriptions was 31%, though this varied considerably across cantons. Poor health status (older patients, complex treatments was associated with lower generic use. Higher rates were associated with higher out-of-pocket costs, greater price differences between the original and the generic, and with the number of generics on the market, while reformulation and repackaging were associated with lower rates. The substitution rate was 13% lower among hospital physicians. The adoption of the prescribing practices of the canton with the highest substitution rate would increase substitution in other cantons to as much as 26%. Conclusions Patient health status explained a part of the reluctance to substitute an original formulation by a generic. Economic incentives were efficient, but with a moderate global effect. The huge interregional differences indicated that prescribing behaviours and

  2. PROBLEM OF GENERIC REPLACEMENT: ADVANTAGES AND DISADVANTAGES

    Directory of Open Access Journals (Sweden)

    S. N. Tolpygina

    2016-01-01

    Full Text Available The main differences between original and generic drugs as well as registration criteria for generics are described. Possible reasons of discrepancy in bioequivalence and therapeutic equivalence of original and generic drugs are reviewed. The examples of such a discrepancy as a result of comparative clinical trails (enalapril maleate are discussed. Approaches to planning of comparative trails on drug therapeutic equivalence are presented. 

  3. Long-distance relationship between large-scale tropical SSTs and ice core-derived oxygen isotopic records in the Third Pole Region

    Science.gov (United States)

    Thompson, L. G.; Yao, T.; Mosley-Thompson, E. S.; Lin, P.

    2012-12-01

    The tropical hydrological cycle is a key factor coupling isotopic records from ice core, speleothem and lake records with tropical SSTs and the vertical amplification of temperature in the Tropics. Stable isotopic ratios, particularly of oxygen, preserved in glacier ice provide high resolution records of climate changes over long time periods. In polar ice sheets the isotopic signal is driven primarily by temperature while in low-latitudes it depends on a variety of hydrologic and thermal influences in the broad geographic region that supplies moisture to the mountain glaciers. The strong correlation between ice core-derived isotopic records throughout the low- and mid-latitudes and tropical SSTs likely reflects the dominance of tropical evaporation in the flux of water vapor to the atmosphere and provides a possible explanation for the large-scale isotopic links among low- and mid-latitude paleoclimate records. Many low- to mid-latitude ice fields provide continuous, annually-resolved proxy records of climatic and environmental variability recorded by many preserved and measurable parameters including oxygen and hydrogen isotopic ratios and net mass balance (accumulation). These records present an opportunity to examine the nature of climate variability in these regions in greater detail and to extract new information about long-distance relationships in the climate system. Understanding these relationships is essential for proper interpretation of the isotopic records archived in glaciers, lakes, speleothems and other paleo-archives in the Third Pole (TP) Region. Here we compare high resolution records from Dasuopu Glacier in the Himalaya, a speleothem record from Wanxiang Cave in Gansu Province on the TP and the annually resolved ice core records from the Quelccaya Ice Cap in the tropical Andes of South America. The purpose is to explore the role of long-distance processes in determining the isotopic composition of paleo archives on the TP. Running correlations

  4. Reference pricing with endogenous generic entry.

    OpenAIRE

    Kurt R. Brekke; Canta, Chiara; Straume, Odd Rune

    2015-01-01

    In this paper we study the effect of reference pricing on pharmaceutical prices and ex-penditures when generic entry is endogenously determined. We develop a Salop-type model where a brand-name producer competes with generic producers in terms of prices. In the market there are two types of consumers: (i) brand biased consumers who choose between brand-name and generic drugs, and (ii) brand neutral consumers who choose between the different generic drugs. We find that, for a given number of ...

  5. Generic modules for trivial extension algebras

    Institute of Scientific and Technical Information of China (English)

    杜先能

    1995-01-01

    Let A be a finite-dimensional algebra over an algebraically closed field. An indecomposable (right) ,4-module M is called generic provided M is infinite k-dimensional but finite length as (left) EndA(M)-module. Let R = A DA be the trivial extension algebra of A- Generic R-modules are constructed from generic A-modules using some functors between Mod A and Mod R. it is also proved that if A is a tame hereditary algebra, then R has only two generic modules.

  6. "Generic Entry and the Pricing of Pharmaceuticals"

    OpenAIRE

    Frank, Richard G.; David S. Salkever

    1995-01-01

    During the 1980s the share of prescriptions sold by retail pharmacies that was accounted for by generic products roughly doubled. The price response to generic entry of brand-name products has been a source of controversy. In this paper we estimate models of price responses to generic entry in the market for brand-name and generic drugs. We study a sample of 32 drugs that lost patent protection during the early to mid-1980s. Our results provide strong evidence that brand-name prices increase ...

  7. Proteomics Core

    Data.gov (United States)

    Federal Laboratory Consortium — Proteomics Core is the central resource for mass spectrometry based proteomics within the NHLBI. The Core staff help collaborators design proteomics experiments in a...

  8. Proteomics Core

    Data.gov (United States)

    Federal Laboratory Consortium — Proteomics Core is the central resource for mass spectrometry based proteomics within the NHLBI. The Core staff help collaborators design proteomics experiments in...

  9. Rational use of generic psychotropic drugs.

    Science.gov (United States)

    Carbon, Maren; Correll, Christoph U

    2013-05-01

    For economic reasons, the generic substitution of branded medications is common and welcome. These replacements are based on the concept of bioequivalence, which is considered equal to therapeutic equivalence. Regulatory standards for bioequivalence require the 90 % confidence intervals of group averages of pharmacokinetic measures of a generic and the original drug to overlap within ±20 %. However, therapeutic equivalence has been challenged for several psychotropic agents by retrospective studies and case reports. To evaluate the degree of bioequivalence and therapeutic equivalence of branded and generic psychotropic drugs, we performed an electronic search (from database inception until 24 May 2012 and without language restrictions) in PubMed/MEDLINE, Cochrane Library, and Web of Science. Search terms were "(generic) AND (psychotropic OR psychoactive OR antipsychotic OR antiepileptic OR antidepressant OR stimulant OR benzodiazepine)" or the respective individual substances. We included clinical studies, regardless of design, comparing branded with generic psychotropic drug formulations, identifying 35 such studies. We also included case reports/series reporting on outcomes after a switch between brand and generic psychotropics, identifying 145 clinical cases. Bioequivalence studies in healthy controls or animals, in-vitro studies, and health economics studies without medical information were excluded. An overview of the few randomized controlled studies supports that US FDA regulations assure clinically adequate drug delivery in the majority of patients switched from brand to generic. However, with a growing number of competing generic products for one substance, and growing economic pressure to substitute with the currently cheapest generic, frequent generic-generic switches, often unbeknownst to prescribing clinicians, raise concerns, particularly for antiepileptics/mood stabilizers. Generic-generic switches may vary by more than ±20 % from each other in

  10. The early Cretaceous orogen-scale Dabieshan metamorphic core complex: implications for extensional collapse of the Triassic HP-UHP orogenic belt in east-central China

    Science.gov (United States)

    Ji, Wenbin; Lin, Wei; Faure, Michel; Shi, Yonghong; Wang, Qingchen

    2016-03-01

    The Dabieshan massif is famous as a portion of the world's largest HP-UHP metamorphic belt in east-central China that was built by the Triassic North-South China collision. The central domain of the Dabieshan massif is occupied by a huge migmatite-cored dome [i.e., the central Dabieshan dome (CDD)]. Origin of this domal structure remains controversial. Synthesizing previous and our new structural and geochronological data, we define the Cretaceous Dabieshan as an orogen-scale metamorphic core complex (MCC) with a multistage history. Onset of lithospheric extension in the Dabieshan area occurred as early as the commencement of crustal anatexis at the earliest Cretaceous (ca. 145 Ma), which was followed by primary (early-stage) detachment during 142-130 Ma. The central Dabieshan complex in the footwall and surrounding detachment faults recorded a consistently top-to-the-NW shearing. It is thus inferred that the primary detachment was initiated from a flat-lying detachment zone at the middle crust level. Removal of the orogenic root by delamination at ca. 130 Ma came into the extensional climax, and subsequently isostatic rebound resulted in rapid doming. Along with exhumation of the footwall, the mid-crustal detachment zone had been warped as shear zones around the CDD. After 120 Ma, the detachment system probably experienced a migration accommodated to the crustal adjustment, which led to secondary (late-stage) detachment with localized ductile shearing at ca. 110 Ma. The migmatite-gneiss with HP/UHP relicts in the CDD (i.e., the central Dabieshan complex) was product of the Cretaceous crustal anatexis that consumed the deep-seated part of the HP-UHP slices and the underlying para-autochthonous basement. Compared with the contemporaneous MCCs widely developed along the eastern margin of the Eurasian continent, we proposed that occurrence of the Dabieshan MCC shares the same tectonic setting as the "destruction of the North China craton". However, geodynamic trigger

  11. The early Cretaceous orogen-scale Dabieshan metamorphic core complex: implications for extensional collapse of the Triassic HP-UHP orogenic belt in east-central China

    Science.gov (United States)

    Ji, Wenbin; Lin, Wei; Faure, Michel; Shi, Yonghong; Wang, Qingchen

    2017-06-01

    The Dabieshan massif is famous as a portion of the world's largest HP-UHP metamorphic belt in east-central China that was built by the Triassic North-South China collision. The central domain of the Dabieshan massif is occupied by a huge migmatite-cored dome [i.e., the central Dabieshan dome (CDD)]. Origin of this domal structure remains controversial. Synthesizing previous and our new structural and geochronological data, we define the Cretaceous Dabieshan as an orogen-scale metamorphic core complex (MCC) with a multistage history. Onset of lithospheric extension in the Dabieshan area occurred as early as the commencement of crustal anatexis at the earliest Cretaceous (ca. 145 Ma), which was followed by primary (early-stage) detachment during 142-130 Ma. The central Dabieshan complex in the footwall and surrounding detachment faults recorded a consistently top-to-the-NW shearing. It is thus inferred that the primary detachment was initiated from a flat-lying detachment zone at the middle crust level. Removal of the orogenic root by delamination at ca. 130 Ma came into the extensional climax, and subsequently isostatic rebound resulted in rapid doming. Along with exhumation of the footwall, the mid-crustal detachment zone had been warped as shear zones around the CDD. After 120 Ma, the detachment system probably experienced a migration accommodated to the crustal adjustment, which led to secondary (late-stage) detachment with localized ductile shearing at ca. 110 Ma. The migmatite-gneiss with HP/UHP relicts in the CDD (i.e., the central Dabieshan complex) was product of the Cretaceous crustal anatexis that consumed the deep-seated part of the HP-UHP slices and the underlying para-autochthonous basement. Compared with the contemporaneous MCCs widely developed along the eastern margin of the Eurasian continent, we proposed that occurrence of the Dabieshan MCC shares the same tectonic setting as the "destruction of the North China craton". However, geodynamic trigger

  12. 76 FR 54507 - Proposed Generic Communication; Draft NRC Generic Letter 2011-XX: Seismic Risk Evaluations for...

    Science.gov (United States)

    2011-09-01

    ... COMMISSION Proposed Generic Communication; Draft NRC Generic Letter 2011-XX: Seismic Risk Evaluations for... the effects of natural phenomena, including earthquakes, without losing the capability to perform... Electric Power Research Institute models to estimate earthquake ground motion and updated models...

  13. Analysis of French generic medicines retail market: why the use of generic medicines is limited.

    Science.gov (United States)

    Dylst, Pieter; Vulto, Arnold; Simoens, Steven

    2014-12-01

    The market share of generic medicines in France is low compared to other European countries. This perspective paper provides an overview of the generic medicines retail market in France and how the current policy environment may affect the long-term sustainability. Looking at the French generic medicines retail market and the surrounding regulatory framework, all conditions seem to be in place to create a healthy generic medicines market: the country has well-respected regulatory authorities, generic medicines enter the market in a timely manner and prices of generic medicines are competitive compared with other European countries. Despite the success of the demand-side policies targeted at pharmacists and patients, those targeted at physicians were less successful due to a lack of enforcement and a lack of trust in generic medicines by French physicians. Recommendations to increase the use of generic medicines in France round off this perspective paper.

  14. Generic multiset programming with discrimination-based joins and symbolic Cartesian products

    DEFF Research Database (Denmark)

    Henglein, Fritz; Larsen, Ken Friis

    2010-01-01

    This paper presents GMP, a library for generic, SQL-style programming with multisets. It generalizes the querying core of SQL in a number of ways: Multisets may contain elements of arbitrary first-order data types, including references (pointers), recur- sive data types and nested multisets; it c...

  15. Knowledge and attitudes of physicians and pharmacists towards the use of generic medicines in Bosnia and Herzegovina.

    Science.gov (United States)

    Čatić, Tarik; Avdagić, Lejla; Martinović, Igor

    2017-02-01

    Aim To investigate and assess knowledge and attitudes of pharmacists and physicians towards generic drugs prescription in order to evaluate current trends, obstacles to prescribe/dispense generics and suggest possible improvements of rational and economic prescribing having in mind scarce public budgets for drugs. Methods A cross-sectional survey among 450 primary care physicians (prescribers) and pharmacists in four major cities in Bosnia and Herzegovina (Sarajevo, Banja Luka, Tuzla and Mostar) during the period between January and March 2016 was conducted. The survey (questionnaire) was developed and physicians' and pharmacists' perception was examined using the 5-point Likert scale. Descriptive statistics was used to examine respondents' characteristics and their responses to survey questions. The respondents perception based on different characteristics was assessed using ordinal logistic regression. Results Generally, positive attitudes towards generic drugs were found. Majority of respondents, 392 (87.0%) considered generic drugs the same as originators and they could be mutually substituted. Physicians were more likely to prescribe branded drugs, 297 (66.6%), even 391 (86.8%) were aware of generic alternatives. Respondents believed that patients considered generic drugs less effective, 204 (45.4%), and 221 (49.0%) disapproved generic substitution. Conclusion Our findings suggest that further education and more information about benefits of generic drugs should be provided to key stakeholders including patients. Also, clearer generic drugs policies should be introduced in order to improve generic prescribing and potentially improve access and optimize pharmaceutical public expenditures.

  16. Pharmaceutical policy regarding generic drugs in Belgium.

    Science.gov (United States)

    Simoens, Steven; De Bruyn, Kristien; Bogaert, Marc; Laekeman, Gert

    2005-01-01

    Pressure to control pharmaceutical expenditure and price competition among pharmaceutical companies are fuelling the development of generic drug markets in EU countries. However, in Belgium, the market for generic drugs is underdeveloped compared with other countries. To promote the use of generic drugs, the government introduced a reference pricing (RP) scheme in 2001. The aim of this paper is to discuss Belgian pharmaceutical policy regarding generic drugs and to analyse how the Belgian drug market has evolved following initiation of the RP scheme. The market share held by generic drugs increased following implementation of the RP scheme. Focusing on volume, average market share (by semester) for generic drugs amounted to 2.05% of the total pharmaceutical market from January 1998 to June 2001, compared with 6.11% from July 2001 to December 2003. As new generic drugs are introduced, their market share tends to increase in the first couple of months, after which it levels off. Faced with increasing generic competition, some manufacturers have launched new variants of their original drug, thereby effectively extending the period of patent protection. Strategies consisting of price reductions in return for the abolition of prescribing conditions and the launch of new dosages or formulations appear to have been successful in maintaining the market share of original drugs. Nevertheless, the introduction of the RP scheme was associated with savings amounting to 1.8% of pharmaceutical expenditure by the third-party payer in 2001 and 2.1% in 2002. The findings of this paper indicate that the RP scheme has stimulated the Belgian generic drug market. However, existing policy has largely failed to take into account the role that physicians and pharmacists can play in stimulating generic drug use. Therefore, further development of the Belgian generic drug market seems to hinge on the creation of appropriate incentives for physicians to prescribe, and for pharmacists to

  17. Interrater Reliability and Concurrent Validity of a New Rating Scale to Assess the Performance of Everyday Life Tasks in Dementia: The Core Elements Method.

    Science.gov (United States)

    de Werd, Maartje M E; Hoelzenbein, Angela C; Boelen, Daniëlle H E; Rikkert, Marcel G M Olde; Hüell, Michael; Kessels, Roy P C; Voigt-Radloff, Sebastian

    2016-12-01

    Errorless learning (EL) is an instructional procedure involving error reduction during learning. Errorless learning is mostly examined by counting correctly executed task steps or by rating them using a Task Performance Scale (TPS). Here, we explore the validity and reliability of a new assessment procedure, the core elements method (CEM), which rates essential building blocks of activities rather than individual steps. Task performance was assessed in 35 patients with Alzheimer's dementia recruited from the Relearning methods on Daily Living task performance of persons with Dementia (REDALI-DEM) study using TPS and CEM independently. Results showed excellent interrater reliabilities for both measure methods (CEM: intraclass coefficient [ICC] = .85; TPS: ICC = .97). Also, both methods showed a high agreement (CEM: mean of measurement difference [MD] = -3.44, standard deviation [SD] = 14.72; TPS: MD = -0.41, SD = 7.89) and correlated highly (>.75). Based on these results, TPS and CEM are both valid for assessing task performance. However, since TPS is more complicated and time consuming, CEM may be the preferred method for future research projects.

  18. Augmenting the core battery with supplementary subtests: Wechsler adult intelligence scale--IV measurement invariance across the United States and Canada.

    Science.gov (United States)

    Bowden, Stephen C; Saklofske, Donald H; Weiss, Lawrence G

    2011-06-01

    Examination of measurement invariance provides a powerful method to evaluate the hypothesis that the same set of psychological constructs underlies a set of test scores in different populations. If measurement invariance is observed, then the same psychological meaning can be ascribed to scores in both populations. In this study, the measurement model including core and supplementary subtests of the Wechsler Adult Intelligence Scale-Fourth edition (WAIS-IV) were compared across the U.S. and Canadian standardization samples. Populations were compared on the 15 subtest version of the test in people aged 70 and younger and on the 12 subtest version in people aged 70 or older. Results indicated that a slightly modified version of the four-factor model reported in the WAIS-IV technical manual provided the best fit in both populations and in both age groups. The null hypothesis of measurement invariance across populations was not rejected, and the results provide direct evidence for the generalizability of convergent and discriminant validity studies with the WAIS-IV across populations. Small to medium differences in latent means favoring Canadians highlight the value of local norms.

  19. Performance Assessment Modeling and Sensitivity Analyses of Generic Disposal System Concepts.

    Energy Technology Data Exchange (ETDEWEB)

    Sevougian, S. David; Freeze, Geoffrey A.; Gardner, William Payton; Hammond, Glenn Edward; Mariner, Paul

    2014-09-01

    directly, rather than through simplified abstractions. It also a llows for complex representations of the source term, e.g., the explicit representation of many individual waste packages (i.e., meter - scale detail of an entire waste emplacement drift). This report fulfills the Generic Disposal System Analysis Work Packa ge Level 3 Milestone - Performance Assessment Modeling and Sensitivity Analyses of Generic Disposal System Concepts (M 3 FT - 1 4 SN08080 3 2 ).

  20. Once more the generic name Passerina Vieillot

    NARCIS (Netherlands)

    Oort, van E.D.

    1910-01-01

    The note on the generic name of the Snow-bunting by Dr. E. Hartert in this part of our periodical gives me cause to revert to the subject of my note on the generic name Passerina Vieillot and to state here, that I stand to what I have said about the rejection of this name in Zoology (Notes Leyden Mu

  1. Designing Generic and Efficient Negotiation Strategies

    NARCIS (Netherlands)

    Tykhonov, D.

    2010-01-01

    The central aim of this thesis is the design of generic and efficient automated strategies for two-party negotiations in which negotiating parties do not reveal their preferences explicitly. A strategy for negotiation is the decision mechanism for determining the actions of a negotiator. Generic ref

  2. Typed generic traversals in $S_gamma^'$

    NARCIS (Netherlands)

    Lämmel, R.

    2001-01-01

    A typed model of strategic rewriting is developed. An innovation is that generic traversals are covered. To this end, we define a rewriting calculus $S'_{gamma$. The calculus offers a few strategy combinators for generic traversals. There is, for example, a combinator to apply a strategy to all imme

  3. [Generic drugs: quality, efficacy, safety and interchangeability].

    Science.gov (United States)

    Tschabitscher, Doris; Platzer, Peter; Baumgärtel, Christoph; Müllner, Marcus

    2008-01-01

    Since the introduction of generic drugs to the pharmaceutical market a sometimes emotional debate exists whether they are well-investigated and of high quality. There is some uncertainty about whether evidence of bioequivalence is enough to guarantee efficacy and safety of generic drugs. Some physicians ask the question if competent authorities are able to ascertain that the pharmaceutical quality of generics is acceptable. Doctors and patients sometimes are ill at ease about the interchangeability of innovator and generic products. This article describes how the European Union legislation ensures that a generic drug is only approved if its risk-benefit relationship is favourable and that it is essentially similar to the innovator product. In this context pharmacokinetic parameters are accepted as surrogates for clinical results because bioequivalence means therapeutic equivalence as well. For most drugs, current bioequivalence testing generally enables clinicians to routinely substitute generic for innovator products. Published findings, however, suggest that particular drugs may not be ideally suited for generic substitution when a patient is already on that drug. These are the so called critical dose medicinal products (drugs with a narrow therapeutic range). When starting a new therapy with any generic drug, however, its similarity to the innovator drug in terms of efficacy, safety and quality is guaranteed.

  4. GENERIC DRUG USER FEE: AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    Darshit S. Patel*, Abhishek R. Patel and Narendra A. Patel

    2012-09-01

    Full Text Available The globalization of generic drug manufacturing, supply and testing, and a growing workload that has far outpaced USFDA’s resources has created new challenges. USFDA & Industry propose generic drug user fee to address the need for globalization of the inspection process, and to speed the timely review of generic product applications. The Generic Drug User Fee (GDUF proposal is agreed by generic industry & USFDA and is focused on three key aims: safety, access, and transparency. Under the program, USFDA will receive nearly $1.5 billion over five years in supplemental funding through generic industry user fees in order to help the agency expedite access to generic drugs, enhance drug quality and safety and ensure inspection parity of both foreign and domestic manufacturing sites. GDUF also will help accelerate the market entry of additional manufacturers of drugs currently in short supply and improve quality, consistency, and availability within the supply chain, further helping to mitigate drug shortages. The GDUF new legislation is a milestone for the generic giants and a major win for American health care consumers.

  5. Defining Generic Skills. At a Glance.

    Science.gov (United States)

    National Centre for Vocational Education Research, Leabrook (Australia).

    Generic skills--skills that apply across a variety of jobs and life contexts--are taking on increased importance in Australia and internationally. There is a high demand for generic skills in the workplace because employers seek to ensure business success by recruiting and retaining employees who have a variety of skills and personal attributes as…

  6. More on core instabilities of magnetic monopoles

    CERN Document Server

    Striet, J

    2003-01-01

    In this paper we present new results on the core instability of the 't Hooft Polyakov monopoles we reported on before. This instability, where the spherical core decays in a toroidal one, typically occurs in models in which charge conjugation is gauged. In this paper we also discuss a third conceivable configuration denoted as ``split core'', which brings us to some details of the numerical methods we employed. We argue that a core instability of 't Hooft Polyakov type monopoles is quite a generic feature of models with charged Higgs particles.

  7. Explaining Communication Displacement and Large-Scale Social Change in Core Networks: A Cross-National Comparison of Why Bigger is Not Better and Less Can Mean More

    DEFF Research Database (Denmark)

    Hampton, Keith; Ling, Richard

    2013-01-01

    The size and diversity of Americans’ core social networks has declined. Some suggest that the replacement of face-to-face contact with new media, and combined with more insular core networks is detrimental to both individual and societal well-being. Based on a cross-national comparison of the Uni...... with larger core networks and more frequent in-person contact. However, while contact is generally associated with contact, frequent in-person interaction within the context of low societal well-being is associated with a smaller core network.......The size and diversity of Americans’ core social networks has declined. Some suggest that the replacement of face-to-face contact with new media, and combined with more insular core networks is detrimental to both individual and societal well-being. Based on a cross-national comparison...

  8. A Generic Agent Organisation Framework for Autonomic Systems

    Science.gov (United States)

    Kota, Ramachandra; Gibbins, Nicholas; Jennings, Nicholas R.

    Autonomic computing is being advocated as a tool for managing large, complex computing systems. Specifically, self-organisation provides a suitable approach for developing such autonomic systems by incorporating self-management and adaptation properties into large-scale distributed systems. To aid in this development, this paper details a generic problem-solving agent organisation framework that can act as a modelling and simulation platform for autonomic systems. Our framework describes a set of service-providing agents accomplishing tasks through social interactions in dynamically changing organisations. We particularly focus on the organisational structure as it can be used as the basis for the design, development and evaluation of generic algorithms for self-organisation and other approaches towards autonomic systems.

  9. Hollow-Core Fiber Lamp

    Science.gov (United States)

    Yi, Lin (Inventor); Tjoelker, Robert L. (Inventor); Burt, Eric A. (Inventor); Huang, Shouhua (Inventor)

    2016-01-01

    Hollow-core capillary discharge lamps on the millimeter or sub-millimeter scale are provided. The hollow-core capillary discharge lamps achieve an increased light intensity ratio between 194 millimeters (useful) and 254 millimeters (useless) light than conventional lamps. The capillary discharge lamps may include a cone to increase light output. Hollow-core photonic crystal fiber (HCPCF) may also be used.

  10. Risks and benefits of generic antiepileptic drugs.

    Science.gov (United States)

    Gómez-Alonso, Juan; Kanner, Andrés M; Herranz, José Luis; Molins, Albert; Gil-Nagel, Antonio

    2008-11-01

    In most therapeutic areas, prescribing generic drugs seems to lower costs without sacrificing efficacy. The use of generic drugs for treating epilepsy may, however, be more controversial. A systematic review of the literature on generic antiepileptic drugs has been carried out based primarily on a bibliographical search in the Medline database. Published studies are usually of a descriptive nature and are sometimes based on generic drugs that were approved in times when regulatory agency requirements were not as strict as they are now. Experts claim that a change in pharmaceutical formulations could cause seizure recurrence in cases that had been successfully controlled in the past, with severe effects on patients. Meanwhile, several health organizations have provided inconsistent recommendations on the use of generic antiepileptic drugs. In order to obtain scientific evidence on the potential risks and benefits of interchanging branded and generic antiepileptic drugs, high methodological comparative studies are necessary. These studies could bring consensus about the role of generic drugs for treating epilepsy.

  11. Generic physical protection logic trees

    Energy Technology Data Exchange (ETDEWEB)

    Paulus, W.K.

    1981-10-01

    Generic physical protection logic trees, designed for application to nuclear facilities and materials, are presented together with a method of qualitative evaluation of the trees for design and analysis of physical protection systems. One or more defense zones are defined where adversaries interact with the physical protection system. Logic trees that are needed to describe the possible scenarios within a defense zone are selected. Elements of a postulated or existing physical protection system are tagged to the primary events of the logic tree. The likelihood of adversary success in overcoming these elements is evaluated on a binary, yes/no basis. The effect of these evaluations is propagated through the logic of each tree to determine whether the adversary is likely to accomplish the end event of the tree. The physical protection system must be highly likely to overcome the adversary before he accomplishes his objective. The evaluation must be conducted for all significant states of the site. Deficiencies uncovered become inputs to redesign and further analysis, closing the loop on the design/analysis cycle.

  12. Generic Magnetic Fusion Reactor Revisited

    Science.gov (United States)

    Sheffield, John; Milora, Stanley

    2015-11-01

    The original Generic Magnetic Fusion Reactor paper was published in 1986. This update describes what has changed in 30 years. Notably, the construction of ITER is providing important benchmark numbers for technologies and costs. In addition, we use a more conservative neutron wall flux and fluence. But these cost-increasing factors are offset by greater optimism on the thermal-electric conversion efficiency and potential availability. The main examples show the cost of electricity (COE) as a function of aspect ratio and neutron flux to the first wall. The dependence of the COE on availability, thermo-electric efficiency, electrical power output, and the present day's low interest rates is also discussed. Interestingly, at fixed aspect ratio there is a shallow minimum in the COE at neutron flux around 2.5 MW/m2. The possibility of operating with only a small COE penalty at even lower wall loadings (to 1.0 MW/m2 at larger plant size) and the use of niobium-titanium coils are also investigated. J. Sheffield was supported by ORNL subcontract 4000088999 with the University of Tennessee.

  13. Combining Vertex-centric Graph Processing with SPARQL for Large-scale RDF Data Analytics

    KAUST Repository

    Abdelaziz, Ibrahim

    2017-06-27

    Modern applications, such as drug repositioning, require sophisticated analytics on RDF graphs that combine structural queries with generic graph computations. Existing systems support either declarative SPARQL queries, or generic graph processing, but not both. We bridge the gap by introducing Spartex, a versatile framework for complex RDF analytics. Spartex extends SPARQL to support programs that combine seamlessly generic graph algorithms (e.g., PageRank, Shortest Paths, etc.) with SPARQL queries. Spartex builds on existing vertex-centric graph processing frameworks, such as Graphlab or Pregel. It implements a generic SPARQL operator as a vertex-centric program that interprets SPARQL queries and executes them efficiently using a built-in optimizer. In addition, any graph algorithm implemented in the underlying vertex-centric framework, can be executed in Spartex. We present various scenarios where our framework simplifies significantly the implementation of complex RDF data analytics programs. We demonstrate that Spartex scales to datasets with billions of edges, and show that our core SPARQL engine is at least as fast as the state-of-the-art specialized RDF engines. For complex analytical tasks that combine generic graph processing with SPARQL, Spartex is at least an order of magnitude faster than existing alternatives.

  14. Generic substitution, financial interests, and imperfect agency.

    Science.gov (United States)

    Rischatsch, Maurus; Trottmann, Maria; Zweifel, Peter

    2013-06-01

    Policy makers around the world seek to encourage generic substitution. In this paper, the importance of prescribing physicians' imperfect agency is tested using the fact that some Swiss jurisdictions allow physicians to dispense drugs on their own account (physician dispensing, PD) while others disallow it. We estimate a model of physician drug choice with the help of drug claim data, finding a significant positive association between PD and the use of generics. While this points to imperfect agency, generics are prescribed more often to patients with high copayments or low incomes.

  15. Generic Argillite/Shale Disposal Reference Case

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Liange; Colon, Carlos Jové; Bianchi, Marco; Birkholzer, Jens

    2014-08-08

    Radioactive waste disposal in a deep subsurface repository hosted in clay/shale/argillite is a subject of widespread interest given the desirable isolation properties, geochemically reduced conditions, and widespread geologic occurrence of this rock type (Hansen 2010; Bianchi et al. 2013). Bianchi et al. (2013) provides a description of diffusion in a clay-hosted repository based on single-phase flow and full saturation using parametric data from documented studies in Europe (e.g., ANDRA 2005). The predominance of diffusive transport and sorption phenomena in this clay media are key attributes to impede radionuclide mobility making clay rock formations target sites for disposal of high-level radioactive waste. The reports by Hansen et al. (2010) and those from numerous studies in clay-hosted underground research laboratories (URLs) in Belgium, France and Switzerland outline the extensive scientific knowledge obtained to assess long-term clay/shale/argillite repository isolation performance of nuclear waste. In the past several years under the UFDC, various kinds of models have been developed for argillite repository to demonstrate the model capability, understand the spatial and temporal alteration of the repository, and evaluate different scenarios. These models include the coupled Thermal-Hydrological-Mechanical (THM) and Thermal-Hydrological-Mechanical-Chemical (THMC) models (e.g. Liu et al. 2013; Rutqvist et al. 2014a, Zheng et al. 2014a) that focus on THMC processes in the Engineered Barrier System (EBS) bentonite and argillite host hock, the large scale hydrogeologic model (Bianchi et al. 2014) that investigates the hydraulic connection between an emplacement drift and surrounding hydrogeological units, and Disposal Systems Evaluation Framework (DSEF) models (Greenberg et al. 2013) that evaluate thermal evolution in the host rock approximated as a thermal conduction process to facilitate the analysis of design options. However, the assumptions and the

  16. Ice cores

    DEFF Research Database (Denmark)

    Svensson, Anders

    2014-01-01

    Ice cores from Antarctica, from Greenland, and from a number of smaller glaciers around the world yield a wealth of information on past climates and environments. Ice cores offer unique records on past temperatures, atmospheric composition (including greenhouse gases), volcanism, solar activity......, dustiness, and biomass burning, among others. In Antarctica, ice cores extend back more than 800,000 years before present (Jouzel et al. 2007), whereas. Greenland ice cores cover the last 130,000 years...

  17. Ice cores

    DEFF Research Database (Denmark)

    Svensson, Anders

    2014-01-01

    Ice cores from Antarctica, from Greenland, and from a number of smaller glaciers around the world yield a wealth of information on past climates and environments. Ice cores offer unique records on past temperatures, atmospheric composition (including greenhouse gases), volcanism, solar activity......, dustiness, and biomass burning, among others. In Antarctica, ice cores extend back more than 800,000 years before present (Jouzel et al. 2007), whereas. Greenland ice cores cover the last 130,000 years...

  18. Dissolution testing for generic drugs: an FDA perspective.

    Science.gov (United States)

    Anand, Om; Yu, Lawrence X; Conner, Dale P; Davit, Barbara M

    2011-09-01

    In vitro dissolution testing is an important tool used for development and approval of generic dosage forms. The objective of this article is to summarize how dissolution testing is used for the approval of safe and effective generic drug products in the United States (US). Dissolution testing is routinely used for stability and quality control purposes for both oral and non-oral dosage forms. The dissolution method should be developed using an appropriate validated method depending on the dosage form. There are several ways in which dissolution testing plays a pivotal role in regulatory decision-making. It may be used to waive in vivo bioequivalence (BE) study requirements, as BE documentation for Scale Up and Post Approval Changes (SUPAC), and to predict the potential for a modified-release (MR) drug product to dose-dump if co-administered with alcoholic beverages. Thus, in vitro dissolution testing plays a major role in FDA's efforts to reduce the regulatory burden and unnecessary human studies in generic drug development without sacrificing the quality of the drug products.

  19. Generic tacrolimus in solid organ transplantation.

    Science.gov (United States)

    Taube, D; Jones, G; O'Beirne, J; Wennberg, L; Connor, A; Rasmussen, A; Backman, L

    2014-05-01

    The availability of a wide range of immunosuppressive therapies has revolutionized the management of patients who have undergone solid organ transplantation (SOT). However, the cost of immunosuppressive drugs remains high. This situation has led to the development of generic equivalents, which are similar in quality, safety, and efficacy to their approved innovator drugs. There are data available for three generic brands, tacrolimus (Intas), tacrolimus (PharOS), and tacrolimus (Sandoz). Bioequivalence has been demonstrated for generic tacrolimus (Sandoz) within a narrow therapeutic range to its innovator tacrolimus drug (Prograf) in both healthy volunteers and kidney transplant patients. Clinical experience with this generic tacrolimus formulation has also been established in both de novo and conversion patients who have undergone kidney and liver transplantation, as well as in conversion of other SOT patients, including lung and heart recipients.

  20. Are generic drugs really inferior medicines?

    Science.gov (United States)

    Moore, N; Berdaï, D; Bégaud, B

    2010-09-01

    In this issue Gagne et al. report an elegant case-crossover study of seizures in patients on antiepileptic drugs. They found that a dispensation episode approximately triples the risk of having a seizure within 21 days, but the risk is not statistically different whether the dispensation was of the same brand-name or generic drug as previously used or a switch from brand-name to a generic or from a generic to a brand name. The cause of the seizure might be a delay in taking medication or late redispensation, among others, but apparently the nature of the product dispensed is not relevant in this study; this may alleviate some of the concerns about generic drugs and epilepsy.

  1. Generic substitution: issues for problematic drugs.

    Science.gov (United States)

    Henderson, J D; Esham, R H

    2001-01-01

    The methodology and criteria for bioequivalence testing have been firmly established by the Food and Drug Administration (FDA). For certain drugs with a narrow therapeutic index (e.g., digoxin, levothyroxine, warfarin), generic substitution may not be advisable or even allowable, depending on the substitution laws of individual states. Digoxin and levothyroxine tablets are examples of drugs for which no New Drug Applications (NDAs) currently exist. However, commercially available generic products for both of these drugs have not been determined by the FDA to be therapeutically equivalent to the innovator products. Generic versions of warfarin have been approved by the FDA as being therapeutically equivalent to the innovator products, as have generic versions of the rescue inhaler albuterol. Yet, misinformation and myths persist regarding the adequacy and proven reliability of the FDA's determination of bioequivalence for these products.

  2. Impacts of Generic Competition and Benefit Management...

    Data.gov (United States)

    U.S. Department of Health & Human Services — According to findings reported in Impacts of Generic Competition and Benefit Management Practices on Spending for Prescription Drugs - Evidence from Medicares Part D...

  3. Impacts of Generic Competition and Benefit Management...

    Data.gov (United States)

    U.S. Department of Health & Human Services — According to findings reported in Impacts of Generic Competition and Benefit Management Practices on Spending for Prescription Drugs - Evidence from Medicares Part D...

  4. Generic User Process Interface for Event Generators

    CERN Document Server

    Boos, E; Giele, W T; Hinchliffe, Ian; Huston, J; Ilyin, V A; Kanzaki, J; Kato, K; Kurihara, Y; Lönnblad, L; Mangano, Michelangelo L; Mrenna, S; Paige, Frank E; Richter-Was, Elzbieta; Seymour, Michael H; Sjöstrand, Torbjörn; Webber, Bryan R; Zeppenfeld, Dieter

    2001-01-01

    Generic Fortran common blocks are presented for use by High Energy Physics event generators for the transfer of event configurations from parton level generators to showering and hadronization event generators.

  5. 核心自我评价的结构验证及其量表修订%Reliability, Validation and Construct Confirmatory of Core Self-Evaluations Scale

    Institute of Scientific and Technical Information of China (English)

    杜建政; 张翔; 赵燕

    2012-01-01

    以Judge等编制的核心自我评价量表为基础,翻译并修订了中文版量表。研究以526名企业员工和在校大学生为样本,采用交叉验证的方法.先进行探索性因素分析,之后对模型进行验证性因素分析,并在控制共同方法偏差的条件下,对修订量表的信度、效度进行检验。单因素结构模型的拟合指数基本达到了可接受的标准,x2/df、GFI、CFI、NFI、IFI、TLI、RMSEA分别为2.20、0.94、0.91、0.86、0.92、0.90、0.07。修订后的核心自我评价量表a系数为0.83,分半信度为0.84。间隔3周的重测信度为0.82。核心自我评价与生活满意度的相关为0.48。核心自我评价具有单因素结构。修订后的核心自我评价中文版量表具有较好的信度和效度,可以作为一种有效实用的人格测量工具。%This present study tested the Chinese Core Self-Evaluations Scale. Recently, Judge et al. (1997) proposed a higher order construct they termed core self-evaluations and defined core self-evalua- tions as basic conclusions or bottom-line evaluations that individuals hold about themselves. According to Judge et al. (1997), this construct is a broad dispositional trait that is indicated by four more specific traits--self esteem, generalized self-efficacy, locus of control and emotional stability. The past research has showed that core self-evaluations play an important role in attitudes and behavior at work and life. Based on the core self-evaluations theory, Judge et al. (2003) developed a direct measure of this trait--Core Self-Evaluations Scale (CSES). Thus, the purpose of this article is to revise and test the reliability and validation of Chinese Core Self-Evaluations Scale. Participants were a sample of 184 employee from 3 companies and 386 students randomly selected in the self-study classrooms from a comprehensive universi- ty. In the process of the collection 44 surveys

  6. Generic methodology for calibrating profiling nacelle lidars

    DEFF Research Database (Denmark)

    Borraccino, Antoine; Courtney, Michael; Wagner, Rozenn

    is calibrated rather than a reconstructed parameter. This contribution presents a generic methodology to calibrate profiling nacelle-mounted lidars. The application of profiling lidars to wind turbine power performance and corresponding need for calibration procedures is introduced in relation to metrological...... standards. Further, two different calibration procedure concepts are described along with their strengths and weaknesses. The main steps of the generic methodology are then explained and illustrated by calibration results from two types of profiling lidars. Finally, measurement uncertainty assessment...

  7. On the genericity of spacetime singularities

    Indian Academy of Sciences (India)

    Pankaj S Joshi

    2007-07-01

    We consider here the genericity aspects of spacetime singularities that occur in cosmology and in gravitational collapse. The singularity theorems (that predict the occurrence of singularities in general relativity) allow the singularities of gravitational collapse to be either visible to external observers or covered by an event horizon of gravity. It is shown that the visible singularities that develop as final states of spherical collapse are generic. Some consequences of this fact are discussed.

  8. On Phases of Generic Toric Singularities

    CERN Document Server

    Sarkar, Tapobrata

    2007-01-01

    We systematically study the phases of generic toric singularities, using methods initiated in hep-th/0612046. These correspond to Gauged Linear Sigma Models with arbitrary charges. We show that complete information about generic $U(1)^r$ GLSMs can be obtained by studying the GLSM Lagrangian, appropriately modified in the different phases of the theory. This can be used to study the different phases of $L^{a,b,c}$ spaces and their non-supersymmetric counterparts.

  9. Aerodynamic and acoustic effects of eliminating core swirl from a full scale 1.6 stage pressure ratio fan (QF-5A)

    Science.gov (United States)

    Woodward, R. P.; Acker, L. W.; Stakolich, E. G.

    1978-01-01

    Fan QF-5A was a modification of fan QF-5 which had an additional core stator and adjusted support struts to turn the core exit flow from a 30 deg swirl to the axial direction. This modification was necessary to eliminate the impingement of the swirling core flow on the axial support pylon of the NASA-Lewis Quiet Fan Facility that caused aerodynamic, acoustic and structural problems with the original fan stage at fan speeds greater than 85 percent of design. The redesigned fan QF-5A did obtain the design bypass ratio with an increased core airflow suggesting that the flow problem was resolved. Acoustically, the redesigned stage showed a low frequency broadband noise reduction compared to the results for fan QF-5 at similar operating conditions.

  10. Determination and variation of core bacterial community in a two-stage full-scale anaerobic reactor treating high-strength pharmaceutical wastewater.

    Science.gov (United States)

    Ma, Haijun; Ye, Lin; Hu, Haidong; Zhang, Lulu; Ding, Lili; Ren, Hongqiang

    2017-08-25

    The functional characterization and temporal variation of anaerobic bacterial population is important to better understanding of microbial process of two-stage anaerobic reactor. However, due to the high diversity of anaerobic bacteria, close attention should be prioritized to be paid to the frequently abundant bacteria that were defined as core bacteria and putatively functionally important. Here in this study, using Miseq sequencing technology, the core bacterial community of 98 operational taxonomic units (OTUs) was determined in a two-stage upflow blanket filter reactors treating pharmaceutical wastewater. The core bacterial community accounted for 61.66% of the total sequences and accurately predicted the sample location in the principal coordinates analysis (PCoA) scatter plot as the total bacterial OTUs did. The core bacterial community in the first-stage (FS) and second-stage (SS) reactors were generally distinct that FS core bacterial community was indicated to be more related to higher-level fermentation process and SS core bacterial community contained more microbes in syntrophic cooperation with methanogens. Moreover, the different responses of FS and SS core bacterial community to the temperature shock and influent disturbance caused by solid contamination were fully investigated. Co-occurring analysis at the order level implied that Bacteroidales, Selenomonadales, Anaerolineales, Syneristales and Thermotogales might play keystone roles in anaerobic digestion due to their high abundance and tight correlation with other microbes. These findings advanced our knowledge about the core bacteria community and its temporal variability for future comparative research and the improvement of the two-stage anaerobic system operation.

  11. Genericness of Big Bounce in isotropic loop quantum cosmology

    OpenAIRE

    Date, Ghanashyam; Hossain, Golam Mortuza

    2004-01-01

    The absence of isotropic singularity in loop quantum cosmology can be understood in an effective classical description as the universe exhibiting a Big Bounce. We show that with scalar matter field, the big bounce is generic in the sense that it is independent of quantization ambiguities and details of scalar field dynamics. The volume of the universe at the bounce point is parametrized by a single parameter. It provides a minimum length scale which serves as a cut-off for computations of den...

  12. 42 CFR 447.506 - Authorized generic drugs.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Authorized generic drugs. 447.506 Section 447.506... (CONTINUED) MEDICAL ASSISTANCE PROGRAMS PAYMENTS FOR SERVICES Payment for Drugs § 447.506 Authorized generic drugs. (a) Authorized generic drug defined. For the purposes of this subpart, an authorized generic...

  13. The diffusion of generics after patent expiry in Germany.

    Science.gov (United States)

    Fischer, Katharina Elisabeth; Stargardt, Tom

    2016-11-01

    To identify the influences on the diffusion of generics after patent expiry, we analyzed 65 generic entries using prescription data of a large German sickness fund between 2007 and 2012 in a sales model. According to theory, several elements are responsible for technology diffusion: (1) time reflecting the rate of adaption within the social system, (2) communication channels, and (3) the degree of incremental innovation, e.g., the modifications of existing active ingredient's strength. We investigated diffusion in two ways: (1) generic market share (percentage of generic prescriptions of all prescriptions of a substance) and, (2) generic sales quantity (number of units sold) over time. We specified mixed regression models. Generic diffusion takes considerable time. An average generic market share of about 75 % was achieved not until 48 months. There was a positive effect of time since generic entry on generic market share (p innovation influenced generic market share (mostly p < 0.001), but not generic sales quantity. Market structure, e.g., the number of generic manufacturers (p < 0.001) and prices influenced both generic market share and sales. Imperfections in generic uptake through informational cascades seem to be largely present. Third-party payers could enhance means to promote generic diffusion to amplify savings through generic entry.

  14. Generics and the specific features of their regulation

    Directory of Open Access Journals (Sweden)

    E. A. Ushkalova

    2016-01-01

    Full Text Available The article discusses the factors that influence the efficacy and safety of generic drugs, including regular generics, biosimilars, and generic nonbiological complex drugs. It emphasizes the importance of adequate regulatory requirements to provide a comparable therapeutic efficacy and a comparable cost-effectiveness ratio for generics versus brand-name drugs.

  15. Gram-scale synthesis, thermal stability, magnetic properties, and microwave absorption application of extremely small Co-C core-shell nanoparticles

    Science.gov (United States)

    Kuang, Daitao; Hou, Lizhen; Yu, Bowen; Liang, Bingbing; Deng, Lianwen; Huang, Han; Ma, Songshan; He, Jun; Wang, Shiliang

    2017-07-01

    Co-C core-shell nanoparticles have been synthesized in large quantity (in grams) by metal-organic chemical vapor deposition with analytical cobalt (III) acetylacetonate as precursor. Extremely small nanoparticles with an average core diameter of 3 nm and a shell thickness of 1-2 nm, and relatively large nanoparticles with an average core diameter of 23 nm and a shell thickness of 5-20 nm were obtained, depending on the deposition regions. The 3 nm Co nanocores are thermally stable up to 200 °C in air atmosphere, and do not exhibit visible structural and morphological changes after exposure to air at room temperature for 180 d. The extremely small core-shell nanoparticles exhibit typical superparamagnetic behaviors with a small coercivity of 5 Oe, while the relative large nanoparticles are a typical ferromagnetic material with a high coercivity of 584 Oe. In the microwave absorption tests, a low reflection loss (RL) of  -80.3 dB and large effective bandwidth (frequency range for \\text{RL}≤slant -10~ dB) of 10.1 GHz are obtained in the nanoparticle-paraffin composites with appropriate layer thicknesses and particle contents. This suggests that the as-synthesized Co-C core-shell nanoparticles have a high potential as the microwave-absorbing materials.

  16. 助产士核心胜任力量表在北京市助产士中的适用性研究%The application of Midwife Core Competency Scale in Beijing midwives

    Institute of Scientific and Technical Information of China (English)

    张贤; 陆虹; 王德慧

    2013-01-01

    Objective To evaluate the reliability and validity of Midwife Core Competency Scale.Methods A total of 369 midwives from 32 hospitals in Beijing were investigated by using the Midwife Core Competency Scale,and the reliability and validity were measured.Results The scale had good internal consistency (Cronbach's α =0.950) and stability (the test-retest reliability was 0.832).The distribution of 54 items in 6 public factors (the cumulative variance=68.405%) extracted from them was consistent with the original scale,yielding an acceptable construct validity.The scale also had good validity to discriminate different categories.Conclusions The Midwife Core Competency Scale had good reliability and validity,which provides an useful tool to evaluate midwifes' core competency.Larger sample from different areas need to be considered for scale improvement and application.%目的 评价助产士核心胜任力量表在北京市助产士中应用的信度和效度,探讨此量表的适用性.方法 采用助产士核心胜任力量表,对北京市32家医院的369名助产士进行测评,对量表进行信度和效度分析.结果 助产士核心胜任力量表有较好的内部一致性(Cronbach's α系数为0.950)和稳定性(重测信度为0.832).54个条目在提取的6个公因子中的分布与原量表结构基本一致,累积贡献达68.405%.量表能够区分不同特征助产士的核心胜任力水平,具有较好的判别效度.结论 助产士核心胜任力量表具有良好的信度和效度,适用于我国助产士核心胜任力的评价.但仍需多区域扩大样本进行量表的适用性评价,以推动量表的完善和推广应用.

  17. FDA Critical Path Initiatives: Opportunities for Generic Drug Development

    OpenAIRE

    Lionberger, Robert A.

    2008-01-01

    FDA’s critical path initiative documents have focused on the challenges involved in the development of new drugs. Some of the focus areas identified apply equally to the production of generic drugs. However, there are scientific challenges unique to the development of generic drugs as well. In May 2007, FDA released a document “Critical Path Opportunities for Generic Drugs” that identified some of the specific challenges in the development of generic drugs. The key steps in generic product de...

  18. Effects of generic versus non-generic feedback on motor learning in children.

    Directory of Open Access Journals (Sweden)

    Suzete Chiviacowsky

    Full Text Available Non-generic feedback refers to a specific event and implies that performance is malleable, while generic feedback implies that task performance reflects an inherent ability. The present study examined the influences of generic versus non-generic feedback on motor performance and learning in 10-year-old children. In the first experiment, using soccer ball kicking at a target as a task, providing participants with generic feedback resulted in worse performance than providing non-generic feedback, after both groups received negative feedback. The second experiment measured more permanent effects. Results of a retention test, performed one day after practicing a throwing task, showed that participants who received non-generic feedback during practice outperformed the generic feedback group, after receiving a negative feedback statement. The findings demonstrate the importance of the wording of feedback. Even though different positive feedback statements may not have an immediate influence on performance, they can affect performance, and presumably individuals' motivation, when performance is (purportedly poor. Feedback implying that performance is malleable, rather than due to an inherent ability, seems to have the potential to inoculate learners against setbacks--a situation frequently encountered in the context of motor performance and learning.

  19. [Analysis of generic drug supply in France].

    Science.gov (United States)

    Taboulet, F; Haramburu, F; Latry, Ph

    2003-09-01

    The list of generic medicines (LGM), published since 1997 by the Agence Française de Sécurité Sanitaire des Produits de Santé (AFFSSaPS), the French Medicine Agency, concerns a special part of the medicines reimbursed by the National Health Insurance (Social Security). The objectives of the present study were: i) to describe the components of this list, based on pharmaceutical, economical and therapeutic characteristics, ii) to study differences between generic and reference products (formulations, excipients, prices, etc.), iii) to analyze information on excipients provided to health care professionals. The 21st version of the LGM (April 2001) was used. Therapeutic value was retrieved from the 2001 AFSSaPS report on the therapeutic value of 4490 reimbursed medicines. Information on excipients in the LGM and the Vidal dictionary (reference prescription book in France) was compared. The products included in the LGM represent 20% of all reimbursed medicines. The mean price differences between generics and their reference products vary between 30 and 50% for more than two thirds of the generic groups. The therapeutic value of the products of the LGM was judged important in 71% of cases (vs 63% for the 4409 assessed medicines) and insufficient in 13% of cases (vs 19%). Information on excipients is often missing and sometimes erroneous. Although the LGM is regularly revised and thus the generic market in perpetual change, the 2001 cross description of this pharmaceutical market provides much informations and raises some concern.

  20. A lifeline to treatment: the role of Indian generic manufacturers in supplying antiretroviral medicines to developing countries.

    Science.gov (United States)

    Waning, Brenda; Diedrichsen, Ellen; Moon, Suerie

    2010-09-14

    Indian manufacturers of generic antiretroviral (ARV) medicines facilitated the rapid scale up of HIV/AIDS treatment in developing countries though provision of low-priced, quality-assured medicines. The legal framework in India that facilitated such production, however, is changing with implementation of the World Trade Organization Agreement on Trade-Related Aspects of Intellectual Property Rights, and intellectual property measures being discussed in regional and bilateral free trade agreement negotiations. Reliable quantitative estimates of the Indian role in generic global ARV supply are needed to understand potential impacts of such measures on HIV/AIDS treatment in developing countries. We utilized transactional data containing 17,646 donor-funded purchases of ARV tablets made by 115 low- and middle-income countries from 2003 to 2008 to measure market share, purchase trends and prices of Indian-produced generic ARVs compared with those of non-Indian generic and brand ARVs. Indian generic manufacturers dominate the ARV market, accounting for more than 80% of annual purchase volumes. Among paediatric ARV and adult nucleoside and non-nucleoside reverse transcriptase inhibitor markets, Indian-produced generics accounted for 91% and 89% of 2008 global purchase volumes, respectively. From 2003 to 2008, the number of Indian generic manufactures supplying ARVs increased from four to 10 while the number of Indian-manufactured generic products increased from 14 to 53. Ninety-six of 100 countries purchased Indian generic ARVs in 2008, including high HIV-burden sub-Saharan African countries. Indian-produced generic ARVs used in first-line regimens were consistently and considerably less expensive than non-Indian generic and innovator ARVs. Key ARVs newly recommended by the World Health Organization are three to four times more expensive than older regimens. Indian generic producers supply the majority of ARVs in developing countries. Future scale up using newly

  1. A lifeline to treatment: the role of Indian generic manufacturers in supplying antiretroviral medicines to developing countries

    Directory of Open Access Journals (Sweden)

    Waning Brenda

    2010-09-01

    Full Text Available Abstract Background Indian manufacturers of generic antiretroviral (ARV medicines facilitated the rapid scale up of HIV/AIDS treatment in developing countries though provision of low-priced, quality-assured medicines. The legal framework in India that facilitated such production, however, is changing with implementation of the World Trade Organization Agreement on Trade-Related Aspects of Intellectual Property Rights, and intellectual property measures being discussed in regional and bilateral free trade agreement negotiations. Reliable quantitative estimates of the Indian role in generic global ARV supply are needed to understand potential impacts of such measures on HIV/AIDS treatment in developing countries. Methods We utilized transactional data containing 17,646 donor-funded purchases of ARV tablets made by 115 low- and middle-income countries from 2003 to 2008 to measure market share, purchase trends and prices of Indian-produced generic ARVs compared with those of non-Indian generic and brand ARVs. Results Indian generic manufacturers dominate the ARV market, accounting for more than 80% of annual purchase volumes. Among paediatric ARV and adult nucleoside and non-nucleoside reverse transcriptase inhibitor markets, Indian-produced generics accounted for 91% and 89% of 2008 global purchase volumes, respectively. From 2003 to 2008, the number of Indian generic manufactures supplying ARVs increased from four to 10 while the number of Indian-manufactured generic products increased from 14 to 53. Ninety-six of 100 countries purchased Indian generic ARVs in 2008, including high HIV-burden sub-Saharan African countries. Indian-produced generic ARVs used in first-line regimens were consistently and considerably less expensive than non-Indian generic and innovator ARVs. Key ARVs newly recommended by the World Health Organization are three to four times more expensive than older regimens. Conclusions Indian generic producers supply the majority of

  2. Generic scaling relation in the scalar $\\phi^{4}$ model

    CERN Document Server

    Derkachov, S E

    1996-01-01

    The results of analysis of the one--loop spectrum of anomalous dimensions of composite operators in the scalar \\phi^{4} model are presented. We give the rigorous constructive proof of the hypothesis on the hierarchical structure of the spectrum of anomalous dimensions -- the naive sum of any two anomalous dimensions generates a limit point in the spectrum. Arguments in favor of the nonperturbative character of this result and the possible ways of a generalization to other field theories are briefly discussed.

  3. Transformer core

    NARCIS (Netherlands)

    Mehendale, A.; Hagedoorn, Wouter; Lötters, Joost Conrad

    2010-01-01

    A transformer core includes a stack of a plurality of planar core plates of a magnetically permeable material, which plates each consist of a first and a second sub-part that together enclose at least one opening. The sub-parts can be fitted together via contact faces that are located on either side

  4. Transformer core

    NARCIS (Netherlands)

    Mehendale, A.; Hagedoorn, Wouter; Lötters, Joost Conrad

    2008-01-01

    A transformer core includes a stack of a plurality of planar core plates of a magnetically permeable material, which plates each consist of a first and a second sub-part that together enclose at least one opening. The sub-parts can be fitted together via contact faces that are located on either side

  5. Application of Core Dynamics Modeling to Core-Mantle Interactions

    Science.gov (United States)

    Kuang, Weijia

    2003-01-01

    Observations have demonstrated that length of day (LOD) variation on decadal time scales results from exchange of axial angular momentum between the solid mantle and the core. There are in general four core-mantle interaction mechanisms that couple the core and the mantle. Of which, three have been suggested likely the dominant coupling mechanism for the decadal core-mantle angular momentum exchange, namely, gravitational core-mantle coupling arising from density anomalies in the mantle and in the core (including the inner core), the electromagnetic coupling arising from Lorentz force in the electrically conducting lower mantle (e.g. D-layer), and the topographic coupling arising from non-hydrostatic pressure acting on the core-mantle boundary (CMB) topography. In the past decades, most effort has been on estimating the coupling torques from surface geomagnetic observations (kinematic approach), which has provided insights on the core dynamical processes. In the meantime, it also creates questions and concerns on approximations in the studies that may invalidate the corresponding conclusions. The most serious problem is perhaps the approximations that are inconsistent with dynamical processes in the core, such as inconsistencies between the core surface flow beneath the CMB and the CMB topography, and that between the D-layer electric conductivity and the approximations on toroidal field at the CMB. These inconsistencies can only be addressed with numerical core dynamics modeling. In the past few years, we applied our MoSST (Modular, Scalable, Self-consistent and Three-dimensional) core dynamics model to study core-mantle interactions together with geodynamo simulation, aiming at assessing the effect of the dynamical inconsistencies in the kinematic studies on core-mantle coupling torques. We focus on topographic and electromagnetic core-mantle couplings and find that, for the topographic coupling, the consistency between the core flow and the CMB topography is

  6. An Internet of Things Generic Reference Architecture

    DEFF Research Database (Denmark)

    Bhalerao, Dipashree M.; Riaz, Tahir; Madsen, Ole Brun

    2013-01-01

    , and keeping track of all these things for monitoring and controlling some information. IoT architecture is studied from software architecture, overall system architecture and network architecture point of view. Paper puts forward the requirements of software architecture along with, its component...... and deployment diagram, process and interface diagram at abstract level. Paper proposes the abstract generic IoT reference and concrete abstract generic IoT reference architectures. Network architecture is also put up as a state of the art. Paper shortly gives overviews of protocols used for IoT. Some...

  7. Generic Rigidity Matroids with Dilworth Truncations

    CERN Document Server

    Tanigawa, Shin-ichi

    2010-01-01

    We prove that the linear matroid that defines generic rigidity of $d$-dimensional body-rod-bar frameworks (i.e., structures consisting of disjoint bodies and rods mutually linked by bars) can be obtained from the union of ${d+1 \\choose 2}$ graphic matroids by applying variants of Dilworth truncation $n_r$ times, where $n_r$ denotes the number of rods. This leads to an alternative proof of Tay's combinatorial characterizations of generic rigidity of rod-bar frameworks and that of identified body-hinge frameworks.

  8. The SENSEI Generic In Situ Interface

    Energy Technology Data Exchange (ETDEWEB)

    Ayachit, Utkarsh [Kitware, Inc., Clifton Park, NY (United States); Whitlock, Brad [Intelligent Light, Rutherford, NJ (United States); Wolf, Matthew [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Loring, Burlen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Lonie, David [Kitware, Inc., Clifton Park, NY (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-04-11

    The SENSEI generic in situ interface is an API that promotes code portability and reusability. From the simulation view, a developer can instrument their code with the SENSEI API and then make make use of any number of in situ infrastructures. From the method view, a developer can write an in situ method using the SENSEI API, then expect it to run in any number of in situ infrastructures, or be invoked directly from a simulation code, with little or no modification. This paper presents the design principles underlying the SENSEI generic interface, along with some simplified coding examples.

  9. Associations between generic substitution and patients' attitudes, beliefs and experience

    DEFF Research Database (Denmark)

    Østergaard Rathe, Jette; Larsen, Pia Veldt; Andersen, Morten

    2013-01-01

    Abstract Background Generic substitution has been implemented in many countries, but knowledge about patients’ attitudes, beliefs and experiences is still sparse. Aim To assess associations between generic switching and patients’ attitudes, beliefs and experiences with previous generic switching...... on generic medicine and confidence in the healthcare system. Only prescriptions issued by the general practitioners were included. For each patient we focused on one purchase of a generically substitutable drug (index drug). Patients were identified by means of a dispensing database. Results Earlier generic...... switches within the index ATC code were statistically significantly associated with experience of a generic switch (adjusted OR 5.93 95% CI 4.70; 7.49). Having had more than 5 earlier switches within other ATC codes and having negative views on generic medicines reduced the odds of experiencing a generic...

  10. 78 FR 23743 - Proposed Information Collection; Comment Request; Generic Clearance for Questionnaire Pretesting...

    Science.gov (United States)

    2013-04-22

    ... Questionnaire Pretesting Research AGENCY: Census Bureau, Commerce. ACTION: Notice. SUMMARY: The Department of... of small-scale questionnaire pretesting activities under this generic clearance. A block of hours... research program will be used by the Census Bureau and survey sponsors to improve questionnaires...

  11. Microsatellite diversity and broad scale geographic structure in a model legume: building a set of nested core collection for studying naturally occurring variation in Medicago truncatula

    DEFF Research Database (Denmark)

    Ronfort, Joelle; Bataillon, Thomas; Santoni, Sylvain

    2006-01-01

    scheme. Conclusion The stratification inferred is discussed considering potential historical events like expansion, refuge history and admixture between neighbouring groups. Information on the allelic richness and the inferred population structure are used to build a nested core-collection. The set......Abstract               Acknowledgements References   Background Exploiting genetic diversity requires previous knowledge of the extent and structure of the variation occurring in a species. Such knowledge can in turn be used to build a core-collection, i.e. a subset of accessions that aim...... at representing the genetic diversity of this species with a minimum of repetitiveness. We investigate the patterns of genetic diversity and population structure in a collection of 346 inbred lines representing the breadth of naturally occurring diversity in the Legume plant model Medicago truncatula using 13...

  12. Silicon nanotube field effect transistor with core-shell gate stacks for enhanced high-performance operation and area scaling benefits

    KAUST Repository

    Fahad, Hossain M.

    2011-10-12

    We introduce the concept of a silicon nanotube field effect transistor whose unique core-shell gate stacks help achieve full volume inversion by giving a surge in minority carrier concentration in the near vicinity of the ultrathin channel and at the same time rapid roll-off at the source and drain junctions constituting velocity saturation-induced higher drive current-enhanced high performance per device with efficient real estate consumption. The core-shell gate stacks also provide superior short channel effects control than classical planar metal oxide semiconductor field effect transistor (MOSFET) and gate-all-around nanowire FET. The proposed device offers the true potential to be an ideal blend for quantum ballistic transport study of device property control by bottom-up approach and high-density integration compatibility using top-down state-of-the-art complementary metal oxide semiconductor flow. © 2011 American Chemical Society.

  13. Should Physicians be Encouraged to use Generic Names and to Prescribe Generic Drugs?

    Science.gov (United States)

    Riaz, Haris; Krasuski, Richard A

    2016-06-01

    While using the brand names seems like a trivial issue at the outset, using these names is inherently problematic. Cardiovascular drugs remain the most commonly prescribed drugs by the physicians. The junior doctors are likely to introject practices of their seniors and consequently to reciprocate from the experiences learnt from their preceptors. Using the generic names may be one way to facilitate prescription of the generic drugs who have a better cost profile and similar efficacy than the more expensive branded drugs. In this editorial, we have outlined several arguments to suggest the importance of using the generic names in academic discussions and clinical documentation.

  14. The importance of being first: evidence from Canadian generic pharmaceuticals.

    Science.gov (United States)

    Hollis, Aidan

    2002-12-01

    This paper uses pooled cross-section data on Canadian ethical drug sales to examine the effect of entry timing on sales of generic drugs. The data is for all drugs for which the first generic competitor entered during the years 1994-1997. It is found that the first generic entrant has a lasting competitive advantage: being first into the market appears to lead to an increase of around 30% in market share (among generics) over a period of at least 4 years. This finding has considerable implications for the current policy of allowing brandname drug companies to issue pseudo-generic equivalents as a preemptive strike against true generic competitors.

  15. Ice Cores

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Records of past temperature, precipitation, atmospheric trace gases, and other aspects of climate and environment derived from ice cores drilled on glaciers and ice...

  16. Core BPEL

    DEFF Research Database (Denmark)

    Hallwyl, Tim; Højsgaard, Espen

    extensions. Combined with the fact that the language definition does not provide a formal semantics, it is an arduous task to work formally with the language (e.g. to give an implementation). In this paper we identify a core subset of the language, called Core BPEL, which has fewer and simpler constructs......, does not allow omissions, and does not contain ignorable elements. We do so by identifying syntactic sugar, including default values, and ignorable elements in WS-BPEL. The analysis results in a translation from the full language to the core subset. Thus, we reduce the effort needed for working...... formally with WS-BPEL, as one, without loss of generality, need only consider the much simpler Core BPEL. This report may also be viewed as an addendum to the WS-BPEL standard specification, which clarifies the WS-BPEL syntax and presents the essential elements of the language in a more concise way...

  17. Core BPEL

    DEFF Research Database (Denmark)

    Hallwyl, Tim; Højsgaard, Espen

    extensions. Combined with the fact that the language definition does not provide a formal semantics, it is an arduous task to work formally with the language (e.g. to give an implementation). In this paper we identify a core subset of the language, called Core BPEL, which has fewer and simpler constructs......, does not allow omissions, and does not contain ignorable elements. We do so by identifying syntactic sugar, including default values, and ignorable elements in WS-BPEL. The analysis results in a translation from the full language to the core subset. Thus, we reduce the effort needed for working...... formally with WS-BPEL, as one, without loss of generality, need only consider the much simpler Core BPEL. This report may also be viewed as an addendum to the WS-BPEL standard specification, which clarifies the WS-BPEL syntax and presents the essential elements of the language in a more concise way...

  18. Core benefits

    National Research Council Canada - National Science Library

    Keith, Brian W

    2010-01-01

    This SPEC Kit explores the core employment benefits of retirement, and life, health, and other insurance -benefits that are typically decided by the parent institution and often have significant governmental regulation...

  19. Physics of Core-Collapse Supernovae in Three Dimensions: a Sneak Preview

    CERN Document Server

    Janka, H -Thomas; Summa, Alexander

    2016-01-01

    Nonspherical mass motions are a generic feature of core-collapse supernovae, and hydrodynamic instabilities play a crucial role for the explosion mechanism. First successful neutrino-driven explosions could be obtained with self-consistent, first-principle simulations in three spatial dimensions (3D). But 3D models tend to be less prone to explosion than corresponding axisymmetric (2D) ones. This has been explained by 3D turbulence leading to energy cascading from large to small spatial scales, inversely to the 2D case, thus disfavoring the growth of buoyant plumes on the largest scales. Unless the inertia to explode simply reflects a lack of sufficient resolution in relevant regions, it suggests that some important aspect may still be missing for robust and sufficiently energetic neutrino-powered explosions. Such deficits could be associated with progenitor properties like rotation, magnetic fields or pre-collapse perturbations, or with microphysics that could lead to an enhancement of neutrino heating behin...

  20. Generic tacrolimus in solid organ transplantation

    DEFF Research Database (Denmark)

    Taube, D; Jones, G; O'Beirne, J

    2014-01-01

    The availability of a wide range of immunosuppressive therapies has revolutionized the management of patients who have undergone solid organ transplantation (SOT). However, the cost of immunosuppressive drugs remains high. This situation has led to the development of generic equivalents, which...

  1. Green's Conjecture for the generic canonical curve

    OpenAIRE

    Teixidor-I-Bigas, Montserrat

    1998-01-01

    Green's Conjecture states the following : syzygies of the canonical model of a curve are simple up to the p^th stage if and only if the Clifford index of C is greater than p. We prove that the generic curve of genus g satisfies Green's conjecture.

  2. First-class rules and generic traversal

    NARCIS (Netherlands)

    Dolstra, E.; Visser, Eelco

    2002-01-01

    In this paper we present a functional language supporting first-class rules and generic traversal. This is achieved by generalizing the pattern matching constructs of standard functional languages. The case construct that ties rules together and prevents their reuse, is replaced by separate, firstcl

  3. On generic representation of implicit induction procedures

    NARCIS (Netherlands)

    Naidich, D.

    1996-01-01

    We develop a generic representation of implicit induction proof procedures within the cover set induction framework. Our work further develops the approach of cover set induction on propositional orderings. We show that in order to represent a substantially wide range of implicit induction procedure

  4. Crystallization Kinetics within a Generic Modelling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist

    2013-01-01

    An existing generic modelling framework has been expanded with tools for kinetic model analysis. The analysis of kinetics is carried out within the framework where kinetic constitutive models are collected, analysed and utilized for the simulation of crystallization operations. A modelling...... procedure is proposed to gain the information of crystallization operation kinetic model analysis and utilize this for faster evaluation of crystallization operations....

  5. Baldrige Theory into Practice: A Generic Model

    Science.gov (United States)

    Arif, Mohammed

    2007-01-01

    Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…

  6. Matatti’s generic names for fungi

    NARCIS (Netherlands)

    Donk, M.A.

    1975-01-01

    The generic names for fungi used by Maratti in his ‘Flora romana’ must be accepted as validly published. Notes are given on the validly re-published names. Of these Agaricum and Coralloides may cause some difficulties. Conservation of Fomes (Fr.) Fr. against Agaricum [Mich.] Maratti is proposed. To

  7. On the Center of Generic Hecke Algebra

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The concept of norm and cellular algebra are introduced and then the cellular basis is used to replace the Kazhdan-Lusztig basis. So a new base for the center of generic Hecke algebra associated with finite Coxeter group is found. The new base is described by using the notion of cell datum of Graham and Lehrer and the notion of norm.

  8. Baldrige Theory into Practice: A Generic Model

    Science.gov (United States)

    Arif, Mohammed

    2007-01-01

    Purpose: The education system globally has moved from a push-based or producer-centric system to a pull-based or customer centric system. Malcolm Baldrige Quality Award (MBQA) model happens to be one of the latest additions to the pull based models. The purpose of this paper is to develop a generic framework for MBQA that can be used by…

  9. Quality of generic medicines in South Africa

    DEFF Research Database (Denmark)

    Patel, Aarti; Gauld, Robin; Norris, Pauline;

    2012-01-01

    must be addressed to ensure that people use them with confidence. Campaigns to increase the uptake of generic medicines by consumers and providers of healthcare need to be informed by local norms and practices. This study sought to compare South African consumers' and healthcare providers' perceptions...

  10. Intermediates and Generic Convergence to Equilibria

    DEFF Research Database (Denmark)

    Freitas, Michael Marcondes de; Wiuf, Carsten; Feliu, Elisenda

    2016-01-01

    Known graphical conditions for the generic or global convergence to equilibria of the dynamical system arising from a reaction network are shown to be invariant under the so-called successive removal of intermediates, a systematic procedure to simplify the network, making the graphical conditions...

  11. Core Competence Development : paradigm and practical implementations

    OpenAIRE

    Koay, Ze Wei; E.Markov, Denis

    2011-01-01

    The theory of core competence has drawn a large amount of attention in the academic field as well as of practitioners in the corporate world. Theory asserts that long-term value creation and competitiveness of the corporation relies on full-scale exploitation and timely development of company Core Competences; business strategies should be built around the core competencies of a firm. Identification and exploitation of Core Competences as well as essential elements comprising Core Competences...

  12. Special Taxing Districts, Various taxing areas (CORE, TIF, Renaissance), updated as boundaires change, Published in 2011, 1:1200 (1in=100ft) scale, City of Bismarck.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Special Taxing Districts dataset, published at 1:1200 (1in=100ft) scale, was produced all or in part from Hardcopy Maps information as of 2011. It is described...

  13. Psychiatrists' decision making between branded and generic drugs.

    Science.gov (United States)

    Hamann, Johannes; Mendel, Rosmarie; Kissling, Werner; Leucht, Stefan

    2013-07-01

    To study psychiatrists' decision making between generic and branded antipsychotics or antidepressants a hypothetical decision scenario involving decisions between branded and generic drugs was presented to a sample of German psychiatrists. Factors influencing this decision were identified using a regression analysis. n=410 Psychiatrists participated in the survey. Psychiatrists were more likely to choose branded drugs when imagining choosing the drug for themselves (vs. recommending a drug to a patient). In addition, psychiatrists were more likely to choose generic antidepressants than generic antipsychotics. Additional predictors for choosing a generic drug were a higher share of outpatients, less negative attitudes toward generics and higher uncertainty tolerance. In conclusion, psychiatrists' decision making in choosing between branded or generic antidepressants or antipsychotics is to a large extent influenced by vague attitudes towards properties of generics and branded drugs as well as by "non-evidence based" factors such as uncertainty tolerance.

  14. Effect of curriculum changes to enhance generic skills proficiency of ...

    African Journals Online (AJOL)

    Effect of curriculum changes to enhance generic skills proficiency of 1st-year ... Feedback from these different evaluation methods identified specific needs in the ... positive effect on students' selfreported acquisition of generic learning skills.

  15. Generic Switching and Non-Persistence among Medicine Users

    DEFF Research Database (Denmark)

    Østergaard Rathe, Jette; Andersen, Morten; Jarbøl, Dorte Ejg;

    2015-01-01

    BACKGROUND: Generic substitution means that one medicinal product is replaced by another product containing the same active substance. It is strictly regulated with respect to its bioequivalence, and all products must have undergone appropriate studies. Although generic substitution is widely...

  16. Acoustic Source Localization via Time Difference of Arrival Estimation for Distributed Sensor Networks using Tera-scale Optical-Core Devices

    Energy Technology Data Exchange (ETDEWEB)

    Imam, Neena [ORNL; Barhen, Jacob [ORNL

    2009-01-01

    For real-time acoustic source localization applications, one of the primary challenges is the considerable growth in computational complexity associated with the emergence of ever larger, active or passive, distributed sensor networks. These sensors rely heavily on battery-operated system components to achieve highly functional automation in signal and information processing. In order to keep communication requirements minimal, it is desirable to perform as much processing on the receiver platforms as possible. However, the complexity of the calculations needed to achieve accurate source localization increases dramatically with the size of sensor arrays, resulting in substantial growth of computational requirements that cannot be readily met with standard hardware. One option to meet this challenge builds upon the emergence of digital optical-core devices. The objective of this work was to explore the implementation of key building block algorithms used in underwater source localization on the optical-core digital processing platform recently introduced by Lenslet Inc. This demonstration of considerably faster signal processing capability should be of substantial significance to the design and innovation of future generations of distributed sensor networks.

  17. Scaling up Kernel Grower Clustering Method for Large Data Sets via Core-sets%基于核集合的大数据快速Kernel Grower聚类方法

    Institute of Scientific and Technical Information of China (English)

    常亮; 邓小明; 郑碎武; 王永庆

    2008-01-01

    Kernel grower is a novel kernel clustering method proposed recently by Camastra and Verri. It shows good performance for various data sets and compares favorably with respect to popular clustering algorithms. However, the main drawback of the method is the weak scaling ability in dealing with large data sets, which restricts its application greatly. In this paper, we propose a scaled-up kernel grower method using core-sets, which is significantly faster than the original method for large data clustering.Meanwhile, it can deal with very large data sets. Numerical experiments on benchmark data sets as well as synthetic data sets show the efficiency of the proposed method. The method is also applied to real image segmentation to illustrate its performance.

  18. Generic and product-specific health claim processes for functional foods across global jurisdictions.

    Science.gov (United States)

    Jew, Stephanie; Vanstone, Catherine A; Antoine, Jean-Michel; Jones, Peter J H

    2008-06-01

    Worldwide consumer interest in functional foods and their potential health benefits has been increasing over the past 10 y. To respond to this interest, regulatory bodies have developed guidelines for assessing health claims on functional foods. The objective of this article is to investigate the type and amount of evidence needed in various jurisdictions on a worldwide basis to substantiate both generic and product-specific health claims. Two types of health claims were examined using separate case studies. Analysis of generic health claims was highlighted by (n-3) fatty acids and their relation to heart health; whereas examination of product-specific health claims was conducted using probiotics and their association with gastrointestinal well-being. Results showed a common core for use of convincing high-quality human data, especially in the form of randomized controlled trials (RCT), but there was significant variability in the type and amount of scientific evidence needed to substantiate health claims, both generic and product specific, across different jurisdictions. Product-specific claims tended to use human RCT as the main basis for claims, whereas generic claims tended to base their statements on a wider spectrum of literature.

  19. 40 CFR 721.9973 - Zirconium dichlorides (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Zirconium dichlorides (generic). 721... Substances § 721.9973 Zirconium dichlorides (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substances identified generically as zirconium dichlorides (PMNs...

  20. 40 CFR 721.3080 - Substituted phosphate ester (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Substituted phosphate ester (generic... Substances § 721.3080 Substituted phosphate ester (generic). (a) Chemical substances and significant new uses subject to reporting. (1) The chemical substance identified generically as a substituted phosphate...

  1. 40 CFR 721.3110 - Polycarboxylic acid ester (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Polycarboxylic acid ester (generic... Substances § 721.3110 Polycarboxylic acid ester (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as a polycarboxylic acid...

  2. 40 CFR 721.8660 - Propionic acid methyl ester (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Propionic acid methyl ester (generic... Substances § 721.8660 Propionic acid methyl ester (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as a propionic acid methyl...

  3. 40 CFR 721.2155 - Alkoxyamino-alkyl-coumarin (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Alkoxyamino-alkyl-coumarin (generic... Substances § 721.2155 Alkoxyamino-alkyl-coumarin (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as...

  4. 40 CFR 721.535 - Halogenated alkane (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Halogenated alkane (generic). 721.535... Substances § 721.535 Halogenated alkane (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as halogenated alkane (PMN P-01-433) is...

  5. 40 CFR 721.10163 - Chloro fluoro alkane (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Chloro fluoro alkane (generic). 721... Substances § 721.10163 Chloro fluoro alkane (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as chloro fluoro alkane (PMN...

  6. 40 CFR 721.555 - Alkyl amino nitriles (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Alkyl amino nitriles (generic). 721... Substances § 721.555 Alkyl amino nitriles (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substances identified generically as alkyl amino nitriles (PMNs P-96...

  7. 40 CFR 721.5350 - Substituted nitrile (generic name).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Substituted nitrile (generic name... Substances § 721.5350 Substituted nitrile (generic name). (a) Chemical substances and significant new uses subject to reporting. (1) The chemical substance identified generically as a substituted nitrile (PMN P-83...

  8. 77 FR 60125 - Generic Drug Facilities, Sites and Organizations

    Science.gov (United States)

    2012-10-02

    ... HUMAN SERVICES Food and Drug Administration Generic Drug Facilities, Sites and Organizations AGENCY... Administration (FDA) is notifying generic drug facilities, and certain sites and organizations identified in a generic drug submission, that they must provide identification information to FDA. This information...

  9. 78 FR 22553 - Generic Drug Facilities, Sites, and Organizations

    Science.gov (United States)

    2013-04-16

    ... HUMAN SERVICES Food and Drug Administration Generic Drug Facilities, Sites, and Organizations AGENCY... announcing that the generic drug facility self-identification reporting period for fiscal year (FY) 2014 will begin on May 1, 2013, and close on June 1, 2013. Generic drug facilities, certain sites,...

  10. 40 CFR 721.5908 - Modified phenolic resin (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Modified phenolic resin (generic). 721... Substances § 721.5908 Modified phenolic resin (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as modified phenolic resin (PMN...

  11. 40 CFR 721.2673 - Aromatic epoxide resin (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Aromatic epoxide resin (generic). 721... Substances § 721.2673 Aromatic epoxide resin (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as aromatic epoxide resin (PMN...

  12. 40 CFR 721.5905 - Modified phenolic resin (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Modified phenolic resin (generic). 721... Substances § 721.5905 Modified phenolic resin (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as a modified phenolic resin...

  13. Generic Drugs: The Same Medicine for Less Money

    Science.gov (United States)

    Generic Drugs: The Same Medicine for Less Money What is a generic drug? A generic is a copy of a brand-name drug. A brand- name drug has a patent. When ... benefit to your health, and you will save money. 7KH IHGHUDO )RRG DQG 'UXJ $GPLQLVWUDWLRQ )'$ UHJXODWHV ERWK ...

  14. A developmental analysis of generic nouns in Southern Peruvian Quechua.

    Science.gov (United States)

    Mannheim, Bruce; Gelman, Susan A; Escalante, Carmen; Huayhua, Margarita; Puma, Rosalía

    2010-01-01

    Generic noun phrases (e.g., "Cats like to drink milk") are a primary means by which adults express generalizations to children, yet they pose a challenging induction puzzle for learners. Although prior research has established that English speakers understand and produce generic noun phrases by preschool age, little is known regarding the cross-cultural generality of generic acquisition. Southern Peruvian Quechua provides a valuable comparison because, unlike English, it is a highly inflected language in which generics are marked by the absence rather than the presence of any linguistic markers. Moreover, Quechua is spoken in a cultural context that differs markedly from the highly educated, middle-class contexts within which earlier research on generics was conducted. We presented participants from 5 age groups (3-6, 7-9, 10-12, 14-35, and 36-90 years of age) with two tasks that examined the ability to distinguish generic from non-generic utterances. In Study 1, even the youngest children understood generics as applying broadly to a category (like "all") and distinct from indefinite reference ("some"). However, there was a developmental lag before children understood that generics, unlike "all", can include exceptions. Study 2 revealed that generic interpretations are more frequent for utterances that (a) lack specifying markers and (b) are animate. Altogether, generic interpretations are found among the youngest participants, and may be a default mode of quantification. These data demonstrate the cross-cultural importance of generic information in linguistic expression.

  15. 40 CFR 721.10113 - Thioether epoxy (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Thioether epoxy (generic). 721.10113... Substances § 721.10113 Thioether epoxy (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as thioether epoxy (PMN P-04-547) is subject to...

  16. 40 CFR 721.324 - Alkoxylated acrylate polymer (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Alkoxylated acrylate polymer (generic... Substances § 721.324 Alkoxylated acrylate polymer (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as alkoxylated acrylate polymer...

  17. 40 CFR 721.9959 - Polyurethane polymer (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Polyurethane polymer (generic). 721... Substances § 721.9959 Polyurethane polymer (generic). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance identified generically as a polyurethane polymer (PMN P-01...

  18. Reentry Thermal Analysis of a Generic Crew Exploration Vehicle Structure

    Science.gov (United States)

    Ko, William L.; Gong, Leslie; Quinn, Robert D.

    2007-01-01

    Comparative studies were performed on the heat-shielding characteristics of honeycomb-core sandwich panels fabricated with different materials for possible use as wall panels for the proposed crew exploration vehicle. Graphite/epoxy sandwich panel was found to outperform aluminum sandwich panel under the same geometry due to superior heat-shielding qualities and lower material density. Also, representative reentry heat-transfer analysis was performed on the windward wall structures of a generic crew exploration vehicle. The Apollo low Earth orbit reentry trajectory was used to calculate the reentry heating rates. The generic crew exploration vehicle has a graphite/epoxy composite honeycomb sandwich exterior wall and an aluminum honeycomb sandwich interior wall, and is protected with the Apollo thermal protection system ablative material. In the thermal analysis computer program used, the TPS ablation effect was not yet included; however, the results from the nonablation heat-transfer analyses were used to develop a "virtual ablation" method to estimate the ablation heat loads and the thermal protection system recession thicknesses. Depending on the severity of the heating-rate time history, the virtual ablation period was found to last for 87 to 107 seconds and the ablation heat load was estimated to be in the range of 86 to 88 percent of the total heat load for the ablation time period. The thermal protection system recession thickness was estimated to be in the range of 0.08 to 0.11 inches. For the crew exploration vehicle zero-tilt and 18-degree-tilt stagnation points, thermal protection system thicknesses of h = {0.717, 0.733} inches were found to be adequate to keep the substructural composite sandwich temperature below the limit of 300 F.

  19. Towards Generic Interaction Styles for Product Design

    DEFF Research Database (Denmark)

    Buur, Jacob; Stienstra, Marcelle

    2008-01-01

    A growing uneasiness among users with the experience of current product user interfaces mounts pressure on interaction designers to innovate user interface conventions. In previous research we have shown that a study of the history of product interaction triggers a broader discussion of interaction...... qualities among designers in a team, and that the naming of interaction styles helps establish an aesthetics of interaction design. However, that research focused on one particular product field, namely industrial controllers, and it was yet to be proven, if interaction styles do have generic traits across...... a wider range of interactive products. In this paper we report on five years of continued research into interaction styles for telephones, kitchen equipment, HiFi products and medical devices, and we show how it is indeed possible and beneficial to formulate a set of generic interaction styles....

  20. The Doppler peaks from a generic defect

    CERN Document Server

    Magueijo, J

    1996-01-01

    We investigate which of the exotic Doppler peak features found for textures and cosmic strings are generic novelties pertaining to defects. We find that the ``out of phase'' texture signature is an accident. Generic defects, when they generate a secondary peak structure similar to inflation, apply to it an additive shift. It is not necessary for this shift to be ``out of phase''. We also show which factors are responsible for the absence of secondary oscillations found for cosmic strings. Within this general analysis we finally consider the conditions under which topological defects and inflation can be confused. It is argued that only \\Omega=1 inflation and a defect with a horizon size coherence length have a chance to be confused. Any other inflationary or defect model always differ distinctly. (To appear in the proceedings of the XXXIth Moriond meeting, ``Microwave Background Anisotropies'')

  1. Towards Generic Models of Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Shaker, Mohammad; Abou-Zleikha, Mohamed

    2015-01-01

    -dependent and their applicability is usually limited to the system and the data used for model construction. Establishing models of user experience that are highly scalable while maintaing the performance constitutes an important research direction. In this paper, we propose generic models of user experience in the computer games...... further examine whether generic features of player be- haviour can be defined and used to boost the modelling per- formance. The accuracies obtained in both experiments in- dicate a promise for the proposed approach and suggest that game-independent player experience models can be built.......Context personalisation is a flourishing area of research with many applications. Context personalisation systems usually employ a user model to predict the appeal of the context to a particular user given a history of interactions. Most of the models used are context...

  2. Generic Data Pipelining Using ORAC-DR

    Science.gov (United States)

    Allan, Alasdair; Jenness, Tim; Economou, Frossie; Currie, Malcolm J.; Bly, Martin J.

    A generic data reduction pipeline is, perhaps, the holy grail for data reduction software. We present work which sets us firmly on the path towards this goal. ORAC-DR is an online data reduction pipeline written by the Joint Astronomy Center (JAC) and the UK Astronomy Technology Center (ATC) and distributed as part of the Starlink Software collection (SSC). It is intended to run with a minimum of observer interaction, and is able to handle data from many different instruments, including SCUBA, CGS4, UFTI, IRCAM and Michelle, with support for IRIS2 and UIST under development. Recent work by Starlink in collaboration with the JAC has resulted in an increase in the pipeline's flexibility, opening up the possibility that it could be used for truly generic data reduction for data from any imaging, and eventually spectroscopic, detector.

  3. Savannah River Site generic data base development

    Energy Technology Data Exchange (ETDEWEB)

    Blanton, C.H.; Eide, S.A.

    1993-06-30

    This report describes the results of a project to improve the generic component failure data base for the Savannah River Site (SRS). A representative list of components and failure modes for SRS risk models was generated by reviewing existing safety analyses and component failure data bases and from suggestions from SRS safety analysts. Then sources of data or failure rate estimates were identified and reviewed for applicability. A major source of information was the Nuclear Computerized Library for Assessing Reactor Reliability, or NUCLARR. This source includes an extensive collection of failure data and failure rate estimates for commercial nuclear power plants. A recent Idaho National Engineering Laboratory report on failure data from the Idaho Chemical Processing Plant was also reviewed. From these and other recent sources, failure data and failure rate estimates were collected for the components and failure modes of interest. This information was aggregated to obtain a recommended generic failure rate distribution (mean and error factor) for each component failure mode.

  4. Towards Generic Models of Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Shaker, Mohammad; Abou-Zleikha, Mohamed

    2015-01-01

    further examine whether generic features of player be- haviour can be defined and used to boost the modelling per- formance. The accuracies obtained in both experiments in- dicate a promise for the proposed approach and suggest that game-independent player experience models can be built.......-dependent and their applicability is usually limited to the system and the data used for model construction. Establishing models of user experience that are highly scalable while maintaing the performance constitutes an important research direction. In this paper, we propose generic models of user experience in the computer games...... domain. We employ two datasets collected from players in- teractions with two games from different genres where accu- rate models of players experience were previously built. We take the approach one step further by investigating the mod- elling mechanism ability to generalise over the two datasets. We...

  5. Unsteady Pressures on a Generic Capsule Shape

    Science.gov (United States)

    Burnside, Nathan; Ross, James C.

    2015-01-01

    While developing the aerodynamic database for the Orion spacecraft, the low-speed flight regime (transonic and below) proved to be the most difficult to predict and measure accurately. The flow over the capsule heat shield in descent flight was particularly troublesome for both computational and experimental efforts due to its unsteady nature and uncertainty about the boundary layer state. The data described here were acquired as part of a study to improve the understanding of the overall flow around a generic capsule. The unsteady pressure measurements acquired on a generic capsule shape are presented along with a discussion about the effects of various flight conditions and heat-shield surface roughness on the resulting pressure fluctuations.

  6. Developing A Generic Optical Avionic Network

    DEFF Research Database (Denmark)

    Zhang, Jiang; An, Yi; Berger, Michael Stübert

    2011-01-01

    We propose a generic optical network design for future avionic systems in order to reduce the weight and power consumption of current networks on board. A three-layered network structure over a ring optical network topology is suggested, as it can provide full reconfiguration flexibility and supp......We propose a generic optical network design for future avionic systems in order to reduce the weight and power consumption of current networks on board. A three-layered network structure over a ring optical network topology is suggested, as it can provide full reconfiguration flexibility...... and support a wide range of avionic applications. Segregation can be made on different hierarchies according to system criticality and security requirements. The structure of each layer is discussed in detail. Two network configurations are presented, focusing on how to support different network services...

  7. [The patents game. Generic and biosimilar drugs].

    Science.gov (United States)

    Villamañán, E; González, D; Armada, E; Ruano, M; Álvarez-Sala, R; Herrero, A

    2016-01-01

    The protection provided by patents on medicines has a limited duration. The expiry of patents expiration allows copies of the drugs to be released, competing with original. At first, they were identical to the original, known as generic drugs, but in recent years, due to the marketing of biological therapies and the expiry of many of their patents, biosimilar drugs have also emerged. These are not exact copies of the original, but, like generic drugs, biosimilar drugs have to demonstrate equivalence to the reference drugs in quality, safety and efficacy. Nevertheless, despite their importance and contribution to sustainability of health system, doctors are sometimes unaware of differences between them, and their impact in terms of clinical and economic effects. An attempt is made to review and clarify certain aspects often unknown by physicians, despite their involvement in their use. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  8. Generic Structure Potential of Christian Apologetics

    Directory of Open Access Journals (Sweden)

    Onwu Inya

    2012-01-01

    Full Text Available Religious texts have been examined by scholars from different theoretical standpoints. However, a close survey of the literature reveals that little attention has been paid to Christian apologetics from a linguistic perspective. Also, an examination of studies along the lines of Generic Structure Potential (henceforth GSP shows that the genre status of Christian apologetics has not been indicated. This gap provides the motivation for this paper, which investigates the GSP of Christian apologetics. Twenty texts written by various key contemporary apologetic writers were purposively selected for the study. The following generic structure potential catalogue was generated:The paper reveals that the elements of the GSP concertedly work to advance, argue for or defend the Christian belief system. The paper also suggests that the model could be applied to other forms of apologetic instances.

  9. Savannah River Site generic data base development

    Energy Technology Data Exchange (ETDEWEB)

    Blanchard , A.

    2000-01-04

    This report describes the results of a project to improve the generic component failure database for the Savannah River Site (SRS). Additionally, guidelines were developed further for more advanced applications of database values. A representative list of components and failure modes for SRS risk models was generated by reviewing existing safety analyses and component failure data bases and from suggestions from SRS safety analysts. Then sources of data or failure rate estimates were identified and reviewed for applicability. A major source of information was the Nuclear Computerized Library for Assessing Reactor Reliability, or NUCLARR. This source includes an extensive collection of failure data and failure rate estimates for commercial nuclear power plants. A recent Idaho National Engineering Laboratory report on failure data from the Idaho Chemical Processing Plant was also reviewed. From these and other recent sources, failure data and failure rate estimates were collected for the components and failure modes of interest. For each component failure mode, this information was aggregated to obtain a recommended generic failure rate distribution (mean and error factor based on a lognormal distribution). Results are presented in a table in this report. A major difference between generic database and previous efforts is that this effort estimates failure rates based on actual data (failure events) rather than on existing failure rate estimates. This effort was successful in that over 75% of the results are now based on actual data. Also included is a section on guidelines for more advanced applications of failure rate data. This report describes the results of a project to improve the generic component failure database for the Savannah River site (SRS). Additionally, guidelines were developed further for more advanced applications of database values.

  10. Superrosy dependent groups having finitely satisfiable generics

    CERN Document Server

    Ealy, Clifton; Pillay, Anand

    2007-01-01

    We study a model theoretic context (finite thorn rank, NIP, with finitely satisfiable generics) which is a common generalization of groups of finite Morley rank and definably compact groups in o-minimal structures. We show that assuming thorn rank 1, the group is abelian-by-finite, and assuming thorn rank 2 the group is solvable by finite. Also a field is algebraically closed.

  11. Generic drugs: myths, facts, and limitations

    OpenAIRE

    Antonio Marzo; Elisabetta Porro; Anna Barassi

    2012-01-01

    Bioequivalence (BE) has always been an important pharmaceutical area, particularly (but not solely) in Mediterranean region, where the use of generic drugs is a relatively recent development. The lack of new therapeutic molecules has concentrated primary research in the hands of a few large pharmaceutical companies. For smaller companies, this has created opportunities for the development of new formulations of existing drugs (orodispersible tablets that dissolve in the mouth, extended-releas...

  12. Molten salts processes and generic simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ogawa, Toru; Minato, Kazuo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    Development of dry separation process (pyrochemical process) using molten salts for the application of spent-nuclear fuel reprocessing requires a rather complete fundamental database as well as process simulation technique with wide applicability. The present report concerns recent progress and problems in this field taking behaviors of co-electrodeposition of UO{sub 2} and PuO{sub 2} in molten salts as an example, and using analytical simulation of local equilibrium combined with generic diffusion. (S. Ohno)

  13. The generic model of General Relativity

    Energy Technology Data Exchange (ETDEWEB)

    Tsamparlis, Michael, E-mail: mtsampa@phys.uoa.g [Department of Physics, Section Astrophysics Astronomy Mechanics, University of Athens, University of Athens, Zografos 15783, Athens (Greece)

    2009-10-01

    We develop a generic spacetime model in General Relativity from which all existing model results are produced under specific assumptions, depending on the case. We classify each type of possible assumption, especially the role of observers and that of symmetries, and discuss their role in the development of a model. We apply the results in a step by step approach to the case of a Bianchi I spacetime and a string fluid.

  14. Compactified String Theories -- Generic Predictions for Particle Physics

    CERN Document Server

    Acharya, Bobby Samir; Kumar, Piyush

    2012-01-01

    In recent years it has been realized that in string/$M$ theories compactified to four dimensions which satisfy cosmological constraints, it is possible to make some generic predictions for particle physics and dark matter: a non-thermal cosmological history before primordial nucleosynthesis, a scale of supersymmetry breaking which is "high" as in gravity mediation, scalar superpartners too heavy to be produced at the LHC (although gluino production is expected in many cases), and a significant fraction of dark matter in the form of axions. When the matter and gauge spectrum below the compactification scale is that of the MSSM, a robust prediction of about 125 GeV for the Higgs boson mass, predictions for various aspects of dark matter physics, as well as predictions for future precision measurements, can be made. As a prototypical example, $M$ theory compactified on a manifold of $G_2$ holonomy leads to a good candidate for our "string vacuum", with the TeV scale emerging from the Planck scale, a de Sitter va...

  15. Generic drug names and social welfare.

    Science.gov (United States)

    Lobo, Félix; Feldman, Roger

    2013-06-01

    This article studies how well International Nonproprietary Names (INNs), the "generic" names for pharmaceuticals, address the problems of imperfect information. Left in private hands, the identification of medicines leads to confusion and errors. Developed in the 1950s by the World Health Organization, INNs are a common, global, scientific nomenclature designed to overcome this failure. Taking stock after sixty years, we argue that the contribution of INNs to social welfare is paramount. They enhance public health by reducing errors and improving patient safety. They also contribute to economic efficiency by creating transparency as the foundation of competitive generic drug markets, reducing transaction costs, and favoring trade. The law in most countries requires manufacturers to designate pharmaceuticals with INNs in labeling and advertising. Generic substitution is also permitted or mandatory in many countries. But not all the benefits of INNs are fully realized because prescribers may not use them. We advocate strong incentives or even legally binding provisions to extend the use of INNs by prescribing physicians and dispensing pharmacists, but we do not recommend replacing brand names entirely with INNs. Instead, we propose dual use of brand names and INNs in prescribing, as in drug labeling.

  16. Generic superweak chaos induced by Hall effect.

    Science.gov (United States)

    Ben-Harush, Moti; Dana, Itzhack

    2016-05-01

    We introduce and study the "kicked Hall system" (KHS), i.e., charged particles periodically kicked in the presence of uniform magnetic (B) and electric (E) fields that are perpendicular to each other and to the kicking direction. We show that for resonant values of B and E and in the weak-chaos regime of sufficiently small nonintegrability parameter κ (the kicking strength), there exists a generic family of periodic kicking potentials for which the Hall effect from B and E significantly suppresses the weak chaos, replacing it by "superweak" chaos (SWC). This means that the system behaves as if the kicking strength were κ^{2} rather than κ. For E=0, SWC is known to be a classical fingerprint of quantum antiresonance, but it occurs under much less generic conditions, in particular only for very special kicking potentials. Manifestations of SWC are a decrease in the instability of periodic orbits and a narrowing of the chaotic layers, relative to the ordinary weak-chaos case. Also, for global SWC, taking place on an infinite "stochastic web" in phase space, the chaotic diffusion on the web is much slower than the weak-chaos one. Thus, the Hall effect can be relatively stabilizing for small κ. In some special cases, the effect is shown to cause ballistic motion for almost all parameter values. The generic global SWC on stochastic webs in the KHS appears to be the two-dimensional closest analog to the Arnol'd web in higher dimensional systems.

  17. Generic legislation of new psychoactive drugs.

    Science.gov (United States)

    van Amsterdam, Jan; Nutt, David; van den Brink, Wim

    2013-03-01

    New psychoactive drugs (NPDs, new psychoactive substances) enter the market all the time. However, it takes several months to ban these NPDs and immediate action is generally not possible. Several European countries and drug enforcement officers insist on a faster procedure to ban NPDs. Introduction of generic legislation, in which clusters of psychotropic drugs are banned in advance, has been mentioned as a possible solution. Here we discuss the pros and cons of such an approach. First, generic legislation could unintentionally increase the expenditures of enforcement, black market practices, administrative burden and health risks for users. Second, it may have a negative impact on research and the development of new treatments. Third, due to the complexity of generic legislation, problems in the enforcement are anticipated due to lack of knowledge about the chemical nomenclature. Finally, various legal options are already available to ban the use, sale and trade of NPDs. We therefore conclude that the currently used scientific benefit-risk evaluation should be continued to limit the adverse health effects of NPDs. Only in emergency cases, where fatal incidents (may) occur, should this approach be overruled.

  18. Generic solar photovoltaic system dynamic simulation model specification

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Behnke, Michael Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Ryan Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-10-01

    This document is intended to serve as a specification for generic solar photovoltaic (PV) system positive-sequence dynamic models to be implemented by software developers and approved by the WECC MVWG for use in bulk system dynamic simulations in accordance with NERC MOD standards. Two specific dynamic models are included in the scope of this document. The first, a Central Station PV System model, is intended to capture the most important dynamic characteristics of large scale (> 10 MW) PV systems with a central Point of Interconnection (POI) at the transmission level. The second, a Distributed PV System model, is intended to represent an aggregation of smaller, distribution-connected systems that comprise a portion of a composite load that might be modeled at a transmission load bus.

  19. C3 generic workstation: Performance metrics and applications

    Science.gov (United States)

    Eddy, Douglas R.

    1988-01-01

    The large number of integrated dependent measures available on a command, control, and communications (C3) generic workstation under development are described. In this system, embedded communications tasks will manipulate workload to assess the effects of performance-enhancing drugs (sleep aids and decongestants), work/rest cycles, biocybernetics, and decision support systems on performance. Task performance accuracy and latency will be event coded for correlation with other measures of voice stress and physiological functioning. Sessions will be videotaped to score non-verbal communications. Physiological recordings include spectral analysis of EEG, ECG, vagal tone, and EOG. Subjective measurements include SWAT, fatigue, POMS and specialized self-report scales. The system will be used primarily to evaluate the effects on performance of drugs, work/rest cycles, and biocybernetic concepts. Performance assessment algorithms will also be developed, including those used with small teams. This system provides a tool for integrating and synchronizing behavioral and psychophysiological measures in a complex decision-making environment.

  20. Knowledge Base Grid: A Generic Grid Architecture for Semantic Web

    Institute of Scientific and Technical Information of China (English)

    WU ZhaoHui(吴朝晖); CHEN HuaJun(陈华钧); XU JieFeng(徐杰锋)

    2003-01-01

    The emergence of semantic web will result in an enormous amount of knowledge base resources on the web. In this paper, a generic Knowledge Base Grid Architecture (KB-Grid)for building large-scale knowledge systems on the semantic web is presented. KB-Grid suggests a paradigm that emphasizes how to organize, discover, utilize, and manage web knowledge base resources. Four principal components are under development: a semantic browser for retrieving and browsing semantically enriched information, a knowledge server acting as the web container for knowledge, an ontology server for managing web ontologies, and a knowledge base directory server acting as the registry and catalog of KBs. Also a referential model of knowledge service and the mechanisms required for semantic communication within KB-Grid are defined. To verify the design rationale underlying the KB-Grid, an implementation of Traditional Chinese Medicine(TCM) is described.

  1. Factorization and Resummation for Generic Hierarchies between Jets

    CERN Document Server

    Pietrulewicz, Piotr; Waalewijn, Wouter J

    2016-01-01

    Jets are an important probe to identify the hard interaction of interest at the LHC. They are routinely used in Standard Model precision measurements as well as in searches for new heavy particles, including jet substructure methods. In processes with several jets, one typically encounters hierarchies in the jet transverse momenta and/or dijet invariant masses. Large logarithms of the ratios of these kinematic jet scales in the cross section are at present primarily described by parton showers. We present a general factorization framework called SCET$_+$, which is an extension of Soft-Collinear Effective Theory (SCET) and allows for a systematic higher-order resummation of such kinematic logarithms for generic jet hierarchies. In SCET$_+$ additional intermediate soft/collinear modes are used to resolve jets arising from additional soft and/or collinear QCD emissions. The resulting factorized cross sections utilize collinear splitting amplitudes and soft gluon currents and fully capture spin and color correlat...

  2. Generic drugs in Brazil: known by many, used by few.

    Science.gov (United States)

    Bertoldi, Andréa D; Barros, Aluísio J D; Hallal, Pedro C

    2005-01-01

    This study evaluated knowledge and use of generic drugs in a population-based sample of adults from a southern Brazilian city. The outcomes were: the proportion of generics in total medicines used; theoretical and practical knowledge about generics; and strategies used to buy medicines on medical prescriptions. The recall period for drug utilization was 15 days. The proportion of generics in total medicines was 3.9%. While 86.0% knew that generics cost less and 70.0% that the quality is similar to brand name medicines, only 57.0% knew any packaging characteristics that distinguish generics from other medicines. The highest proportion of generic drug utilization was in the antimicrobial pharmacological group. A brand name medicine (with a brand similar to the generic name) was mistakenly classified as a generic through photos by 48.0% of the interviewees. Among subjects who bought medicines in the 15-day period, 18.9% reported buying a generic, but this result should be interpreted with caution, because the population frequently fails to differentiate between generics and other medicines.

  3. Core Java

    CERN Document Server

    Horstmann, Cay S

    2013-01-01

    Fully updated to reflect Java SE 7 language changes, Core Java™, Volume I—Fundamentals, Ninth Edition, is the definitive guide to the Java platform. Designed for serious programmers, this reliable, unbiased, no-nonsense tutorial illuminates key Java language and library features with thoroughly tested code examples. As in previous editions, all code is easy to understand, reflects modern best practices, and is specifically designed to help jumpstart your projects. Volume I quickly brings you up-to-speed on Java SE 7 core language enhancements, including the diamond operator, improved resource handling, and catching of multiple exceptions. All of the code examples have been updated to reflect these enhancements, and complete descriptions of new SE 7 features are integrated with insightful explanations of fundamental Java concepts.

  4. Identifying ELIXIR Core Data Resources.

    Science.gov (United States)

    Durinx, Christine; McEntyre, Jo; Appel, Ron; Apweiler, Rolf; Barlow, Mary; Blomberg, Niklas; Cook, Chuck; Gasteiger, Elisabeth; Kim, Jee-Hyub; Lopez, Rodrigo; Redaschi, Nicole; Stockinger, Heinz; Teixeira, Daniel; Valencia, Alfonso

    2016-01-01

    The core mission of ELIXIR is to build a stable and sustainable infrastructure for biological information across Europe. At the heart of this are the data resources, tools and services that ELIXIR offers to the life-sciences community, providing stable and sustainable access to biological data. ELIXIR aims to ensure that these resources are available long-term and that the life-cycles of these resources are managed such that they support the scientific needs of the life-sciences, including biological research. ELIXIR Core Data Resources are defined as a set of European data resources that are of fundamental importance to the wider life-science community and the long-term preservation of biological data. They are complete collections of generic value to life-science, are considered an authority in their field with respect to one or more characteristics, and show high levels of scientific quality and service. Thus, ELIXIR Core Data Resources are of wide applicability and usage. This paper describes the structures, governance and processes that support the identification and evaluation of ELIXIR Core Data Resources. It identifies key indicators which reflect the essence of the definition of an ELIXIR Core Data Resource and support the promotion of excellence in resource development and operation. It describes the specific indicators in more detail and explains their application within ELIXIR's sustainability strategy and science policy actions, and in capacity building, life-cycle management and technical actions. The identification process is currently being implemented and tested for the first time. The findings and outcome will be evaluated by the ELIXIR Scientific Advisory Board in March 2017. Establishing the portfolio of ELIXIR Core Data Resources and ELIXIR Services is a key priority for ELIXIR and publicly marks the transition towards a cohesive infrastructure.

  5. Assessment of knowledge and perceptions toward generic medicines among basic science undergraduate medical students at Aruba

    Science.gov (United States)

    Shankar, P. Ravi; Herz, Burton L.; Dubey, Arun K.; Hassali, Mohamed A.

    2016-01-01

    Objective: Use of generic medicines is important to reduce rising health-care costs. Proper knowledge and perception of medical students and doctors toward generic medicines are important. Xavier University School of Medicine in Aruba admits students from the United States, Canada, and other countries to the undergraduate medical (MD) program. The present study was conducted to study the knowledge and perception about generic medicines among basic science MD students. Materials and Methods: The cross-sectional study was conducted among first to fifth semester students during February 2015. A previously developed instrument was used. Basic demographic information was collected. Respondent’s agreement with a set of statements was noted using a Likert-type scale. The calculated total score was compared among subgroups of respondents. One sample Kolmogorov–Smirnov test was used to study the normality of distribution, Independent samples t-test to compare the total score for dichotomous variables, and analysis of variance for others were used for statistical analysis. Results: Fifty-six of the 85 students (65.8%) participated. Around 55% of respondents were between 20 and 25 years of age and of American nationality. Only three respondents (5.3%) provided the correct value of the regulatory bioequivalence limits. The mean total score was 43.41 (maximum 60). There was no significant difference in scores among subgroups. Conclusions: There was a significant knowledge gap with regard to the regulatory bioequivalence limits for generic medicines. Respondents’ level of knowledge about other aspects of generic medicines was good but could be improved. Studies among clinical students in the institution and in other Caribbean medical schools are required. Deficiencies were noted and we have strengthened learning about generic medicines during the basic science years. PMID:28031604

  6. Assessment of knowledge and perceptions toward generic medicines among basic science undergraduate medical students at Aruba.

    Science.gov (United States)

    Shankar, P Ravi; Herz, Burton L; Dubey, Arun K; Hassali, Mohamed A

    2016-10-01

    Use of generic medicines is important to reduce rising health-care costs. Proper knowledge and perception of medical students and doctors toward generic medicines are important. Xavier University School of Medicine in Aruba admits students from the United States, Canada, and other countries to the undergraduate medical (MD) program. The present study was conducted to study the knowledge and perception about generic medicines among basic science MD students. The cross-sectional study was conducted among first to fifth semester students during February 2015. A previously developed instrument was used. Basic demographic information was collected. Respondent's agreement with a set of statements was noted using a Likert-type scale. The calculated total score was compared among subgroups of respondents. One sample Kolmogorov-Smirnov test was used to study the normality of distribution, Independent samples t-test to compare the total score for dichotomous variables, and analysis of variance for others were used for statistical analysis. Fifty-six of the 85 students (65.8%) participated. Around 55% of respondents were between 20 and 25 years of age and of American nationality. Only three respondents (5.3%) provided the correct value of the regulatory bioequivalence limits. The mean total score was 43.41 (maximum 60). There was no significant difference in scores among subgroups. There was a significant knowledge gap with regard to the regulatory bioequivalence limits for generic medicines. Respondents' level of knowledge about other aspects of generic medicines was good but could be improved. Studies among clinical students in the institution and in other Caribbean medical schools are required. Deficiencies were noted and we have strengthened learning about generic medicines during the basic science years.

  7. Generic algorithms for high performance scalable geocomputing

    Science.gov (United States)

    de Jong, Kor; Schmitz, Oliver; Karssenberg, Derek

    2016-04-01

    During the last decade, the characteristics of computing hardware have changed a lot. For example, instead of a single general purpose CPU core, personal computers nowadays contain multiple cores per CPU and often general purpose accelerators, like GPUs. Additionally, compute nodes are often grouped together to form clusters or a supercomputer, providing enormous amounts of compute power. For existing earth simulation models to be able to use modern hardware platforms, their compute intensive parts must be rewritten. This can be a major undertaking and may involve many technical challenges. Compute tasks must be distributed over CPU cores, offloaded to hardware accelerators, or distributed to different compute nodes. And ideally, all of this should be done in such a way that the compute task scales well with the hardware resources. This presents two challenges: 1) how to make good use of all the compute resources and 2) how to make these compute resources available for developers of simulation models, who may not (want to) have the required technical background for distributing compute tasks. The first challenge requires the use of specialized technology (e.g.: threads, OpenMP, MPI, OpenCL, CUDA). The second challenge requires the abstraction of the logic handling the distribution of compute tasks from the model-specific logic, hiding the technical details from the model developer. To assist the model developer, we are developing a C++ software library (called Fern) containing algorithms that can use all CPU cores available in a single compute node (distributing tasks over multiple compute nodes will be done at a later stage). The algorithms are grid-based (finite difference) and include local and spatial operations such as convolution filters. The algorithms handle distribution of the compute tasks to CPU cores internally. In the resulting model the low-level details of how this is done is separated from the model-specific logic representing the modeled system

  8. Conceptual study of advanced PWR core design

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chang, Moon Hee; Kim, Keung Ku; Joo, Hyung Kuk; Kim, Young Il; Noh, Jae Man; Hwang, Dae Hyun; Kim, Taek Kyum; Yoo, Yon Jong

    1997-09-01

    The purpose of this project is for developing and verifying the core design concepts with enhanced safety and economy, and associated methodologies for core analyses. From the study of the sate-of-art of foreign advanced reactor cores, we developed core concepts such as soluble boron free, high convertible and enhanced safety core loaded semi-tight lattice hexagonal fuel assemblies. To analyze this hexagonal core, we have developed and verified some neutronic and T/H analysis methodologies. HELIOS code was adopted as the assembly code and HEXFEM code was developed for hexagonal core analysis. Based on experimental data in hexagonal lattices and the COBRA-IV-I code, we developed a thermal-hydraulic analysis code for hexagonal lattices. Using the core analysis code systems developed in this project, we designed a 600 MWe core and studied the feasibility of the core concepts. Two additional scopes were performed in this project : study on the operational strategies of soluble boron free core and conceptual design of large scale passive core. By using the axial BP zoning concept and suitable design of control rods, this project showed that it was possible to design a soluble boron free core in 600 MWe PWR. The results of large scale core design showed that passive concepts and daily load follow operation could be practiced. (author). 15 refs., 52 tabs., 101 figs.

  9. Consumer choice between common generic and brand medicines in a country with a small generic market.

    Science.gov (United States)

    Fraeyman, Jessica; Peeters, Lies; Van Hal, Guido; Beutels, Philippe; De Meyer, Guido R Y; De Loof, Hans

    2015-04-01

    Generic medicines offer an opportunity for governments to contain pharmaceutical expenditures, since generics are generally 10%-80% lower in price than brand medicines. Belgium has a small generic market that takes up 15% of the total pharmaceutical market in packages sold. To determine the knowledge of consumers about the different available packages of a common over-the-counter medicine (acetaminophen) with regard to price advantage, quality, and effectiveness in a country with a small generic market. We conducted an online survey in the general Flemish population using a questionnaire with 25 statements. The questionnaire also contained 2 informative interventions. First, we showed the price per package and per tablet that the patient would pay in the pharmacy. Second, we provided the respondent with general information about generic medication (equivalence, effectiveness, price, and recognition). Before and after the interventions, we probed for preferences and knowledge about the different packages. Multivariate logistic models were used to examine the independent effects of consumer characteristics on responses to the survey statements. We obtained a sample of 1,636 respondents. The general attitude towards generic medication was positive-only 5% would rather not use a generic. Nevertheless, only 17% of the respondents were able to recognize a generic medicine. Older consumers (aged 60 years and above) were more often confused about the different packages (OR = 2.59, 95% CI = 1.76-3.80, P ≤ 0.001). Consumers without a higher education degree tended to be more doubtful about the difference in effectiveness and quality between the different brands (OR = 0.59, 95% CI = 0.44-0.79, P ≤ 0.001). Consumer recognition of the name of the active substance of acetaminophen was poor. When different brands were displayed, possible price advantage seemed to be an important motive to switch to a cheaper brand. Consumers generally found medicines

  10. Generic Properties of Curvature Sensing through Vision and Touch

    Directory of Open Access Journals (Sweden)

    Birgitta Dresp-Langley

    2013-01-01

    Full Text Available Generic properties of curvature representations formed on the basis of vision and touch were examined as a function of mathematical properties of curved objects. Virtual representations of the curves were shown on a computer screen for visual scaling by sighted observers (experiment 1. Their physical counterparts were placed in the two hands of blindfolded and congenitally blind observers for tactile scaling. The psychophysical data show that curvature representations in congenitally blind individuals, who never had any visual experience, and in sighted observers, who rely on vision most of the time, are statistically linked to the same mathematical properties of the curves. The perceived magnitude of object curvature, sensed through either vision or touch, is related by a mathematical power law, with similar exponents for the two sensory modalities, to the aspect ratio of the curves, a scale invariant geometric property. This finding supports biologically motivated models of sensory integration suggesting a universal power law for the adaptive brain control and balance of motor responses to environmental stimuli from any sensory modality.

  11. 21 CFR 880.2720 - Patient scale.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Patient scale. 880.2720 Section 880.2720 Food and... Patient scale. (a) Identification. A patient scale is a device intended for medical purposes that is used to measure the weight of a patient who cannot stand on a scale. This generic device includes...

  12. Generic medicines: issues and relevance for global health.

    Science.gov (United States)

    Rana, Proteesh; Roy, Vandana

    2015-12-01

    Generic medicine is a pharmaceutical product which is bioequivalent to the innovator product in terms of dosage form, strength, route of administration, quality, safety, performance characteristics, and intended use. Generic medicines are a cornerstone for providing affordable medicines to patients. The major generic markets in the world include United States of America followed by European Union, Canada, Japan, and Australia. The major suppliers of generic medicines China and India are showing tremendous growth in the generic medicine sector. There are many legal and regulatory issues along with quality concerns associated with the use of the generic products. Lately, bilateral international agreements called free trade agreements, delaying tactics by originator companies like strategic patenting and litigations on generic manufacturers, have been a major setback for the generic medicine industry. These issues need to be addressed to optimize the use of generic medicines. The sustainability of generic medicine sector is crucial for improving access to essential medicines for the worldwide. © 2015 Société Française de Pharmacologie et de Thérapeutique.

  13. What Is the Future of Generics in Transplantation?

    Science.gov (United States)

    van Gelder, Teun

    2015-11-01

    Generic immunosuppressive drugs are available in Europe, Canada, and the United States. Between countries, there are large differences in the market penetration of generic drugs in general, and for immunosuppressive drugs in particular. The registration criteria for generic immunosuppressive drugs are often criticized. However, it is unlikely that the criteria for registration of narrow therapeutic index drugs are going to change, and bioequivalence studies, performed in healthy volunteers, will remain the backbone of the registration process. It would be good if the registration authorities would demand that all generic variants of an innovator drug have the same pill appearance to reduce errors and promote drug adherence.To allow for safe substitution, a number of criteria need to be fulfilled. Generic substitution should not be taken out of the hands of the treating physicians. Generic substitution can only be done safely if initiated by the prescriber, and in well-informed and prepared patients. Payers should refrain from forcing pharmacists to dispense generic drugs in patients on maintenance treatment with innovator drug. Instead, together with transplant societies, they should design guidelines on how to implement generic immunosuppressive drugs into clinical practice. Substitutions must be followed by control visits to check if the patient is taking the medication correctly and if drug exposure remains stable. Inadvertent, uncontrolled substitutions from 1 generic to another, initiated outside the scope of the prescriber, must be avoided as they are unsafe. Repetitive subsequent generic substitutions result in minimal additional cost savings and have an inherent risk of medication errors.

  14. Generic hierarchical engine for mask data preparation

    Science.gov (United States)

    Kalus, Christian K.; Roessl, Wolfgang; Schnitker, Uwe; Simecek, Michal

    2002-07-01

    Electronic layouts are usually flattened on their path from the hierarchical source downstream to the wafer. Mask data preparation has certainly been identified as a severe bottleneck since long. Data volumes are not only doubling every year along the ITRS roadmap. With the advent of optical proximity correction and phase-shifting masks data volumes are escalating up to non-manageable heights. Hierarchical treatment is one of the most powerful means to keep memory and CPU consumption in reasonable ranges. Only recently, however, has this technique acquired more public attention. Mask data preparation is the most critical area calling for a sound infrastructure to reduce the handling problem. Gaining more and more attention though, are other applications such as large area simulation and manufacturing rule checking (MRC). They all would profit from a generic engine capable to efficiently treat hierarchical data. In this paper we will present a generic engine for hierarchical treatment which solves the major problem, steady transitions along cell borders. Several alternatives exist how to walk through the hierarchy tree. They have, to date, not been thoroughly investigated. One is a bottom-up attempt to treat cells starting with the most elementary cells. The other one is a top-down approach which lends itself to creating a new hierarchy tree. In addition, since the variety, degree of hierarchy and quality of layouts extends over a wide range a generic engine has to take intelligent decisions when exploding the hierarchy tree. Several applications will be shown, in particular how far the limits can be pushed with the current hierarchical engine.

  15. Generic device controller for accelerator control systems

    Energy Technology Data Exchange (ETDEWEB)

    Mariotti, R.; Buxton, W.; Frankel, R.; Hoff, L.

    1987-01-01

    A new distributed intelligence control system has become operational at the AGS for transport, injection, and acceleration of heavy ions. A brief description of the functionality of the physical devices making up the system is given. An attempt has been made to integrate the devices for accelerator specific interfacing into a standard microprocessor system, namely, the Universal Device Controller (UDC). The main goals for such a generic device controller are to provide: local computing power; flexibility to configure; and real time event handling. The UDC assemblies and software are described. (LEW)

  16. A Generic Design Model for Evolutionary Algorithms

    Institute of Scientific and Technical Information of China (English)

    He Feng; Kang Li-shan; Chen Yu-ping

    2003-01-01

    A generic design model for evolutionary algo rithms is proposed in this paper. The model, which was described by UML in details, focuses on the key concepts and mechanisms in evolutionary algorithms. The model not only achieves separation of concerns and encapsulation of implementations by classification and abstraction of those concepts,it also has a flexible architecture due to the application of design patterns. As a result, the model is reusable, extendible,easy to understand, easy to use, and easy to test. A large number of experiments applying the model to solve many different problems adequately illustrate the generality and effectivity of the model.

  17. [Generic and biosimilar drug substitution: a panacea?].

    Science.gov (United States)

    Daly, M J; Guignard, B; Nendaz, M

    2015-10-14

    Drugs are the third largest source of expenditure under Switzerland's compulsory basic health insurance. Generics, the price of which should be at least 30 per cent less than the cost of the original drugs, can potentially allow substantial savings. Their approval requires bioequivalence studies and their use is safe, although some factors may influence patients' and physicians' acceptance. The increased substitution of biosimilar drugs for more expensive biotech drugs should allow further cost savings. In an attempt to extend the monopoly granted by the original drug patent, some pharmaceutical companies implement "evergreening" strategies including small modifications of the original substance for which the clinical benefit is not always demonstrated.

  18. Static aeroelastic analysis for generic configuration wing

    Science.gov (United States)

    Lee, IN; Miura, Hirokazu; Chargin, Mladen K.

    1991-01-01

    A static aeroelastic analysis capability that calculates flexible air loads for generic configuration wings was developed. It was made possible by integrating a finite element structural analysis code (MSC/NASTRAN) and a panel code of aerodynamic analysis based on linear potential flow theory. The framework already built in MSC/NASTRAN was used, and the aerodynamic influence coefficient matrix was computed externally and inserted in the NASTRAN by means of a DMAP program. It was shown that deformation and flexible air loads of an oblique wing configuration including asymmetric wings can be calculated reliably by this code both in subsonic and supersonic speeds.

  19. Holographic entanglement entropy on generic time slices

    Science.gov (United States)

    Kusuki, Yuya; Takayanagi, Tadashi; Umemoto, Koji

    2017-06-01

    We study the holographic entanglement entropy and mutual information for Lorentz boosted subsystems. In holographic CFTs at zero and finite temperature, we find that the mutual information gets divergent in a universal way when the end points of two subsystems are light-like separated. In Lifshitz and hyperscaling violating geometries dual to non-relativistic theories, we show that the holographic entanglement entropy is not well-defined for Lorentz boosted subsystems in general. This strongly suggests that in non-relativistic theories, we cannot make a real space factorization of the Hilbert space on a generic time slice except the constant time slice, as opposed to relativistic field theories.

  20. Generalized Hausdorff measure for generic compact sets

    CERN Document Server

    Balka, Richárd

    2012-01-01

    Let $X$ be a Polish space. We prove that the generic compact set $K\\subseteq X$ (in the sense of Baire category) is either finite or there is a continuous gauge function $h$ such that $0<\\mathcal{H}^{h}(K)<\\infty$, where $\\mathcal{H}^h$ denotes the $h$-Hausdorff measure. This answers a question of C. Cabrelli, U. B. Darji, and U. M. Molter. Moreover, for every weak contraction $f\\colon K\\to X$ we have $\\mathcal{H}^{h} (K\\cap f(K))=0$. This is a measure theoretic analogue of a result of M. Elekes.

  1. Commercial Generic Bioprocessing Apparatus Science Insert - 03

    Science.gov (United States)

    Moreno, Nancy; Stodieck, Louis; Cushing, Paula; Stowe, Mark; Hamilton, Mary Ann; Werner, Ken

    2008-01-01

    Commercial Generic Bioprocessing Apparatus Science Insert - 03 (CSI-03) is the third set of investigations in the CSI program series. The CSI program provides the K-12 community opportunities to utilize the unique microgravity environment of the International Space Station as part of the regular classroom to encourage learning and interest in science, technology, engineering and math. CSI-03 will examine the complete life cycle of the painted lady butterfly and the ability of an orb weaving spider to spin a web, eat and remain healthy in space.

  2. Developing A Generic Optical Avionic Network

    DEFF Research Database (Denmark)

    Zhang, Jiang; An, Yi; Berger, Michael Stübert

    2011-01-01

    and support a wide range of avionic applications. Segregation can be made on different hierarchies according to system criticality and security requirements. The structure of each layer is discussed in detail. Two network configurations are presented, focusing on how to support different network services......We propose a generic optical network design for future avionic systems in order to reduce the weight and power consumption of current networks on board. A three-layered network structure over a ring optical network topology is suggested, as it can provide full reconfiguration flexibility...... by such a network. Finally, three redundancy scenarios are discussed and compared....

  3. Generic 'du' in time and context

    DEFF Research Database (Denmark)

    Jensen, Torben Juel

    2017-01-01

    speakers, and that variation also prevails within a micro-diachronic perspective (between conversations recorded within a few months, and even within the same conversation). The results emphasise that great caution should be exercised when taking the difference between two pieces of attested language use......This article considers the way individual speakers respond to a macro-level process of language change: an increased use of the second-person pronoun du for generic reference. Real time panel studies show that life span change is much more common than is often assumed, particularly among adult...

  4. Generic Wing-Body Aerodynamics Data Base

    Science.gov (United States)

    Holst, Terry L.; Olsen, Thomas H.; Kwak, Dochan (Technical Monitor)

    2001-01-01

    The wing-body aerodynamics data base consists of a series of CFD (Computational Fluid Dynamics) simulations about a generic wing body configuration consisting of a ogive-circular-cylinder fuselage and a simple symmetric wing mid-mounted on the fuselage. Solutions have been obtained for Nonlinear Potential (P), Euler (E) and Navier-Stokes (N) solvers over a range of subsonic and transonic Mach numbers and angles of attack. In addition, each solution has been computed on a series of grids, coarse, medium and fine to permit an assessment of grid refinement errors.

  5. Generic Interfaces for Managing Web Data

    Directory of Open Access Journals (Sweden)

    Oleg Burlaca

    2005-05-01

    Full Text Available This paper discusses a generic user interface for managing web data that is incorporated in a content management system. The interface is created at run-time from a set of XML documents stored in database. We accentuate the importance of content analysis phase that leads to a well formed data model. Another important aspect is the use of context in the interface and the hierarchical model to represent multiple relationships between hierarchy items. The proposed event model acts like a glue between data management and application logic.

  6. A Generic Middleware Model for Smart Home

    Directory of Open Access Journals (Sweden)

    Madhusudanan J.

    2014-07-01

    Full Text Available A Smart Home is an emerging technology, where the electronic devices are controlled automatically based on the occupants activities. The pervasive computing plays a vital role in the smart home environment, which provides the computer-based service to human beings anywhere and anytime. However, when discussing smart home of the future, related studies have focused on providing middleware. The middleware acts as a interface between human beings and the smart devices. In this paper, we have proposed a generic middleware model for smart home that enables interaction between human being and devices and also between various devices based on the context identified in the environment.

  7. Generic Patch Inference

    DEFF Research Database (Denmark)

    Andersen, Jesper; Lawall, Julia Laetitia

    2008-01-01

    A key issue in maintaining Linux device drivers is the need to update drivers in response to evolutions in Linux internal libraries. Currently, there is little tool support for performing and documenting such changes. In this paper we present a tool, spfind, that identifies common changes made...... developers can use it to extract an abstract representation of the set of changes that others have made. Our experiments on recent changes in Linux show that the inferred generic patches are more concise than the corresponding patches found in commits to the Linux source tree while being safe with respect...

  8. Magnetic Probing of Core Geodynamics

    Science.gov (United States)

    Voorhies, Coerte V.

    2004-01-01

    To better understand geomagnetic theory and observation, we can use spatial magnetic spectra for the main field and secular variation to test core dynmcal hypotheses against seismology. The hypotheses lead to theoretical spectra which are fitted to observational spectra. Each fit yields an estimate of the radius of Earth's core and uncertainty. If this agrees with the seismologic value, then the hypothes pass the test. A new way to obtain theoretical spectra extends the hydromagnetic scale analysis of Benton to scale-variant field and flow. For narrow scale flow and a dynamically weak field by the top of Earth's core, this yields a generalized Stevenson-McLeod spectrum for the core-source field, and a secular variation spectrum modulated by a cubic polynomial in spherical harmonic degree n. The former passes the tests. The latter passes many tests, but does not describe rapid dipole decline and quadrupole rebound; some tests suggest it is a bit hard, or rich in narrow scale change. In a core geodynamo, motion of the fluid conductor does work against the Lorentz force. This converts kinetic into magnetic energy which, in turn, is lost to heat via Ohmic dissipation. In the analysis at lentgh-scale l/k, if one presumes kinetic energy is converted in either eddy- overturning or magnetic free-decay time-scales, then Kolmogorov or other spectra in conflict with observational spectra can result. Instead, the rate work is done roughly balances the dissipation rate, which is consistent with small scale flow. The conversion time-scale depends on dynamical constraints. These are summarized by the magneto-geostrophic vertical vorticity balance by the top of the core, which includes anisotropic effects of rotation, the magnetic field, and the core- mantle boundary. The resulting theoretical spectra for the core-source field and its SV are far more compatible with observation. The conversion time-scale of order l20 years is pseudo-scale-invarient. Magnetic spectra of other

  9. GALE: a generic open source extensible adaptation engine

    Science.gov (United States)

    De Bra, Paul; Knutov, Evgeny; Smits, David; Stash, Natalia; Ramos, Vinicius F. C.

    2013-06-01

    This paper motivates and describes GALE, the Generic Adaptation Language and Engine that came out of the GRAPPLE EU FP7 project. The main focus of the paper is the extensible nature of GALE. The purpose of this description is to illustrate how a single core adaptation engine can be used for different types of adaptation, applied to different types of information items and documents. We illustrate the adaptive functionality on some examples of hypermedia documents. In April 2012, David Smits defended the world's first adaptive PhD thesis on this topic. The thesis, available for download and direct adaptive access at http://gale.win.tue.nl/thesis, shows that a single source of information can serve different audiences and at the same time also allows more freedom of navigation than is possible in any paper or static hypermedia document. The same can be done for course texts, hyperfiction, encyclopedia, museum, or other cultural heritage websites, etc. We explain how to add functionality to GALE if desired, to adapt the system's behavior to whatever the application requires. This stresses our main objective: to provide a technological base for adaptive (hypermedia) system researchers on which they can build extensions for the specific research they have in mind.

  10. A Generic business rules validation system for ORACLE Applications

    CERN Document Server

    Martin, O F

    1997-01-01

    Picture this : You have just spent the remainder of your IT budget on a new software package for Human Resources. Despite its excellent functionality, it does not perform all of the complex validation that your old in-house-developed system did. How can you improve the standard software package given the following constraints : You cannot afford to pay the vendor for modifications to the package Modifying the package yourself is out of the question. We describe a tool designed to implement the validation of complex business rules for any ORACLE database application - without incurring any modification to the user interface. It enhances your productís standard capabilities and improves data quality as soon as data has been entered or modified. . Our initial implementations was for the ORACLE Human Resources package. Our tool consists of four independent components: A description of the data and its location, A set of rules (written in a simple pseudo-code), A generic on-line change detection system, A core en...

  11. Investigation of EAS cores

    Science.gov (United States)

    Shaulov, S. B.; Beyl, P. F.; Beysembaev, R. U.; Beysembaeva, E. A.; Bezshapov, S. P.; Borisov, A. S.; Cherdyntceva, K. V.; Chernyavsky, M. M.; Chubenko, A. P.; Dalkarov, O. D.; Denisova, V. G.; Erlykin, A. D.; Kabanova, N. V.; Kanevskaya, E. A.; Kotelnikov, K. A.; Morozov, A. E.; Mukhamedshin, R. A.; Nam, R. A.; Nesterova, N. M.; Nikolskaya, N. M.; Pavluchenko, V. P.; Piskal, V. V.; Puchkov, V. S.; Pyatovsky, S. E.; Ryabov, V. A.; Sadykov, T. Kh.; Schepetov, A. L.; Smirnova, M. D.; Stepanov, A. V.; Uryson, A. V.; Vavilov, Yu. N.; Vildanov, N. G.; Vildanova, L. I.; Zayarnaya, I. S.; Zhanceitova, J. K.; Zhukov, V. V.

    2017-06-01

    The development of nuclear-electromagnetic cascade models in air in the late forties have shown informational content of the study of cores of extensive air showers (EAS). These investigations were the main goal in different experiments which were carried out over many years by a variety of methods. Outcomes of such investigations obtained in the HADRON experiment using an X-ray emulsion chamber (XREC) as a core detector are considered. The Ne spectrum of EAS associated with γ-ray families, spectra of γ-rays (hadrons) in EAS cores and the Ne dependence of the muon number, ⟨Nμ⟩, in EAS with γ-ray families are obtained for the first time at energies of 1015-1017 eV with this method. A number of new effects were observed, namely, an abnormal scaling violation in hadron spectra which are fundamentally different from model predictions, an excess of muon number in EAS associated with γ-ray families, and the penetrating component in EAS cores. It is supposed that the abnormal behavior of γ-ray spectra and Ne dependence of the muon number are explained by the emergence of a penetrating component in the 1st PCR spectrum `knee' range. Nuclear and astrophysical explanations of the origin of the penetrating component are discussed. The necessity of considering the contribution of a single close cosmic-ray source to explain the PCR spectrum in the knee range is noted.

  12. Investigation of EAS cores

    Directory of Open Access Journals (Sweden)

    Shaulov S.B.

    2017-01-01

    Full Text Available The development of nuclear-electromagnetic cascade models in air in the late forties have shown informational content of the study of cores of extensive air showers (EAS. These investigations were the main goal in different experiments which were carried out over many years by a variety of methods. Outcomes of such investigations obtained in the HADRON experiment using an X-ray emulsion chamber (XREC as a core detector are considered. The Ne spectrum of EAS associated with γ-ray families, spectra of γ-rays (hadrons in EAS cores and the Ne dependence of the muon number, ⟨Nμ⟩, in EAS with γ-ray families are obtained for the first time at energies of 1015–1017 eV with this method. A number of new effects were observed, namely, an abnormal scaling violation in hadron spectra which are fundamentally different from model predictions, an excess of muon number in EAS associated with γ-ray families, and the penetrating component in EAS cores. It is supposed that the abnormal behavior of γ-ray spectra and Ne dependence of the muon number are explained by the emergence of a penetrating component in the 1st PCR spectrum ‘knee’ range. Nuclear and astrophysical explanations of the origin of the penetrating component are discussed. The necessity of considering the contribution of a single close cosmic-ray source to explain the PCR spectrum in the knee range is noted.

  13. Select Generic Dry-Storage Pilot Plant Design for Safeguards and Security by Design (SSBD) per Used Fuel Campaign

    Energy Technology Data Exchange (ETDEWEB)

    Demuth, Scott Francis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sprinkle, James K. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-05-26

    As preparation to the year-end deliverable (Provide SSBD Best Practices for Generic Dry-Storage Pilot Scale Plant) for the Work Package (FT-15LA040501–Safeguards and Security by Design for Extended Dry Storage), the initial step was to select a generic dry-storage pilot plant design for SSBD. To be consistent with other DOE-NE Fuel Cycle Research and Development (FCR&D) activities, the Used Fuel Campaign was engaged for the selection of a design for this deliverable. For the work Package FT-15LA040501–“Safeguards and Security by Design for Extended Dry Storage”, SSBD will be initiated for the Generic Dry-Storage Pilot Scale Plant described by the layout of Reference 2. SSBD will consider aspects of the design that are impacted by domestic material control and accounting (MC&A), domestic security, and international safeguards.

  14. PERFORM 60 - Prediction of the effects of radiation for reactor pressure vessel and in-core materials using multi-scale modelling - 60 years foreseen plant lifetime

    Energy Technology Data Exchange (ETDEWEB)

    Leclercq, Sylvain, E-mail: sylvain.leclercq@edf.f [EDF R and D, Materials and Mechanics of Components, Avenue des Renardieres - Ecuelles, 77818 Moret sur Loing Cedex (France); Lidbury, David [SERCO Assurance - Walton House, 404 Faraday Street, Birchwood Park, Warrington, Cheshire WA3 6GA (United Kingdom); Van Dyck, Steven [SCK-CEN, Nuclear Material Science, Boeretang 200, BE, 2400 Mol (Belgium); Moinereau, Dominique [EDF R and D, Materials and Mechanics of Components, Avenue des Renardieres - Ecuelles, 77818 Moret sur Loing Cedex (France); Alamo, Ana [CEA Saclay, DEN/DSOE, 91191 Gif-sur-Yvette (France); Mazouzi, Abdou Al [EDF R and D, Materials and Mechanics of Components, Avenue des Renardieres - Ecuelles, 77818 Moret sur Loing Cedex (France)

    2010-11-01

    In nuclear power plants, materials may undergo degradation due to severe irradiation conditions that may limit their operational life. Utilities that operate these reactors need to quantify the ageing and the potential degradations of some essential structures of the power plant to ensure safe and reliable plant operation. So far, the material databases needed to take account of these degradations in the design and safe operation of installations mainly rely on long-term irradiation programs in test reactors as well as on mechanical or corrosion testing in specialized hot cells. Continuous progress in the physical understanding of the phenomena involved in irradiation damage and continuous progress in computer sciences have now made possible the development of multi-scale numerical tools able to simulate the effects of irradiation on materials microstructure. A first step towards this goal has been successfully reached through the development of the RPV-2 and Toughness Module numerical tools by the scientific community created around the FP6 PERFECT project. These tools allow to simulate irradiation effects on the constitutive behaviour of the reactor pressure vessel low alloy steel, and also on its failure properties. Relying on the existing PERFECT Roadmap, the 4 years Collaborative Project PERFORM 60 has mainly for objective to develop multi-scale tools aimed at predicting the combined effects of irradiation and corrosion on internals (austenitic stainless steels) and also to improve existing ones on RPV (bainitic steels). PERFORM 60 is based on two technical sub-projects: (i) RPV and (ii) internals. In addition to these technical sub-projects, the Users' Group and Training sub-project shall allow representatives of constructors, utilities, research organizations... from Europe, USA and Japan to receive the information and training to get their own appraisal on limits and potentialities of the developed tools. An important effort will also be made to teach

  15. PERFORM 60 - Prediction of the effects of radiation for reactor pressure vessel and in-core materials using multi-scale modelling - 60 years foreseen plant lifetime

    Science.gov (United States)

    Leclercq, Sylvain; Lidbury, David; Van Dyck, Steven; Moinereau, Dominique; Alamo, Ana; Mazouzi, Abdou Al

    2010-11-01

    In nuclear power plants, materials may undergo degradation due to severe irradiation conditions that may limit their operational life. Utilities that operate these reactors need to quantify the ageing and the potential degradations of some essential structures of the power plant to ensure safe and reliable plant operation. So far, the material databases needed to take account of these degradations in the design and safe operation of installations mainly rely on long-term irradiation programs in test reactors as well as on mechanical or corrosion testing in specialized hot cells. Continuous progress in the physical understanding of the phenomena involved in irradiation damage and continuous progress in computer sciences have now made possible the development of multi-scale numerical tools able to simulate the effects of irradiation on materials microstructure. A first step towards this goal has been successfully reached through the development of the RPV-2 and Toughness Module numerical tools by the scientific community created around the FP6 PERFECT project. These tools allow to simulate irradiation effects on the constitutive behaviour of the reactor pressure vessel low alloy steel, and also on its failure properties. Relying on the existing PERFECT Roadmap, the 4 years Collaborative Project PERFORM 60 has mainly for objective to develop multi-scale tools aimed at predicting the combined effects of irradiation and corrosion on internals (austenitic stainless steels) and also to improve existing ones on RPV (bainitic steels). PERFORM 60 is based on two technical sub-projects: (i) RPV and (ii) internals. In addition to these technical sub-projects, the Users' Group and Training sub-project shall allow representatives of constructors, utilities, research organizations… from Europe, USA and Japan to receive the information and training to get their own appraisal on limits and potentialities of the developed tools. An important effort will also be made to teach young

  16. Generic, scalable and decentralized fault detection for robot swarms

    Science.gov (United States)

    Christensen, Anders Lyhne; Timmis, Jon

    2017-01-01

    Robot swarms are large-scale multirobot systems with decentralized control which means that each robot acts based only on local perception and on local coordination with neighboring robots. The decentralized approach to control confers number of potential benefits. In particular, inherent scalability and robustness are often highlighted as key distinguishing features of robot swarms compared with systems that rely on traditional approaches to multirobot coordination. It has, however, been shown that swarm robotics systems are not always fault tolerant. To realize the robustness potential of robot swarms, it is thus essential to give systems the capacity to actively detect and accommodate faults. In this paper, we present a generic fault-detection system for robot swarms. We show how robots with limited and imperfect sensing capabilities are able to observe and classify the behavior of one another. In order to achieve this, the underlying classifier is an immune system-inspired algorithm that learns to distinguish between normal behavior and abnormal behavior online. Through a series of experiments, we systematically assess the performance of our approach in a detailed simulation environment. In particular, we analyze our system’s capacity to correctly detect robots with faults, false positive rates, performance in a foraging task in which each robot exhibits a composite behavior, and performance under perturbations of the task environment. Results show that our generic fault-detection system is robust, that it is able to detect faults in a timely manner, and that it achieves a low false positive rate. The developed fault-detection system has the potential to enable long-term autonomy for robust multirobot systems, thus increasing the usefulness of robots for a diverse repertoire of upcoming applications in the area of distributed intelligent automation. PMID:28806756

  17. Improved constraints on primordial blackholes and gravitational waves for a generic model of inflation

    CERN Document Server

    Choudhury, Sayantan

    2014-01-01

    In this article we provide a new closed relationship between the cosmic abundance of primordial gravitational waves and primordial blackholes originated from initial inflationary perturbations for a generic model of inflation where inflation occurs below the Planck scale. We have obtained a strict bound on the current abundance of primordial blackholes from the Planck measurements, $9.99712\\times 10^{-3}<\\Omega_{PBH}h^{2}< 9.99736\\times 10^{-3}$.

  18. Associations between generic substitution and patient-related factors

    DEFF Research Database (Denmark)

    Østergaard Rathe, Jette

    Associations between generic substitution and patient-related factors Jette Østergaard Rathe1, Pia V. Larsen1, Morten Andersen2, Janus L. Thomsen3, Maja S. Paulsen1, Jens Søndergaard1 1. Research Unit of General Practice, Institute of Public Health, University of Southern Denmark 2. Centre...... substitutable drug. Data were linked with a prescription database. Results We found no associations between generic substitution and, respectively, gender, age, drug group and polypharmacy. Earlier switches of the index drug are statistically significant associated with acceptance of generic substitution...... generics in the antiepileptic and antidepressant groups (antiepileptics OR 0.37 and antidepressants OR 0.53). Conclusion We did not find any patient-related factors associated with generic substitution; however, patients who have once experienced a generic substitution with a specific drug are more likely...

  19. [Strategies for pharmaceutical research and development. II. Generic drugs].

    Science.gov (United States)

    Kuchar, M

    1996-07-01

    When the patent protection is terminated, the original registered-mark preparation becomes a generic drug, which results in a decrease in its price as compared with the original pharmaceutical. The effects of changes in price relation are discussed from the viewpoint of the generic firms and the manufacturers of original preparations. The differences in the insurance system and legislative regulations of the registration of generic preparations can markedly the size influence of the share of generic drugs in the total consumption of drugs. The future development of generic drugs from a general viewpoint is discussed in relation to the contemporary extensive expiration of patent protection of drugs. The hitherto results are summed up and the topics for the present strategy of the development of generic drugs in the Research Institute for Pharmacy and Biochemistry, or in the Czech Republic, respectively are discussed.

  20. Individual differences in children's and parents' generic language.

    Science.gov (United States)

    Gelman, Susan A; Ware, Elizabeth A; Kleinberg, Felicia; Manczak, Erika M; Stilwell, Sarah M

    2014-01-01

    Generics ("Dogs bark") convey important information about categories and facilitate children's learning. Two studies with parents and their 2- or 4-year-old children (N = 104 dyads) examined whether individual differences in generic language use are as follows: (a) stable over time, contexts, and domains, and (b) linked to conceptual factors. For both children and parents, individual differences in rate of generic production were stable across time, contexts, and domains, and parents' generic usage significantly correlated with that of their own children. Furthermore, parents' essentialist beliefs correlated with their own and their children's rates of generic frequency. These results indicate that generic language use exhibits substantial stability and may reflect individual differences in speakers' conceptual attitudes toward categories. © 2013 The Authors. Child Development © 2013 Society for Research in Child Development, Inc.

  1. MULTITASKS-GENERIC PLATFORM VIA WSN

    Directory of Open Access Journals (Sweden)

    Mahmoud Mezghani

    2011-08-01

    Full Text Available In recent years, the use of wireless sensor network invade various areas (domotic/home automationfields, medical, industrial …, which sets up several applications such as control of energy consumptionin the habitat, home entertainment system, security system of the intelligent home, health care, … Eachapplication employed its own platform what restore the system very complex. For thus and in this paper,we proposed only one platform for all applications, that’s qualified to generate endless tasks. It coversmany commands developed with a generic remote control interface created by C# language. This genericinterface is very adaptable & adjusts oneself to TinyOS operating system requirements and able to beaccessed via Internet using 6LoWPAN protocol. Validate the proper operation of this generic platformmulti-tasking is approved on several levels:at the implementation of a proposed solution to control energy consumption in the smart home which thesuggested solution is based on the techniques of scheduling under constraints of resources; at theautomation of habitat and in the overall context of the intelligent home and the other to improve thequality of life and make it more comfortable ;across of the practical assistance and to set monitoring ofthe patient and to keep track of his statement by the doctor and that, whether at home or in the route tothe hospital or in the hospital.

  2. A Generic Hybrid Encryption System (HES

    Directory of Open Access Journals (Sweden)

    Ijaz Ali Shoukat

    2013-03-01

    Full Text Available This study proposes a Generic Hybrid Encryption System (HES under mutual committee of symmetric and asymmetric cryptosystems. Asymmetric (public key Cryptosystems associates several performance issues like computational incompetence, memory wastages, energy consumptions and employment limitations on bulky data sets but they are quite secure and reliable in key exchange over insecure remote communication channels. Symmetric (private key cryptosystems are 100 times out performed, having no such issues but they cannot fulfill non-repudiation, false modifications in secret key, fake modifications in cipher text and origin authentication of both parties while exchanging information. These contradictory issues can be omitted by utilizing hybrid encryption mechanisms (symmetric+asymmetric to get optimal benefits of both schemes. Several hybrid mechanisms are available with different logics but our logic differs in infrastructural design, simplicity, computational efficiency and security as compared to prior hybrid encryption schemes. Some prior schemes are either diversified in performance aspects, customer satisfaction, memory utilization or energy consumptions and some are vulnerable against forgery and password guessing (session key recovery attacks. We have done some functional and design related changes in existing Public Key Infrastructure (PKI to achieve simplicity, optimal privacy and more customer satisfaction by providing Hybrid Encryption System (HES that is able to fulfill all set of standardized security constraints. No such PKI based generic hybrid encryption scheme persists as we have provided in order to manage all these kinds of discussed issues.

  3. Analytical Analysis of Generic Reusabilty: Weyukers Properties

    Directory of Open Access Journals (Sweden)

    Parul Gandhi

    2012-03-01

    Full Text Available Reusability is the key concept in todays software development environment. The concept of reusability can be achieved by Generic programming approach. C++ templates help us to develop generic code which results in reusable software modules and also identify effectiveness of this reuse strategy. Many researchers have already developed various reusability metrics [9] [7]. In this paper we emphasis on evaluating reusability metrics on weyukers set of properties. Weyukers list of properties has always been a point of reference and suggested as a guiding tool in identification of a good complexity measure by several researchers. We have chosen some recently reported reusability metrics Method Template Inheritance Factor (MTIF and Attribute Template Inheritance factor (ATIF and evaluated them against Weyukers set of principles. We divide our work in a two-step framework. In the first step the metrics are analytically evaluated against a formal list of Weyukers properties and in the second step we calculate LOC metric value by using three different programs designed using template and inheritance features of object-oriented programming and observe that by using template with inheritance property we can reduce number of lines of a project to a great extent.

  4. A Fast Generic Sequence Matching Algorithm

    CERN Document Server

    Musser, David R

    2008-01-01

    A string matching -- and more generally, sequence matching -- algorithm is presented that has a linear worst-case computing time bound, a low worst-case bound on the number of comparisons (2n), and sublinear average-case behavior that is better than that of the fastest versions of the Boyer-Moore algorithm. The algorithm retains its efficiency advantages in a wide variety of sequence matching problems of practical interest, including traditional string matching; large-alphabet problems (as in Unicode strings); and small-alphabet, long-pattern problems (as in DNA searches). Since it is expressed as a generic algorithm for searching in sequences over an arbitrary type T, it is well suited for use in generic software libraries such as the C++ Standard Template Library. The algorithm was obtained by adding to the Knuth-Morris-Pratt algorithm one of the pattern-shifting techniques from the Boyer-Moore algorithm, with provision for use of hashing in this technique. In situations in which a hash function or random a...

  5. The future of generic HIV drugs in Rhode Island.

    Science.gov (United States)

    Lee, Jennifer Y; Reece, Rebecca; Montague, Brian; Rana, Aadia; Alexander-Scott, Nicole; Flanigan, Timothy

    2013-09-06

    The number of HIV-infected persons in the United States continues to increase and most patients with HIV will be on antiretroviral therapy (ART) for many decades. The introduction of generic antiretroviral medications has the potential for significant cost savings which may then be accompanied by improved access. State AIDS Drug Assistance Programs will be made more effective by the switch to generic ARTs. Cost savings and barriers to the introduction of generic ART are discussed.

  6. Bioequivalence studies: need for the reability of generic drugs

    OpenAIRE

    Laosa, Olga; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Médico especialista en Farmacología Clínica.; Guerra, Pedro; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Médico especialista en Farmacología Clínica.; López-Durán, Jose Luis; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Médico especialista en Farmacología Clínica.; Mosquera, Beatriz; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Licenciada en Ciencias Químicas.; Frías, Jesús; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Servicio de Farmacología Clínica, Hospital Universitario la Paz. Madrid, España. Médico especialista en Farmacología Clínica.

    2009-01-01

    A generic medicine is a pharmaceutical product containing an active ingredient already known and previously developed and invented by others. The cost of these generic or multisource products should be less than their counterparts original. The clinical effects and the risk-benefit balance of a medicine do not depend exclusively on the activity of a pharmacologically active substance. Demonstration of bioequivalence of generic medicine is of great importance. In Europe and the United States g...

  7. Bioequivalence studies: need for the reability of generic drugs

    OpenAIRE

    Laosa, Olga; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Médico especialista en Farmacología Clínica.; Guerra, Pedro; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Médico especialista en Farmacología Clínica.; López-Durán, Jose Luis; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Médico especialista en Farmacología Clínica.; Mosquera, Beatriz; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Licenciada en Ciencias Químicas.; Frías, Jesús; Centro de Farmacología Clínica, Departamento de Farmacología y Terapéutica, Facultad de Medicina, Universidad Autónoma de Madrid. Madrid, España. Servicio de Farmacología Clínica, Hospital Universitario la Paz. Madrid, España. Médico especialista en Farmacología Clínica.

    2009-01-01

    A generic medicine is a pharmaceutical product containing an active ingredient already known and previously developed and invented by others. The cost of these generic or multisource products should be less than their counterparts original. The clinical effects and the risk-benefit balance of a medicine do not depend exclusively on the activity of a pharmacologically active substance. Demonstration of bioequivalence of generic medicine is of great importance. In Europe and the United States g...

  8. Understanding and perceptions of final-year Doctor of Pharmacy students about generic medicines in Karachi, Pakistan: a quantitative insight

    Directory of Open Access Journals (Sweden)

    Jamshed SQ

    2015-05-01

    Full Text Available Shazia Qasim Jamshed,1 Mohamad Izham Mohamad Ibrahim,2 Mohamad Azmi Hassali,3 Adheed Khalid Sharrad,4 Asrul Akmal Shafie,3 Zaheer-Ud-Din Babar5 1Pharmacy Practice, Kulliyyah of Pharmacy, International Islamic University Malaysia, Kuantan Campus, Pahang, Malaysia; 2College of Pharmacy, Qatar University, Doha, Qatar; 3Discipline of Social and Administrative Pharmacy, School of Pharmaceutical Sciences, Penang, Malaysia; 4College of Pharmacy, University of Basra, Basra, Iraq; 5School of Pharmacy, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand General objective: To evaluate the understanding and perceptions of generic medicines among final-year Doctor of Pharmacy students in Karachi, Pakistan. Methods: A 23-item survey instrument that included a question on the bioequivalence limits and Likert-type scale questions regarding the understanding and perceptions of generic medicines among the students was executed. Cronbach’s alpha was found to be 0.62. Results: Responses were obtained from 236 final-year Doctor of Pharmacy students (n=85 from a publicly funded institute; n=151 from a privately funded institute. When comparing a brand-name medicine to a generic medicine, pharmacy students scored poorly on bioequivalence limits. More than 80% of the students incorrectly answered that all the products that are rated as generic equivalents are therapeutically equivalent to each other (P<0.04. Half of the students agreed that a generic medicine is bioequivalent to the brand-name medicine (P<0.001. With regard to quality, effectiveness, and safety, more than 75% of the students disagreed that generic medicines are of inferior quality and are less effective than brand-name medicines (P<0.001. More than 50% of the students disagreed that generic medicines produce more side effects than brand-name medicines (P<0.001. Conclusion: The current study identified a positive perception toward generic medicines but also gaps in

  9. A generic hydroeconomic model to assess future water scarcity

    Science.gov (United States)

    Neverre, Noémie; Dumas, Patrice

    2015-04-01

    We developed a generic hydroeconomic model able to confront future water supply and demand on a large scale, taking into account man-made reservoirs. The assessment is done at the scale of river basins, using only globally available data; the methodology can thus be generalized. On the supply side, we evaluate the impacts of climate change on water resources. The available quantity of water at each site is computed using the following information: runoff is taken from the outputs of CNRM climate model (Dubois et al., 2010), reservoirs are located using Aquastat, and the sub-basin flow-accumulation area of each reservoir is determined based on a Digital Elevation Model (HYDRO1k). On the demand side, agricultural and domestic demands are projected in terms of both quantity and economic value. For the agricultural sector, globally available data on irrigated areas and crops are combined in order to determine irrigated crops localization. Then, crops irrigation requirements are computed for the different stages of the growing season using Allen (1998) method with Hargreaves potential evapotranspiration. Irrigation water economic value is based on a yield comparison approach between rainfed and irrigated crops. Potential irrigated and rainfed yields are taken from LPJmL (Blondeau et al., 2007), or from FAOSTAT by making simple assumptions on yield ratios. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The method consists in building three-blocks inverse demand functions where volume limits of the blocks evolve with the level of GDP per capita. The value of water along the demand curve is determined from price-elasticity, price and demand data from the literature, using the point-expansion method, and from water costs data. Then projected demands are confronted to future water availability. Operating rules of the reservoirs and water allocation between demands are based on

  10. Lattice Boltzmann methods applied to large-scale three-dimensional virtual cores constructed from digital optical borehole images of the karst carbonate Biscayne aquifer in southeastern Florida

    Science.gov (United States)

    Michael Sukop,; Cunningham, Kevin J.

    2014-01-01

    Digital optical borehole images at approximately 2 mm vertical resolution and borehole caliper data were used to create three-dimensional renderings of the distribution of (1) matrix porosity and (2) vuggy megaporosity for the karst carbonate Biscayne aquifer in southeastern Florida. The renderings based on the borehole data were used as input into Lattice Boltzmann methods to obtain intrinsic permeability estimates for this extremely transmissive aquifer, where traditional aquifer test methods may fail due to very small drawdowns and non-Darcian flow that can reduce apparent hydraulic conductivity. Variogram analysis of the borehole data suggests a nearly isotropic rock structure at lag lengths up to the nominal borehole diameter. A strong correlation between the diameter of the borehole and the presence of vuggy megaporosity in the data set led to a bias in the variogram where the computed horizontal spatial autocorrelation is strong at lag distances greater than the nominal borehole size. Lattice Boltzmann simulation of flow across a 0.4 × 0.4 × 17 m (2.72 m3 volume) parallel-walled column of rendered matrix and vuggy megaporosity indicates a high hydraulic conductivity of 53 m s−1. This value is similar to previous Lattice Boltzmann calculations of hydraulic conductivity in smaller limestone samples of the Biscayne aquifer. The development of simulation methods that reproduce dual-porosity systems with higher resolution and fidelity and that consider flow through horizontally longer renderings could provide improved estimates of the hydraulic conductivity and help to address questions about the importance of scale.

  11. Geomagnetism of earth's core

    Science.gov (United States)

    Benton, E. R.

    1983-01-01

    Instrumentation, analytical methods, and research goals for understanding the behavior and source of geophysical magnetism are reviewed. Magsat, launched in 1979, collected global magnetometer data and identified the main terrestrial magnetic fields. The data has been treated by representing the curl-free field in terms of a scalar potential which is decomposed into a truncated series of spherical harmonics. Solutions to the Laplace equation then extend the field upward or downward from the measurement level through intervening spaces with no source. Further research is necessary on the interaction between harmonics of various spatial scales. Attempts are also being made to analytically model the main field and its secular variation at the core-mantle boundary. Work is also being done on characterizing the core structure, composition, thermodynamics, energetics, and formation, as well as designing a new Magsat or a tethered satellite to be flown on the Shuttle.

  12. Geomagnetism of earth's core

    Science.gov (United States)

    Benton, E. R.

    1983-01-01

    Instrumentation, analytical methods, and research goals for understanding the behavior and source of geophysical magnetism are reviewed. Magsat, launched in 1979, collected global magnetometer data and identified the main terrestrial magnetic fields. The data has been treated by representing the curl-free field in terms of a scalar potential which is decomposed into a truncated series of spherical harmonics. Solutions to the Laplace equation then extend the field upward or downward from the measurement level through intervening spaces with no source. Further research is necessary on the interaction between harmonics of various spatial scales. Attempts are also being made to analytically model the main field and its secular variation at the core-mantle boundary. Work is also being done on characterizing the core structure, composition, thermodynamics, energetics, and formation, as well as designing a new Magsat or a tethered satellite to be flown on the Shuttle.

  13. Dynamics of core accretion

    Science.gov (United States)

    Nelson, Andrew F.; Ruffert, Maximilian

    2013-02-01

    We perform three-dimensional hydrodynamic simulations of gas flowing around a planetary core of mass Mpl = 10M⊕ embedded in a near Keplerian background flow, using a modified shearing box approximation. We assume an ideal gas behaviour following an equation of state with a fixed ratio of the specific heats, γ = 1.42, consistent with the conditions of a moderate-temperature background disc with solar composition. No radiative heating or cooling is included in the models. We employ a nested grid hydrodynamic code implementing the `Piecewise Parabolic Method' with as many as six fixed nested grids, providing spatial resolution on the finest grid comparable to the present-day diameters of Neptune and Uranus. We find that a strongly dynamically active flow develops such that no static envelope can form. The activity is not sensitive to plausible variations in the rotation curve of the underlying disc. It is sensitive to the thermodynamic treatment of the gas, as modelled by prescribed equations of state (either `locally isothermal' or `locally isentropic') and the temperature of the background disc material. The activity is also sensitive to the shape and depth of the core's gravitational potential, through its mass and gravitational softening coefficient. Each of these factors influences the magnitude and character of hydrodynamic feedback of the small-scale flow on the background, and we conclude that accurate modelling of such feedback is critical to a complete understanding of the core accretion process. The varying flow pattern gives rise to large, irregular eruptions of matter from the region around the core which return matter to the background flow: mass in the envelope at one time may not be found in the envelope at any later time. No net mass accretion into the envelope is observed over the course of the simulation and none is expected, due to our neglect of cooling. Except in cases of very rapid cooling however, as defined by locally isothermal or

  14. Associations between generic substitution and patient-related factors

    DEFF Research Database (Denmark)

    Østergaard Rathe, Jette

    for Pharmacoepidemiology, Karolinska Institutet, Department of Medicine Solna, Stockholm, Sweden 3. Danish Quality Unit of General Practice, Odense, Denmark Background Generic substitution means that chemically equivalent but less expensive drugs are dispensed in place of a brand name product. Although generic medicines...... was made on beliefs about medicine, views on generic medicine and confidence in the health care system. The study comprised 2476 patients (736 users of antidepressants, 795 users of antiepileptics and 945 users of other substitutable drugs). For each patient we focused on one purchase of a generically...

  15. Generic medicine pricing in Europe: current issues and future perspective.

    Science.gov (United States)

    Simoens, Steven

    2008-01-01

    This editorial discusses a number of trends affecting the pricing of generic medicines in Europe. With respect to pricing, recent evidence has emerged that European generic medicine manufacturers face competition from Indian manufacturers; that the price level of generic medicines varies substantially between European countries; and that generic medicine manufacturers engage in competition by discount rather than price competition in France, The Netherlands and the UK. These trends suggest that there may be scope for further reducing the prices of generic medicines in several countries. In relation to reference pricing, most European countries have incorporated market incentives within reference pricing systems with a view to promoting price competition. The European experience indicates that the generic medicines industry delivers competitive prices under a reference pricing system if demand-side policies are in place that stimulate physicians, pharmacists and patients to use generic medicines. Finally, caution needs to be exercised when focusing on the drivers of generic medicine pricing as these drivers not only vary between countries, but may also vary within a country. Manufacturers of originator and generic medicines do not take a single pricing approach following patent expiry, but vary their pricing strategy from molecule to molecule.

  16. Brand loyalty, patients and limited generic medicines uptake.

    Science.gov (United States)

    Costa-Font, Joan; Rudisill, Caroline; Tan, Stefanie

    2014-06-01

    The sluggish development of European generic drug markets depends heavily on demand side factors, and more specifically, patients' and doctors' loyalty to branded products. Loyalty to originator drugs, to the point where originator prices rise upon generic entry has been described as the 'generics paradox'. Originator loyalty can emerge for a plethora of reasons; including costs, perceptions about quality and physician advice. We know very little about the behavioural underpinnings of brand loyalty from the consumer or patient standpoint. This paper attempts to test the extent to which patients are brand loyal by drawing upon Spain's 2002 Health Barometer survey as it includes questions about consumer acceptance of generics in a country with exceptionally low generic uptake and substitution at the time of the study. Our findings suggest that at least 13% of the population would not accept generics as substitutes to the originator. These results confirm evidence of brand loyalty for a minority. Alongside high levels of awareness of generics, we find that low cost-sharing levels explain consumer brand loyalty but their impact on acceptance of generic substitution is very small. Higher cost-sharing and exempting fewer patients from cost-sharing have the potential to encourage generic acceptance.

  17. Towards Using a Generic Robot as Training Partner

    DEFF Research Database (Denmark)

    Sørensen, Anders Stengaard; Savarimuthu, Thiusius Rajeeth; Nielsen, Jacob

    2014-01-01

    In this paper, we demonstrate how a generic industrial robot can be used as a training partner, for upper limb training. The motion path and human/robot interaction of a non-generic upper-arm training robot is transferred to a generic industrial robot arm, and we demonstrate that the robot arm can...... implement the same type of interaction, but can expand the training regime to include both upper arm and shoulder training. We compare the generic robot to two affordable but custom-built training robots, and outline interesting directions for future work based on these training robots....

  18. Pharmacy and generic substitution of antiepileptic drugs: missing in action?

    Science.gov (United States)

    Welty, Timothy E

    2007-06-01

    Generic substitution of antiepileptic drugs is an issue that is gathering a lot of attention in the neurology community but is not receiving much attention within pharmacy. Several proposals have been drafted that restrict a pharmacist's decision-making in generic substitution. These proposals highlight concerns about the pharmacy community related to generic substitution. Careful consideration needs to be given to these issues by pharmacists and pharmacy professional organizations. Unless pharmacy as a profession takes strong positions in support of a pharmacist's ability to make decisions about pharmacotherapy and addresses many of the pharmacy-related problems of generic substitution, policies that negatively impact pharmacy will be established.

  19. A generic method for evaluating crowding in the emergency department

    DEFF Research Database (Denmark)

    Eiset, Andreas Halgreen; Erlandsen, Mogens; Møllekær, Anders Brøns;

    2016-01-01

    Background Crowding in the emergency department (ED) has been studied intensively using complicated non-generic methods that may prove difficult to implement in a clinical setting. This study sought to develop a generic method to describe and analyse crowding from measurements readily available......, a ‘carry over’ effect was shown between shifts and days. Conclusions The presented method offers an easy and generic way to get detailed insight into the dynamics of crowding in an ED. Keywords Crowding, Emergency department, ED, Generic, Method, Model, Queue, Patient flow...

  20. Generic trajectory representation and trajectory following for wheeled robots

    DEFF Research Database (Denmark)

    Kjærgaard, Morten; Andersen, Nils Axel; Ravn, Ole

    2014-01-01

    This article presents the work towards a purely generic navigation solution for wheeled mobile robots motivated by the following goals: Generic: Works for different types of robots. Configurable: Parameters maps to geometric properties of the robot. Predictable: Well defined where the robot...... will drive. Safe: Avoid fatal collisions. Based on a survey of existing methods and algorithms the article presents a generic way to represent constraints for different types of robots, a generic way to represent trajectories using Bëzier curves, a method to convert the trajectory so it can be driven...

  1. Magnectic Probing of Core Geodynamics

    Science.gov (United States)

    Voorhies, Coerte

    2004-01-01

    To better understand geomagnetic theory and observation, we can use spatial magnetic spectra for the main field and secular variation to test core dynamical hypotheses against seismology. The hypotheses lead to theoretical spectra which are fitted to observational spectra. Each fit yields an estimate of the radius of Earth s core and uncertainty. If this agrees with the seismologic value, then the hypotheses pass the test. A new way to obtain theoretical spectra extends the hydromagnetic scale analysis of Benton to scale-variant field and flow. For narrow scale flow and a dynamically weak field by the top of Earth s core, this yields a JGR-PI, and a secular variation spectrum modulated by a cubic polynomial in spherical harmonic degree n. The former passes the tests. The latter passes many tests, but does not describe rapid dipole decline and quadrupole rebound; some tests suggest it is a bit hard, or rich in narrow scale change.In a core geodynamo, motion of the fluid conductor does work against the Lorentz force. This converts kinetic into magnetic energy which, in turn, is lost to heat via Ohmic dissipation. In the analysis at length- scale l/k, if one presumes kinetic energy is converted in either eddy- overturning or magnetic free-decay time-scales, then Kolmogorov or other spectra in conflict with observational spectra can result. Instead, the rate work is done roughly balances the dissipation rate, which is consistent with small scale flow. The conversion time-scale depends on dynamical constraints. These are summarized by the magneto- geostrophic vertical vorticity balance by the top of the core, which includes anisotropic effects of rotation, the magnetic field, and the core-mantle boundary. The resulting theoretical spectra for the core- source field and its SV are far more compatible with observation. The conversion time-scale of order 120 years is pseudo-scale-invariant. Magnetic spectra of other planets may differ; however, if a transition to non

  2. Modeling generic aspects of ideal fibril formation

    CERN Document Server

    Michel, Denis

    2016-01-01

    Many different proteins self-aggregate into insoluble fibrils growing apically by reversible addition of elementary building blocks. But beyond this common principle, the modalities of fibril formation are very disparate, with various intermediate forms which can be reshuffled by minor modifications of physico-chemical conditions or amino-acid sequences. To bypass this complexity, the multifaceted phenomenon of fibril formation is reduced here to its most elementary principles defined for a linear prototype of fibril. Selected generic features, including nucleation, elongation and conformational recruitment, are modeled using minimalist hypotheses and tools, by separating equilibrium from kinetic aspects and in vitro from in vivo conditions. These reductionist approaches allow to bring out known and new rudiments, including the kinetic and equilibrium effects of nucleation, the dual influence of elongation on nucleation, the kinetic limitations on nucleation and fibril numbers and the accumulation of complexe...

  3. Making sense of a generic label

    DEFF Research Database (Denmark)

    Lassen, Inger

    2016-01-01

    Making sense of a generic label through linguistic context analysis: A study of genre (re)cognition among novices’ Considerable work has been done on written and spoken genres characterized by a high degree of ritualization with “predictable elements occurring in a predictable order” (Fairclough...... (Bhatia 2008) assist the analytical process? Inspired by Sommers and Saltz (2004), Bhatia (2008) and Tardy (2009), these research questions will be addressed on the basis of 55 exam papers written in January 2014 by 3rd year undergraduate students. The exam tested students’ competences in genre......) that the students had studied during the course leading up to the exam. Given the lack of situated cognition (Bawarshi and Reiff, 2010: 79) of one of these genres, the students were requested to produce arguments and justification for assigning the genres presented to them to two different genre colonies...

  4. Static aeroelastic analysis for generic configuration aircraft

    Science.gov (United States)

    Lee, IN; Miura, Hirokazu; Chargin, Mladen K.

    1987-01-01

    A static aeroelastic analysis capability that can calculate flexible air loads for generic configuration aircraft was developed. It was made possible by integrating a finite element structural analysis code (MSC/NASTRAN) and a panel code of aerodynamic analysis based on linear potential flow theory. The framework already built in MSC/NASTRAN was used and the aerodynamic influence coefficient matrix is computed externally and inserted in the NASTRAN by means of a DMAP program. It was shown that deformation and flexible airloads of an oblique wing aircraft can be calculated reliably by this code both in subsonic and supersonic speeds. Preliminary results indicating importance of flexibility in calculating air loads for this type of aircraft are presented.

  5. A Generic Solution Approach to Nurse Rostering

    DEFF Research Database (Denmark)

    Hansen, Anders Dohn; Mason, Andrew; Ryan, David

    , which is solved in a branch-and-price framework. Columns of the set partitioning problem are generated dynamically and branch-and-bound is used to enforce integrality. The column generating subproblem is modeled in three stages that utilize the inherent structure of roster-lines. Some important features...... of the implementation are described. The implementation builds on the generic model and hence the program can be setup for any problem that fits the model. The adaption to a new problem is simple, as it requires only the input of a new problem definition. The solution method is internally adjusted according to the new...... definition. In this report, we present two different practical problems along with corresponding solutions. The approach captures all features of each problem and is efficient enough to provide optimal solutions. The solution time is still too large for the method to be immediately applicable in practice...

  6. [Generics: similarities, bioequivalence but no conformity].

    Science.gov (United States)

    Even-Adin, D; De Muylder, J A; Sternon, J

    2002-01-01

    The using of generic forms (GF) is presented as a potential source of budgetary "saving of money" in the field of pharmaceutical expenses. Not frequently prescribed in Belgium, they win a new interest thanks to the recent making use of the "reference repayment". Sale's authorization of GF is controlled by european rules, but some questions about their identity to original medications remain. Do similarities based only upon qualitative and quantitative composition in active molecules, pharmaceutical forms and biodisponibility give us all requested guaranties? Several cases of discordances can appear; the major elements of non conformity are the nature of excipients, notice's contents and the value of biodisponibility studies. However, in term of economy, in the drug market, development of GF appears to constitute an unavoidable phenomenon.

  7. Crystallization Kinetics within a Generic Modeling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist V.

    2014-01-01

    to the modeling of various kinetic phenomena like nucleation, growth, agglomeration, and breakage are discussed in terms of model forms, model parameters, their availability and/or estimation, and their selection and application for specific crystallization operational scenarios under study. The advantages......A new and extended version of a generic modeling framework for analysis and design of crystallization operations is presented. The new features of this framework are described, with focus on development, implementation, identification, and analysis of crystallization kinetic models. Issues related...... of employing a well-structured model library for storage, use/reuse, and analysis of the kinetic models are highlighted. Examples illustrating the application of the modeling framework for kinetic model discrimination related to simulation of specific crystallization scenarios and for kinetic model parameter...

  8. Generic Adaptively Secure Searchable Phrase Encryption

    Directory of Open Access Journals (Sweden)

    Kissel Zachary A.

    2017-01-01

    Full Text Available In recent years searchable symmetric encryption has seen a rapid increase in query expressiveness including keyword, phrase, Boolean, and fuzzy queries. With this expressiveness came increasingly complex constructions. Having these facts in mind, we present an efficient and generic searchable symmetric encryption construction for phrase queries. Our construction is straightforward to implement, and is proven secure under adaptively chosen query attacks (CQA2 in the random oracle model with an honest-but-curious adversary. To our knowledge, this is the first encrypted phrase search system that achieves CQA2 security. Moreover, we demonstrate that our document collection preprocessing algorithm allows us to extend a dynamic SSE construction so that it supports phrase queries. We also provide a compiler theorem which transforms any CQA2-secure SSE construction for keyword queries into a CQA2-secure SSE construction that supports phrase queries.

  9. Generic small modular reactor plant design.

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Tom Goslee,; Cipiti, Benjamin B.; Jordan, Sabina Erteza; Baum, Gregory A.

    2012-12-01

    This report gives an overview of expected design characteristics, concepts, and procedures for small modular reactors. The purpose of this report is to provide those who are interested in reducing the cost and improving the safety of advanced nuclear power plants with a generic design that possesses enough detail in a non-sensitive manner to give merit to their conclusions. The report is focused on light water reactor technology, but does add details on what could be different in a more advanced design (see Appendix). Numerous reactor and facility concepts were used for inspiration (documented in the bibliography). The final design described here is conceptual and does not reflect any proposed concept or sub-systems, thus any details given here are only relevant within this report. This report does not include any design or engineering calculations.

  10. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  11. Generic Skills from Qur'anic Perspective

    Directory of Open Access Journals (Sweden)

    Siddig Ahmad

    2012-06-01

    Full Text Available Generic skills are defined as a set of skills that are directly related and needed for the working environment. Employers prefer to recruit officials who are competent in interpersonal communication, leadership skill, team work, oral and written skills. They are reluctant to employ graduates lacking certain necessary skills. This reveals the fact that there is a serious gap between the skills that are required by the employers and the skills that the graduates possess. Therefore, this research is focused on five aspects of generic skills namely; communication, team work, problem solving, lifelong learning and self-esteem. From Qur’anic perspective, the same terms have been used except minor differences in using various terms. The thematic approach is used when discussing these aspects from the Qur’an. The findings showed that the ways of effective communication are represented by terms of qawl sadid, qawl ma`ruf, qawl baligh, qawl maysur, qawl karim and qawl layyin. For collective work, ta`aruf and tafahum, as the pre-requisites, should be practiced via ta`awun and takaful. For problem solving, four methods are adapted from the Qur’an such as reflection of the past, observation, demonstration and asking questions. For lifelong learning, the establishment of learning institutions and the self-motivation of learners are two pre-requisites that should be undertaken for its accomplishment. They could be practiced through open learning system, consultation and hands-on learning. Last but not least, for personality development could be built up through physical training, spiritual training and mental training.

  12. Generic Opinion Mining System for Decision Support

    Directory of Open Access Journals (Sweden)

    Dr.P.G.Naik

    2016-04-01

    Full Text Available Social networking sites prove to be indispensible tools for decision making owing to the large repository of user views accumulated over a period of time. Such a real data can be exploited for various purposes such as making buying decisions, analysing the user views about new product launched by a company, product promotion campaign , impact of policy decisions made by a political party on society etc. In the current work the authors have proposed a generic model for feature based polarity determination by sentiment analysis of tweets. This model has been implemented by the seamless integration of R tool, XML, JAVA, Link Parser A practical multistep system, in place, efficiently extracts data from tweet text, pre-process the raw data to remove noise, and tags their polarity. Data used in the current study is derived from online product feature based reviews collected from tweeter tweets. Link parser version 4.1 b is employed for parsing a natural sentence which is broken into multiple tokens corresponding to noun and adjective before being stored in a persistent storage medium. The objectivity score is determined using SentiWordNet 3.0 lexical resource which is parsed using a tool implemented in Java. The linguistic hedges are taken care of using Zadeh’s proposition which modifies the final objectivity score. The objectivity score so computed, provides the necessary guidelines in influencing decisions. The authors have tested the model for product purchase decisions of two different sets of products, smart phone and laptop based on predefined set of features. The model is generic and can be applied to any set of products evaluated on a predefined set of features.

  13. Convergence of generic pronouns : Language contact and Faroese mann

    NARCIS (Netherlands)

    Knooihuizen, Remco

    2015-01-01

    Despite state-driven language policy against Danish linguistic influence, the Faroese language has borrowed the Danish generic pronoun mann 'one'. As in Danish, this pronoun varies with generically used tú 'you'. An analysis of the variation in Faroese shows that Faroese tú is used more often than i

  14. 40 CFR 721.10180 - Trifunctional acrylic ester (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Trifunctional acrylic ester (generic... Specific Chemical Substances § 721.10180 Trifunctional acrylic ester (generic). (a) Chemical substance and... acrylic ester (PMN P-04-692) is subject to reporting under this section for the significant new...

  15. 40 CFR 721.2825 - Alkyl ester (generic name).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Alkyl ester (generic name). 721.2825... Substances § 721.2825 Alkyl ester (generic name). (a) Chemical substance and significant new uses subject to reporting. (1) The chemical substance alkyl ester (PMN P-84-968) is subject to reporting under this...

  16. 17 CFR 230.135a - Generic advertising.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false Generic advertising. 230.135a... REGULATIONS, SECURITIES ACT OF 1933 General § 230.135a Generic advertising. (a) For the purposes only of section 5 of the Act, a notice, circular, advertisement, letter, sign, or other communication,...

  17. Generic medicines: Perceptions of Physicians in Basrah, Iraq

    Directory of Open Access Journals (Sweden)

    Adheed Khalid Sharrad

    2009-08-01

    Full Text Available BackgroundThe use of cheaper generic medicines is a strategy promotedin many countries to reduce rising health care costs. The aimof this study was to explore factors affecting generic medicineprescribing by physicians in Basrah, Iraq.MethodologyA purposive sample of ten physicians practicing in Basrahwas interviewed using a semi-structured interview guide.ResultsAnalysis of the interviews identified seven major themes:medicine prescribing practice, knowledge of therapeuticequivalency of generic medicine, patients’ acceptance ofgeneric medicine, counterfeit medicine, drug informationsource and effect of drug advertising on medicines choice,brand substitution practice by community pharmacists, and,finally strategies to improve generic medicine usefulness.Participants identified helpful strategies to increase genericprescribing including; physician and patient education ongeneric medicine; persuading physicians about the safety andefficacy of generic medicines; and finally educating seniormedical students on generic prescribing.ConclusionThe data suggest that participants were enthusiasticabout prescribing generic medicines. However physiciansinsist that pharmacists should not be allowed tosubstitute generic drugs without prior approval ofdoctors.

  18. Generic Attributes as Espoused Theory: The Importance of Context

    Science.gov (United States)

    Jones, Anna

    2009-01-01

    There has been considerable interest in generic attributes in higher education for over a decade and yet while generic skills or attributes are an important aspect of policy, there is often a lack of consistency between beliefs about the importance of these skills and attributes and the degree to which exist in teaching practice. There has been an…

  19. 40 CFR 721.10073 - Modified alkyl acrylamide (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Modified alkyl acrylamide (generic... Specific Chemical Substances § 721.10073 Modified alkyl acrylamide (generic). (a) Chemical substance and... acrylamide (PMN P-05-536) is subject to reporting under this section for the significant new uses described...

  20. 40 CFR 721.10127 - Alkenyl dimethyl betaine (generic).

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Alkenyl dimethyl betaine (generic... Specific Chemical Substances § 721.10127 Alkenyl dimethyl betaine (generic). (a) Chemical substance and... dimethyl betaine (PMN P-06-693) is subject to reporting under this section for the significant new...