WorldWideScience

Sample records for significantly outperform standard

  1. Do bilinguals outperform monolinguals?

    Directory of Open Access Journals (Sweden)

    Sejdi Sejdiu

    2016-11-01

    Full Text Available The relationship between second dialect acquisition and the psychological capacity of the learner is still a divisive topic that generates a lot of debate. A few researchers contend that the acquisition of the second dialect tends to improve the cognitive abilities in various individuals, but at the same time it could hinder the same abilities in other people. Currently, immersion is a common occurrence in some countries. In the recent past, it has significantly increased in its popularity, which has caused parents, professionals, and researchers to question whether second language acquisition has a positive impact on cognitive development, encompassing psychological ability. In rundown, the above might decide to comprehend the effects of using a second language based on the literal aptitudes connected with the native language. The issue of bilingualism was seen as a disadvantage until recently because of two languages being present which would hinder or delay the development of languages. However, recent studies have proven that bilinguals outperform monolinguals in tasks which require more attention.

  2. Smiling on the Inside: The Social Benefits of Suppressing Positive Emotions in Outperformance Situations.

    Science.gov (United States)

    Schall, Marina; Martiny, Sarah E; Goetz, Thomas; Hall, Nathan C

    2016-05-01

    Although expressing positive emotions is typically socially rewarded, in the present work, we predicted that people suppress positive emotions and thereby experience social benefits when outperformed others are present. We tested our predictions in three experimental studies with high school students. In Studies 1 and 2, we manipulated the type of social situation (outperformance vs. non-outperformance) and assessed suppression of positive emotions. In both studies, individuals reported suppressing positive emotions more in outperformance situations than in non-outperformance situations. In Study 3, we manipulated the social situation (outperformance vs. non-outperformance) as well as the videotaped person's expression of positive emotions (suppression vs. expression). The findings showed that when outperforming others, individuals were indeed evaluated more positively when they suppressed rather than expressed their positive emotions, and demonstrate the importance of the specific social situation with respect to the effects of suppression. © 2016 by the Society for Personality and Social Psychology, Inc.

  3. Automated Facial Coding Software Outperforms People in Recognizing Neutral Faces as Neutral from Standardized Datasets

    Directory of Open Access Journals (Sweden)

    Peter eLewinski

    2015-09-01

    Full Text Available Little is known about people’s accuracy of recognizing neutral faces as neutral. In this paper, I demonstrate the importance of knowing how well people recognize neutral faces. I contrasted human recognition scores of 100 typical, neutral front-up facial images with scores of an arguably objective judge – automated facial coding (AFC software. I hypothesized that the software would outperform humans in recognizing neutral faces because of the inherently objective nature of computer algorithms. Results confirmed this hypothesis. I provided the first-ever evidence that computer software (90% was more accurate in recognizing neutral faces than people were (59%. I posited two theoretical mechanisms, i.e. smile-as-a-baseline and false recognition of emotion, as possible explanations for my findings.

  4. Hip fracture risk assessment: artificial neural network outperforms conditional logistic regression in an age- and sex-matched case control study.

    Science.gov (United States)

    Tseng, Wo-Jan; Hung, Li-Wei; Shieh, Jiann-Shing; Abbod, Maysam F; Lin, Jinn

    2013-07-15

    Osteoporotic hip fractures with a significant morbidity and excess mortality among the elderly have imposed huge health and economic burdens on societies worldwide. In this age- and sex-matched case control study, we examined the risk factors of hip fractures and assessed the fracture risk by conditional logistic regression (CLR) and ensemble artificial neural network (ANN). The performances of these two classifiers were compared. The study population consisted of 217 pairs (149 women and 68 men) of fractures and controls with an age older than 60 years. All the participants were interviewed with the same standardized questionnaire including questions on 66 risk factors in 12 categories. Univariate CLR analysis was initially conducted to examine the unadjusted odds ratio of all potential risk factors. The significant risk factors were then tested by multivariate analyses. For fracture risk assessment, the participants were randomly divided into modeling and testing datasets for 10-fold cross validation analyses. The predicting models built by CLR and ANN in modeling datasets were applied to testing datasets for generalization study. The performances, including discrimination and calibration, were compared with non-parametric Wilcoxon tests. In univariate CLR analyses, 16 variables achieved significant level, and six of them remained significant in multivariate analyses, including low T score, low BMI, low MMSE score, milk intake, walking difficulty, and significant fall at home. For discrimination, ANN outperformed CLR in both 16- and 6-variable analyses in modeling and testing datasets (p?hip fracture are more personal than environmental. With adequate model construction, ANN may outperform CLR in both discrimination and calibration. ANN seems to have not been developed to its full potential and efforts should be made to improve its performance.

  5. Weak-value measurements can outperform conventional measurements

    International Nuclear Information System (INIS)

    Magaña-Loaiza, Omar S; Boyd, Robert W; Harris, Jérémie; Lundeen, Jeff S

    2017-01-01

    In this paper we provide a simple, straightforward example of a specific situation in which weak-value amplification (WVA) clearly outperforms conventional measurement in determining the angular orientation of an optical component. We also offer a perspective reconciling the views of some theorists, who claim WVA to be inherently sub-optimal for parameter estimation, with the perspective of the many experimentalists and theorists who have used the procedure to successfully access otherwise elusive phenomena. (invited comment)

  6. Complementary Variety: When Can Cooperation in Uncertain Environments Outperform Competitive Selection?

    Directory of Open Access Journals (Sweden)

    Martin Hilbert

    2017-01-01

    Full Text Available Evolving biological and socioeconomic populations can sometimes increase their growth rate by cooperatively redistributing resources among their members. In unchanging environments, this simply comes down to reallocating resources to fitter types. In uncertain and fluctuating environments, cooperation cannot always outperform blind competitive selection. When can it? The conditions depend on the particular shape of the fitness landscape. The article derives a single measure that quantifies by how much an intervention in stochastic environments can possibly outperform the blind forces of natural selection. It is a multivariate and multilevel measure that essentially quantifies the amount of complementary variety between different population types and environmental states. The more complementary the fitness of types in different environmental states, the proportionally larger the potential benefit of strategic cooperation over competitive selection. With complementary variety, holding population shares constant will always outperform natural and market selection (including bet-hedging, portfolio management, and stochastic switching. The result can be used both to determine the acceptable cost of learning the details of a fitness landscape and to design multilevel classification systems of population types and environmental states that maximize population growth. Two empirical cases are explored, one from the evolving economy and the other one from migrating birds.

  7. Reciprocity Outperforms Conformity to Promote Cooperation.

    Science.gov (United States)

    Romano, Angelo; Balliet, Daniel

    2017-10-01

    Evolutionary psychologists have proposed two processes that could give rise to the pervasiveness of human cooperation observed among individuals who are not genetically related: reciprocity and conformity. We tested whether reciprocity outperformed conformity in promoting cooperation, especially when these psychological processes would promote a different cooperative or noncooperative response. To do so, across three studies, we observed participants' cooperation with a partner after learning (a) that their partner had behaved cooperatively (or not) on several previous trials and (b) that their group members had behaved cooperatively (or not) on several previous trials with that same partner. Although we found that people both reciprocate and conform, reciprocity has a stronger influence on cooperation. Moreover, we found that conformity can be partly explained by a concern about one's reputation-a finding that supports a reciprocity framework.

  8. A paclitaxel-loaded recombinant polypeptide nanoparticle outperforms Abraxane in multiple murine cancer models

    Science.gov (United States)

    Bhattacharyya, Jayanta; Bellucci, Joseph J.; Weitzhandler, Isaac; McDaniel, Jonathan R.; Spasojevic, Ivan; Li, Xinghai; Lin, Chao-Chieh; Chi, Jen-Tsan Ashley; Chilkoti, Ashutosh

    2015-08-01

    Packaging clinically relevant hydrophobic drugs into a self-assembled nanoparticle can improve their aqueous solubility, plasma half-life, tumour-specific uptake and therapeutic potential. To this end, here we conjugated paclitaxel (PTX) to recombinant chimeric polypeptides (CPs) that spontaneously self-assemble into ~60 nm near-monodisperse nanoparticles that increased the systemic exposure of PTX by sevenfold compared with free drug and twofold compared with the Food and Drug Administration-approved taxane nanoformulation (Abraxane). The tumour uptake of the CP-PTX nanoparticle was fivefold greater than free drug and twofold greater than Abraxane. In a murine cancer model of human triple-negative breast cancer and prostate cancer, CP-PTX induced near-complete tumour regression after a single dose in both tumour models, whereas at the same dose, no mice treated with Abraxane survived for >80 days (breast) and 60 days (prostate), respectively. These results show that a molecularly engineered nanoparticle with precisely engineered design features outperforms Abraxane, the current gold standard for PTX delivery.

  9. Stochastic gradient ascent outperforms gamers in the Quantum Moves game

    Science.gov (United States)

    Sels, Dries

    2018-04-01

    In a recent work on quantum state preparation, Sørensen and co-workers [Nature (London) 532, 210 (2016), 10.1038/nature17620] explore the possibility of using video games to help design quantum control protocols. The authors present a game called "Quantum Moves" (https://www.scienceathome.org/games/quantum-moves/) in which gamers have to move an atom from A to B by means of optical tweezers. They report that, "players succeed where purely numerical optimization fails." Moreover, by harnessing the player strategies, they can "outperform the most prominent established numerical methods." The aim of this Rapid Communication is to analyze the problem in detail and show that those claims are untenable. In fact, without any prior knowledge and starting from a random initial seed, a simple stochastic local optimization method finds near-optimal solutions which outperform all players. Counterdiabatic driving can even be used to generate protocols without resorting to numeric optimization. The analysis results in an accurate analytic estimate of the quantum speed limit which, apart from zero-point motion, is shown to be entirely classical in nature. The latter might explain why gamers are reasonably good at the game. A simple modification of the BringHomeWater challenge is proposed to test this hypothesis.

  10. Deep Convolutional Neural Networks Outperform Feature-Based But Not Categorical Models in Explaining Object Similarity Judgments

    Science.gov (United States)

    Jozwik, Kamila M.; Kriegeskorte, Nikolaus; Storrs, Katherine R.; Mur, Marieke

    2017-01-01

    Recent advances in Deep convolutional Neural Networks (DNNs) have enabled unprecedentedly accurate computational models of brain representations, and present an exciting opportunity to model diverse cognitive functions. State-of-the-art DNNs achieve human-level performance on object categorisation, but it is unclear how well they capture human behavior on complex cognitive tasks. Recent reports suggest that DNNs can explain significant variance in one such task, judging object similarity. Here, we extend these findings by replicating them for a rich set of object images, comparing performance across layers within two DNNs of different depths, and examining how the DNNs’ performance compares to that of non-computational “conceptual” models. Human observers performed similarity judgments for a set of 92 images of real-world objects. Representations of the same images were obtained in each of the layers of two DNNs of different depths (8-layer AlexNet and 16-layer VGG-16). To create conceptual models, other human observers generated visual-feature labels (e.g., “eye”) and category labels (e.g., “animal”) for the same image set. Feature labels were divided into parts, colors, textures and contours, while category labels were divided into subordinate, basic, and superordinate categories. We fitted models derived from the features, categories, and from each layer of each DNN to the similarity judgments, using representational similarity analysis to evaluate model performance. In both DNNs, similarity within the last layer explains most of the explainable variance in human similarity judgments. The last layer outperforms almost all feature-based models. Late and mid-level layers outperform some but not all feature-based models. Importantly, categorical models predict similarity judgments significantly better than any DNN layer. Our results provide further evidence for commonalities between DNNs and brain representations. Models derived from visual features

  11. Using Outperformance Pay to Motivate Academics: Insiders' Accounts of Promises and Problems

    Science.gov (United States)

    Field, Laurie

    2015-01-01

    Many researchers have investigated the appropriateness of pay for outperformance, (also called "merit-based pay" and "performance-based pay") for academics, but a review of this body of work shows that the voice of academics themselves is largely absent. This article is a contribution to addressing this gap, summarising the…

  12. Toward a biologically significant and usable standard for ozone that will also protect plants

    International Nuclear Information System (INIS)

    Paoletti, Elena; Manning, William J.

    2007-01-01

    Ozone remains an important phytotoxic air pollutant and is also recognized as a significant greenhouse gas. In North America, Europe, and Asia, incidence of high concentrations is decreasing, but background levels are steadily rising. There is a need to develop a biologically significant and usable standard for ozone. We compare the strengths and weaknesses of concentration-based, exposure-based and threshold-based indices, such as SUM60 and AOT40, and examine the O 3 flux concept. We also present major challenges to the development of an air quality standard for ozone that has both biological significance and practicality in usage. - Current standards do not protect vegetation from ozone, but progress is being made

  13. Female Chess Players Outperform Expectations When Playing Men.

    Science.gov (United States)

    Stafford, Tom

    2018-03-01

    Stereotype threat has been offered as a potential explanation of differential performance between men and women in some cognitive domains. Questions remain about the reliability and generality of the phenomenon. Previous studies have found that stereotype threat is activated in female chess players when they are matched against male players. I used data from over 5.5 million games of international tournament chess and found no evidence of a stereotype-threat effect. In fact, female players outperform expectations when playing men. Further analysis showed no influence of degree of challenge, player age, nor prevalence of female role models in national chess leagues on differences in performance when women play men versus when they play women. Though this analysis contradicts one specific mechanism of influence of gender stereotypes, the persistent differences between male and female players suggest that systematic factors do exist and remain to be uncovered.

  14. Physiological outperformance at the morphologically-transformed edge of the cyanobacteriosponge Terpios hoshinota (Suberitidae: Hadromerida when confronting opponent corals.

    Directory of Open Access Journals (Sweden)

    Jih-Terng Wang

    Full Text Available Terpios hoshinota, an encrusting cyanosponge, is known as a strong substrate competitor of reef-building corals that kills encountered coral by overgrowth. Terpios outbreaks cause significant declines in living coral cover in Indo-Pacific coral reefs, with the damage usually lasting for decades. Recent studies show that there are morphological transformations at a sponge's growth front when confronting corals. Whether these morphological transformations at coral contacts are involved with physiological outperformance (e.g., higher metabolic activity or nutritional status over other portions of Terpios remains equivocal. In this study, we compared the indicators of photosynthetic capability and nitrogen status of a sponge-cyanobacteria association at proximal, middle, and distal portions of opponent corals. Terpios tissues in contact with corals displayed significant increases in photosynthetic oxygen production (ca. 61%, the δ13C value (ca. 4%, free proteinogenic amino acid content (ca. 85%, and Gln/Glu ratio (ca. 115% compared to middle and distal parts of the sponge. In contrast, the maximum quantum yield (Fv/Fm, which is the indicator usually used to represent the integrity of photosystem II, of cyanobacteria photosynthesis was low (0.256~0.319 and showed an inverse trend of higher values in the distal portion of the sponge that might be due to high and variable levels of cyanobacterial phycocyanin. The inconsistent results between photosynthetic oxygen production and Fv/Fm values indicated that maximum quantum yields might not be a suitable indicator to represent the photosynthetic function of the Terpios-cyanobacteria association. Our data conclusively suggest that Terpios hoshinota competes with opponent corals not only by the morphological transformation of the sponge-cyanobacteria association but also by physiological outperformance in accumulating resources for the battle.

  15. Proteome Profiling Outperforms Transcriptome Profiling for Coexpression Based Gene Function Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jing; Ma, Zihao; Carr, Steven A.; Mertins, Philipp; Zhang, Hui; Zhang, Zhen; Chan, Daniel W.; Ellis, Matthew J. C.; Townsend, R. Reid; Smith, Richard D.; McDermott, Jason E.; Chen, Xian; Paulovich, Amanda G.; Boja, Emily S.; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Rodland, Karin D.; Liebler, Daniel C.; Zhang, Bing

    2016-11-11

    Coexpression of mRNAs under multiple conditions is commonly used to infer cofunctionality of their gene products despite well-known limitations of this “guilt-by-association” (GBA) approach. Recent advancements in mass spectrometry-based proteomic technologies have enabled global expression profiling at the protein level; however, whether proteome profiling data can outperform transcriptome profiling data for coexpression based gene function prediction has not been systematically investigated. Here, we address this question by constructing and analyzing mRNA and protein coexpression networks for three cancer types with matched mRNA and protein profiling data from The Cancer Genome Atlas (TCGA) and the Clinical Proteomic Tumor Analysis Consortium (CPTAC). Our analyses revealed a marked difference in wiring between the mRNA and protein coexpression networks. Whereas protein coexpression was driven primarily by functional similarity between coexpressed genes, mRNA coexpression was driven by both cofunction and chromosomal colocalization of the genes. Functionally coherent mRNA modules were more likely to have their edges preserved in corresponding protein networks than functionally incoherent mRNA modules. Proteomic data strengthened the link between gene expression and function for at least 75% of Gene Ontology (GO) biological processes and 90% of KEGG pathways. A web application Gene2Net (http://cptac.gene2net.org) developed based on the three protein coexpression networks revealed novel gene-function relationships, such as linking ERBB2 (HER2) to lipid biosynthetic process in breast cancer, identifying PLG as a new gene involved in complement activation, and identifying AEBP1 as a new epithelial-mesenchymal transition (EMT) marker. Our results demonstrate that proteome profiling outperforms transcriptome profiling for coexpression based gene function prediction. Proteomics should be integrated if not preferred in gene function and human disease studies

  16. Change in end-tidal carbon dioxide outperforms other surrogates for change in cardiac output during fluid challenge.

    Science.gov (United States)

    Lakhal, K; Nay, M A; Kamel, T; Lortat-Jacob, B; Ehrmann, S; Rozec, B; Boulain, T

    2017-03-01

    During fluid challenge, volume expansion (VE)-induced increase in cardiac output (Δ VE CO) is seldom measured. In patients with shock undergoing strictly controlled mechanical ventilation and receiving VE, we assessed minimally invasive surrogates for Δ VE CO (by transthoracic echocardiography): fluid-induced increases in end-tidal carbon dioxide (Δ VE E'CO2 ); pulse (Δ VE PP), systolic (Δ VE SBP), and mean systemic blood pressure (Δ VE MBP); and femoral artery Doppler flow (Δ VE FemFlow). In the absence of arrhythmia, fluid-induced decrease in heart rate (Δ VE HR) and in pulse pressure respiratory variation (Δ VE PPV) were also evaluated. Areas under the receiver operating characteristic curves (AUC ROC s) reflect the ability to identify a response to VE (Δ VE CO ≥15%). In 86 patients, Δ VE E'CO2 had an AUC ROC =0.82 [interquartile range 0.73-0.90], significantly higher than the AUC ROC for Δ VE PP, Δ VE SBP, Δ VE MBP, and Δ VE FemFlow (AUC ROC =0.61-0.65, all P  1 mm Hg (>0.13 kPa) had good positive (5.0 [2.6-9.8]) and fair negative (0.29 [0.2-0.5]) likelihood ratios. The 16 patients with arrhythmia had similar relationships between Δ VE E'CO2 and Δ VE CO to patients with regular rhythm ( r 2 =0.23 in both subgroups). In 60 patients with no arrhythmia, Δ VE E'CO2 (AUC ROC =0.84 [0.72-0.92]) outperformed Δ VE HR (AUC ROC =0.52 [0.39-0.66], P AUC ROC =0.73 [0.60-0.84], P =0.21). In the 45 patients with no arrhythmia and receiving ventilation with tidal volume AUC ROC =0.86 [0.72-0.95] vs 0.66 [0.49-0.80], P =0.02. Δ VE E'CO2 outperformed Δ VE PP, Δ VE SBP, Δ VE MBP, Δ VE FemFlow, and Δ VE HR and, during protective ventilation, arrhythmia, or both, it also outperformed Δ VE PPV. A value of Δ VE E'CO2 >1 mm Hg (>0.13 kPa) indicated a likely response to VE. © The Author 2017. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. PROMIS PF CAT Outperforms the ODI and SF-36 Physical Function Domain in Spine Patients.

    Science.gov (United States)

    Brodke, Darrel S; Goz, Vadim; Voss, Maren W; Lawrence, Brandon D; Spiker, William Ryan; Hung, Man

    2017-06-15

    The Oswestry Disability Index v2.0 (ODI), SF36 Physical Function Domain (SF-36 PFD), and PROMIS Physical Function CAT v1.2 (PF CAT) questionnaires were prospectively collected from 1607 patients complaining of back or leg pain, visiting a university-based spine clinic. All questionnaires were collected electronically, using a tablet computer. The aim of this study was to compare the psychometric properties of the PROMIS PF CAT with the ODI and SF36 Physical Function Domain in the same patient population. Evidence-based decision-making is improved by using high-quality patient-reported outcomes measures. Prior studies have revealed the shortcomings of the ODI and SF36, commonly used in spine patients. The PROMIS Network has developed measures with excellent psychometric properties. The Physical Function domain, delivered by Computerized Adaptive Testing (PF CAT), performs well in the spine patient population, though to-date direct comparisons with common measures have not been performed. Standard Rasch analysis was performed to directly compare the psychometrics of the PF CAT, ODI, and SF36 PFD. Spearman correlations were computed to examine the correlations of the three instruments. Time required for administration was also recorded. One thousand six hundred seven patients were administered all assessments. The time required to answer all items in the PF CAT, ODI, and SF-36 PFD was 44, 169, and 99 seconds. The ceiling and floor effects were excellent for the PF CAT (0.81%, 3.86%), while the ceiling effects were marginal and floor effects quite poor for the ODI (6.91% and 44.24%) and SF-36 PFD (5.97% and 23.65%). All instruments significantly correlated with each other. The PROMIS PF CAT outperforms the ODI and SF-36 PFD in the spine patient population and is highly correlated. It has better coverage, while taking less time to administer with fewer questions to answer. 2.

  18. Outperforming markets

    DEFF Research Database (Denmark)

    Nielsen, Christian; Rimmel, Gunnar; Yosano, Tadanori

    2015-01-01

    This article studies the effects of disclosure practices of Japanese IPO prospectuses on long-term stock performance and bid-ask spread, as a proxy for cost of capital, after a company is admitted to the stock exchange. A disclosure index methodology is applied to 120 IPO prospectuses from 2003....... Intellectual capital information leads to significantly better long-term performance against a reference portfolio, and is thus important to the capital market. Further, superior disclosure of IC reduces bid-ask spread in the long-term, indicating that such disclosures are important in an IPO setting. Analysts...

  19. The Development and Significance of Standards for Smoking-Machine Methodology

    Directory of Open Access Journals (Sweden)

    Baker R

    2014-12-01

    Full Text Available Bialous and Yach have recently published an article in Tobacco Control in which they claim that all smoking-machine standards stem from a method developed unilaterally by the tobacco industry within the Cooperation Centre for Scientific Research Relative to Tobacco (CORESTA. Using a few highly selective quotations from internal tobacco company memos, they allege, inter alia, that the tobacco industry has changed the method to suit its own needs, that because humans do not smoke like machines the standards are of little value, and that the tobacco industry has unjustifiably made health claims about low “tar” cigarettes. The objectives of this paper are to review the development of smoking-machine methodology and standards, involvement of relative parties, outline the significance of the results and explore the validity of Bialous and Yach's claims. The large volume of published scientific information on the subject together with other information in the public domain has been consulted. When this information is taken into account it becomes obvious that the very narrow and restricted literature base of Bialous and Yach's analysis has resulted in them, perhaps inadvertedly, making factual errors, drawing wrong conclusions and writing inaccurate statements on many aspects of the subject. The first smoking-machine standard was specified by the Federal Trade Commission (FTC, a federal government agency in the USA, in 1966. The CORESTA Recommended Method, similar in many aspects to that of the FTC, was developed in the late 1960s and published in 1969. Small differences in the butt lengths, smoke collection and analytical procedures in methods used in various countries including Germany, Canada and the UK, developed later, resulted in about a 10% difference in smoke “tar” yields. These differences in methodology were harmonised in a common International Organisation for Standardisation (ISO Standard Method in 1991, after a considerable amount

  20. A Mozart is not a Pavarotti: singers outperform instrumentalists on foreign accent imitation.

    Science.gov (United States)

    Christiner, Markus; Reiterer, Susanne Maria

    2015-01-01

    Recent findings have shown that people with higher musical aptitude were also better in oral language imitation tasks. However, whether singing capacity and instrument playing contribute differently to the imitation of speech has been ignored so far. Research has just recently started to understand that instrumentalists develop quite distinct skills when compared to vocalists. In the same vein the role of the vocal motor system in language acquisition processes has poorly been investigated as most investigations (neurobiological and behavioral) favor to examine speech perception. We set out to test whether the vocal motor system can influence an ability to learn, produce and perceive new languages by contrasting instrumentalists and vocalists. Therefore, we investigated 96 participants, 27 instrumentalists, 33 vocalists and 36 non-musicians/non-singers. They were tested for their abilities to imitate foreign speech: unknown language (Hindi), second language (English) and their musical aptitude. Results revealed that both instrumentalists and vocalists have a higher ability to imitate unintelligible speech and foreign accents than non-musicians/non-singers. Within the musician group, vocalists outperformed instrumentalists significantly. First, adaptive plasticity for speech imitation is not reliant on audition alone but also on vocal-motor induced processes. Second, vocal flexibility of singers goes together with higher speech imitation aptitude. Third, vocal motor training, as of singers, may speed up foreign language acquisition processes.

  1. Implementation of standardized follow-up care significantly reduces peritonitis in children on chronic peritoneal dialysis.

    Science.gov (United States)

    Neu, Alicia M; Richardson, Troy; Lawlor, John; Stuart, Jayne; Newland, Jason; McAfee, Nancy; Warady, Bradley A

    2016-06-01

    The Standardizing Care to improve Outcomes in Pediatric End stage renal disease (SCOPE) Collaborative aims to reduce peritonitis rates in pediatric chronic peritoneal dialysis patients by increasing implementation of standardized care practices. To assess this, monthly care bundle compliance and annualized monthly peritonitis rates were evaluated from 24 SCOPE centers that were participating at collaborative launch and that provided peritonitis rates for the 13 months prior to launch. Changes in bundle compliance were assessed using either a logistic regression model or a generalized linear mixed model. Changes in average annualized peritonitis rates over time were illustrated using the latter model. In the first 36 months of the collaborative, 644 patients with 7977 follow-up encounters were included. The likelihood of compliance with follow-up care practices increased significantly (odds ratio 1.15, 95% confidence interval 1.10, 1.19). Mean monthly peritonitis rates significantly decreased from 0.63 episodes per patient year (95% confidence interval 0.43, 0.92) prelaunch to 0.42 (95% confidence interval 0.31, 0.57) at 36 months postlaunch. A sensitivity analysis confirmed that as mean follow-up compliance increased, peritonitis rates decreased, reaching statistical significance at 80% at which point the prelaunch rate was 42% higher than the rate in the months following achievement of 80% compliance. In its first 3 years, the SCOPE Collaborative has increased the implementation of standardized follow-up care and demonstrated a significant reduction in average monthly peritonitis rates. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  2. A generic standard for assessing and managing activities with significant risk to health and safety

    International Nuclear Information System (INIS)

    Wilde, T.S.; Sandquist, G.M.

    2005-01-01

    Some operations and activities in industry, business, and government can present an unacceptable risk to health and safety if not performed according to established safety practices and documented procedures. The nuclear industry has extensive experience and commitment to assessing and controlling such risks. This paper provides a generic standard based upon DOE Standard DOE-STD-3007- 93, Nov 1993, Change Notice No. 1, Sep 1998. This generic standard can be used to assess practices and procedures employed by any industrial and government entity to ensure that an acceptable level of safety and control prevail for such operations. When any activity and operation is determined to involve significant risk to health and safety to workers or the public, the organization should adopt and establish an appropriate standard and methodology to ensure that adequate health and safety prevail. This paper uses DOE experience and standards to address activities with recognized potential for impact upon health and safety. Existing and future assessments of health and safety issues can be compared and evaluated against this generic standard for insuring that proper planning, analysis, review, and approval have been made. (authors)

  3. Standard filtration practices may significantly distort planktonic microbial diversity estimates

    Directory of Open Access Journals (Sweden)

    Cory Cruz Padilla

    2015-06-01

    Full Text Available Fractionation of biomass by filtration is a standard method for sampling planktonic microbes. It is unclear how the taxonomic composition of filtered biomass changes depending on sample volume. Using seawater from a marine oxygen minimum zone, we quantified the 16S rRNA gene composition of biomass on a prefilter (1.6 μm pore-size and a downstream 0.2 μm filter over sample volumes from 0.05 to 5 L. Significant community shifts occurred in both filter fractions, and were most dramatic in the prefilter community. Sequences matching Vibrionales decreased from ~40-60% of prefilter datasets at low volumes (0.05-0.5 L to less than 5% at higher volumes, while groups such at the Chromatiales and Thiohalorhabdales followed opposite trends, increasing from minor representation to become the dominant taxa at higher volumes. Groups often associated with marine particles, including members of the Deltaproteobacteria, Planctomycetes and Bacteroidetes, were among those showing the greatest increase with volume (4 to 27-fold. Taxon richness (97% similarity clusters also varied significantly with volume, and in opposing directions depending on filter fraction, highlighting potential biases in community complexity estimates. These data raise concerns for studies using filter fractionation for quantitative comparisons of aquatic microbial diversity, for example between free-living and particle-associated communities.

  4. Sex Differences in Spatial Memory in Brown-Headed Cowbirds: Males Outperform Females on a Touchscreen Task.

    Directory of Open Access Journals (Sweden)

    Mélanie F Guigueno

    Full Text Available Spatial cognition in females and males can differ in species in which there are sex-specific patterns in the use of space. Brown-headed cowbirds are brood parasites that show a reversal of sex-typical space use often seen in mammals. Female cowbirds, search for, revisit and parasitize hosts nests, have a larger hippocampus than males and have better memory than males for a rewarded location in an open spatial environment. In the current study, we tested female and male cowbirds in breeding and non-breeding conditions on a touchscreen delayed-match-to-sample task using both spatial and colour stimuli. Our goal was to determine whether sex differences in spatial memory in cowbirds generalizes to all spatial tasks or is task-dependant. Both sexes performed better on the spatial than on the colour touchscreen task. On the spatial task, breeding males outperformed breeding females. On the colour task, females and males did not differ, but females performed better in breeding condition than in non-breeding condition. Although female cowbirds were observed to outperform males on a previous larger-scale spatial task, males performed better than females on a task testing spatial memory in the cowbirds' immediate visual field. Spatial abilities in cowbirds can favour males or females depending on the type of spatial task, as has been observed in mammals, including humans.

  5. Sex Differences in Spatial Memory in Brown-Headed Cowbirds: Males Outperform Females on a Touchscreen Task

    Science.gov (United States)

    Guigueno, Mélanie F.; MacDougall-Shackleton, Scott A.; Sherry, David F.

    2015-01-01

    Spatial cognition in females and males can differ in species in which there are sex-specific patterns in the use of space. Brown-headed cowbirds are brood parasites that show a reversal of sex-typical space use often seen in mammals. Female cowbirds, search for, revisit and parasitize hosts nests, have a larger hippocampus than males and have better memory than males for a rewarded location in an open spatial environment. In the current study, we tested female and male cowbirds in breeding and non-breeding conditions on a touchscreen delayed-match-to-sample task using both spatial and colour stimuli. Our goal was to determine whether sex differences in spatial memory in cowbirds generalizes to all spatial tasks or is task-dependant. Both sexes performed better on the spatial than on the colour touchscreen task. On the spatial task, breeding males outperformed breeding females. On the colour task, females and males did not differ, but females performed better in breeding condition than in non-breeding condition. Although female cowbirds were observed to outperform males on a previous larger-scale spatial task, males performed better than females on a task testing spatial memory in the cowbirds’ immediate visual field. Spatial abilities in cowbirds can favour males or females depending on the type of spatial task, as has been observed in mammals, including humans. PMID:26083573

  6. Significant improvement of optical traps by tuning standard water immersion objectives

    International Nuclear Information System (INIS)

    Reihani, S Nader S; Mir, Shahid A; Richardson, Andrew C; Oddershede, Lene B

    2011-01-01

    Focused infrared lasers are widely used for micromanipulation and visualization of biological specimens. An inherent practical problem is that off-the-shelf commercial microscope objectives are designed for use with visible and not infrared wavelengths. Less aberration is introduced by water immersion objectives than by oil immersion ones, however, even water immersion objectives induce significant aberration. We present a simple method to reduce the spherical aberration induced by water immersion objectives, namely by tuning the correction collar of the objective to a value that is ∼ 10% lower than the physical thickness of the coverslip. This results in marked improvements in optical trapping strengths of up to 100% laterally and 600% axially from a standard microscope objective designed for use in the visible range. The results are generally valid for any water immersion objective with any numerical aperture

  7. Atomic-Layer-Deposited AZO Outperforms ITO in High-Efficiency Polymer Solar Cells

    KAUST Repository

    Kan, Zhipeng

    2018-05-11

    Tin-doped indium oxide (ITO) transparent conducting electrodes are widely used across the display industry, and are currently the cornerstone of photovoltaic device developments, taking a substantial share in the manufacturing cost of large-area modules. However, cost and supply considerations are set to limit the extensive use of indium for optoelectronic device applications and, in turn, alternative transparent conducting oxide (TCO) materials are required. In this report, we show that aluminum-doped zinc oxide (AZO) thin films grown by atomic layer deposition (ALD) are sufficiently conductive and transparent to outperform ITO as the cathode in inverted polymer solar cells. Reference polymer solar cells made with atomic-layer-deposited AZO cathodes, PCE10 as the polymer donor and PC71BM as the fullerene acceptor (model systems), reach power conversion efficiencies of ca. 10% (compared to ca. 9% with ITO-coated glass), without compromising other figures of merit. These ALD-grown AZO electrodes are promising for a wide range of optoelectronic device applications relying on TCOs.

  8. Atomic-Layer-Deposited AZO Outperforms ITO in High-Efficiency Polymer Solar Cells

    KAUST Repository

    Kan, Zhipeng; Wang, Zhenwei; Firdaus, Yuliar; Babics, Maxime; Alshareef, Husam N.; Beaujuge, Pierre

    2018-01-01

    Tin-doped indium oxide (ITO) transparent conducting electrodes are widely used across the display industry, and are currently the cornerstone of photovoltaic device developments, taking a substantial share in the manufacturing cost of large-area modules. However, cost and supply considerations are set to limit the extensive use of indium for optoelectronic device applications and, in turn, alternative transparent conducting oxide (TCO) materials are required. In this report, we show that aluminum-doped zinc oxide (AZO) thin films grown by atomic layer deposition (ALD) are sufficiently conductive and transparent to outperform ITO as the cathode in inverted polymer solar cells. Reference polymer solar cells made with atomic-layer-deposited AZO cathodes, PCE10 as the polymer donor and PC71BM as the fullerene acceptor (model systems), reach power conversion efficiencies of ca. 10% (compared to ca. 9% with ITO-coated glass), without compromising other figures of merit. These ALD-grown AZO electrodes are promising for a wide range of optoelectronic device applications relying on TCOs.

  9. Factoring local sequence composition in motif significance analysis.

    Science.gov (United States)

    Ng, Patrick; Keich, Uri

    2008-01-01

    We recently introduced a biologically realistic and reliable significance analysis of the output of a popular class of motif finders. In this paper we further improve our significance analysis by incorporating local base composition information. Relying on realistic biological data simulation, as well as on FDR analysis applied to real data, we show that our method is significantly better than the increasingly popular practice of using the normal approximation to estimate the significance of a finder's output. Finally we turn to leveraging our reliable significance analysis to improve the actual motif finding task. Specifically, endowing a variant of the Gibbs Sampler with our improved significance analysis we demonstrate that de novo finders can perform better than has been perceived. Significantly, our new variant outperforms all the finders reviewed in a recently published comprehensive analysis of the Harbison genome-wide binding location data. Interestingly, many of these finders incorporate additional information such as nucleosome positioning and the significance of binding data.

  10. Identification of Water Quality Significant Parameter with Two Transformation/Standardization Methods on Principal Component Analysis and Scilab Software

    Directory of Open Access Journals (Sweden)

    Jovan Putranda

    2016-09-01

    Full Text Available Water quality monitoring is prone to encounter error on its recording or measuring process. The monitoring on river water quality not only aims to recognize the water quality dynamic, but also to evaluate the data to create river management policy and water pollution in order to maintain the continuity of human health or sanitation requirement, and biodiversity preservation. Evaluation on water quality monitoring needs to be started by identifying the important water quality parameter. This research objected to identify the significant parameters by using two transformation or standardization methods on water quality data, which are the river Water Quality Index, WQI (Indeks Kualitas Air, Sungai, IKAs transformation or standardization method and transformation or standardization method with mean 0 and variance 1; so that the variability of water quality parameters could be aggregated with one another. Both of the methods were applied on the water quality monitoring data which its validity and reliability have been tested. The PCA, Principal Component Analysis (Analisa Komponen Utama, AKU, with the help of Scilab software, has been used to process the secondary data on water quality parameters of Gadjah Wong river in 2004-2013, with its validity and reliability has been tested. The Scilab result was cross examined with the result from the Excel-based Biplot Add In software. The research result showed that only 18 from total 35 water quality parameters that have passable data quality. The two transformation or standardization data methods gave different significant parameter type and amount result. On the transformation or standardization mean 0 variances 1, there were water quality significant parameter dynamic to mean concentration of each water quality parameters, which are TDS, SO4, EC, TSS, NO3N, COD, BOD5, Grease Oil and NH3N. On the river WQI transformation or standardization, the water quality significant parameter showed the level of

  11. Native Honey Bees Outperform Adventive Honey Bees in Increasing Pyrus bretschneideri (Rosales: Rosaceae) Pollination.

    Science.gov (United States)

    Gemeda, Tolera Kumsa; Shao, Youquan; Wu, Wenqin; Yang, Huipeng; Huang, Jiaxing; Wu, Jie

    2017-12-05

    The foraging behavior of different bee species is a key factor influencing the pollination efficiency of different crops. Most pear species exhibit full self-incompatibility and thus depend entirely on cross-pollination. However, as little is known about the pear visitation preferences of native Apis cerana (Fabricius; Hymenoptera: Apidae) and adventive Apis mellifera (L.; Hymenoptera: Apidae) in China. A comparative analysis was performed to explore the pear-foraging differences of these species under the natural conditions of pear growing areas. The results show significant variability in the pollen-gathering tendency of these honey bees. Compared to A. mellifera, A. cerana begins foraging at an earlier time of day and gathers a larger amount of pollen in the morning. Based on pollen collection data, A. mellifera shows variable preferences: vigorously foraging on pear on the first day of observation but collecting pollen from non-target floral resources on other experimental days. Conversely, A. cerana persists in pear pollen collection, without shifting preference to other competitive flowers. Therefore, A. cerana outperforms adventive A. mellifera with regard to pear pollen collection under natural conditions, which may lead to increased pear pollination. This study supports arguments in favor of further multiplication and maintenance of A. cerana for pear and other native crop pollination. Moreover, it is essential to develop alternative pollination management techniques to utilize A. mellifera for pear pollination. © The Author(s) 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. The development and significance of the DOE Safeguards and Security standards and criteria

    International Nuclear Information System (INIS)

    Toman, J.

    1987-01-01

    In October 1985, the DOE Assistant Secretary for Defense Programs created a task force to develop inspection standards and criteria for Safeguards and Security. These standards and criteria (S/C) would provide the DOE Inspection and Evaluation (I and E) teams with the guidance needed to assess the security posture of DOE's nuclear and other important facilities. The Lawrence Livermore National Laboratory was designated the lead management organization for the structuring, administration, and execution of the overall task force effort and appointed the Executive Secretary. The Office of Security Evaluations (OSE) became the responsible DOE organization, and its Director assumed the role of Chairman of the Task Force Executive Committee. At its peak, the Task Force consisted of approximately 200 people who were considered to be experts in eight major topical areas. The composition of the experts was almost evenly divided between DOE and contractor employees. The collective wisdom of these experts was used in a consensus process to develop the S/C that are now published in draft form. These S/C have been used in more than ten inspections since May 1986 with much success. This paper discusses the process used to achieve the desired end result and the significance of the Task Force's accomplishments

  13. Gender differences in primary and secondary education: Are girls really outperforming boys?

    Science.gov (United States)

    Driessen, Geert; van Langen, Annemarie

    2013-06-01

    A moral panic has broken out in several countries after recent studies showed that girls were outperforming boys in education. Commissioned by the Dutch Ministry of Education, the present study examines the position of boys and girls in Dutch primary education and in the first phase of secondary education over the past ten to fifteen years. On the basis of several national and international large-scale databases, the authors examined whether one can indeed speak of a gender gap, at the expense of boys. Three domains were investigated, namely cognitive competencies, non-cognitive competencies, and school career features. The results as expressed in effect sizes show that there are hardly any differences with regard to language and mathematics proficiency. However, the position of boys in terms of educational level and attitudes and behaviour is much more unfavourable than that of girls. Girls, on the other hand, score more unfavourably with regard to sector and subject choice. While the present situation in general does not differ very much from that of a decade ago, it is difficult to predict in what way the balances might shift in the years to come.

  14. Replacing gasoline with corn ethanol results in significant environmental problem-shifting.

    Science.gov (United States)

    Yang, Yi; Bae, Junghan; Kim, Junbeum; Suh, Sangwon

    2012-04-03

    Previous studies on the life-cycle environmental impacts of corn ethanol and gasoline focused almost exclusively on energy balance and greenhouse gas (GHG) emissions and largely overlooked the influence of regional differences in agricultural practices. This study compares the environmental impact of gasoline and E85 taking into consideration 12 different environmental impacts and regional differences among 19 corn-growing states. Results show that E85 does not outperform gasoline when a wide spectrum of impacts is considered. If the impacts are aggregated using weights developed by the National Institute of Standards and Technology (NIST), overall, E85 generates approximately 6% to 108% (23% on average) greater impact compared with gasoline, depending on where corn is produced, primarily because corn production induces significant eutrophication impacts and requires intensive irrigation. If GHG emissions from the indirect land use changes are considered, the differences increase to between 16% and 118% (33% on average). Our study indicates that replacing gasoline with corn ethanol may only result in shifting the net environmental impacts primarily toward increased eutrophication and greater water scarcity. These results suggest that the environmental criteria used in the Energy Independence and Security Act (EISA) be re-evaluated to include additional categories of environmental impact beyond GHG emissions.

  15. Do bilinguals outperform monolinguals?

    OpenAIRE

    Sejdi Sejdiu

    2016-01-01

    The relationship between second dialect acquisition and the psychological capacity of the learner is still a divisive topic that generates a lot of debate. A few researchers contend that the acquisition of the second dialect tends to improve the cognitive abilities in various individuals, but at the same time it could hinder the same abilities in other people. Currently, immersion is a common occurrence in some countries. In the recent past, it has significantly increased in its popularity, w...

  16. Cloud Computing Security Model with Combination of Data Encryption Standard Algorithm (DES) and Least Significant Bit (LSB)

    Science.gov (United States)

    Basri, M.; Mawengkang, H.; Zamzami, E. M.

    2018-03-01

    Limitations of storage sources is one option to switch to cloud storage. Confidentiality and security of data stored on the cloud is very important. To keep up the confidentiality and security of such data can be done one of them by using cryptography techniques. Data Encryption Standard (DES) is one of the block cipher algorithms used as standard symmetric encryption algorithm. This DES will produce 8 blocks of ciphers combined into one ciphertext, but the ciphertext are weak against brute force attacks. Therefore, the last 8 block cipher will be converted into 8 random images using Least Significant Bit (LSB) algorithm which later draws the result of cipher of DES algorithm to be merged into one.

  17. Do School-Based Tutoring Programs Significantly Improve Student Performance on Standardized Tests?

    Science.gov (United States)

    Rothman, Terri; Henderson, Mary

    2011-01-01

    This study used a pre-post, nonequivalent control group design to examine the impact of an in-district, after-school tutoring program on eighth grade students' standardized test scores in language arts and mathematics. Students who had scored in the near-passing range on either the language arts or mathematics aspect of a standardized test at the…

  18. A clinically driven variant prioritization framework outperforms purely computational approaches for the diagnostic analysis of singleton WES data.

    Science.gov (United States)

    Stark, Zornitza; Dashnow, Harriet; Lunke, Sebastian; Tan, Tiong Y; Yeung, Alison; Sadedin, Simon; Thorne, Natalie; Macciocca, Ivan; Gaff, Clara; Oshlack, Alicia; White, Susan M; James, Paul A

    2017-11-01

    Rapid identification of clinically significant variants is key to the successful application of next generation sequencing technologies in clinical practice. The Melbourne Genomics Health Alliance (MGHA) variant prioritization framework employs a gene prioritization index based on clinician-generated a priori gene lists, and a variant prioritization index (VPI) based on rarity, conservation and protein effect. We used data from 80 patients who underwent singleton whole exome sequencing (WES) to test the ability of the framework to rank causative variants highly, and compared it against the performance of other gene and variant prioritization tools. Causative variants were identified in 59 of the patients. Using the MGHA prioritization framework the average rank of the causative variant was 2.24, with 76% ranked as the top priority variant, and 90% ranked within the top five. Using clinician-generated gene lists resulted in ranking causative variants an average of 8.2 positions higher than prioritization based on variant properties alone. This clinically driven prioritization approach significantly outperformed purely computational tools, placing a greater proportion of causative variants top or in the top 5 (permutation P-value=0.001). Clinicians included 40 of the 49 WES diagnoses in their a priori list of differential diagnoses (81%). The lists generated by PhenoTips and Phenomizer contained 14 (29%) and 18 (37%) of these diagnoses respectively. These results highlight the benefits of clinically led variant prioritization in increasing the efficiency of singleton WES data analysis and have important implications for developing models for the funding and delivery of genomic services.

  19. Do new wipe materials outperform traditional lead dust cleaning methods?

    Science.gov (United States)

    Lewis, Roger D; Ong, Kee Hean; Emo, Brett; Kennedy, Jason; Brown, Christopher A; Condoor, Sridhar; Thummalakunta, Laxmi

    2012-01-01

    Government guidelines have traditionally recommended the use of wet mopping, sponging, or vacuuming for removal of lead-contaminated dust from hard surfaces in homes. The emergence of new technologies, such as the electrostatic dry cloth and wet disposable clothes used on mopheads, for removal of dust provides an opportunity to evaluate their ability to remove lead compared with more established methods. The purpose of this study was to determine if relative differences exist between two new and two older methods for removal of lead-contaminated dust (LCD) from three wood surfaces that were characterized by different roughness or texture. Standard leaded dust, coefficient of friction was performed for each wipe material. Analysis of variance was used to evaluate the surface and cleaning methods. There were significant interactions between cleaning method and surface types, p = 0.007. Cleaning method was found be a significant factor in removal of lead, p coefficient of friction, significantly different among the three wipes, is likely to influence the cleaning action. Cleaning method appears to be more important than texture in LCD removal from hard surfaces. There are some small but important factors in cleaning LCD from hard surfaces, including the limits of a Swiffer mop to conform to curved surfaces and the efficiency of the wetted shop towel and vacuuming for cleaning all surface textures. The mean percentage reduction in lead dust achieved by the traditional methods (vacuuming and wet wiping) was greater and more consistent compared to the new methods (electrostatic dry cloth and wet Swiffer mop). Vacuuming and wet wiping achieved lead reductions of 92% ± 4% and 91%, ± 4%, respectively, while the electrostatic dry cloth and wet Swiffer mops achieved lead reductions of only 89 ± 8% and  81 ± 17%, respectively.

  20. Extracting biologically significant patterns from short time series gene expression data

    Directory of Open Access Journals (Sweden)

    McGinnis Thomas

    2009-08-01

    Full Text Available Abstract Background Time series gene expression data analysis is used widely to study the dynamics of various cell processes. Most of the time series data available today consist of few time points only, thus making the application of standard clustering techniques difficult. Results We developed two new algorithms that are capable of extracting biological patterns from short time point series gene expression data. The two algorithms, ASTRO and MiMeSR, are inspired by the rank order preserving framework and the minimum mean squared residue approach, respectively. However, ASTRO and MiMeSR differ from previous approaches in that they take advantage of the relatively few number of time points in order to reduce the problem from NP-hard to linear. Tested on well-defined short time expression data, we found that our approaches are robust to noise, as well as to random patterns, and that they can correctly detect the temporal expression profile of relevant functional categories. Evaluation of our methods was performed using Gene Ontology (GO annotations and chromatin immunoprecipitation (ChIP-chip data. Conclusion Our approaches generally outperform both standard clustering algorithms and algorithms designed specifically for clustering of short time series gene expression data. Both algorithms are available at http://www.benoslab.pitt.edu/astro/.

  1. Clinically significant cardiopulmonary events and the effect of definition standardization on apnea of prematurity management.

    Science.gov (United States)

    Powell, M B F; Ahlers-Schmidt, C R; Engel, M; Bloom, B T

    2017-01-01

    To define the impact of care standardization on caffeine and cardiorespiratory monitoring at neonatal intensive care unit (NICU) discharge. Electronic records were abstracted for infants aged 24-36 weeks gestation with birth weights appropriate for gestational age. Infants who died, transferred prior to discharge, had major pulmonary anomalies, required a home monitor for mechanical ventilation or had a family history of sudden infant death syndrome were excluded. Data and records were used to indicate when the new definition of clinically significant cardiopulmonary events (CSCPEs) and concurrent education was implemented. Preimplementation and postimplementation cohorts were compared. Incidence fell from 74% diagnosed with apnea of prematurity at baseline to 49% diagnosed with CSCPE postimplementation (Pdefinitions and treatments reduced the use of caffeine and cardiorespiratory monitors upon NICU dismissal.

  2. Advanced GF(32) nonbinary LDPC coded modulation with non-uniform 9-QAM outperforming star 8-QAM.

    Science.gov (United States)

    Liu, Tao; Lin, Changyu; Djordjevic, Ivan B

    2016-06-27

    In this paper, we first describe a 9-symbol non-uniform signaling scheme based on Huffman code, in which different symbols are transmitted with different probabilities. By using the Huffman procedure, prefix code is designed to approach the optimal performance. Then, we introduce an algorithm to determine the optimal signal constellation sets for our proposed non-uniform scheme with the criterion of maximizing constellation figure of merit (CFM). The proposed nonuniform polarization multiplexed signaling 9-QAM scheme has the same spectral efficiency as the conventional 8-QAM. Additionally, we propose a specially designed GF(32) nonbinary quasi-cyclic LDPC code for the coded modulation system based on the 9-QAM non-uniform scheme. Further, we study the efficiency of our proposed non-uniform 9-QAM, combined with nonbinary LDPC coding, and demonstrate by Monte Carlo simulation that the proposed GF(23) nonbinary LDPC coded 9-QAM scheme outperforms nonbinary LDPC coded uniform 8-QAM by at least 0.8dB.

  3. Multifunctional Cellulolytic Enzymes Outperform Processive Fungal Cellulases for Coproduction of Nanocellulose and Biofuels.

    Science.gov (United States)

    Yarbrough, John M; Zhang, Ruoran; Mittal, Ashutosh; Vander Wall, Todd; Bomble, Yannick J; Decker, Stephen R; Himmel, Michael E; Ciesielski, Peter N

    2017-03-28

    Producing fuels, chemicals, and materials from renewable resources to meet societal demands remains an important step in the transition to a sustainable, clean energy economy. The use of cellulolytic enzymes for the production of nanocellulose enables the coproduction of sugars for biofuels production in a format that is largely compatible with the process design employed by modern lignocellulosic (second generation) biorefineries. However, yields of enzymatically produced nanocellulose are typically much lower than those achieved by mineral acid production methods. In this study, we compare the capacity for coproduction of nanocellulose and fermentable sugars using two vastly different cellulase systems: the classical "free enzyme" system of the saprophytic fungus, Trichoderma reesei (T. reesei) and the complexed, multifunctional enzymes produced by the hot springs resident, Caldicellulosiruptor bescii (C. bescii). We demonstrate by comparative digestions that the C. bescii system outperforms the fungal enzyme system in terms of total cellulose conversion, sugar production, and nanocellulose production. In addition, we show by multimodal imaging and dynamic light scattering that the nanocellulose produced by the C. bescii cellulase system is substantially more uniform than that produced by the T. reesei system. These disparities in the yields and characteristics of the nanocellulose produced by these disparate systems can be attributed to the dramatic differences in the mechanisms of action of the dominant enzymes in each system.

  4. A studentized permutation test for three-arm trials in the 'gold standard' design.

    Science.gov (United States)

    Mütze, Tobias; Konietschke, Frank; Munk, Axel; Friede, Tim

    2017-03-15

    The 'gold standard' design for three-arm trials refers to trials with an active control and a placebo control in addition to the experimental treatment group. This trial design is recommended when being ethically justifiable and it allows the simultaneous comparison of experimental treatment, active control, and placebo. Parametric testing methods have been studied plentifully over the past years. However, these methods often tend to be liberal or conservative when distributional assumptions are not met particularly with small sample sizes. In this article, we introduce a studentized permutation test for testing non-inferiority and superiority of the experimental treatment compared with the active control in three-arm trials in the 'gold standard' design. The performance of the studentized permutation test for finite sample sizes is assessed in a Monte Carlo simulation study under various parameter constellations. Emphasis is put on whether the studentized permutation test meets the target significance level. For comparison purposes, commonly used Wald-type tests, which do not make any distributional assumptions, are included in the simulation study. The simulation study shows that the presented studentized permutation test for assessing non-inferiority in three-arm trials in the 'gold standard' design outperforms its competitors, for instance the test based on a quasi-Poisson model, for count data. The methods discussed in this paper are implemented in the R package ThreeArmedTrials which is available on the comprehensive R archive network (CRAN). Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Hydrological and environmental variables outperform spatial factors in structuring species, trait composition, and beta diversity of pelagic algae.

    Science.gov (United States)

    Wu, Naicheng; Qu, Yueming; Guse, Björn; Makarevičiūtė, Kristė; To, Szewing; Riis, Tenna; Fohrer, Nicola

    2018-03-01

    There has been increasing interest in algae-based bioassessment, particularly, trait-based approaches are increasingly suggested. However, the main drivers, especially the contribution of hydrological variables, of species composition, trait composition, and beta diversity of algae communities are less studied. To link species and trait composition to multiple factors (i.e., hydrological variables, local environmental variables, and spatial factors) that potentially control species occurrence/abundance and to determine their relative roles in shaping species composition, trait composition, and beta diversities of pelagic algae communities, samples were collected from a German lowland catchment, where a well-proven ecohydrological modeling enabled to predict long-term discharges at each sampling site. Both trait and species composition showed significant correlations with hydrological, environmental, and spatial variables, and variation partitioning revealed that the hydrological and local environmental variables outperformed spatial variables. A higher variation of trait composition (57.0%) than species composition (37.5%) could be explained by abiotic factors. Mantel tests showed that both species and trait-based beta diversities were mostly related to hydrological and environmental heterogeneity with hydrological contributing more than environmental variables, while purely spatial impact was less important. Our findings revealed the relative importance of hydrological variables in shaping pelagic algae community and their spatial patterns of beta diversities, emphasizing the need to include hydrological variables in long-term biomonitoring campaigns and biodiversity conservation or restoration. A key implication for biodiversity conservation was that maintaining the instream flow regime and keeping various habitats among rivers are of vital importance. However, further investigations at multispatial and temporal scales are greatly needed.

  6. Relative resilience to noise of standard and sequential approaches to measurement-based quantum computation

    Science.gov (United States)

    Gallagher, C. B.; Ferraro, A.

    2018-05-01

    A possible alternative to the standard model of measurement-based quantum computation (MBQC) is offered by the sequential model of MBQC—a particular class of quantum computation via ancillae. Although these two models are equivalent under ideal conditions, their relative resilience to noise in practical conditions is not yet known. We analyze this relationship for various noise models in the ancilla preparation and in the entangling-gate implementation. The comparison of the two models is performed utilizing both the gate infidelity and the diamond distance as figures of merit. Our results show that in the majority of instances the sequential model outperforms the standard one in regard to a universal set of operations for quantum computation. Further investigation is made into the performance of sequential MBQC in experimental scenarios, thus setting benchmarks for possible cavity-QED implementations.

  7. Do Private Firms Outperform SOE Firms after Going Public in China Given their Different Governance Characteristics?

    Directory of Open Access Journals (Sweden)

    Shenghui Tong

    2013-06-01

    Full Text Available This study examines the characteristics of board structure that affect Chinese public firm’s financial performance. Using a sample of 871 firms with 699 observations of previously private firms and 1,914 observations of previously state-owned enterprise (SOE firms, we investigate the differences in corporate governance between publicly listed firms that used to be pure private firms before going public and listed firms that used to be SOEs before their initial public offerings (IPOs. Our main finding is that previously private firms outperform previously SOE firms in China after IPOs. In the wake of becoming listed firms, previously SOE firms might be faced with difficulties adjusting to professional business practices to build and extend competitive advantages. In addition, favorable policies and assistance from the government to the SOE firms might have triggered complacency, especially in early years after getting listed. On the other hand, professional savvy and acumen, combined with efficiency and favorable business climate created by the government have probably led the previously private firms to improve their values stronger and faster.

  8. Importance of a species' socioecology: Wolves outperform dogs in a conspecific cooperation task.

    Science.gov (United States)

    Marshall-Pescini, Sarah; Schwarz, Jonas F L; Kostelnik, Inga; Virányi, Zsófia; Range, Friederike

    2017-10-31

    A number of domestication hypotheses suggest that dogs have acquired a more tolerant temperament than wolves, promoting cooperative interactions with humans and conspecifics. This selection process has been proposed to resemble the one responsible for our own greater cooperative inclinations in comparison with our closest living relatives. However, the socioecology of wolves and dogs, with the former relying more heavily on cooperative activities, predicts that at least with conspecifics, wolves should cooperate better than dogs. Here we tested similarly raised wolves and dogs in a cooperative string-pulling task with conspecifics and found that wolves outperformed dogs, despite comparable levels of interest in the task. Whereas wolves coordinated their actions so as to simultaneously pull the rope ends, leading to success, dogs pulled the ropes in alternate moments, thereby never succeeding. Indeed in dog dyads it was also less likely that both members simultaneously engaged in other manipulative behaviors on the apparatus. Different conflict-management strategies are likely responsible for these results, with dogs' avoidance of potential competition over the apparatus constraining their capacity to coordinate actions. Wolves, in contrast, did not hesitate to manipulate the ropes simultaneously, and once cooperation was initiated, rapidly learned to coordinate in more complex conditions as well. Social dynamics (rank and affiliation) played a key role in success rates. Results call those domestication hypotheses that suggest dogs evolved greater cooperative inclinations into question, and rather support the idea that dogs' and wolves' different social ecologies played a role in affecting their capacity for conspecific cooperation and communication. Published under the PNAS license.

  9. HINTS outperforms ABCD2 to screen for stroke in acute continuous vertigo and dizziness.

    Science.gov (United States)

    Newman-Toker, David E; Kerber, Kevin A; Hsieh, Yu-Hsiang; Pula, John H; Omron, Rodney; Saber Tehrani, Ali S; Mantokoudis, Georgios; Hanley, Daniel F; Zee, David S; Kattah, Jorge C

    2013-10-01

    younger than 60 years old (28.9%). HINTS stroke sensitivity was 96.5%, specificity was 84.4%, LR+ was 6.19, and LR- was 0.04 and did not vary by age. For any central lesion, sensitivity was 96.8%, specificity was 98.5%, LR+ was 63.9, and LR- was 0.03 for HINTS, and sensitivity was 99.2%, specificity was 97.0%, LR+ was 32.7, and LR- was 0.01 for HINTS "plus" (any new hearing loss added to HINTS). Initial MRIs were falsely negative in 15 of 105 (14.3%) infarctions; all but one was obtained before 48 hours after onset, and all were confirmed by delayed MRI. HINTS substantially outperforms ABCD2 for stroke diagnosis in ED patients with AVS. It also outperforms MRI obtained within the first 2 days after symptom onset. While HINTS testing has traditionally been performed by specialists, methods for empowering emergency physicians (EPs) to leverage this approach for stroke screening in dizziness should be investigated. © 2013 by the Society for Academic Emergency Medicine.

  10. Beyond the hype: deep neural networks outperform established methods using a ChEMBL bioactivity benchmark set.

    Science.gov (United States)

    Lenselink, Eelke B; Ten Dijke, Niels; Bongers, Brandon; Papadatos, George; van Vlijmen, Herman W T; Kowalczyk, Wojtek; IJzerman, Adriaan P; van Westen, Gerard J P

    2017-08-14

    The increase of publicly available bioactivity data in recent years has fueled and catalyzed research in chemogenomics, data mining, and modeling approaches. As a direct result, over the past few years a multitude of different methods have been reported and evaluated, such as target fishing, nearest neighbor similarity-based methods, and Quantitative Structure Activity Relationship (QSAR)-based protocols. However, such studies are typically conducted on different datasets, using different validation strategies, and different metrics. In this study, different methods were compared using one single standardized dataset obtained from ChEMBL, which is made available to the public, using standardized metrics (BEDROC and Matthews Correlation Coefficient). Specifically, the performance of Naïve Bayes, Random Forests, Support Vector Machines, Logistic Regression, and Deep Neural Networks was assessed using QSAR and proteochemometric (PCM) methods. All methods were validated using both a random split validation and a temporal validation, with the latter being a more realistic benchmark of expected prospective execution. Deep Neural Networks are the top performing classifiers, highlighting the added value of Deep Neural Networks over other more conventional methods. Moreover, the best method ('DNN_PCM') performed significantly better at almost one standard deviation higher than the mean performance. Furthermore, Multi-task and PCM implementations were shown to improve performance over single task Deep Neural Networks. Conversely, target prediction performed almost two standard deviations under the mean performance. Random Forests, Support Vector Machines, and Logistic Regression performed around mean performance. Finally, using an ensemble of DNNs, alongside additional tuning, enhanced the relative performance by another 27% (compared with unoptimized 'DNN_PCM'). Here, a standardized set to test and evaluate different machine learning algorithms in the context of multi

  11. Bayesian methods outperform parsimony but at the expense of precision in the estimation of phylogeny from discrete morphological data.

    Science.gov (United States)

    O'Reilly, Joseph E; Puttick, Mark N; Parry, Luke; Tanner, Alastair R; Tarver, James E; Fleming, James; Pisani, Davide; Donoghue, Philip C J

    2016-04-01

    Different analytical methods can yield competing interpretations of evolutionary history and, currently, there is no definitive method for phylogenetic reconstruction using morphological data. Parsimony has been the primary method for analysing morphological data, but there has been a resurgence of interest in the likelihood-based Mk-model. Here, we test the performance of the Bayesian implementation of the Mk-model relative to both equal and implied-weight implementations of parsimony. Using simulated morphological data, we demonstrate that the Mk-model outperforms equal-weights parsimony in terms of topological accuracy, and implied-weights performs the most poorly. However, the Mk-model produces phylogenies that have less resolution than parsimony methods. This difference in the accuracy and precision of parsimony and Bayesian approaches to topology estimation needs to be considered when selecting a method for phylogeny reconstruction. © 2016 The Authors.

  12. Telemetry Standards, RCC Standard 106-17. Chapter 8. Digital Data Bus Acquisition Formatting Standard

    Science.gov (United States)

    2017-07-01

    incorrect word count/message and illegal mode codes are not considered bus errors. 8.6.2 Source Signal The source of data is a signal conforming to...Telemetry Standards, RCC Standard 106-17 Chapter 8, July 2017 CHAPTER 8 Digital Data Bus Acquisition Formatting Standard Acronyms...check FCS frame check sequence HDDR high-density digital recording MIL-STD Military Standard msb most significant bit PCM pulse code modulation

  13. Standard format and content for a licensee physical security plan for the protection of special nuclear material of moderate or low strategic significance - January 1980

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This guide describes the information required in the physical security plan submitted as part of an application for a license to possess, use, or transport special nuclear material (SNM) of moderate strategic significance or 10 kg or more of SNM of low strategic significance and recommends a standard format for presenting the information in an orderly arrangement. This standards format will thus serve as an aid to uniformity and completeness in the preparation and review of the physical protection plan of the license application. This document can also be used as guidance by licensees possessing or transporting less than 10 kg of SNM of low strategic significance in understanding the intent and implementing the requirements of paragraphs 73.67(a), 73.67(f), and 73.67(g) of 10 CRF Part 73

  14. Improving a Power Line Communications Standard with LDPC Codes

    Directory of Open Access Journals (Sweden)

    Hsu Christine

    2007-01-01

    Full Text Available We investigate a power line communications (PLC scheme that could be used to enhance the HomePlug 1.0 standard, specifically its ROBO mode which provides modest throughput for the worst case PLC channel. The scheme is based on using a low-density parity-check (LDPC code, in lieu of the concatenated Reed-Solomon and convolutional codes in ROBO mode. The PLC channel is modeled with multipath fading and Middleton's class A noise. Clipping is introduced to mitigate the effect of impulsive noise. A simple and effective method is devised to estimate the variance of the clipped noise for LDPC decoding. Simulation results show that the proposed scheme outperforms the HomePlug 1.0 ROBO mode and has lower computational complexity. The proposed scheme also dispenses with the repetition of information bits in ROBO mode to gain time diversity, resulting in 4-fold increase in physical layer throughput.

  15. Who influenced inflation persistence in China? A comparative analysis of the standard CIA model and CIA model with endogenous money

    Directory of Open Access Journals (Sweden)

    Liao Ying

    2013-12-01

    Full Text Available In this paper, we examine the influencing factors of inflation persistence in China’s economy using the DSGE approach. Two monetary DSGE models are estimated, namely, a standard CIA model and a CIA model with a Taylor rule. This article uses the Bayesian method to estimate the model, and the estimated and inferred results are credible due to the Markov chain reaching convergence. The results show that the augmented model outperforms the standard CIA model in terms of capturing inflation persistence. Further studies show that inflation persistence mainly comes from the persistence of the money supply, while money supply uncertainty, the reaction coefficient of monetary growth to productivity, productivity persistence and productivity uncertainty have a smaller impact on inflation persistence. Changes of monetary policy have little effect on inflation persistence.

  16. Difference in prognostic significance of maximum standardized uptake value on [18F]-fluoro-2-deoxyglucose positron emission tomography between adenocarcinoma and squamous cell carcinoma of the lung

    International Nuclear Information System (INIS)

    Tsutani, Yasuhiro; Miyata, Yoshihiro; Misumi, Keizo; Ikeda, Takuhiro; Mimura, Takeshi; Hihara, Jun; Okada, Morihito

    2011-01-01

    This study evaluates the prognostic significance of [18F]-fluoro-2-deoxyglucose positron emission tomography/computed tomography findings according to histological subtypes in patients with completely resected non-small cell lung cancer. We examined 176 consecutive patients who had undergone preoperative [18F]-fluoro-2-deoxyglucose-positron emission tomography/computed tomography imaging and curative surgical resection for adenocarcinoma (n=132) or squamous cell carcinoma (n=44). Maximum standardized uptake values for the primary lesions in all patients were calculated as the [18F]-fluoro-2-deoxyglucose uptake and the surgical results were analyzed. The median values of maximum standardized uptake value for the primary tumors were 2.60 in patients with adenocarcinoma and 6.95 in patients with squamous cell carcinoma (P 6.95 (P=0.83) among patients with squamous cell carcinoma, 2-year disease-free survival rates were 93.9% for maximum standardized uptake value ≤3.7 and 52.4% for maximum standardized uptake value >3.7 (P<0.0001) among those with adenocarcinoma, and notably, 100 and 57.2%, respectively, in patients with Stage I adenocarcinoma (P<0.0001). On the basis of the multivariate Cox analyses of patients with adenocarcinoma, maximum standardized uptake value (P=0.008) was a significantly independent factor for disease-free survival as well as nodal metastasis (P=0.001). Maximum standardized uptake value of the primary tumor was a powerful prognostic determinant for patients with adenocarcinoma, but not with squamous cell carcinoma of the lung. (author)

  17. CT outperforms radiographs at a comparable radiation dose in the assessment for spondylolysis.

    Science.gov (United States)

    Fadell, Michael F; Gralla, Jane; Bercha, Istiaq; Stewart, Jaime R; Harned, Roger K; Ingram, James D; Miller, Angie L; Strain, John D; Weinman, Jason P

    2015-07-01

    Lumbar spondylolysis, a unilateral or bilateral fracture at pars interarticularis, is a common cause of low back pain in children. The initial imaging study in the diagnosis of lumbar spondylolysis has historically been lumbar spine radiographs; however, radiographs can be equivocal or false-negative. Definitive diagnosis can be achieved with computed tomography (CT), but its use has been limited due to the dose of ionizing radiation to the patient. By limiting the z-axis coverage to the relevant anatomy and optimizing the CT protocol, we are able to provide a definitive diagnosis of fractures of the pars interarticularis at comparable or lower radiation dose than commonly performed lumbar spine radiographs. As there is no gold standard for the diagnosis of spondylolysis besides surgery, we compared interobserver agreement and degree of confidence to determine which modality is preferable. Sixty-two patients with low back pain ages 5-18 years were assessed for the presence of spondylolyis. Forty-seven patients were evaluated by radiography and 15 patients were evaluated by limited field-of-view CT. Both radiographic and CT examinations were assessed anonymously in random order for the presence or absence of spondylolyisis by six raters. Agreement was assessed among raters using a Fleiss Kappa statistic for multiple raters. CT provided a significantly higher level of agreement among raters than radiographs (P < 0.001). The overall Kappa for rater agreement with radiographs was 0.24, 0.34 and 0.40 for 2, 3 or 4 views, respectively, and 0.88 with CT. Interobserver agreement is significantly greater using limited z-axis coverage CT when compared with radiographs. Radiologist confidence improved significantly with CT compared to radiographs regardless of the number of views.

  18. CT outperforms radiographs at a comparable radiation dose in the assessment for spondylolysis

    Energy Technology Data Exchange (ETDEWEB)

    Fadell, Michael F.; Stewart, Jaime R.; Harned, Roger K.; Ingram, James D.; Miller, Angie L.; Strain, John D.; Weinman, Jason P. [Children' s Hospital Colorado, Department of Radiology, Aurora, CO (United States); University of Colorado Hospital, Department of Radiology, Aurora, CO (United States); Gralla, Jane [University of Colorado Denver, Department of Pediatrics, Aurora, CO (United States); Bercha, Istiaq [Children' s Hospital Colorado, Department of Radiology, Aurora, CO (United States)

    2015-07-15

    Lumbar spondylolysis, a unilateral or bilateral fracture at pars interarticularis, is a common cause of low back pain in children. The initial imaging study in the diagnosis of lumbar spondylolysis has historically been lumbar spine radiographs; however, radiographs can be equivocal or false-negative. Definitive diagnosis can be achieved with computed tomography (CT), but its use has been limited due to the dose of ionizing radiation to the patient. By limiting the z-axis coverage to the relevant anatomy and optimizing the CT protocol, we are able to provide a definitive diagnosis of fractures of the pars interarticularis at comparable or lower radiation dose than commonly performed lumbar spine radiographs. As there is no gold standard for the diagnosis of spondylolysis besides surgery, we compared interobserver agreement and degree of confidence to determine which modality is preferable. Sixty-two patients with low back pain ages 5-18 years were assessed for the presence of spondylolysis. Forty-seven patients were evaluated by radiography and 15 patients were evaluated by limited field-of-view CT. Both radiographic and CT examinations were assessed anonymously in random order for the presence or absence of spondylolysis by six raters. Agreement was assessed among raters using a Fleiss Kappa statistic for multiple raters. CT provided a significantly higher level of agreement among raters than radiographs (P < 0.001). The overall Kappa for rater agreement with radiographs was 0.24, 0.34 and 0.40 for 2, 3 or 4 views, respectively, and 0.88 with CT. Interobserver agreement is significantly greater using limited z-axis coverage CT when compared with radiographs. Radiologist confidence improved significantly with CT compared to radiographs regardless of the number of views. (orig.)

  19. CT outperforms radiographs at a comparable radiation dose in the assessment for spondylolysis

    International Nuclear Information System (INIS)

    Fadell, Michael F.; Stewart, Jaime R.; Harned, Roger K.; Ingram, James D.; Miller, Angie L.; Strain, John D.; Weinman, Jason P.; Gralla, Jane; Bercha, Istiaq

    2015-01-01

    Lumbar spondylolysis, a unilateral or bilateral fracture at pars interarticularis, is a common cause of low back pain in children. The initial imaging study in the diagnosis of lumbar spondylolysis has historically been lumbar spine radiographs; however, radiographs can be equivocal or false-negative. Definitive diagnosis can be achieved with computed tomography (CT), but its use has been limited due to the dose of ionizing radiation to the patient. By limiting the z-axis coverage to the relevant anatomy and optimizing the CT protocol, we are able to provide a definitive diagnosis of fractures of the pars interarticularis at comparable or lower radiation dose than commonly performed lumbar spine radiographs. As there is no gold standard for the diagnosis of spondylolysis besides surgery, we compared interobserver agreement and degree of confidence to determine which modality is preferable. Sixty-two patients with low back pain ages 5-18 years were assessed for the presence of spondylolysis. Forty-seven patients were evaluated by radiography and 15 patients were evaluated by limited field-of-view CT. Both radiographic and CT examinations were assessed anonymously in random order for the presence or absence of spondylolysis by six raters. Agreement was assessed among raters using a Fleiss Kappa statistic for multiple raters. CT provided a significantly higher level of agreement among raters than radiographs (P < 0.001). The overall Kappa for rater agreement with radiographs was 0.24, 0.34 and 0.40 for 2, 3 or 4 views, respectively, and 0.88 with CT. Interobserver agreement is significantly greater using limited z-axis coverage CT when compared with radiographs. Radiologist confidence improved significantly with CT compared to radiographs regardless of the number of views. (orig.)

  20. The International Standards Organisation offshore structures standard

    International Nuclear Information System (INIS)

    Snell, R.O.

    1994-01-01

    The International Standards Organisation has initiated a program to develop a suite of ISO Codes and Standards for the Oil Industry. The Offshore Structures Standard is one of seven topics being addressed. The scope of the standard will encompass fixed steel and concrete structures, floating structures, Arctic structures and the site specific assessment of mobile drilling and accommodation units. The standard will use as base documents the existing recommended practices and standards most frequently used for each type of structure, and will develop them to incorporate best published and recognized practice and knowledge where it provides a significant improvement on the base document. Work on the Code has commenced under the direction of an internationally constituted sub-committee comprising representatives from most of the countries with a substantial offshore oil and gas industry. This paper outlines the background to the code and the format, content and work program

  1. Standard format and content for a licensee physical security plan for the protection of special nuclear material of moderate or low strategic significance (Revision 1, Feb. 1983)

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    This regulatory guide describes the information required in the physical security plan submitted as part of an application for a license to possess, use, or transport Special Nuclear Materials (SNM) of moderate strategic significance or 10 kg or more of SNM of low strategic significance and recommends a standard format for presenting the information in an orderly arrangement. This standard format will thus serve as an aid to uniformity and completeness in the preparation and review of the physical security plan of the license application. This document can also be used as guidance by licensees possessing or transporting less than 10 kg of SNM of low strategic significance in understanding the intent and implementing the requirements of paragraphs 73.67(a), 73.67(f), and 73.67(g) of 10 CFR Part 73

  2. Dynamic jump intensities and risk premia : Evidence from S&P500 returns and options

    NARCIS (Netherlands)

    Christoffersen, P.; Jacobs, K.; Ornthanalai, C.

    2012-01-01

    We build a new class of discrete-time models that are relatively easy to estimate using returns and/or options. The distribution of returns is driven by two factors: dynamic volatility and dynamic jump intensity. Each factor has its own risk premium. The models significantly outperform standard

  3. Association between the application of ISO 9001:2008 alone or in combination with health-specific standards and quality-related activities in Hungarian hospitals.

    Science.gov (United States)

    Dombrádi, Viktor; Csenteri, Orsolya Karola; Sándor, János; Godény, Sándor

    2017-04-01

    To investigate how International Organization for Standardization (ISO) 9001 and the Hungarian Health Care Standards (HHCS) certifications are associated with quality management, patient safety, patient rights and human resource management activities. A cross-sectional study was implemented using the 2009 Hungarian hospital survey's database. Hungary. Fifty-three general hospitals were included in the statistical analysis. No intervention was carried out in the study. The outcomes included the percentage of compliance in the dimensions of quality management, patient safety, patient rights, human resource management and the overall score for each hospital, and they were grouped according to the hospitals' certifications. Sixteen hospitals did not have either ISO 9001 or HHCS certifications, 19 had ISO 9001 certification only and 18 had both. Hospitals with ISO 9001 alone or in combination with the HHCS significantly outperformed hospitals with no certifications in terms of quality management and human resource management activities but not in terms of patient safety or patient rights activities. Combined, the two models provided the highest median levels in all cases. Nevertheless, no significant differences were observed when the hospitals with both certifications were compared with hospitals with ISO 9001 only. Although the combination of ISO 9001 and the HHCS showed the best results, the benefits were not decisive. Furthermore, although the HHCS include standards addressing patient safety, no direct association was found with regard to compliance. Thus, further investigation is required to understand this enigma. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  4. 32 CFR 651.39 - Significance.

    Science.gov (United States)

    2010-07-01

    ... existing pollution standards; cause water, air, noise, soil, or underground pollution; impair visibility... Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) ENVIRONMENTAL QUALITY ENVIRONMENTAL ANALYSIS OF ARMY ACTIONS (AR 200-2) Environmental Assessment § 651.39 Significance. (a) If the...

  5. Detecting Novelty and Significance

    Science.gov (United States)

    Ferrari, Vera; Bradley, Margaret M.; Codispoti, Maurizio; Lang, Peter J.

    2013-01-01

    Studies of cognition often use an “oddball” paradigm to study effects of stimulus novelty and significance on information processing. However, an oddball tends to be perceptually more novel than the standard, repeated stimulus as well as more relevant to the ongoing task, making it difficult to disentangle effects due to perceptual novelty and stimulus significance. In the current study, effects of perceptual novelty and significance on ERPs were assessed in a passive viewing context by presenting repeated and novel pictures (natural scenes) that either signaled significant information regarding the current context or not. A fronto-central N2 component was primarily affected by perceptual novelty, whereas a centro-parietal P3 component was modulated by both stimulus significance and novelty. The data support an interpretation that the N2 reflects perceptual fluency and is attenuated when a current stimulus matches an active memory representation and that the amplitude of the P3 reflects stimulus meaning and significance. PMID:19400680

  6. Self-directed learning can outperform direct instruction in the course of a modern German medical curriculum - results of a mixed methods trial.

    Science.gov (United States)

    Peine, Arne; Kabino, Klaus; Spreckelsen, Cord

    2016-06-03

    Modernised medical curricula in Germany (so called "reformed study programs") rely increasingly on alternative self-instructed learning forms such as e-learning and curriculum-guided self-study. However, there is a lack of evidence that these methods can outperform conventional teaching methods such as lectures and seminars. This study was conducted in order to compare extant traditional teaching methods with new instruction forms in terms of learning effect and student satisfaction. In a randomised trial, 244 students of medicine in their third academic year were assigned to one of four study branches representing self-instructed learning forms (e-learning and curriculum-based self-study) and instructed learning forms (lectures and seminars). All groups participated in their respective learning module with standardised materials and instructions. Learning effect was measured with pre-test and post-test multiple-choice questionnaires. Student satisfaction and learning style were examined via self-assessment. Of 244 initial participants, 223 completed the respective module and were included in the study. In the pre-test, the groups showed relatively homogenous scores. All students showed notable improvements compared with the pre-test results. Participants in the non-self-instructed learning groups reached scores of 14.71 (seminar) and 14.37 (lecture), while the groups of self-instructed learners reached higher scores with 17.23 (e-learning) and 15.81 (self-study). All groups improved significantly (p learning group, whose self-assessment improved by 2.36. The study shows that students in modern study curricula learn better through modern self-instructed methods than through conventional methods. These methods should be used more, as they also show good levels of student acceptance and higher scores in personal self-assessment of knowledge.

  7. Analysis of the basic professional standards involving the work of psychologists in difficult and legally significant situations

    Directory of Open Access Journals (Sweden)

    Bogdanovich N. V.

    2016-06-01

    Full Text Available In this article the analysis of professional standards in terms of the scope of work of the psychologist with clients in difficult life and legal situations. The criteria of analysis chosen: reflected in professional activities, the choice of grounds for the selection of professional activities that focus on a specific Department, selection of a particular direction of activity of the psychologist (prevention, support, rehabilitation. It is shown that all five of the analyzed standards imply such a situation, but only three of them ("educational psychologist", "Psychologist in the social sphere", "Specialist in rehabilitative work in the social sphere" describe the activities of the psychologist, and the remaining ("Expert of bodies of guardianship and guardianship concerning minors" and "Specialist in working with families" are more organizational in nature. The conclusion about compliance of the training programs developed by the Department of legal psychology and law and education, the requirements of professional standards, proposed improvements in these programs.

  8. Pilot Evaluation of a Communication Skills Training Program for Psychiatry Residents Using Standardized Patient Assessment.

    Science.gov (United States)

    Ditton-Phare, Philippa; Sandhu, Harsimrat; Kelly, Brian; Kissane, David; Loughland, Carmel

    2016-10-01

    Mental health clinicians can experience difficulties communicating diagnostic information to patients and their families/carers, especially about distressing psychiatric disorders such as schizophrenia. There is evidence for the effectiveness of communication skills training (CST) for improving diagnostic discussions, particularly in specialties such as oncology, but only limited evidence exists about CST for psychiatry. This study evaluated a CST program specifically developed for psychiatry residents called ComPsych that focuses on conveying diagnostic and prognostic information about schizophrenia. The ComPsych program consists of an introductory lecture, module booklets for trainees, and exemplary skills videos, followed by small group role-plays with simulated patients (SPs) led by a trained facilitator. A standardized patient assessment (SPA) was digitally recorded pre- and post-training with a SP using a standardized scenario in a time-limited (15 min) period. Recorded SPAs were independently rated using a validated coding system (ComSkil) to identify frequency of skills used in five skills categories (agenda setting, checking, questioning, information organization, and empathic communication). Thirty trainees (15 males and 15 females; median age = 32) undertaking their vocational specialty training in psychiatry participated in ComPsych training and pre- and post-ComPsych SPAs. Skills increased post-training for agenda setting (d = -0.82), while questioning skills (d = 0.56) decreased. There were no significant differences in any other skills grouping, although checking, information organization, and empathic communication skills tended to increase post-training. A dose effect was observed for agenda setting, with trainees who attended more CST sessions outperforming those attending fewer. Findings support the generalization and translation of ComPsych CST to psychiatry.

  9. The Stock Performance of C. Everett Koop Award Winners Compared With the Standard & Poor's 500 Index.

    Science.gov (United States)

    Goetzel, Ron Z; Fabius, Raymond; Fabius, Dan; Roemer, Enid C; Thornton, Nicole; Kelly, Rebecca K; Pelletier, Kenneth R

    2016-01-01

    To explore the link between companies investing in the health and well-being programs of their employees and stock market performance. Stock performance of C. Everett Koop National Health Award winners (n = 26) was measured over time and compared with the average performance of companies comprising the Standard and Poor's (S&P) 500 Index. The Koop Award portfolio outperformed the S&P 500 Index. In the 14-year period tracked (2000-2014), Koop Award winners' stock values appreciated by 325% compared with the market average appreciation of 105%. This study supports prior and ongoing research demonstrating a higher market valuation--an affirmation of business success by Wall Street investors--of socially responsible companies that invest in the health and well-being of their workers when compared with other publicly traded firms.

  10. Prognostic significance of standardized uptake value on preoperative 18F-FDG PET/CT in patients with ampullary adenocarcinoma

    International Nuclear Information System (INIS)

    Choi, Hye Jin; Kang, Chang Moo; Lee, Woo Jung; Jo, Kwanhyeong; Lee, Jong Doo; Lee, Jae-Hoon; Ryu, Young Hoon

    2015-01-01

    The purpose of this study was to investigate the prognostic value of 18 F-fluorodeoxyglucose (FDG) positron emission tomography/computed tomography (PET/CT) in patients with ampullary adenocarcinoma (AAC) after curative surgical resection. Fifty-two patients with AAC who had undergone 18 F-FDG PET/CT and subsequent curative resections were retrospectively enrolled. The maximum standardized uptake value (SUV max ) and tumor to background ratio (TBR) were measured on 18 F-FDG PET/CT in all patients. The prognostic significances of PET/CT parameters and clinicopathologic factors for recurrence-free survival (RFS) and overall survival (OS) were evaluated by univariate and multivariate analyses. Of the 52 patients, 19 (36.5 %) experienced tumor recurrence during the follow-up period and 18 (35.8 %) died. The 3-year RFS and OS were 62.3 and 61.5 %, respectively. Preoperative CA19-9 level, tumor differentiation, presence of lymph node metastasis, SUV max , and TBR were significant prognostic factors for both RFS and OS (p < 0.05) on univariate analyses, and patient age showed significance only for predicting RFS (p < 0.05). On multivariate analyses, SUV max and TBR were independent prognostic factors for RFS, and tumor differentiation, SUV max , and TBR were independent prognostic factors for OS. SUV max and TBR on preoperative 18 F-FDG PET/CT are independent prognostic factors for predicting RFS and OS in patients with AAC; patients with high SUV max (>4.80) or TBR (>1.75) had poor survival outcomes. The role of and indications for adjuvant therapy after curative resection of AAC are still unclear. 18 F-FDG uptake in the primary tumor could provide additive prognostic information for the decision-making process regarding adjuvant therapy. (orig.)

  11. Codon Deviation Coefficient: a novel measure for estimating codon usage bias and its statistical significance

    Directory of Open Access Journals (Sweden)

    Zhang Zhang

    2012-03-01

    Full Text Available Abstract Background Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB. Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis. Results Here we propose a novel measure--Codon Deviation Coefficient (CDC--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance. Conclusions As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions.

  12. Predictive significance of standardized uptake value parameters of FDG-PET in patients with non-small cell lung carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Duan, X-Y.; Wang, W.; Li, M.; Li, Y.; Guo, Y-M. [PET-CT Center, The First Affiliated Hospital of Xi' an, Jiaotong University, Xi' an, Shaanxi (China)

    2015-02-03

    {sup 18}F-fluoro-2-deoxyglucose (FDG) positron emission tomography (PET)/computed tomography (CT) is widely used to diagnose and stage non-small cell lung cancer (NSCLC). The aim of this retrospective study was to evaluate the predictive ability of different FDG standardized uptake values (SUVs) in 74 patients with newly diagnosed NSCLC. {sup 18}F-FDG PET/CT scans were performed and different SUV parameters (SUV{sub max}, SUV{sub avg}, SUV{sub T/L}, and SUV{sub T/A}) obtained, and their relationship with clinical characteristics were investigated. Meanwhile, correlation and multiple stepwise regression analyses were performed to determine the primary predictor of SUVs for NSCLC. Age, gender, and tumor size significantly affected SUV parameters. The mean SUVs of squamous cell carcinoma were higher than those of adenocarcinoma. Poorly differentiated tumors exhibited higher SUVs than well-differentiated ones. Further analyses based on the pathologic type revealed that the SUV{sub max}, SUV{sub avg}, and SUV{sub T/L} of poorly differentiated adenocarcinoma tumors were higher than those of moderately or well-differentiated tumors. Among these four SUV parameters, SUV{sub T/L} was the primary predictor for tumor differentiation. However, in adenocarcinoma, SUV{sub max} was the determining factor for tumor differentiation. Our results showed that these four SUV parameters had predictive significance related to NSCLC tumor differentiation; SUV{sub T/L} appeared to be most useful overall, but SUV{sub max} was the best index for adenocarcinoma tumor differentiation.

  13. Predicting standard-dose PET image from low-dose PET and multimodal MR images using mapping-based sparse representation

    International Nuclear Information System (INIS)

    Wang, Yan; Zhou, Jiliu; Zhang, Pei; An, Le; Ma, Guangkai; Kang, Jiayin; Shi, Feng; Shen, Dinggang; Wu, Xi; Lalush, David S; Lin, Weili

    2016-01-01

    Positron emission tomography (PET) has been widely used in clinical diagnosis for diseases and disorders. To obtain high-quality PET images requires a standard-dose radionuclide (tracer) injection into the human body, which inevitably increases risk of radiation exposure. One possible solution to this problem is to predict the standard-dose PET image from its low-dose counterpart and its corresponding multimodal magnetic resonance (MR) images. Inspired by the success of patch-based sparse representation (SR) in super-resolution image reconstruction, we propose a mapping-based SR (m-SR) framework for standard-dose PET image prediction. Compared with the conventional patch-based SR, our method uses a mapping strategy to ensure that the sparse coefficients, estimated from the multimodal MR images and low-dose PET image, can be applied directly to the prediction of standard-dose PET image. As the mapping between multimodal MR images (or low-dose PET image) and standard-dose PET images can be particularly complex, one step of mapping is often insufficient. To this end, an incremental refinement framework is therefore proposed. Specifically, the predicted standard-dose PET image is further mapped to the target standard-dose PET image, and then the SR is performed again to predict a new standard-dose PET image. This procedure can be repeated for prediction refinement of the iterations. Also, a patch selection based dictionary construction method is further used to speed up the prediction process. The proposed method is validated on a human brain dataset. The experimental results show that our method can outperform benchmark methods in both qualitative and quantitative measures. (paper)

  14. The Role of Instructional Quality within School Sectors: A Multi-Level Analysis

    Science.gov (United States)

    Miller, Saralyn J.

    2013-01-01

    On average, private school students outperform public school students on standardized tests. Research confirms these differences in student scores, but also shows that when student background characteristics are controlled, on average, public school students outperform private school students. Explaining achievement differences between sectors…

  15. Designing and standardization of Persian version of verbal fluency test among Iranian bilingual (Turkish-Persian adolescents

    Directory of Open Access Journals (Sweden)

    Ayyoub Malek

    2013-08-01

    Full Text Available BACKGROUND: The present study aims to design and standardize the verbal fluency test (VFT among bilingual (Turkish-Persian adolescents in Tabriz, Iran. METHODS: In the designing stage, 190 adolescents who were already selected randomly from among the guidance and high school students in Tabriz were classified into three age groups (11-12, 13-15, 16-18. The screening test including 33 Persian letters and three ‘animal’, ‘fruit’, and ‘supermarket stuff’ categories, and SDQ was administered to them. The results were the three letters ‘M’, ‘D’, and ‘B’ for phonological fluency, and two ‘animal’ and ‘supermarket stuff’ categories for semantic fluency in the Persian language. In the standardization stage, the letters and categories specified in the designing stage were administered in the same order to 302 adolescents. Moreover, 28 adolescents diagnosed with ADHD were selected to estimate the discriminant validity of VFT. RESULTS: Pearson correlation coefficient between test-retest of the three letters ‘M’, ‘D’, and ‘B’ for phonological fluency were estimated at 0.67, 0.66, and 0.75, respectively. Furthermore, for the two categories of ‘animal’ and ‘supermarket stuff’ it was estimated to be 0.80 and 0.65, respectively. All these amounts were significant (P < 0.01. The discriminant validity, which was estimated through comparison between the scores of normal and ADHD adolescents, showed that the obtained t value for all indices except for the letter ‘B’ was meaningful. The results of MANOVA between two gender groups were significant at P < 0.05 for three ‘M’, ‘D’, and ‘B’ variables of verbal fluency and ‘animal’ semantic fluency. In both verbal and semantic fluency the mean of subjects’ performance scores showed that females outperformed males. CONCLUSIONS: The findings of the current study indicated that VFT is reliable in the studied sample group, and has a valid psychometric

  16. Genome-wide identification of significant aberrations in cancer genome.

    Science.gov (United States)

    Yuan, Xiguo; Yu, Guoqiang; Hou, Xuchu; Shih, Ie-Ming; Clarke, Robert; Zhang, Junying; Hoffman, Eric P; Wang, Roger R; Zhang, Zhen; Wang, Yue

    2012-07-27

    Somatic Copy Number Alterations (CNAs) in human genomes are present in almost all human cancers. Systematic efforts to characterize such structural variants must effectively distinguish significant consensus events from random background aberrations. Here we introduce Significant Aberration in Cancer (SAIC), a new method for characterizing and assessing the statistical significance of recurrent CNA units. Three main features of SAIC include: (1) exploiting the intrinsic correlation among consecutive probes to assign a score to each CNA unit instead of single probes; (2) performing permutations on CNA units that preserve correlations inherent in the copy number data; and (3) iteratively detecting Significant Copy Number Aberrations (SCAs) and estimating an unbiased null distribution by applying an SCA-exclusive permutation scheme. We test and compare the performance of SAIC against four peer methods (GISTIC, STAC, KC-SMART, CMDS) on a large number of simulation datasets. Experimental results show that SAIC outperforms peer methods in terms of larger area under the Receiver Operating Characteristics curve and increased detection power. We then apply SAIC to analyze structural genomic aberrations acquired in four real cancer genome-wide copy number data sets (ovarian cancer, metastatic prostate cancer, lung adenocarcinoma, glioblastoma). When compared with previously reported results, SAIC successfully identifies most SCAs known to be of biological significance and associated with oncogenes (e.g., KRAS, CCNE1, and MYC) or tumor suppressor genes (e.g., CDKN2A/B). Furthermore, SAIC identifies a number of novel SCAs in these copy number data that encompass tumor related genes and may warrant further studies. Supported by a well-grounded theoretical framework, SAIC has been developed and used to identify SCAs in various cancer copy number data sets, providing useful information to study the landscape of cancer genomes. Open-source and platform-independent SAIC software is

  17. Standard-based comprehensive detection of adverse drug reaction signals from nursing statements and laboratory results in electronic health records.

    Science.gov (United States)

    Lee, Suehyun; Choi, Jiyeob; Kim, Hun-Sung; Kim, Grace Juyun; Lee, Kye Hwa; Park, Chan Hee; Han, Jongsoo; Yoon, Dukyong; Park, Man Young; Park, Rae Woong; Kang, Hye-Ryun; Kim, Ju Han

    2017-07-01

    We propose 2 Medical Dictionary for Regulatory Activities-enabled pharmacovigilance algorithms, MetaLAB and MetaNurse, powered by a per-year meta-analysis technique and improved subject sampling strategy. This study developed 2 novel algorithms, MetaLAB for laboratory abnormalities and MetaNurse for standard nursing statements, as significantly improved versions of our previous electronic health record (EHR)-based pharmacovigilance method, called CLEAR. Adverse drug reaction (ADR) signals from 117 laboratory abnormalities and 1357 standard nursing statements for all precautionary drugs ( n   = 101) were comprehensively detected and validated against SIDER (Side Effect Resource) by MetaLAB and MetaNurse against 11 817 and 76 457 drug-ADR pairs, respectively. We demonstrate that MetaLAB (area under the curve, AUC = 0.61 ± 0.18) outperformed CLEAR (AUC = 0.55 ± 0.06) when we applied the same 470 drug-event pairs as the gold standard, as in our previous research. Receiver operating characteristic curves for 101 precautionary terms in the Medical Dictionary for Regulatory Activities Preferred Terms were obtained for MetaLAB and MetaNurse (0.69 ± 0.11; 0.62 ± 0.07), which complemented each other in terms of ADR signal coverage. Novel ADR signals discovered by MetaLAB and MetaNurse were successfully validated against spontaneous reports in the US Food and Drug Administration Adverse Event Reporting System database. The present study demonstrates the symbiosis of laboratory test results and nursing statements for ADR signal detection in terms of their system organ class coverage and performance profiles. Systematic discovery and evaluation of the wide spectrum of ADR signals using standard-based observational electronic health record data across many institutions will affect drug development and use, as well as postmarketing surveillance and regulation. © The Author 2017. Published by Oxford University Press on behalf of the American

  18. The Stock Performance of C. Everett Koop Award Winners Compared With the Standard & Poor's 500 Index

    Science.gov (United States)

    Goetzel, Ron Z.; Fabius, Raymond; Fabius, Dan; Roemer, Enid C.; Thornton, Nicole; Kelly, Rebecca K.; Pelletier, Kenneth R.

    2016-01-01

    Objective: To explore the link between companies investing in the health and well-being programs of their employees and stock market performance. Methods: Stock performance of C. Everett Koop National Health Award winners (n = 26) was measured over time and compared with the average performance of companies comprising the Standard and Poor's (S&P) 500 Index. Results: The Koop Award portfolio outperformed the S&P 500 Index. In the 14-year period tracked (2000–2014), Koop Award winners’ stock values appreciated by 325% compared with the market average appreciation of 105%. Conclusions: This study supports prior and ongoing research demonstrating a higher market valuation—an affirmation of business success by Wall Street investors—of socially responsible companies that invest in the health and well-being of their workers when compared with other publicly traded firms. PMID:26716843

  19. Codon Deviation Coefficient: A novel measure for estimating codon usage bias and its statistical significance

    KAUST Repository

    Zhang, Zhang

    2012-03-22

    Background: Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB). Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis.Results: Here we propose a novel measure--Codon Deviation Coefficient (CDC)--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance.Conclusions: As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions. 2012 Zhang et al; licensee BioMed Central Ltd.

  20. MATE standardization

    Science.gov (United States)

    Farmer, R. E.

    1982-11-01

    The MATE (Modular Automatic Test Equipment) program was developed to combat the proliferation of unique, expensive ATE within the Air Force. MATE incorporates a standard management approach and a standard architecture designed to implement a cradle-to-grave approach to the acquisition of ATE and to significantly reduce the life cycle cost of weapons systems support. These standards are detailed in the MATE Guides. The MATE Guides assist both the Air Force and Industry in implementing the MATE concept, and provide the necessary tools and guidance required for successful acquisition of ATE. The guides also provide the necessary specifications for industry to build MATE-qualifiable equipment. The MATE architecture provides standards for all key interfaces of an ATE system. The MATE approach to the acquisition and management of ATE has been jointly endorsed by the commanders of Air Force Systems Command and Air Force Logistics Command as the way of doing business in the future.

  1. Standards for radiation protection instrumentation: design of safety standards and testing procedures

    International Nuclear Information System (INIS)

    Meissner, Frank

    2008-01-01

    This paper describes by means of examples the role of safety standards for radiation protection and the testing and qualification procedures. The development and qualification of radiation protection instrumentation is a significant part of the work of TUV NORD SysTec, an independent expert organisation in Germany. The German Nuclear Safety Standards Commission (KTA) establishes regulations in the field of nuclear safety. The examples presented may be of importance for governments and nuclear safety authorities, for nuclear operators and for manufacturers worldwide. They demonstrate the advantage of standards in the design of radiation protection instrumentation for new power plants, in the upgrade of existing instrumentation to nuclear safety standards or in the application of safety standards to newly developed equipment. Furthermore, they show how authorities may proceed when safety standards for radiation protection instrumentation are not yet established or require actualization. (author)

  2. Updating OSHA Standards Based on National Consensus Standards; Eye and Face Protection. Final rule.

    Science.gov (United States)

    2016-03-25

    On March 13, 2015, OSHA published in the Federal Register a notice of proposed rulemaking (NPRM) to revise its eye and face protection standards for general industry, shipyard employment, marine terminals, longshoring, and construction by updating the references to national consensus standards approved by the American National Standards Institute (ANSI). OSHA received no significant objections from commenters and therefore is adopting the amendments as proposed. This final rule updates the references in OSHA's eye and face standards to reflect the most recent edition of the ANSI/International Safety Equipment Association (ISEA) eye and face protection standard. It removes the oldest-referenced edition of the same ANSI standard. It also amends other provisions of the construction eye and face protection standard to bring them into alignment with OSHA's general industry and maritime standards.

  3. Prognostic significance of standardized uptake value and metabolic tumour volume on {sup 18}F-FDG PET/CT in oropharyngeal squamous cell carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Won; Roh, Jong-Lyel; Choi, Seung-Ho; Nam, Soon Yuhl [University of Ulsan College of Medicine, Department of Otolaryngology, Asan Medical Centre, Seoul (Korea, Republic of); Oh, Jungsu S.; Kim, Jae Seung [University of Ulsan College of Medicine, Department of Nuclear Medicine, Asan Medical Centre, Seoul (Korea, Republic of); Kim, Sang Yoon [University of Ulsan College of Medicine, Department of Otolaryngology, Asan Medical Centre, Seoul (Korea, Republic of); Biomedical Research Institute, Korea Institute of Science and Technology, Seoul (Korea, Republic of)

    2015-08-15

    Standardized uptake value (SUV) and metabolic tumour volume (MTV) measured by {sup 18}F-FDG PET/CT are emerging prognostic biomarkers in human solid cancers. However, their prognostic significance in oropharyngeal squamous cell carcinoma (OPSCC) has been investigated in only a few studies and with small cohorts. In the present study we evaluated the ability of SUV, MTV, and total lesion glycolysis (TLG) measured on pretreatment {sup 18}F-FDG PET/CT to predict recurrence and survival outcomes in OPSCC. The study included 221 patients with OPSCC who underwent pretreatment {sup 18}F-FDG PET/CT imaging and received definitive treatment at our tertiary referral centre. The PET imaging parameters SUV{sub max}, SUV{sub peak}, MTV and TLG were measured in primary tumours with focal {sup 18}F-FDG uptake. Clinical and imaging variables significantly associated with overall survival (OS) and disease-free survival (DFS) were identified by univariate and multivariate analyses using the Cox proportional hazards model. Overall 5-year OS and DFS rates were 72.0 % and 79.5 %, respectively, during a median follow-up of 61 months (range 18 - 122 months). The cut-off values of tumour SUV{sub max}, SUV{sub peak}, MTV and TLG for prediction of DFS were 7.55, 6.80, 11.06 mL and 78.56 g, respectively. Univariate analyses showed that age >60 years, advanced tumour stage, and high tumour SUV{sub max}, SUV{sub peak}, MTV and TLG were significantly associated with decreased OS and DFS (P < 0.05 each). Age, tumour SUV{sub max} and MTV remained independent variables for OS and DFS (P < 0.05 each) in the multivariate analyses. SUV{sub max} and MTV measured on pretreatment {sup 18}F-FDG PET/CT may be useful in predicting the clinical outcomes in OPSCC patients. This study investigated the clinical prognostic value of imaging parameters from pretreatment {sup 18}F-FDG PET/CT in 221 patients who underwent definitive treatment for oropharyngeal squamous cell carcinoma. High maximum standardized

  4. Why envy outperforms admiration.

    Science.gov (United States)

    van de Ven, Niels; Zeelenberg, Marcel; Pieters, Rik

    2011-06-01

    Four studies tested the hypothesis that the emotion of benign envy, but not the emotions of admiration or malicious envy, motivates people to improve themselves. Studies 1 to 3 found that only benign envy was related to the motivation to study more (Study 1) and to actual performance on the Remote Associates Task (which measures intelligence and creativity; Studies 2 and 3). Study 4 found that an upward social comparison triggered benign envy and subsequent better performance only when people thought self-improvement was attainable. When participants thought self-improvement was hard, an upward social comparison led to more admiration and no motivation to do better. Implications of these findings for theories of social emotions such as envy, social comparisons, and for understanding the influence of role models are discussed.

  5. The revised APTA code of ethics for the physical therapist and standards of ethical conduct for the physical therapist assistant: theory, purpose, process, and significance.

    Science.gov (United States)

    Swisher, Laura Lee; Hiller, Peggy

    2010-05-01

    In June 2009, the House of Delegates (HOD) of the American Physical Therapy Association (APTA) passed a major revision of the APTA Code of Ethics for physical therapists and the Standards of Ethical Conduct for the Physical Therapist Assistant. The revised documents will be effective July 1, 2010. The purposes of this article are: (1) to provide a historical, professional, and theoretical context for this important revision; (2) to describe the 4-year revision process; (3) to examine major features of the documents; and (4) to discuss the significance of the revisions from the perspective of the maturation of physical therapy as a doctoring profession. PROCESS OF REVISION: The process for revision is delineated within the context of history and the Bylaws of APTA. FORMAT, STRUCTURE, AND CONTENT OF REVISED CORE ETHICS DOCUMENTS: The revised documents represent a significant change in format, level of detail, and scope of application. Previous APTA Codes of Ethics and Standards of Ethical Conduct for the Physical Therapist Assistant have delineated very broad general principles, with specific obligations spelled out in the Ethics and Judicial Committee's Guide for Professional Conduct and Guide for Conduct of the Physical Therapist Assistant. In contrast to the current documents, the revised documents address all 5 roles of the physical therapist, delineate ethical obligations in organizational and business contexts, and align with the tenets of Vision 2020. The significance of this revision is discussed within historical parameters, the implications for physical therapists and physical therapist assistants, the maturation of the profession, societal accountability and moral community, potential regulatory implications, and the inclusive and deliberative process of moral dialogue by which changes were developed, revised, and approved.

  6. Invasive Acer negundo outperforms native species in non-limiting resource environments due to its higher phenotypic plasticity.

    Science.gov (United States)

    Porté, Annabel J; Lamarque, Laurent J; Lortie, Christopher J; Michalet, Richard; Delzon, Sylvain

    2011-11-24

    To identify the determinants of invasiveness, comparisons of traits of invasive and native species are commonly performed. Invasiveness is generally linked to higher values of reproductive, physiological and growth-related traits of the invasives relative to the natives in the introduced range. Phenotypic plasticity of these traits has also been cited to increase the success of invasive species but has been little studied in invasive tree species. In a greenhouse experiment, we compared ecophysiological traits between an invasive species to Europe, Acer negundo, and early- and late-successional co-occurring native species, under different light, nutrient availability and disturbance regimes. We also compared species of the same species groups in situ, in riparian forests. Under non-limiting resources, A. negundo seedlings showed higher growth rates than the native species. However, A. negundo displayed equivalent or lower photosynthetic capacities and nitrogen content per unit leaf area compared to the native species; these findings were observed both on the seedlings in the greenhouse experiment and on adult trees in situ. These physiological traits were mostly conservative along the different light, nutrient and disturbance environments. Overall, under non-limiting light and nutrient conditions, specific leaf area and total leaf area of A. negundo were substantially larger. The invasive species presented a higher plasticity in allocation to foliage and therefore in growth with increasing nutrient and light availability relative to the native species. The higher level of plasticity of the invasive species in foliage allocation in response to light and nutrient availability induced a better growth in non-limiting resource environments. These results give us more elements on the invasiveness of A. negundo and suggest that such behaviour could explain the ability of A. negundo to outperform native tree species, contributes to its spread in European resource

  7. 18 CFR 415.42 - Technical standards.

    Science.gov (United States)

    2010-04-01

    ...) Standards used by state and local governments shall conform in principle to Commission standards but may.... Any significant difference shall be reviewed with and subject to approval by the Executive Director...

  8. A Novel Activated-Charcoal-Doped Multiwalled Carbon Nanotube Hybrid for Quasi-Solid-State Dye-Sensitized Solar Cell Outperforming Pt Electrode.

    Science.gov (United States)

    Arbab, Alvira Ayoub; Sun, Kyung Chul; Sahito, Iftikhar Ali; Qadir, Muhammad Bilal; Choi, Yun Seon; Jeong, Sung Hoon

    2016-03-23

    Highly conductive mesoporous carbon structures based on multiwalled carbon nanotubes (MWCNTs) and activated charcoal (AC) were synthesized by an enzymatic dispersion method. The synthesized carbon configuration consists of synchronized structures of highly conductive MWCNT and porous activated charcoal morphology. The proposed carbon structure was used as counter electrode (CE) for quasi-solid-state dye-sensitized solar cells (DSSCs). The AC-doped MWCNT hybrid showed much enhanced electrocatalytic activity (ECA) toward polymer gel electrolyte and revealed a charge transfer resistance (RCT) of 0.60 Ω, demonstrating a fast electron transport mechanism. The exceptional electrocatalytic activity and high conductivity of the AC-doped MWCNT hybrid CE are associated with its synchronized features of high surface area and electronic conductivity, which produces higher interfacial reaction with the quasi-solid electrolyte. Morphological studies confirm the forms of amorphous and conductive 3D carbon structure with high density of CNT colloid. The excessive oxygen surface groups and defect-rich structure can entrap an excessive volume of quasi-solid electrolyte and locate multiple sites for iodide/triiodide catalytic reaction. The resultant D719 DSSC composed of this novel hybrid CE fabricated with polymer gel electrolyte demonstrated an efficiency of 10.05% with a high fill factor (83%), outperforming the Pt electrode. Such facile synthesis of CE together with low cost and sustainability supports the proposed DSSCs' structure to stand out as an efficient next-generation photovoltaic device.

  9. Human dental age estimation using third molar developmental stages: does a Bayesian approach outperform regression models to discriminate between juveniles and adults?

    Science.gov (United States)

    Thevissen, P W; Fieuws, S; Willems, G

    2010-01-01

    Dental age estimation methods based on the radiologically detected third molar developmental stages are implemented in forensic age assessments to discriminate between juveniles and adults considering the judgment of young unaccompanied asylum seekers. Accurate and unbiased age estimates combined with appropriate quantified uncertainties are the required properties for accurate forensic reporting. In this study, a subset of 910 individuals uniformly distributed in age between 16 and 22 years was selected from an existing dataset collected by Gunst et al. containing 2,513 panoramic radiographs with known third molar developmental stages of Belgian Caucasian men and women. This subset was randomly split in a training set to develop a classical regression analysis and a Bayesian model for the multivariate distribution of the third molar developmental stages conditional on age and in a test set to assess the performance of both models. The aim of this study was to verify if the Bayesian approach differentiates the age of maturity more precisely and removes the bias, which disadvantages the systematically overestimated young individuals. The Bayesian model offers the discrimination of subjects being older than 18 years more appropriate and produces more meaningful prediction intervals but does not strongly outperform the classical approaches.

  10. Maximum standard uptake value on pre-chemotherapeutic FDG-PET is a significant parameter for disease progression of newly diagnosed lymphoma

    International Nuclear Information System (INIS)

    Eo, Jae Seon; Lee, Won Woo; Chung, June Key; Lee, Myung Chul; Kim, Sang Eun

    2005-01-01

    F-18 FDG-PET is useful for detection and staging of lymphoma. We investigated the prognostic significance of maximum standard uptake (maxSUV) value of FDG-PET for newly diagnosed lymphoma patients before chemotherapy. Twenty-seven patients (male: female = 17: 10: age: 49±19 years) with newly diagnosed lymphoma were enrolled. Nine-teen patients suffered from B cell lymphoma, 6 Hodgkins disease and 2 T cell lymphoma. One patient was stage I, 9 stage II, 3 stage III, 1 stage IV and 13 others. All patients underwent FDG-PET before initiation of chemotherapy. MaxSUV values using lean body weight were obtained for main and largest lesion to represent maxSUV of the patients. The disease progression was defined as total change of the chemotherapeutic regimen or addition of new chemotherapeutic agent during follow up period. The observed period was 389±224 days. The value of maxSUV ranged from 3 to 18 (mean±SD = 10.6±4.4). The disease progressions occurred in 6 patients. Using Cox proportional-hazard regression analysis, maxSUV was identified as a significant parameter for the disease progression free survival (p=0.044). Kaplan-Meier survival curve analysis revealed that the group with higher maxSUV (=10.6, n=5) suffered from shorter disease progression free survival (median 299 days) than the group with lower maxSUV (<10.6, n = 22) (median 378 days, p=0.0146). We found that maxSUV on pre-chemotherapeutic F-18 FDG-PET for newly diagnosed lymphoma patients is a significant parameter for disease progression. Lymphoma patients can be stratified before initiation of chemotherapy in terms of disease progression by the value of maxSUV 10.6

  11. Genome-wide identification of significant aberrations in cancer genome

    Directory of Open Access Journals (Sweden)

    Yuan Xiguo

    2012-07-01

    Full Text Available Abstract Background Somatic Copy Number Alterations (CNAs in human genomes are present in almost all human cancers. Systematic efforts to characterize such structural variants must effectively distinguish significant consensus events from random background aberrations. Here we introduce Significant Aberration in Cancer (SAIC, a new method for characterizing and assessing the statistical significance of recurrent CNA units. Three main features of SAIC include: (1 exploiting the intrinsic correlation among consecutive probes to assign a score to each CNA unit instead of single probes; (2 performing permutations on CNA units that preserve correlations inherent in the copy number data; and (3 iteratively detecting Significant Copy Number Aberrations (SCAs and estimating an unbiased null distribution by applying an SCA-exclusive permutation scheme. Results We test and compare the performance of SAIC against four peer methods (GISTIC, STAC, KC-SMART, CMDS on a large number of simulation datasets. Experimental results show that SAIC outperforms peer methods in terms of larger area under the Receiver Operating Characteristics curve and increased detection power. We then apply SAIC to analyze structural genomic aberrations acquired in four real cancer genome-wide copy number data sets (ovarian cancer, metastatic prostate cancer, lung adenocarcinoma, glioblastoma. When compared with previously reported results, SAIC successfully identifies most SCAs known to be of biological significance and associated with oncogenes (e.g., KRAS, CCNE1, and MYC or tumor suppressor genes (e.g., CDKN2A/B. Furthermore, SAIC identifies a number of novel SCAs in these copy number data that encompass tumor related genes and may warrant further studies. Conclusions Supported by a well-grounded theoretical framework, SAIC has been developed and used to identify SCAs in various cancer copy number data sets, providing useful information to study the landscape of cancer genomes

  12. Identification of significant features by the Global Mean Rank test.

    Science.gov (United States)

    Klammer, Martin; Dybowski, J Nikolaj; Hoffmann, Daniel; Schaab, Christoph

    2014-01-01

    With the introduction of omics-technologies such as transcriptomics and proteomics, numerous methods for the reliable identification of significantly regulated features (genes, proteins, etc.) have been developed. Experimental practice requires these tests to successfully deal with conditions such as small numbers of replicates, missing values, non-normally distributed expression levels, and non-identical distributions of features. With the MeanRank test we aimed at developing a test that performs robustly under these conditions, while favorably scaling with the number of replicates. The test proposed here is a global one-sample location test, which is based on the mean ranks across replicates, and internally estimates and controls the false discovery rate. Furthermore, missing data is accounted for without the need of imputation. In extensive simulations comparing MeanRank to other frequently used methods, we found that it performs well with small and large numbers of replicates, feature dependent variance between replicates, and variable regulation across features on simulation data and a recent two-color microarray spike-in dataset. The tests were then used to identify significant changes in the phosphoproteomes of cancer cells induced by the kinase inhibitors erlotinib and 3-MB-PP1 in two independently published mass spectrometry-based studies. MeanRank outperformed the other global rank-based methods applied in this study. Compared to the popular Significance Analysis of Microarrays and Linear Models for Microarray methods, MeanRank performed similar or better. Furthermore, MeanRank exhibits more consistent behavior regarding the degree of regulation and is robust against the choice of preprocessing methods. MeanRank does not require any imputation of missing values, is easy to understand, and yields results that are easy to interpret. The software implementing the algorithm is freely available for academic and commercial use.

  13. Combination of supervised and semi-supervised regression models for improved unbiased estimation

    DEFF Research Database (Denmark)

    Arenas-Garía, Jeronimo; Moriana-Varo, Carlos; Larsen, Jan

    2010-01-01

    In this paper we investigate the steady-state performance of semisupervised regression models adjusted using a modified RLS-like algorithm, identifying the situations where the new algorithm is expected to outperform standard RLS. By using an adaptive combination of the supervised and semisupervi......In this paper we investigate the steady-state performance of semisupervised regression models adjusted using a modified RLS-like algorithm, identifying the situations where the new algorithm is expected to outperform standard RLS. By using an adaptive combination of the supervised...

  14. 40 CFR 403.6 - National pretreatment standards: Categorical standards.

    Science.gov (United States)

    2010-07-01

    ... falls within that particular subcategory. If an existing Industrial User adds or changes a process or... best of my knowledge and belief, true, accurate, and complete. I am aware that there are significant... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS GENERAL PRE-TREAT-MENT REGULATIONS FOR EXIST-ING AND NEW...

  15. Profitability of simple technical trading rules of Chinese stock exchange indexes

    Science.gov (United States)

    Zhu, Hong; Jiang, Zhi-Qiang; Li, Sai-Ping; Zhou, Wei-Xing

    2015-12-01

    Although technical trading rules have been widely used by practitioners in financial markets, their profitability still remains controversial. We here investigate the profitability of moving average (MA) and trading range break (TRB) rules by using the Shanghai Stock Exchange Composite Index (SHCI) from May 21, 1992 through December 31, 2013 and Shenzhen Stock Exchange Component Index (SZCI) from April 3, 1991 through December 31, 2013. The t-test is adopted to check whether the mean returns which are conditioned on the trading signals are significantly different from unconditioned returns and whether the mean returns conditioned on the buy signals are significantly different from the mean returns conditioned on the sell signals. We find that TRB rules outperform MA rules and short-term variable moving average (VMA) rules outperform long-term VMA rules. By applying White's Reality Check test and accounting for the data snooping effects, we find that the best trading rule outperforms the buy-and-hold strategy when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. Our analysis suggests that simple trading rules like MA and TRB cannot beat the standard buy-and-hold strategy for the Chinese stock exchange indexes.

  16. HDL-LDL Ratio: A Significant Predisposition to the Onset of ...

    African Journals Online (AJOL)

    The significance of high-density lipoprotein/low density lipoprotein (HDL-LDL) ratio as a predisposing factor to the onset of atherogenesis has been studied. Standard enzymatic method using Cholesterol kit to extract cholesterol was used. HDL was analysed using standard HDL Kit and LDL concentration was derived by a ...

  17. Development of a quantitative risk standard

    International Nuclear Information System (INIS)

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  18. Isolating DNA from sexual assault cases: a comparison of standard methods with a nuclease-based approach

    Science.gov (United States)

    2012-01-01

    Background Profiling sperm DNA present on vaginal swabs taken from rape victims often contributes to identifying and incarcerating rapists. Large amounts of the victim’s epithelial cells contaminate the sperm present on swabs, however, and complicate this process. The standard method for obtaining relatively pure sperm DNA from a vaginal swab is to digest the epithelial cells with Proteinase K in order to solubilize the victim’s DNA, and to then physically separate the soluble DNA from the intact sperm by pelleting the sperm, removing the victim’s fraction, and repeatedly washing the sperm pellet. An alternative approach that does not require washing steps is to digest with Proteinase K, pellet the sperm, remove the victim’s fraction, and then digest the residual victim’s DNA with a nuclease. Methods The nuclease approach has been commercialized in a product, the Erase Sperm Isolation Kit (PTC Labs, Columbia, MO, USA), and five crime laboratories have tested it on semen-spiked female buccal swabs in a direct comparison with their standard methods. Comparisons have also been performed on timed post-coital vaginal swabs and evidence collected from sexual assault cases. Results For the semen-spiked buccal swabs, Erase outperformed the standard methods in all five laboratories and in most cases was able to provide a clean male profile from buccal swabs spiked with only 1,500 sperm. The vaginal swabs taken after consensual sex and the evidence collected from rape victims showed a similar pattern of Erase providing superior profiles. Conclusions In all samples tested, STR profiles of the male DNA fractions obtained with Erase were as good as or better than those obtained using the standard methods. PMID:23211019

  19. Digital Quantum Simulation of Spin Models with Circuit Quantum Electrodynamics

    OpenAIRE

    Salathé, Y.; Mondal, M.; Oppliger, M.; Heinsoo, J.; Kurpiers, P.; Potočnik, A.; Mezzacapo, Antonio; Las Heras García, Urtzi; Lamata Manuel, Lucas; Solano Villanueva, Enrique Leónidas; Filipp, S.; Wallraff, A.

    2015-01-01

    Systems of interacting quantum spins show a rich spectrum of quantum phases and display interesting many-body dynamics. Computing characteristics of even small systems on conventional computers poses significant challenges. A quantum simulator has the potential to outperform standard computers in calculating the evolution of complex quantum systems. Here, we perform a digital quantum simulation of the paradigmatic Heisenberg and Ising interacting spin models using a two transmon-qubit circuit...

  20. Advanced materials and processes for polymer solar cell devices

    DEFF Research Database (Denmark)

    Petersen, Martin Helgesen; Søndergaard, Roar; Krebs, Frederik C

    2010-01-01

    The rapidly expanding field of polymer and organic solar cells is reviewed in the context of materials, processes and devices that significantly deviate from the standard approach which involves rigid glass substrates, indium-tin-oxide electrodes, spincoated layers of conjugated polymer/fullerene...... be performing less than the current state-of-the-art in their present form but that may have the potential to outperform these pending a larger investment in effort....

  1. ExSTA: External Standard Addition Method for Accurate High-Throughput Quantitation in Targeted Proteomics Experiments.

    Science.gov (United States)

    Mohammed, Yassene; Pan, Jingxi; Zhang, Suping; Han, Jun; Borchers, Christoph H

    2018-03-01

    Targeted proteomics using MRM with stable-isotope-labeled internal-standard (SIS) peptides is the current method of choice for protein quantitation in complex biological matrices. Better quantitation can be achieved with the internal standard-addition method, where successive increments of synthesized natural form (NAT) of the endogenous analyte are added to each sample, a response curve is generated, and the endogenous concentration is determined at the x-intercept. Internal NAT-addition, however, requires multiple analyses of each sample, resulting in increased sample consumption and analysis time. To compare the following three methods, an MRM assay for 34 high-to-moderate abundance human plasma proteins is used: classical internal SIS-addition, internal NAT-addition, and external NAT-addition-generated in buffer using NAT and SIS peptides. Using endogenous-free chicken plasma, the accuracy is also evaluated. The internal NAT-addition outperforms the other two in precision and accuracy. However, the curves derived by internal vs. external NAT-addition differ by only ≈3.8% in slope, providing comparable accuracies and precision with good CV values. While the internal NAT-addition method may be "ideal", this new external NAT-addition can be used to determine the concentration of high-to-moderate abundance endogenous plasma proteins, providing a robust and cost-effective alternative for clinical analyses or other high-throughput applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Performance monitoring and error significance in patients with obsessive-compulsive disorder.

    Science.gov (United States)

    Endrass, Tanja; Schuermann, Beate; Kaufmann, Christan; Spielberg, Rüdiger; Kniesche, Rainer; Kathmann, Norbert

    2010-05-01

    Performance monitoring has been consistently found to be overactive in obsessive-compulsive disorder (OCD). The present study examines whether performance monitoring in OCD is adjusted with error significance. Therefore, errors in a flanker task were followed by neutral (standard condition) or punishment feedbacks (punishment condition). In the standard condition patients had significantly larger error-related negativity (ERN) and correct-related negativity (CRN) ampliudes than controls. But, in the punishment condition groups did not differ in ERN and CRN amplitudes. While healthy controls showed an amplitude enhancement between standard and punishment condition, OCD patients showed no variation. In contrast, group differences were not found for the error positivity (Pe): both groups had larger Pe amplitudes in the punishment condition. Results confirm earlier findings of overactive error monitoring in OCD. The absence of a variation with error significance might indicate that OCD patients are unable to down-regulate their monitoring activity according to external requirements. Copyright 2010 Elsevier B.V. All rights reserved.

  3. 76 FR 75782 - Revising Standards Referenced in the Acetylene Standard

    Science.gov (United States)

    2011-12-05

    ... Determinations A. Legal Considerations B. Final Economic Analysis and Regulatory Flexibility Act Certification C... within the meaning of Section 652(8) when a significant risk of material harm exists in the workplace and the standard would substantially reduce or eliminate that workplace risk. This DFR will not reduce the...

  4. Comparisons of ANSI standards cited in the NRC standard review plan, NUREG-0800 and related documents

    International Nuclear Information System (INIS)

    Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Pawlowski, R.A.; Spiesman, J.B.

    1995-11-01

    This report provides the results of comparisons of the cited and latest versions of ANSI standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review

  5. Comparisons of ASTM standards cited in the NRC standard review plan, NUREG-0800 and related documents

    International Nuclear Information System (INIS)

    Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Pawlowski, R.A.; Spiesman, J.B.

    1995-10-01

    This report provides the results of comparisons of the cited and latest versions of ASTM standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review

  6. Routine Laboratory Blood Tests May Diagnose Significant Fibrosis in Liver Transplant Recipients with Chronic Hepatitis C: A 10 Year Experience.

    Science.gov (United States)

    Sheen, Victoria; Nguyen, Heajung; Jimenez, Melissa; Agopian, Vatche; Vangala, Sitaram; Elashoff, David; Saab, Sammy

    2016-03-28

    The aims of our study were to determine whether routine blood tests, the aspartate aminotransferase (AST) to Platelet Ratio Index (APRI) and Fibrosis 4 (Fib-4) scores, were associated with advanced fibrosis and to create a novel model in liver transplant recipients with chronic hepatitis C virus (HCV). We performed a cross sectional study of patients at The University of California at Los Angeles (UCLA) Medical Center who underwent liver transplantation for HCV. We used linear mixed effects models to analyze association between fibrosis severity and individual biochemical markers and mixed effects logistic regression to construct diagnostic models for advanced fibrosis (METAVIR F3-4). Cross-validation was used to estimate a receiving operator characteristic (ROC) curve for the prediction models and to estimate the area under the curve (AUC). The mean (± standard deviation [SD]) age of our cohort was 55 (±7.7) years, and almost three quarter were male. The mean (±SD) time from transplant to liver biopsy was 19.9 (±17.1) months. The mean (±SD) APRI and Fib-4 scores were 3 (±12) and 7 (±14), respectively. Increased fibrosis was associated with lower platelet count and alanine aminotransferase (ALT) values and higher total bilirubin and Fib-4 scores. We developed a model that takes into account age, gender, platelet count, ALT, and total bilirubin, and this model outperformed APRI and Fib-4 with an AUC of 0.68 (p fibrosis more reliably than APRI and Fib-4 scores. This noninvasive calculation may be used clinically to identify liver transplant recipients with HCV with significant liver damage.

  7. A study on the effects of the CAFE standard on consumers

    International Nuclear Information System (INIS)

    Jun, Seung-Pyo; Yoo, Hyoung Sun; Kim, Ji-Hui

    2016-01-01

    In this paper, we analyzed how the CAFE standard has affected improvements in the fuel economy of vehicles, as examined in other preceding studies, but in addition, we also analyzed how these standards have affected the level of consumer interest in fuel economy. Our goal was to determine what effects the government intervention has had on consumers, and whether such intervention ought to be continued. The results showed that not only has the CAFE standard had a direct and significant impact on improving fuel economy and increasing the market share of fuel-efficient vehicles, it has also boosted the development of technologies for enhancing fuel economy and raised consumer interest in fuel economy, thus indirectly contributing to overcoming market failure. The significance of this study is that we used publically available observed data and analyzed the recent impact of the CAFE standard specifically with a focus on the behavior and strategies exhibited by consumers and automakers. Another significance of this study is that it extends our purview to examine the effects that the CAFE standard has had in other countries (Korea). - Highlights: •CAFE standards have raised consumer interest in fuel economy such as MPG. •CAFE standards had a significant impact on increasing fuel-efficient vehicles •Sales of HEVs are more significantly affected by CAFE standards than by WTI. •CAFE standards had a significant impact on a foreign vehicle market. •Analysis suggests the standards will continue to be necessary for market growth.

  8. 76 FR 75840 - Revising Standards Referenced in the Acetylene Standard

    Science.gov (United States)

    2011-12-05

    ... Flexibility Act Certification C. OMB Review Under the Paperwork Reduction Act of 1995 D. Federalism E. State... meaning of Section 652(8) when a significant risk of material harm exists in the workplace and the standard would substantially reduce or eliminate that workplace risk. This NPRM would not reduce the...

  9. Experimental Standards in Sustainability Transitions

    DEFF Research Database (Denmark)

    Hale, Lara Anne

    In this thesis I address how experimental standards are used in the new governance paradigm to further sustainability transitions. Focusing on the case of the Active House standard in the building sector, I investigate experimental standards in three research papers examining the following dynamics......: (1) the relationship between commensuration and legitimacy in the formulation and diffusion of a standard’s specifications; (2) the role of awareness in standardizing green default rules to establish sustainable consumption in buildings; and (3) the significance of focus on humans in the development...... of technological standards for sustainable building. Launching from a critical realist social ontology, I collected ethnographic data on the Active House Alliance, its cofounder VELUX, and three of their demonstration building projects in Austria, Germany, and Belgium over the course of three years from 2013...

  10. Accounting Standards: What Do They Mean?

    Science.gov (United States)

    Farley, Jerry B.

    1992-01-01

    Four recent and proposed changes in national school accounting standards have significant policy implications for colleges and universities. These changes address (1) standards regarding postemployment benefits other than pensions, (2) depreciation, (3) financial report format, and (4) contributions and pledges made to the school. Governing boards…

  11. Updating OSHA standards based on national consensus standards. final rule; confirmation of effective date.

    Science.gov (United States)

    2008-03-14

    OSHA is confirming the effective date of its direct final rule that revises a number of standards for general industry that refer to national consensus standards. The direct final rule states that it would become effective on March 13, 2008 unless OSHA receives significant adverse comment on these revisions by January 14, 2008. OSHA received no adverse comments by that date and, therefore, is confirming that the rule will become effective on March 13, 2008.

  12. A population study comparing screening performance of prototypes for depression and anxiety with standard scales

    Directory of Open Access Journals (Sweden)

    Christensen Helen

    2011-11-01

    Full Text Available Abstract Background Screening instruments for mental disorders need to be short, engaging, and valid. Current screening instruments are usually questionnaire-based and may be opaque to the user. A prototype approach where individuals identify with a description of an individual with typical symptoms of depression, anxiety, social phobia or panic may be a shorter, faster and more acceptable method for screening. The aim of the study was to evaluate the accuracy of four new prototype screeners for predicting depression and anxiety disorders and to compare their performance with existing scales. Methods Short and ultra-short prototypes were developed for Major Depressive Disorder (MDD, Generalised Anxiety Disorder (GAD, Panic Disorder (PD and Social Phobia (SP. Prototypes were compared to typical short and ultra-short self-report screening scales, such as the Centre for Epidemiology Scale, CES-D and the GAD-7, and their short forms. The Mini International Neuropsychiatric Interview (MINI version 6 1 was used as the gold standard for obtaining clinical criteria through a telephone interview. From a population sample, 225 individuals who endorsed a prototype and 101 who did not were administered the MINI. Receiver operating characteristic (ROC curves were plotted for the short and ultra short prototypes and for the short and ultra short screening scales. Results The study found that the rates of endorsement of the prototypes were commensurate with prevalence estimates. The short-form and ultra short scales outperformed the short and ultra short prototypes for every disorder except GAD, where the GAD prototype outperformed the GAD 7. Conclusions The findings suggest that people may be able to self-identify generalised anxiety more accurately than depression based on a description of a prototypical case. However, levels of identification were lower than expected. Considerable benefits from this method of screening may ensue if our prototypes can be

  13. ['Gold standard', not 'golden standard'

    NARCIS (Netherlands)

    Claassen, J.A.H.R.

    2005-01-01

    In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same

  14. Prostate Health Index (Phi) and Prostate Cancer Antigen 3 (PCA3) significantly improve prostate cancer detection at initial biopsy in a total PSA range of 2-10 ng/ml.

    Science.gov (United States)

    Ferro, Matteo; Bruzzese, Dario; Perdonà, Sisto; Marino, Ada; Mazzarella, Claudia; Perruolo, Giuseppe; D'Esposito, Vittoria; Cosimato, Vincenzo; Buonerba, Carlo; Di Lorenzo, Giuseppe; Musi, Gennaro; De Cobelli, Ottavio; Chun, Felix K; Terracciano, Daniela

    2013-01-01

    Many efforts to reduce prostate specific antigen (PSA) overdiagnosis and overtreatment have been made. To this aim, Prostate Health Index (Phi) and Prostate Cancer Antigen 3 (PCA3) have been proposed as new more specific biomarkers. We evaluated the ability of phi and PCA3 to identify prostate cancer (PCa) at initial prostate biopsy in men with total PSA range of 2-10 ng/ml. The performance of phi and PCA3 were evaluated in 300 patients undergoing first prostate biopsy. ROC curve analyses tested the accuracy (AUC) of phi and PCA3 in predicting PCa. Decision curve analyses (DCA) were used to compare the clinical benefit of the two biomarkers. We found that the AUC value of phi (0.77) was comparable to those of %p2PSA (0.76) and PCA3 (0.73) with no significant differences in pairwise comparison (%p2PSA vs phi p = 0.673, %p2PSA vs. PCA3 p = 0.417 and phi vs. PCA3 p = 0.247). These three biomarkers significantly outperformed fPSA (AUC = 0.60), % fPSA (AUC = 0.62) and p2PSA (AUC = 0.63). At DCA, phi and PCA3 exhibited a very close net benefit profile until the threshold probability of 25%, then phi index showed higher net benefit than PCA3. Multivariable analysis showed that the addition of phi and PCA3 to the base multivariable model (age, PSA, %fPSA, DRE, prostate volume) increased predictive accuracy, whereas no model improved single biomarker performance. Finally we showed that subjects with active surveillance (AS) compatible cancer had significantly lower phi and PCA3 values (pphi and PCA3 comparably increase the accuracy in predicting the presence of PCa in total PSA range 2-10 ng/ml at initial biopsy, outperforming currently used %fPSA.

  15. Novel CO2 laser robotic controller outperforms experienced laser operators in tasks of accuracy and performance repeatability.

    Science.gov (United States)

    Wong, Yu-Tung; Finley, Charles C; Giallo, Joseph F; Buckmire, Robert A

    2011-08-01

    To introduce a novel method of combining robotics and the CO(2) laser micromanipulator to provide excellent precision and performance repeatability designed for surgical applications. Pilot feasibility study. We developed a portable robotic controller that appends to a standard CO(2) laser micromanipulator. The robotic accuracy and laser beam path repeatability were compared to six experienced users of the industry standard micromanipulator performing the same simulated surgical tasks. Helium-neon laser beam video tracking techniques were employed. The robotic controller demonstrated superiority over experienced human manual micromanipulator control in accuracy (laser path within 1 mm of idealized centerline), 97.42% (standard deviation [SD] 2.65%), versus 85.11% (SD 14.51%), P = .018; and laser beam path repeatability (area of laser path divergence on successive trials), 21.42 mm(2) (SD 4.35 mm(2) ) versus 65.84 mm(2) (SD 11.93 mm(2) ), P = .006. Robotic micromanipulator control enhances accuracy and repeatability for specific laser tasks. Computerized control opens opportunity for alternative user interfaces and additional safety features. Copyright © 2011 The American Laryngological, Rhinological, and Otological Society, Inc.

  16. How Often Is the Misfit of Item Response Theory Models Practically Significant?

    Science.gov (United States)

    Sinharay, Sandip; Haberman, Shelby J.

    2014-01-01

    Standard 3.9 of the Standards for Educational and Psychological Testing ([, 1999]) demands evidence of model fit when item response theory (IRT) models are employed to data from tests. Hambleton and Han ([Hambleton, R. K., 2005]) and Sinharay ([Sinharay, S., 2005]) recommended the assessment of practical significance of misfit of IRT models, but…

  17. Premise for Standardized Sepsis Models.

    Science.gov (United States)

    Remick, Daniel G; Ayala, Alfred; Chaudry, Irshad; Coopersmith, Craig M; Deutschman, Clifford; Hellman, Judith; Moldawer, Lyle; Osuchowski, Marcin

    2018-06-05

    Sepsis morbidity and mortality exacts a toll on patients and contributes significantly to healthcare costs. Preclinical models of sepsis have been used to study disease pathogenesis and test new therapies, but divergent outcomes have been observed with the same treatment even when using the same sepsis model. Other disorders such as diabetes, cancer, malaria, obesity and cardiovascular diseases have used standardized, preclinical models that allow laboratories to compare results. Standardized models accelerate the pace of research and such models have been used to test new therapies or changes in treatment guidelines. The National Institutes of Health (NIH) mandated that investigators increase data reproducibility and the rigor of scientific experiments and has also issued research funding announcements about the development and refinement of standardized models. Our premise is that refinement and standardization of preclinical sepsis models may accelerate the development and testing of potential therapeutics for human sepsis, as has been the case with preclinical models for other disorders. As a first step towards creating standardized models, we suggest 1) standardizing the technical standards of the widely used cecal ligation and puncture model and 2) creating a list of appropriate organ injury and immune dysfunction parameters. Standardized sepsis models could enhance reproducibility and allow comparison of results between laboratories and may accelerate our understanding of the pathogenesis of sepsis.

  18. Factors of Engagement: Professional Standards and the Library Science Internship

    Science.gov (United States)

    Dotson, Kaye B.; Dotson-Blake, Kylie P.

    2015-01-01

    In today's technological world, school librarians planning to be leaders should be ready to keep up with advances in standards significant to the profession. The professional standards, specifically American Association of School Librarians (AASL) Standards and International Society for Technology in Education (ISTE) Standards for Coaches offer…

  19. BUSINESS ETHICS STANDARDS AND HOTEL BUSINESS

    OpenAIRE

    Ivica Batinić

    2014-01-01

    By implementing certain standards in business, especially the standards of business ethics, each entity in the hotel industry emphasize its specificity and recognition, while giving a guestconsumer security and a guarantee that they will get desired quality. In today's global world, business ethics has become an indispensable part of the hotel business practices and prerequisite for achieving business success. Business ethics receives strategic significance because ...

  20. Standard Wiggler magnets

    International Nuclear Information System (INIS)

    Winick, H.; Helm, R.H.

    1977-09-01

    Interest in Wiggler magnets (a close sequence of transverse fields with alternating polarity) to extend and enhance the spectrum of synchrotron radiation from electron storage rings has increased significantly during the past few years. Standard wigglers, i.e., wigglers in which interference effects on the spectrum of synchrotron radiation are not important, are considered. In standard wigglers the spectrum of synchrotron radiation has the same general shape as the spectrum from ring bending magnets. However, the critical energy of the wiggler spectrum may be different. The critical energy of the wiggler spectrum is given by epsilon/sub CW/ = epsilon/sub CB/(B/sub W//B/sub B/) where epsilon/sub CB/ is the critical energy from the bending magnets and B/sub W/ and B/sub B/ are the magnetic field strengths of the wiggler magnet and bending magnets respectively. Since most electron storage rings operate with relatively low bending magnet fields (B/sub B/ less than or equal to 12 kG), even a modest wiggler magnet field (less than or equal to 18 kG) can significantly increase the critical energy. Such magnets are planned for ADONE and SPEAR. Higher field (30 to 50 kG) superconducting magnets are planned at Brookhaven, Daresbury, and Novosibirsk to produce even larger increase in the critical energy. For some standard wigglers a further enhancement of the spectrum is produced due tothe superposition of the radiation from the individual poles. Wiggler designs are discussed as well as the effect of wigglers on the synchrotron radiation spectrum and on the operation of storage rings

  1. Radiation protection and shielding standards for the 1980s

    International Nuclear Information System (INIS)

    Trubey, D.K.

    1982-01-01

    The American Nuclear Society (ANS) is a standards-writing organization member of the American National Standards Institute (ANSI). The ANS Standards Committee has a subcommittee denoted ANS-6, Radiation Protection and Shielding, whose charge is to develop standards for radiation protection and shield design, to provide shielding information to other standards-writing groups, and to develop standard reference shielding data and test problems. This paper is a progress report of this subcommittee. Significant progress has been made since the last comprehensive report to the Society

  2. Better Metrics to Automatically Predict the Quality of a Text Summary

    Directory of Open Access Journals (Sweden)

    Judith D. Schlesinger

    2012-09-01

    Full Text Available In this paper we demonstrate a family of metrics for estimating the quality of a text summary relative to one or more human-generated summaries. The improved metrics are based on features automatically computed from the summaries to measure content and linguistic quality. The features are combined using one of three methods—robust regression, non-negative least squares, or canonical correlation, an eigenvalue method. The new metrics significantly outperform the previous standard for automatic text summarization evaluation, ROUGE.

  3. Fast neuromimetic object recognition using FPGA outperforms GPU implementations.

    Science.gov (United States)

    Orchard, Garrick; Martin, Jacob G; Vogelstein, R Jacob; Etienne-Cummings, Ralph

    2013-08-01

    Recognition of objects in still images has traditionally been regarded as a difficult computational problem. Although modern automated methods for visual object recognition have achieved steadily increasing recognition accuracy, even the most advanced computational vision approaches are unable to obtain performance equal to that of humans. This has led to the creation of many biologically inspired models of visual object recognition, among them the hierarchical model and X (HMAX) model. HMAX is traditionally known to achieve high accuracy in visual object recognition tasks at the expense of significant computational complexity. Increasing complexity, in turn, increases computation time, reducing the number of images that can be processed per unit time. In this paper we describe how the computationally intensive and biologically inspired HMAX model for visual object recognition can be modified for implementation on a commercial field-programmable aate Array, specifically the Xilinx Virtex 6 ML605 evaluation board with XC6VLX240T FPGA. We show that with minor modifications to the traditional HMAX model we can perform recognition on images of size 128 × 128 pixels at a rate of 190 images per second with a less than 1% loss in recognition accuracy in both binary and multiclass visual object recognition tasks.

  4. Vocational High School Effectiveness Standard ISO 9001: 2008 for Achievement Content Standards, Standard Process and Competency Standards Graduates

    Directory of Open Access Journals (Sweden)

    Yeni Ratih Pratiwi

    2014-06-01

    Full Text Available Efektivitas Sekolah Menengah Kejuruan Berstandar ISO 9001:2008 terhadap Pencapaian Standar Isi, Standar Proses dan Standar Kompetensi Lulusan Abstract: The purpose of this study was to determine differences in the effectiveness of the achievement of the content standards, process standards, and competency standards in vocational already standard ISO 9001: 2008 with CMS that has not been standardized ISO 9001: 2008 both in public schools and private schools. Data collection using the questionnaire enclosed Likert scale models. Analysis of data using one-way ANOVA using SPSS. The results showed: (1 there is no difference in effectiveness between public SMK ISO standard ISO standards with private SMK (P = 0.001; (2 there are differences in the effectiveness of public SMK SMK ISO standards with ISO standards have not (P = 0.000; (3 there are differences in the effectiveness of public SMK ISO standards with private vocational yet ISO standards (P = 0.000; (4 there are differences in the effectiveness of the private vocational school with vocational standard ISO standard ISO country has not (P = 0.015; (5 there are differences in the effectiveness of the private vocational bertandar ISO with private vocational yet standardized ISO (P = 0.000; (6 there was no difference in the effectiveness of public SMK has not been standardized by the ISO standard ISO private SMK yet. Key Words: vocational high school standards ISO 9001: 2008, the standard content, process standards, competency standards Abstrak: Tujuan penelitian ini untuk mengetahui perbedaan efektivitas pencapaian standar isi, standar proses, dan standar kompetensi lulusan pada SMK yang sudah berstandar ISO 9001:2008 dengan SMK yang belum berstandar ISO 9001:2008 baik pada sekolah negeri maupun sekolah swasta. Pengumpulan data menggunakan kuisioner tertutup model skala likert. Analisis data menggunakan ANOVA one way menggunakan program SPSS. Hasil penelitian menunjukkan: (1 ada perbedaan

  5. HMM-based lexicon-driven and lexicon-free word recognition for online handwritten Indic scripts.

    Science.gov (United States)

    Bharath, A; Madhvanath, Sriganesh

    2012-04-01

    Research for recognizing online handwritten words in Indic scripts is at its early stages when compared to Latin and Oriental scripts. In this paper, we address this problem specifically for two major Indic scripts--Devanagari and Tamil. In contrast to previous approaches, the techniques we propose are largely data driven and script independent. We propose two different techniques for word recognition based on Hidden Markov Models (HMM): lexicon driven and lexicon free. The lexicon-driven technique models each word in the lexicon as a sequence of symbol HMMs according to a standard symbol writing order derived from the phonetic representation. The lexicon-free technique uses a novel Bag-of-Symbols representation of the handwritten word that is independent of symbol order and allows rapid pruning of the lexicon. On handwritten Devanagari word samples featuring both standard and nonstandard symbol writing orders, a combination of lexicon-driven and lexicon-free recognizers significantly outperforms either of them used in isolation. In contrast, most Tamil word samples feature the standard symbol order, and the lexicon-driven recognizer outperforms the lexicon free one as well as their combination. The best recognition accuracies obtained for 20,000 word lexicons are 87.13 percent for Devanagari when the two recognizers are combined, and 91.8 percent for Tamil using the lexicon-driven technique.

  6. Does unconscious thought outperform conscious thought on complex decisions? A further examination

    Directory of Open Access Journals (Sweden)

    Todd J. Thorsteinson

    2009-04-01

    Full Text Available Two experiments examined the benefits of unconscious thought on complex decisions (Dijksterhuis, 2004. Experiment 1 attempted to replicate and extend past research by examining the effect of providing reasons prior to rating the options. Results indicated no significant differences between the conditions. Experiment 2 attempted to replicate the findings of Dijksterhuis, Bos, Nordgren, and van Baaren (2006 and determine if a memory aid could overcome the limitations of conscious thought on complex tasks. Results revealed that a memory aid improved decisions compared to the conscious thought condition. Participants in the unconscious thought condition did not perform significantly better than did participants in the conscious thought condition.

  7. The reform of accounting standards and audit pricing

    Directory of Open Access Journals (Sweden)

    Kai Zhu

    2012-06-01

    Full Text Available This paper focuses on the reform of accounting standards in China in 2007 and investigates its impact on equilibrium pricing in the audit market. We find that the concentration of the audit market and the probability of issuing modified audit opinions do not significantly change, but that audit fees increase significantly after the adoption of the new accounting standards in China. Deeper analysis suggests that (1 the implementation of the new IFRS-based Chinese Accounting Standards (CASs has increased the market risk faced by listed firms and thus auditors’ expected audit risk, causing an increase in audit fees, and (2 the degree of the increase in audit fees is positively related to the adjusted difference between net income according to the old CAS before 2007 and the new CAS after 2007. We thus conclude that the reform has had a significant impact on audit pricing in China.

  8. Improving Pharmacy Student Communication Outcomes Using Standardized Patients.

    Science.gov (United States)

    Gillette, Chris; Rudolph, Michael; Rockich-Winston, Nicole; Stanton, Robert; Anderson, H Glenn

    2017-08-01

    Objective. To examine whether standardized patient encounters led to an improvement in a student pharmacist-patient communication assessment compared to traditional active-learning activities within a classroom setting. Methods. A quasi-experimental study was conducted with second-year pharmacy students in a drug information and communication skills course. Student patient communication skills were assessed using high-stakes communication assessment. Results. Two hundred and twenty students' data were included. Students were significantly more likely to have higher scores on the communication assessment when they had higher undergraduate GPAs, were female, and taught using standardized patients. Similarly, students were significantly more likely to pass the assessment on the first attempt when they were female and when they were taught using standardized patients. Conclusion. Incorporating standardized patients within a communication course resulted in improved scores as well as first-time pass rates on a communication assessment than when using different methods of active learning.

  9. Development and prospects of standardization in the German municipal wastewater sector.

    Science.gov (United States)

    Freimuth, Claudia; Oelmann, Mark; Amann, Erwin

    2018-04-17

    Given the significance of wastewater treatment and disposal for society and the economy together with the omnipresence of standards in the sector, we studied the development and prospects of the rules governing standardization in the German municipal wastewater sector. We thereby provide a detailed description of sector-specific committee-based standardization and significantly contribute to the understanding of this complex arena. We find that the German Association for Water Wastewater and Waste (DWA) has significantly improved its rules on standardization over time by aligning them closer to the generally accepted superordinate standardization principles. However, by focusing on theoretical findings of committee decision-making and committee composition, we argue that there is still scope for improvement with respect to rule reading and rule compliance. We show that the incentives at work in standardization committees are manifold, whereas the representation of the different stakeholder groups needs' remains unbalanced. Due to vested interests and potential strategic behavior of the various agents involved in standardization rule compliance does not necessarily happen naturally. To this end, we claim that the implementation of monitoring mechanisms can be a significant contribution to the institutional design of standardization and briefly discuss the advantages and disadvantages of different schemes. Finally, we show that there is ample need for future research on the optimal design of such a scheme. Even though the analysis relates specifically to the DWA our claims apply to a wide range of standards development organizations. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Telemetry Standards, RCC Standard 106-17. Chapter 3. Frequency Division Multiplexing Telemetry Standards

    Science.gov (United States)

    2017-07-01

    Standard 106-17 Chapter 3, July 2017 3-5 Table 3-4. Constant-Bandwidth FM Subcarrier Channels Frequency Criteria\\Channels: A B C D E F G H Deviation ...Telemetry Standards , RCC Standard 106-17 Chapter 3, July 2017 3-i CHAPTER 3 Frequency Division Multiplexing Telemetry Standards Acronyms...Frequency Division Multiplexing Telemetry Standards ................................ 3-1 3.1 General

  11. ARFI cut-off values and significance of standard deviation for liver fibrosis staging in patients with chronic liver disease.

    Science.gov (United States)

    Goertz, Ruediger S; Sturm, Joerg; Pfeifer, Lukas; Wildner, Dane; Wachter, David L; Neurath, Markus F; Strobel, Deike

    2013-01-01

    Acoustic radiation force impulse (ARFI) elastometry quantifies hepatic stiffness, and thus degree of fibrosis, non-invasively. Our aim was to analyse the diagnostic accuracy of ARFI cut-off values, and the significance of a defined limit of standard deviation (SD) as a potential quality parameter for liver fibrosis staging in patients with chronic liver diseases (CLD). 153 patients with CLD (various aetiologies) undergoing liver biopsy, and an additional 25 patients with known liver cirrhosis, were investigated. ARFI measurements were performed in the right hepatic lobe, and correlated with the histopathological Ludwig fibrosis score (inclusion criteria: at least 6 portal tracts). The diagnostic accuracy of cut-off values was analysed with respect to an SD limit of 30% of the mean ARFI value. The mean ARFI elastometry showed 1.95 ± 0.87 m/s (range 0.79-4.40) in 178 patients (80 female, 98 male, mean age: 52 years). The cut-offs were 1.25 m/s for F ≥ 2, 1.72 m/s for F ≥ 3 and 1.75 m/s for F = 4, and the corresponding AUROC 80.7%, 86.2% and 88.7%, respectively. Exclusion of 31 patients (17.4%) with an SD higher than 30% of the mean ARFI improved the diagnostic accuracy: The AUROC for F ≥ 2, F ≥ 3 and F = 4 were 86.1%, 91.2% and 91.5%, respectively. The diagnostic accuracy of ARFI can be improved by applying a maximum SD of 30% of the mean ARFI as a quality parameter--which however leads to an exclusion of a relevant number of patients. ARFI results with a high SD should be interpreted with caution.

  12. Marketing Program Standardization: The Experience of TNCs in Poland

    Directory of Open Access Journals (Sweden)

    Mariusz Sagan

    2016-01-01

    Full Text Available The purpose of this study is to determine the rate of standardization of marketing programs in transnational corporations in the consumer goods market in Poland, which currently is one of the fastest growing markets in the world. An important research objective was to observe how Polish consumers adopt the marketing patterns and related lifestyles from countries of Western Europe and the USA. The empirical tests and data, collected in a sample survey of 35 transnational corporations and their 140 products, and using varied methods of statistical inference, allowed to formulate the following conclusions. The analyzed TNC’s adopted a clear standardization strategy in the Polish market. Among the analyzed products, 2/3 of them have been entirely transferred from foreign markets into the Polish market. A detailed analysis has indicated that the standardization rate of product and its items in the FMCG market in Poland is high and very high, and significantly higher than the pricing and advertising strategy standardization rates. The product standardization rate in the Polish market has been slightly higher than the rate in the developed countries, yet the pricing standardization has been significantly lower. The standardization of advertising strategies showed similar features.

  13. Significant Revisions to OSHA 29 CFR 1910.269.

    Science.gov (United States)

    Neitzel, Dennis K

    2015-06-01

    The updated OSHA 29 CFR 1910.269 requirements are significant for assisting employers in their efforts to protect their employees from electrical hazards. In addition, OSHA based these revisions on the latest consensus standards and improvements in electrical safety technology. Together, the updated regulation creates a unified and up-to-date set of requirements to help employers more effectively establish safe work practices to protect their workers.

  14. Comparisons of ANS, ASME, AWS, and NFPA standards cited in the NRC standard review plan, NUREG-0800, and related documents

    International Nuclear Information System (INIS)

    Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Spiesman, J.B.

    1995-11-01

    This report provides the results of comparisons of the cited and latest versions of ANS, ASME, AWS and NFPA standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review

  15. 36 CFR 292.13 - Standards.

    Science.gov (United States)

    2010-07-01

    ... operation, physical structures, or waste byproducts would not have significant adverse impacts on... include, but are not limited to, cement production, gravel extraction operations involving more than one... will conform with the following minimum standards: (1) Commercial development. (i) Stores, restaurants...

  16. Pattern and security requirements engineering-based establishment of security standards

    CERN Document Server

    Beckers, Kristian

    2015-01-01

    Security threats are a significant problem for information technology companies today. This book focuses on how to mitigate these threats by using security standards and provides ways to address associated problems faced by engineers caused by ambiguities in the standards. The security standards are analysed, fundamental concepts of the security standards presented, and the relations to the elementary concepts of security requirements engineering (SRE) methods explored. Using this knowledge, engineers can build customised methods that support the establishment of security standards. Standard

  17. Welfare standards in hospital mergers.

    Science.gov (United States)

    Katona, Katalin; Canoy, Marcel

    2013-08-01

    There is a broad literature on the consequences of applying different welfare standards in merger control. Total welfare is usually defined as the sum of consumer and provider surplus, i.e., potential external effects are not considered. The general result is then that consumer welfare is a more restrictive standard than total welfare, which is advantageous in certain situations. This relationship between the two standards is not necessarily true when the merger has significant external effects. We model mergers on hospital markets and allow for not-profit-maximizing behavior of providers and mandatory health insurance. Mandatory health insurance detaches the financial and consumption side of health care markets, and the concept consumer in merger control becomes non-evident. Patients not visiting the merging hospitals still are affected by price changes through their insurance premiums. External financial effects emerge on not directly affected consumers. We show that applying a restricted interpretation of consumer (neglecting externality) in health care merger control can reverse the relation between the two standards; consumer welfare standard can be weaker than total welfare. Consequently, applying the wrong standard can lead to both clearing socially undesirable and to blocking socially desirable mergers. The possible negative consequences of applying a simple consumer welfare standard in merger control can be even stronger when hospitals maximize quality and put less weight on financial considerations. We also investigate the implications of these results for the practice of merger control.

  18. Can motto-goals outperform learning and performance goals? Influence of goal setting on performance and affect in a complex problem solving task

    Directory of Open Access Journals (Sweden)

    Miriam S. Rohe

    2016-09-01

    Full Text Available In this paper, we bring together research on complex problem solving with that on motivational psychology about goal setting. Complex problems require motivational effort because of their inherent difficulties. Goal Setting Theory has shown with simple tasks that high, specific performance goals lead to better performance outcome than do-your-best goals. However, in complex tasks, learning goals have proven more effective than performance goals. Based on the Zurich Resource Model (Storch & Krause, 2014, so-called motto-goals (e.g., "I breathe happiness" should activate a person’s resources through positive affect. It was found that motto-goals are effective with unpleasant duties. Therefore, we tested the hypothesis that motto-goals outperform learning and performance goals in the case of complex problems. A total of N = 123 subjects participated in the experiment. In dependence of their goal condition, subjects developed a personal motto, learning, or performance goal. This goal was adapted for the computer-simulated complex scenario Tailorshop, where subjects worked as managers in a small fictional company. Other than expected, there was no main effect of goal condition for the management performance. As hypothesized, motto goals led to higher positive and lower negative affect than the other two goal types. Even though positive affect decreased and negative affect increased in all three groups during Tailorshop completion, participants with motto goals reported the lowest rates of negative affect over time. Exploratory analyses investigated the role of affect in complex problem solving via mediational analyses and the influence of goal type on perceived goal attainment.

  19. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  20. Prostate Health Index (Phi and Prostate Cancer Antigen 3 (PCA3 significantly improve prostate cancer detection at initial biopsy in a total PSA range of 2-10 ng/ml.

    Directory of Open Access Journals (Sweden)

    Matteo Ferro

    Full Text Available Many efforts to reduce prostate specific antigen (PSA overdiagnosis and overtreatment have been made. To this aim, Prostate Health Index (Phi and Prostate Cancer Antigen 3 (PCA3 have been proposed as new more specific biomarkers. We evaluated the ability of phi and PCA3 to identify prostate cancer (PCa at initial prostate biopsy in men with total PSA range of 2-10 ng/ml. The performance of phi and PCA3 were evaluated in 300 patients undergoing first prostate biopsy. ROC curve analyses tested the accuracy (AUC of phi and PCA3 in predicting PCa. Decision curve analyses (DCA were used to compare the clinical benefit of the two biomarkers. We found that the AUC value of phi (0.77 was comparable to those of %p2PSA (0.76 and PCA3 (0.73 with no significant differences in pairwise comparison (%p2PSA vs phi p = 0.673, %p2PSA vs. PCA3 p = 0.417 and phi vs. PCA3 p = 0.247. These three biomarkers significantly outperformed fPSA (AUC = 0.60, % fPSA (AUC = 0.62 and p2PSA (AUC = 0.63. At DCA, phi and PCA3 exhibited a very close net benefit profile until the threshold probability of 25%, then phi index showed higher net benefit than PCA3. Multivariable analysis showed that the addition of phi and PCA3 to the base multivariable model (age, PSA, %fPSA, DRE, prostate volume increased predictive accuracy, whereas no model improved single biomarker performance. Finally we showed that subjects with active surveillance (AS compatible cancer had significantly lower phi and PCA3 values (p<0.001 and p = 0.01, respectively. In conclusion, both phi and PCA3 comparably increase the accuracy in predicting the presence of PCa in total PSA range 2-10 ng/ml at initial biopsy, outperforming currently used %fPSA.

  1. Mini vs standard percutaneous nephrolithotomy for renal stones: a comparative study.

    Science.gov (United States)

    ElSheemy, Mohammed S; Elmarakbi, Akram A; Hytham, Mohammed; Ibrahim, Hamdy; Khadgi, Sanjay; Al-Kandari, Ahmed M

    2018-03-16

    To compare the outcome of mini-percutaneous nephrolithotomy (Mini-PNL) versus standard-PNL for renal stones. Retrospective study was performed between March 2010 and May 2013 for patients treated by Mini-PNL or standard-PNL through 18 and 30 Fr tracts, respectively, using pneumatic lithotripsy. Semirigid ureteroscope (8.5/11.5 Fr) was used for Mini-PNL and 24 Fr nephroscope for standard-PNL. Both groups were compared in stone free rate(SFR), complications and operative time using Student-t, Mann-Whitney, Chi square or Fisher's exact tests as appropriate in addition to logistic regression analysis. P PNL (378) and standard-PNL (151) were nearly comparable in patients and stones criteria including stone burden (3.77 ± 2.21 vs 3.77 ± 2.43 cm 2 ; respectively). There was no significant difference in number of tracts or supracostal puncture. Mini-PNL had longer operative time (68.6 ± 29.09 vs 60.49 ± 11.38 min; p = 0.434), significantly shorter hospital stay (2.43 ± 1.46 vs 4.29 ± 1.28 days) and significantly higher rate of tubeless PNL (75.1 vs 4.6%). Complications were significantly higher in standard-PNL (7.9 vs 20.5%; p PNL (89.9 vs 96%; p = 0.022). This significant difference was found with multiple stones and large stone burden (> 2 cm 2 ), but the SFR was comparable between both groups with single stone or stone burden ≤ 2 cm. Logistic regression analysis confirmed significantly higher complications and SFR with standard-PNL but with significantly shorter operative time. Mini-PNL has significantly lower SFR when compared to standard-PNL (but clinically comparable) with markedly reduced complications and hospital stay. Most of cases can be performed tubeless. The significant difference in SFR was found with multiple stones or large stone burden (> 2 cm 2 ), but not with single stones or stone burden ≤ 2 cm 2 .

  2. DOE technical standards list: Department of Energy standards index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-05-01

    This Department of Energy (DOE) technical standards list (TSL) has been prepared by the Office of Nuclear Safety Policy and Standards (EH-31) on the basis of currently available technical information. Periodic updates of this TSL will be issued as additional information is received on standardization documents being issued, adopted, or canceled by DOE. This document was prepared for use by personnel involved in the selection and use of DOE technical standards and other Government and non-Government standards. This TSL provides listings of current DOE technical standards, non-Government standards that have been adopted by DOE, other standards-related documents in which DOE has a recorded interest, and canceled DOE technical standards. Information on new DOE technical standards projects, technical standards released for coordination, recently published DOE technical standards, and activities of non-Government standards bodies that may be of interest to DOE is published monthly in Standards Actions.

  3. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    Science.gov (United States)

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  4. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  5. DOE technical standards list. Department of Energy standards index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-01

    This document was prepared for use by personnel involved in the selection and use of DOE technical standards and other Government and non-Government standards. This TSL provides listing of current DOE technical standards, non-Government standards that have been adopted by DOE, other Government documents in which DOE has a recorded interest, and canceled DOE technical standards. Information on new DOE technical standards projects, technical standards released for coordination, recently published DOE technical standards, and activities of non-Government standards bodies that may be of interest to DOE is published monthly in Standards Actions.

  6. Non-standard work schedules, gender, and parental stress

    Directory of Open Access Journals (Sweden)

    Mariona Lozano

    2016-02-01

    Full Text Available Background: Working non-standard hours changes the temporal structure of family life, constraining the time that family members spend with one another and threatening individuals' well-being. However, the empirical research on the link between stress and non-standard schedules has provided mixed results. Some studies have indicated that working non-standard hours is harmful whereas others have suggested that working atypical hours might facilitate the balance between family and work. Moreover, there is some evidence that the association between stress and non-standard employment has different implications for men and women. Objective: This paper examines the association between non-standard work schedules and stress among dual-earner couples with children. Two research questions are addressed. First, do predictability of the schedule and time flexibility moderate the link between non-standard work hours and stress? Second, do non-standard schedules affect men's and women's perceptions of stress differently? Methods: We use a sample of 1,932 working parents from the Canadian 2010 General Social Survey, which includes a time-use diary. A sequential logit regression analysis stratified by gender is employed to model two types of result. First, we estimate the odds of being stressed versus not being stressed. Second, for all respondents feeling stressed, we estimate the odds of experiencing high levels versus moderate levels of stress. Results: Our analysis shows that the link between non-standard working hours and perceived stress differs between mothers and fathers. First, fathers with non-standard schedules appear more likely to experience stress than those working standard hours, although the results are not significant. Among mothers, having a non-standard schedule is associated with a significantly lower risk of experiencing stress. Second, the analysis focusing on the mediating role of flexibility and predictability indicates that

  7. Neutron cross section standards and instrumentation: Annual report

    International Nuclear Information System (INIS)

    1987-01-01

    This annual report from the National Bureau of Standards contains a summary of the results of the Neutron Cross Section Standards and Instrumentation Program. The technical measurements for the past year are given along with the proposed program and budget needs for the next three years. The neutron standards measurements have concentrated on the most important 235 U(n,f) cross section in the thermal to 20 MeV energy range along with the development of neutron detectors required for these measurements. The NBS measurements have made a significant contribution to the improvement in the understanding of this reaction. Measurements were performed with numerous neutron detectors at overlapping energies and at different neutron sources in order to reduce the systematic errors to achieve the required accuracy in this important neutron standard. Significant progress was also made in the development of a detector to utilize the 3 He(n,p) reaction as a standard in the eV to MeV energy region. Improvements in data acquisition systems as well as additional studies of advanced neutron sources were accomplished. Contacts with private industry were maintained and coordination of the neutron standards evaluation was continued. The report also includes biographical listings of the research staff along with copies of a few of our recent publications. 13 figs., 1 tab

  8. 77 FR 43542 - Cost Accounting Standards: Cost Accounting Standards 412 and 413-Cost Accounting Standards...

    Science.gov (United States)

    2012-07-25

    ... rule that revised Cost Accounting Standard (CAS) 412, ``Composition and Measurement of Pension Cost... Accounting Standards: Cost Accounting Standards 412 and 413--Cost Accounting Standards Pension Harmonization Rule AGENCY: Cost Accounting Standards Board, Office of Federal Procurement Policy, Office of...

  9. Trends in U.S. nuclear standards development

    International Nuclear Information System (INIS)

    Crowley, J.H.; Kaminski, R.S.

    1987-01-01

    Regulation of the U.S Nuclear Power industry has been extensive during the 1970's. Key to this situation has been the evolution in the 'interpretation' of the rules, regulations and consensus standards which have been incorporated into NRC guidance documents by reference or endorsement. The resulting increase in the number and complexity of LWR construction requirements has significantly increased the labor content of an LWR construction project. The authors believe that existing nuclear related consensus standards should be reviewed with the objective of modifying the standards to improve the efficiency and productivity of engineering, craft, and non-manual personnel. (author)

  10. Prostate Health Index (Phi) and Prostate Cancer Antigen 3 (PCA3) Significantly Improve Prostate Cancer Detection at Initial Biopsy in a Total PSA Range of 2–10 ng/ml

    Science.gov (United States)

    Perdonà, Sisto; Marino, Ada; Mazzarella, Claudia; Perruolo, Giuseppe; D’Esposito, Vittoria; Cosimato, Vincenzo; Buonerba, Carlo; Di Lorenzo, Giuseppe; Musi, Gennaro; De Cobelli, Ottavio; Chun, Felix K.; Terracciano, Daniela

    2013-01-01

    Many efforts to reduce prostate specific antigen (PSA) overdiagnosis and overtreatment have been made. To this aim, Prostate Health Index (Phi) and Prostate Cancer Antigen 3 (PCA3) have been proposed as new more specific biomarkers. We evaluated the ability of phi and PCA3 to identify prostate cancer (PCa) at initial prostate biopsy in men with total PSA range of 2–10 ng/ml. The performance of phi and PCA3 were evaluated in 300 patients undergoing first prostate biopsy. ROC curve analyses tested the accuracy (AUC) of phi and PCA3 in predicting PCa. Decision curve analyses (DCA) were used to compare the clinical benefit of the two biomarkers. We found that the AUC value of phi (0.77) was comparable to those of %p2PSA (0.76) and PCA3 (0.73) with no significant differences in pairwise comparison (%p2PSA vs phi p = 0.673, %p2PSA vs. PCA3 p = 0.417 and phi vs. PCA3 p = 0.247). These three biomarkers significantly outperformed fPSA (AUC = 0.60), % fPSA (AUC = 0.62) and p2PSA (AUC = 0.63). At DCA, phi and PCA3 exhibited a very close net benefit profile until the threshold probability of 25%, then phi index showed higher net benefit than PCA3. Multivariable analysis showed that the addition of phi and PCA3 to the base multivariable model (age, PSA, %fPSA, DRE, prostate volume) increased predictive accuracy, whereas no model improved single biomarker performance. Finally we showed that subjects with active surveillance (AS) compatible cancer had significantly lower phi and PCA3 values (p<0.001 and p = 0.01, respectively). In conclusion, both phi and PCA3 comparably increase the accuracy in predicting the presence of PCa in total PSA range 2–10 ng/ml at initial biopsy, outperforming currently used %fPSA. PMID:23861782

  11. Disciplining standard-setting : Which approach to choose (if any)?

    NARCIS (Netherlands)

    Kanevskaia, Olia; Jacobs, Kai; Blind, Knut

    In the world of continuous globalization, standards play a crucial role in transnational economic development. Being the drivers of harmonization and innovation, standards do not only facilitate production and exchange in goods and services, but also carry significant policy implications and create

  12. Disciplining standard-setting : Which approach to choose (if any)

    NARCIS (Netherlands)

    Kanevskaia, Olia

    2017-01-01

    In the world of continuous globalization, standards play a crucial role in transnational economic development. Being the drivers of harmonization and innovation, standards do not only facilitate production and exchange in goods and services, but also carry significant policy implications and create

  13. Global multiplicity of dietary standards for trace elements.

    Science.gov (United States)

    Freeland-Graves, Jeanne H; Lee, Jane J

    2012-06-01

    Consistent guidelines across the world for dietary standards of trace elements remain elusive. Harmonization of dietary standards has been suggested by international agencies to facilitate consistency in food and nutrition policies and international trade. Yet significant barriers exist to standardize recommendations on a global basis, such as vast differences in geography, food availability and transport; cultural, social and economic constraints, and biological diversity. Simple commonality is precluded further by the variety of terminologies among countries and regions related to diet. Certain unions have created numerous nutritional descriptive categories for standards, while other large countries are limited to only a few. This paper will explore the global multiplicity of dietary standards and efforts for harmonization. Copyright © 2012 Elsevier GmbH. All rights reserved.

  14. Preservation Study for Ultra-Dilute VX Standards | Science ...

    Science.gov (United States)

    Report Lawrence Livermore National Laboratory (LLNL) supplies ultra-dilute (10 µg/mL) chemical warfare agent (CWA) standards to the Environmental Response Laboratory Network (ERLN) laboratories to allow the use of authentic standards to assist in analyses required for a remediation event involving CWAs. For this reason, it is important to collect data regarding the shelf-lives of these standards. The instability has the potential to impact quality control in regional ERLN laboratories, resulting in data that are difficult to interpret. Thus, this study investigated the use of chemical stabilizers to increase the shelf-life of VX standards. VX standards with long shelf-lives are desirable, as long shelf-life would significantly reduce the costs associated with synthesizing and resupplying the ERLN laboratories with VX.

  15. DOE technical standards list: Department of Energy standards index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    This technical standards list (TSL) was prepared for use by personnel involved in the selection and use of US DOE technical standards and other government and non-government standards. This TSL provides listings of current DOE technical standards, non-government standards that have been adopted by DOE, other government documents in which DOE has a recorded interest, and cancelled DOE technical standards. Standards are indexed by type in the appendices to this document. Definitions of and general guidance for the use of standards are also provided.

  16. Clinically significant discrepancies between sleep problems assessed by standard clinical tools and actigraphy

    Directory of Open Access Journals (Sweden)

    Kjersti Marie Blytt

    2017-10-01

    Full Text Available Abstract Background Sleep disturbances are widespread among nursing home (NH patients and associated with numerous negative consequences. Identifying and treating them should therefore be of high clinical priority. No prior studies have investigated the degree to which sleep disturbances as detected by actigraphy and by the sleep-related items in the Cornell Scale for Depression in Dementia (CSDD and the Neuropsychiatric Inventory – Nursing Home version (NPI-NH provide comparable results. Such knowledge is highly needed, since both questionnaires are used in clinical settings and studies use the NPI-NH sleep item to measure sleep disturbances. For this reason, insight into their relative (disadvantages is valuable. Method Cross-sectional study of 83 NH patients. Sleep was objectively measured with actigraphy for 7 days, and rated by NH staff with the sleep items in the CSDD and the NPI-NH, and results were compared. McNemar's tests were conducted to investigate whether there were significant differences between the pairs of relevant measures. Cohen's Kappa tests were used to investigate the degree of agreement between the pairs of relevant actigraphy, NPI-NH and CSDD measures. Sensitivity and specificity analyses were conducted for each of the pairs, and receiver operating characteristics (ROC curves were designed as a plot of the true positive rate against the false positive rate for the diagnostic test. Results Proxy-raters reported sleep disturbances in 20.5% of patients assessed with NPI-NH and 18.1% (difficulty falling asleep, 43.4% (multiple awakenings and 3.6% (early morning awakenings of patients had sleep disturbances assessed with CSDD. Our results showed significant differences (p<0.001 between actigraphy measures and proxy-rated sleep by the NPI-NH and CSDD. Sensitivity and specificity analyses supported these results. Conclusions Compared to actigraphy, proxy-raters clearly underreported NH patients' sleep disturbances as assessed

  17. Support vector inductive logic programming outperforms the naive Bayes classifier and inductive logic programming for the classification of bioactive chemical compounds.

    Science.gov (United States)

    Cannon, Edward O; Amini, Ata; Bender, Andreas; Sternberg, Michael J E; Muggleton, Stephen H; Glen, Robert C; Mitchell, John B O

    2007-05-01

    We investigate the classification performance of circular fingerprints in combination with the Naive Bayes Classifier (MP2D), Inductive Logic Programming (ILP) and Support Vector Inductive Logic Programming (SVILP) on a standard molecular benchmark dataset comprising 11 activity classes and about 102,000 structures. The Naive Bayes Classifier treats features independently while ILP combines structural fragments, and then creates new features with higher predictive power. SVILP is a very recently presented method which adds a support vector machine after common ILP procedures. The performance of the methods is evaluated via a number of statistical measures, namely recall, specificity, precision, F-measure, Matthews Correlation Coefficient, area under the Receiver Operating Characteristic (ROC) curve and enrichment factor (EF). According to the F-measure, which takes both recall and precision into account, SVILP is for seven out of the 11 classes the superior method. The results show that the Bayes Classifier gives the best recall performance for eight of the 11 targets, but has a much lower precision, specificity and F-measure. The SVILP model on the other hand has the highest recall for only three of the 11 classes, but generally far superior specificity and precision. To evaluate the statistical significance of the SVILP superiority, we employ McNemar's test which shows that SVILP performs significantly (p < 5%) better than both other methods for six out of 11 activity classes, while being superior with less significance for three of the remaining classes. While previously the Bayes Classifier was shown to perform very well in molecular classification studies, these results suggest that SVILP is able to extract additional knowledge from the data, thus improving classification results further.

  18. Putting Customers First: Standards for Serving the American People.

    Science.gov (United States)

    Clinton, Bill; Gore, Al

    This document, part of the Clinton Administration's "Reinventing Government" initiative involving a long-term, significant revamping of the federal bureaucracy, presents a comprehensive set of published customer service standards for the United States Government. It presents more than 1,500 standards representing commitments from more…

  19. Innovation and reliability of atomic standards for PTTI applications

    Science.gov (United States)

    Kern, R.

    1981-01-01

    Innovation and reliability in hyperfine frequency standards and clock systems are discussed. Hyperfine standards are defined as those precision frequency sources and clocks which use a hyperfine atomic transition for frequency control and which have realized significant commercial production and acceptance (cesium, hydrogen, and rubidium atoms). References to other systems such as thallium and ammonia are excluded since these atomic standards have not been commercially exploited in this country.

  20. High Spatial Resolution Visual Band Imagery Outperforms Medium Resolution Spectral Imagery for Ecosystem Assessment in the Semi-Arid Brazilian Sertão

    Directory of Open Access Journals (Sweden)

    Ran Goldblatt

    2017-12-01

    Full Text Available Semi-arid ecosystems play a key role in global agricultural production, seasonal carbon cycle dynamics, and longer-run climate change. Because semi-arid landscapes are heterogeneous and often sparsely vegetated, repeated and large-scale ecosystem assessments of these regions have to date been impossible. Here, we assess the potential of high-spatial resolution visible band imagery for semi-arid ecosystem mapping. We use WorldView satellite imagery at 0.3–0.5 m resolution to develop a reference data set of nearly 10,000 labeled examples of three classes—trees, shrubs/grasses, and bare land—across 1000 km 2 of the semi-arid Sertão region of northeast Brazil. Using Google Earth Engine, we show that classification with low-spectral but high-spatial resolution input (WorldView outperforms classification with the full spectral information available from Landsat 30 m resolution imagery as input. Classification with high spatial resolution input improves detection of sparse vegetation and distinction between trees and seasonal shrubs and grasses, two features which are lost at coarser spatial (but higher spectral resolution input. Our total tree cover estimates for the study area disagree with recent estimates using other methods that may underestimate treecover because they confuse trees with seasonal vegetation (shrubs and grasses. This distinction is important for monitoring seasonal and long-run carbon cycle and ecosystem health. Our results suggest that newer remote sensing products that promise high frequency global coverage at high spatial but lower spectral resolution may offer new possibilities for direct monitoring of the world’s semi-arid ecosystems, and we provide methods that could be scaled to do so.

  1. Precision phase estimation based on weak-value amplification

    Science.gov (United States)

    Qiu, Xiaodong; Xie, Linguo; Liu, Xiong; Luo, Lan; Li, Zhaoxue; Zhang, Zhiyou; Du, Jinglei

    2017-02-01

    In this letter, we propose a precision method for phase estimation based on the weak-value amplification (WVA) technique using a monochromatic light source. The anomalous WVA significantly suppresses the technical noise with respect to the intensity difference signal induced by the phase delay when the post-selection procedure comes into play. The phase measured precision of this method is proportional to the weak-value of a polarization operator in the experimental range. Our results compete well with the wide spectrum light phase weak measurements and outperform the standard homodyne phase detection technique.

  2. On the efficient numerical solution of lattice systems with low-order couplings

    International Nuclear Information System (INIS)

    Ammon, A.; Genz, A.; Hartung, T.; Jansen, K.; Volmer, J.; Leoevey, H.

    2015-10-01

    We apply the Quasi Monte Carlo (QMC) and recursive numerical integration methods to evaluate the Euclidean, discretized time path-integral for the quantum mechanical anharmonic oscillator and a topological quantum mechanical rotor model. For the anharmonic oscillator both methods outperform standard Markov Chain Monte Carlo methods and show a significantly improved error scaling. For the quantum mechanical rotor we could, however, not find a successful way employing QMC. On the other hand, the recursive numerical integration method works extremely well for this model and shows an at least exponentially fast error scaling.

  3. Management Aspects of Implementing the New Effluent Air Monitoring Standard

    International Nuclear Information System (INIS)

    Glissmeyer, John A.; Davis, William E.

    2000-01-01

    The revision to ANSI/HPS N13.1,'Sampling and Monitoring Releases of Airborne Radioactive substances From the Stacks and Ducts of Nuclear Facilities,' went into effect in January 1999 - replacing the 1969 version of the standard. There are several significant changes from the old version of the standard. The revised standard provides a new paradigm where representative air samples can be collected by extracting the sample from a single point in air streams where the contaminants are well mixed. The revised standard provides specific performance criteria and requirements for the various air sampling processes - program structure, sample extraction, transport, collection, effluent and sample flow measurement, and quality assurance. A graded approach to sampling is recommended with more stringent requirements for stacks with a greater potential to emit. These significant changes in the standard will impact the air monitoring programs at some sites and facilities. The impacts on the air monitor design, operation, maintenance, and quality control processes are discussed.

  4. Essential patents in industry standards : the case of UMTS

    NARCIS (Netherlands)

    Bekkers, R.N.A.; Bongard, R.; Nuvolari, A.

    2009-01-01

    We study the determinants of essential patents in industry standards. In particular, we assess the role of two main factors: the significance of the technological solution contained in the patent and the involvement of the applicant of the patent in the standardization process. To this end, we

  5. Characterization and Comparison of the 10-2 SITA-Standard and Fast Algorithms

    Directory of Open Access Journals (Sweden)

    Yaniv Barkana

    2012-01-01

    Full Text Available Purpose: To compare the 10-2 SITA-standard and SITA-fast visual field programs in patients with glaucoma. Methods: We enrolled 26 patients with open angle glaucoma with involvement of at least one paracentral location on 24-2 SITA-standard field test. Each subject performed 10-2 SITA-standard and SITA-fast tests. Within 2 months this sequence of tests was repeated. Results: SITA-fast was 30% shorter than SITA-standard (5.5±1.1 vs 7.9±1.1 minutes, <0.001. Mean MD was statistically significantly higher for SITA-standard compared with SITA-fast at first visit (Δ=0.3 dB, =0.017 but not second visit. Inter-visit difference in MD or in number of depressed points was not significant for both programs. Bland-Altman analysis showed that clinically significant variations can exist in individual instances between the 2 programs and between repeat tests with the same program. Conclusions: The 10-2 SITA-fast algorithm is significantly shorter than SITA-standard. The two programs have similar long-term variability. Average same-visit between-program and same-program between-visit sensitivity results were similar for the study population, but clinically significant variability was observed for some individual test pairs. Group inter- and intra-program test results may be comparable, but in the management of the individual patient field change should be verified by repeat testing.

  6. The Effect of Cooperative Learning with DSLM on Conceptual Understanding and Scientific Reasoning among Form Four Physics Students with Different Motivation Levels

    Directory of Open Access Journals (Sweden)

    M.S. Hamzah

    2010-11-01

    Full Text Available The purpose of this study was to investigate the effect of Cooperative Learning with a Dual Situated Learning Model (CLDSLM and a Dual Situated Learning Model (DSLM on (a conceptual understanding (CU and (b scientific reasoning (SR among Form Four students. The study further investigated the effect of the CLDSLM and DSLM methods on performance in conceptual understanding and scientific reasoning among students with different motivation levels. A quasi-experimental method with the 3 x 2 Factorial Design was applied in the study. The sample consisted of 240 stu¬dents in six (form four classes selected from three different schools, i.e. two classes from each school, with students randomly selected and assigned to the treatment groups. The results showed that students in the CLDSLM group outperformed their counterparts in the DSLM group—who, in turn, significantly outperformed other students in the traditional instructional method (T group in scientific reasoning and conceptual understanding. Also, high-motivation (HM students in the CLDSLM group significantly outperformed their counterparts in the T groups in conceptual understanding and scientific reasoning. Furthermore, HM students in the CLDSLM group significantly outperformed their counterparts in the DSLM group in scientific reasoning but did not significantly outperform their counterparts on conceptual understanding. Also, the DSLM instructional method has significant positive effects on highly motivated students’ (a conceptual understanding and (b scientific reason¬ing. The results also showed that LM students in the CLDSLM group significantly outperformed their counterparts in the DSLM group and (T method group in scientific reasoning and conceptual understanding. However, the low-motivation students taught via the DSLM instructional method significantly performed higher than the low-motivation students taught via the T method in scientific reasoning. Nevertheless, they did not

  7. Status of conversion of DOE standards to non-Government standards

    Energy Technology Data Exchange (ETDEWEB)

    Moseley, H.L.

    1992-07-01

    One major goal of the DOE Technical Standards Program is to convert existing DOE standards into non-Government standards (NGS's) where possible. This means that a DOE standard may form the basis for a standards-writing committee to produce a standard in the same subject area using the non-Government standards consensus process. This report is a summary of the activities that have evolved to effect conversion of DOE standards to NGSs, and the status of current conversion activities. In some cases, all requirements in a DOE standard will not be incorporated into the published non-Government standard because these requirements may be considered too restrictive or too specific for broader application by private industry. If requirements in a DOE standard are not incorporated in a non-Government standard and the requirements are considered necessary for DOE program applications, the DOE standard will be revised and issued as a supplement to the non-Government standard. The DOE standard will contain only those necessary requirements not reflected by the non-Government standard. Therefore, while complete conversion of DOE standards may not always be realized, the Department's technical standards policy as stated in Order 1300.2A has been fully supported in attempting to make maximum use of the non-Government standard.

  8. Status of conversion of DOE standards to non-Government standards

    Energy Technology Data Exchange (ETDEWEB)

    Moseley, H.L.

    1992-07-01

    One major goal of the DOE Technical Standards Program is to convert existing DOE standards into non-Government standards (NGS`s) where possible. This means that a DOE standard may form the basis for a standards-writing committee to produce a standard in the same subject area using the non-Government standards consensus process. This report is a summary of the activities that have evolved to effect conversion of DOE standards to NGSs, and the status of current conversion activities. In some cases, all requirements in a DOE standard will not be incorporated into the published non-Government standard because these requirements may be considered too restrictive or too specific for broader application by private industry. If requirements in a DOE standard are not incorporated in a non-Government standard and the requirements are considered necessary for DOE program applications, the DOE standard will be revised and issued as a supplement to the non-Government standard. The DOE standard will contain only those necessary requirements not reflected by the non-Government standard. Therefore, while complete conversion of DOE standards may not always be realized, the Department`s technical standards policy as stated in Order 1300.2A has been fully supported in attempting to make maximum use of the non-Government standard.

  9. Modular Power Standard for Space Explorations Missions

    Science.gov (United States)

    Oeftering, Richard C.; Gardner, Brent G.

    2016-01-01

    Future human space exploration will most likely be composed of assemblies of multiple modular spacecraft elements with interconnected electrical power systems. An electrical system composed of a standardized set modular building blocks provides significant development, integration, and operational cost advantages. The modular approach can also provide the flexibility to configure power systems to meet the mission needs. A primary goal of the Advanced Exploration Systems (AES) Modular Power System (AMPS) project is to establish a Modular Power Standard that is needed to realize these benefits. This paper is intended to give the space exploration community a "first look" at the evolving Modular Power Standard and invite their comments and technical contributions.

  10. Instant standard concept for data standards development

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; Kulcsor, Istvan Zsolt; Roes, Jasper

    2013-01-01

    This paper presents the current results of an ongoing research about a new data standards development concept. The concept is called Instant Standard referring to the pressure that is generated by shrinking the length of the standardization process. Based on this concept it is estimated that the

  11. Next Generation Science Standards: Considerations for Curricula, Assessments, Preparation, and Implementation

    Science.gov (United States)

    Best, Jane; Dunlap, Allison

    2014-01-01

    This policy brief provides an overview of the Next Generation Science Standards (NGSS), discusses policy considerations for adopting or adapting the new standards, and presents examples from states considering or implementing the NGSS. Changing academic standards is a complex process that requires significant investments of time, money, and human…

  12. The MIMIC Method with Scale Purification for Detecting Differential Item Functioning

    Science.gov (United States)

    Wang, Wen-Chung; Shih, Ching-Lin; Yang, Chih-Chien

    2009-01-01

    This study implements a scale purification procedure onto the standard MIMIC method for differential item functioning (DIF) detection and assesses its performance through a series of simulations. It is found that the MIMIC method with scale purification (denoted as M-SP) outperforms the standard MIMIC method (denoted as M-ST) in controlling…

  13. An improved genetic algorithm for designing optimal temporal patterns of neural stimulation

    Science.gov (United States)

    Cassar, Isaac R.; Titus, Nathan D.; Grill, Warren M.

    2017-12-01

    Objective. Electrical neuromodulation therapies typically apply constant frequency stimulation, but non-regular temporal patterns of stimulation may be more effective and more efficient. However, the design space for temporal patterns is exceedingly large, and model-based optimization is required for pattern design. We designed and implemented a modified genetic algorithm (GA) intended for design optimal temporal patterns of electrical neuromodulation. Approach. We tested and modified standard GA methods for application to designing temporal patterns of neural stimulation. We evaluated each modification individually and all modifications collectively by comparing performance to the standard GA across three test functions and two biophysically-based models of neural stimulation. Main results. The proposed modifications of the GA significantly improved performance across the test functions and performed best when all were used collectively. The standard GA found patterns that outperformed fixed-frequency, clinically-standard patterns in biophysically-based models of neural stimulation, but the modified GA, in many fewer iterations, consistently converged to higher-scoring, non-regular patterns of stimulation. Significance. The proposed improvements to standard GA methodology reduced the number of iterations required for convergence and identified superior solutions.

  14. Status of conversion of NE standards to national consensus standards

    International Nuclear Information System (INIS)

    Jennings, S.D.

    1990-06-01

    One major goal of the Nuclear Standards Program is to convert existing NE standards into national consensus standards (where possible). This means that an NE standard in the same subject area using the national consensus process. This report is a summary of the activities that have evolved to effect conversion of NE standards to national consensus standards, and the status of current conversion activities. In some cases, all requirements in an NE standard will not be incorporated into the published national consensus standard because these requirements may be considered too restrictive or too specific for broader application by the nuclear industry. If these requirements are considered necessary for nuclear reactor program applications, the program standard will be revised and issued as a supplement to the national consensus standard. The supplemental program standard will contain only those necessary requirements not reflected by the national consensus standard. Therefore, while complete conversion of program standards may not always be realized, the standards policy has been fully supported in attempting to make maximum use of the national consensus standard. 1 tab

  15. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  16. The international development of forensic science standards - A review.

    Science.gov (United States)

    Wilson-Wilde, Linzi

    2018-04-16

    Standards establish specifications and procedures designed to ensure products, services and systems are safe, reliable and consistently perform as intended. Standards can be used in the accreditation of forensic laboratories or facilities and in the certification of products and services. In recent years there have been various international activities aiming at developing forensic science standards and guidelines. The most significant initiative currently underway within the global forensic community is the development of International Organization for Standardization (ISO) standards. This paper reviews the main bodies working on standards for forensic science, the processes used and the implications for accreditation. This paper specifically discusses the work of ISO Technical Committee TC272, the future TC272 work program for the development of forensic science standards and associated timelines. Also discussed, are the lessons learnt to date in navigating the complex environment of multi-country stakeholder deliberations in standards development. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  17. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version...... of BEL based on a simple, though non-standard, data-blocking rule which uses a data block of every possible length. Consequently, the method involves no block selection and is also anticipated to exhibit better coverage performance. Its non-standard blocking scheme, however, induces non......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  18. Incorporating Experience Curves in Appliance Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.

  19. Nondestructive testing standards and the ASME code

    International Nuclear Information System (INIS)

    Spanner, J.C.

    1991-04-01

    Nondestructive testing (NDT) requirements and standards are an important part of the ASME Boiler and Pressure Vessel Code. In this paper, the evolution of these requirements and standards is reviewed in the context of the unique technical and legal stature of the ASME Code. The coherent and consistent manner by which the ASME Code rules are organized is described, and the interrelationship between the various ASME Code sections, the piping codes, and the ASTM Standards is discussed. Significant changes occurred in ASME Sections 5 and 11 during the 1980s, and these are highlighted along with projections and comments regarding future trends and changes in these important documents. 4 refs., 8 tabs

  20. A company perspective on software engineering standards

    International Nuclear Information System (INIS)

    Steer, R.W.

    1988-01-01

    Software engineering standards, as implemented via formal policies and procedures, have historically been used in the nuclear industry, especially for codes used in the design, analysis, or operation of the plant. Over the past two decades, a significant amount of software has been put in place to perform these functions, while the overall software life cycle has become better understood, more and different computer systems have become available, and industry has become increasingly aware of the advantages gained when these procedures are used in the development and maintenance of this large amount of software. The use of standards and attendant procedures is thus becoming increasingly important as more computerization is taking place, both in the design and the operation of the plant. It is difficult to categorize software used in activities related to nuclear plants in a simple manner. That difficulty is due to the diversity of those uses, with attendant diversity in the methods and procedures used in the production of the software, compounded by a changing business climate in which significant software engineering expertise is being applied to a broader range of applications on a variety of computing systems. The use of standards in the various phases of the production of software thus becomes more difficult as well. This paper discusses the various types of software and the importance of software standards in the development of each of them

  1. The Effect of Standardized Interviews on Organ Donation.

    Science.gov (United States)

    Corman Dincer, Pelin; Birtan, Deniz; Arslantas, Mustafa Kemal; Tore Altun, Gulbin; Ayanoglu, Hilmi Omer

    2018-03-01

    Organ donation is the most important stage for organ transplant. Studies reveal that attitudes of families of brain-dead patients toward donation play a significant role in their decision. We hypothesized that supporting family awareness about the meaning of organ donation, including saving lives while losing a loved one, combined with being informed about brain death and the donation process must be maintained by intensive care unit physicians through standardized interviews and questionnaires to increase the donation rate. We retrospectively evaluated the final decisions of families of 52 brain-dead donors treated at our institution between 2014 and 2017. Data underwent descriptive analyses. The standard interview content was generated after literature search results were reviewed by the authors. Previously, we examined the impact of standardized interviews done by intensive care unit physicians with relatives of potential brain-dead donors regarding decisions to donate or reasons for refusing organ donation. After termination of that study, interviews were done according to the intensivist's orientation, resulting in significantly decreased donation rates. Standardized interviews were then started again, resulting in increased donation rates. Of 17 families who participated in standardized interviews, 5 families (29.4%) agreed to donate organs of their brain-dead relatives. In the other group of families, intensivists governed informing the families of donation without standardized interviews. In this group of 35 families, 5 families (14.3%) approved organ donation. The decision regarding whether to agree to organ donation was statistically different between the 2 family groups (P donation process resulted in an increased rate of organ donation compared with routine protocols.

  2. DO SOBER EYEWITNESSES OUTPERFORM ALCOHOL INTOXICATED EYEWITNESSES IN A LINEUP?

    Directory of Open Access Journals (Sweden)

    Claudia Fahlke

    2013-01-01

    Full Text Available Although alcohol intoxicated eyewitnesses are common, there are only a few studies in the area. The aim of the current study is to investigate how different doses of alcohol affect eyewitness lineup identification performance. The participants (N = 123 were randomly assigned to a 3 [Beverage: control (0.0 g/kg ethanol vs. lower (0.4 g/kg ethanol vs. higher alcohol dose (0.7 g/kg ethanol] X 2 (Lineup: target-present vs. target-absent between-subject design. Participants consumed two glasses of beverage at an even pace for 15 minutes. Five minutes after consumption the participants witnessed a film depicting a staged kidnapping. Seven days later, the participants returned to the laboratory and were asked to identify the culprit in a simultaneous lineup. The result showed that overall, the participants performed better than chance; however, their lineup performance was poor. There were no significant effects of alcohol intoxication with respect to performance, neither in target-present nor target-absent lineups. The study’s results suggest that eyewitnesses who have consumed a lower (0.4 g/kg ethanol or a higher (0.7 g/kg ethanol dose of alcohol perform at the same level as sober eyewitnesses in a lineup. The results are discussed in relation to the alcohol myopia theory and suggestions for future research are made.

  3. Theorists reject challenge to standard model

    CERN Multimedia

    Adam, D

    2001-01-01

    Particle physicists are questioning results that appear to violate the Standard Model. There are concerns that there is not sufficient statistical significance and also charges that the comparison is being made with the 'most convenient' theoretical value for the muon's magnetic moment (1 page).

  4. P-Value, a true test of statistical significance? a cautionary note ...

    African Journals Online (AJOL)

    While it's not the intention of the founders of significance testing and hypothesis testing to have the two ideas intertwined as if they are complementary, the inconvenient marriage of the two practices into one coherent, convenient, incontrovertible and misinterpreted practice has dotted our standard statistics textbooks and ...

  5. Decommissioning standards

    International Nuclear Information System (INIS)

    Crofford, W.N.

    1980-01-01

    EPA has agreed to establish a series of environmental standards for the safe disposal of radioactive waste through participation in the Interagency Review Group on Nuclear Waste Management (IRG). One of the standards required under the IRG is the standard for decommissioning of radioactive contaminated sites, facilities, and materials. This standard is to be proposed by December 1980 and promulgated by December 1981. Several considerations are important in establishing these standards. This study includes discussions of some of these considerations and attempts to evaluate their relative importance. Items covered include: the form of the standards, timing for decommissioning, occupational radiation protection, costs and financial provisions. 4 refs

  6. The new CSA standard for leak detection

    Energy Technology Data Exchange (ETDEWEB)

    Pietsch, Ulli [TAU, Edmonton, Alberta, (Canada); Scott, Don [TransCanada Pipelines, Edmonton, Alberta, (Canada)

    2010-07-01

    Standards need to be updated regularly to reflect current technology and industry practices. This paper describes the new Canadian Standards Association (CSA) for leak detection called Recommended Practice for Liquid Hydrocarbon Pipeline System Leak Detection, Annex E, which can be found in the CSA Z662 Oil and Gas Pipeline Systems standard. The CSA formed a task force of industry experts and regulators for a period of 18 months to draft the new standards. Several comparisons were made with the American Petroleum Institute (API) recommended practice API 1130. This new version introduces and defines the terms, critical instrument, critical process and dependent instrument. The most significant improvement made by the new Annex E is the new requirement that an operating company must develop a leak detection strategy. The writing of a leak detection manual is given high priority. The use of both Annex E and API 1130 is recommended.

  7. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  8. Using Objective Structured Clinical Examinations to Assess Intern Orthopaedic Physical Examination Skills: A Multimodal Didactic Comparison.

    Science.gov (United States)

    Phillips, Donna; Pean, Christian A; Allen, Kathleen; Zuckerman, Joseph; Egol, Kenneth

    Patient care is 1 of the 6 core competencies defined by the Accreditation Council for Graduate Medical Education (ACGME). The physical examination (PE) is a fundamental skill to evaluate patients and make an accurate diagnosis. The purpose of this study was to investigate 3 different methods to teach PE skills and to assess the ability to do a complete PE in a simulated patient encounter. Prospective, uncontrolled, observational. Northeastern academic medical center. A total of 32 orthopedic surgery residents participated and were divided into 3 didactic groups: Group 1 (n = 12) live interactive lectures, demonstration on standardized patients, and textbook reading; Group 2 (n = 11) video recordings of the lectures given to Group 1 and textbook reading alone; Group 3 (n = 9): 90-minute modules taught by residents to interns in near-peer format and textbook reading. The overall score for objective structured clinical examinations from the combined groups was 66%. There was a trend toward more complete PEs in Group 1 taught via live lectures and demonstrations compared to Group 2 that relied on video recording. Near-peer taught residents from Group 3 significantly outperformed Group 2 residents overall (p = 0.02), and trended toward significantly outperforming Group 1 residents as well, with significantly higher scores in the ankle (p = 0.02) and shoulder (p = 0.02) PE cases. This study found that orthopedic interns taught musculoskeletal PE skills by near-peers outperformed other groups overall. An overall score of 66% for the combined didactic groups suggests a baseline deficit in first-year resident musculoskeletal PE skills. The PE should continue to be taught and objectively assessed throughout residency to confirm that budding surgeons have mastered these fundamental skills before going into practice. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  9. Watermarking Techniques Using Least Significant Bit Algorithm for Digital Image Security Standard Solution- Based Android

    Directory of Open Access Journals (Sweden)

    Ari Muzakir

    2017-05-01

    Full Text Available Ease of deployment of digital image through the internet has positive and negative sides, especially for owners of the original digital image. The positive side of the ease of rapid deployment is the owner of that image deploys digital image files to various sites in the world address. While the downside is that if there is no copyright that serves as protector of the image it will be very easily recognized ownership by other parties. Watermarking is one solution to protect the copyright and know the results of the digital image. With Digital Image Watermarking, copyright resulting digital image will be protected through the insertion of additional information such as owner information and the authenticity of the digital image. The least significant bit (LSB is one of the algorithm is simple and easy to understand. The results of the simulations carried out using android smartphone shows that the LSB watermarking technique is not able to be seen by naked human eye, meaning there is no significant difference in the image of the original files with images that have been inserted watermarking. The resulting image has dimensions of 640x480 with a bit depth of 32 bits. In addition, to determine the function of the ability of the device (smartphone in processing the image using this application used black box testing. 

  10. Standard Model Higgs boson searches with the ATLAS detector

    Indian Academy of Sciences (India)

    The experimental results on the search of the Standard Model Higgs boson with 1 to 2 fb-1 of proton–proton collision data at s = 7 TeV recorded by the ATLAS detector are presented and discussed. No significant excess of events is found with respect to the expectations from Standard Model processes, and the production ...

  11. Best in class: hot competition makes Canada an 'incredibly innovative environment' by global industry standards

    International Nuclear Information System (INIS)

    Jaremko, G.

    2000-01-01

    The highly innovative environment in the field service sector of the oil and natural gas industry and the intense competition generated by it are discussed. Despite the fact that Canada produces only 3.5 per cent of the world's oil and 7.0 per cent of its natural gas, Canada is a world leader in the development of field service systems and equipment. On a return on investment basis the field service sector outperformed the exploration and production sector, and while many of them are small compared to the giants like Haliburton and Schlumberger, small field service companies frequently outperform the giants, if only because below 100 million dollars in revenues, investors expect a 25 to 30 per cent return on equity. Constant cost cutting and an eye on the bottom line, combined with products and services of high quality, and the intense rivalry and competition keeps the industry constantly on its toes to do more with less, to come up with innovative business practices and to stay on the cutting edge of new technology. Progress by several of the field service companies, large and small, are reviewed by way of illustration

  12. Symplectic Attitude Estimation for Small Satellites

    National Research Council Canada - National Science Library

    Valpiani, James M; Palmer, Phillip L

    2006-01-01

    .... Symplectic numerical methods are applied to the Extended Kalman Filter (EKF) algorithm to give the SKF, which outperforms the standard EKF in the presence of nonlinearity and low measurement noise in the 1-D case...

  13. The identification of sites of biodiversity conservation significance: progress with the application of a global standard

    Directory of Open Access Journals (Sweden)

    M.N. Foster

    2012-08-01

    Full Text Available As a global community, we have a responsibility to ensure the long-term future of our natural heritage. As part of this, it is incumbent upon us to do all that we can to reverse the current trend of biodiversity loss, using all available tools at our disposal. One effective mean is safeguarding of those sites that are highest global priority for the conservation of biodiversity, whether through formal protected areas, community managed reserves, multiple-use areas, or other means. This special issue of the Journal of Threatened Taxa examines the application of the Key Biodiversity Area (KBA approach to identifying such sites. Given the global mandate expressed through policy instruments such as the Convention on Biological Diversity (CBD, the KBA approach can help countries meet obligations in an efficient and transparent manner. KBA methodology follows the well-established general principles of vulnerability and irreplaceability, and while it aims to be a globally standardized approach, it recognizes the fundamental need for the process to be led at local and national levels. In this series of papers the application of the KBA approach is explored in seven countries or regions: the Caribbean, Indo-Burma, Japan, Macedonia, Mediterranean Algeria, the Philippines and the Upper Guinea region of West Africa. This introductory article synthesizes some of the common main findings and provides a comparison of key summary statistics.

  14. A solar neutrino loophole: standard solar models

    Energy Technology Data Exchange (ETDEWEB)

    Rouse, C A [General Atomic Co., San Diego, Calif. (USA)

    1975-11-01

    The salient aspects of the existence theorem for a unique solution to a system of linear of nonlinear first-order, ordinary differential equations are given and applied to the equilibrium stellar structure equations. It is shown that values of pressure, temperature, mass and luminosity are needed at one point - and for the sun, the logical point is the solar radius. It is concluded that since standard solar model calculations use split boundary conditions, a solar neutrino loophole still remains: solar model calculations that seek to satisfy the necessary condition for a unique solution to the solar structure equations suggest a solar interior quite different from that deduced in standard models. This, in turn, suggests a theory of formation and solar evolution significantly different from the standard theory.

  15. [18F]FDG PET/CT outperforms [18F]FDG PET/MRI in differentiated thyroid cancer

    International Nuclear Information System (INIS)

    Vrachimis, Alexis; Wenning, Christian; Weckesser, Matthias; Stegger, Lars; Burg, Matthias Christian; Allkemper, Thomas; Schaefers, Michael

    2016-01-01

    To evaluate the diagnostic potential of PET/MRI with [ 18 F]FDG in comparison to PET/CT in patients with differentiated thyroid cancer suspected or known to have dedifferentiated. The study included 31 thyroidectomized and remnant-ablated patients who underwent a scheduled [ 18 F]FDG PET/CT scan and were then enrolled for a PET/MRI scan of the neck and thorax. The datasets (PET/CT, PET/MRI) were rated regarding lesion count, conspicuity, diameter and characterization. Standardized uptake values were determined for all [ 18 F]FDG-positive lesions. Histology, cytology, and examinations before and after treatment served as the standards of reference. Of 26 patients with a dedifferentiated tumour burden, 25 were correctly identified by both [ 18 F]FDG PET/CT and PET/MRI. Detection rates by PET/CT and PET/MRI were 97 % (113 of 116 lesions) and 85 % (99 of 113 lesions) for malignant lesions, and 100 % (48 of 48 lesions) and 77 % (37 of 48 lesions) for benign lesions, respectively. Lesion conspicuity was higher on PET/CT for both malignant and benign pulmonary lesions and in the overall rating for malignant lesions (p < 0.001). There was a difference between PET/CT and PET/MRI in overall evaluation of malignant lesions (p < 0.01) and detection of pulmonary metastases (p < 0.001). Surgical evaluation revealed three malignant lesions missed by both modalities. PET/MRI additionally failed to detect 14 pulmonary metastases and 11 benign lesions. In patients with thyroid cancer and suspected or known dedifferentiation, [ 18 F]FDG PET/MRI was inferior to low-dose [ 18 F]FDG PET/CT for the assessment of pulmonary status. However, for the assessment of cervical status, [ 18 F]FDG PET/MRI was equal to contrast-enhanced neck [ 18 F]FDG PET/CT. Therefore, [ 18 F]FDG PET/MRI combined with a low-dose CT scan of the thorax may provide an imaging solution when high-quality imaging is needed and high-energy CT is undesirable or the use of a contrast agent is contraindicated. (orig.)

  16. Social Accountability 8000 standard as a contemporary challenge in HR management

    Directory of Open Access Journals (Sweden)

    Agnieszka Michalak

    2015-12-01

    Full Text Available This study is aimed at presenting the requirements of the Social Accountability 8000 standard as the only certifiable standard in the corporate accountability area. Standard implementation is a significant challenge for many entities. This article discusses the areas where SA8000 implementation is most difficult and offers some practical tips, useful for further standard  implementation. Workplace relations and care about the business ethics should be the foundation of any responsible business operations.

  17. Effluent standards

    Energy Technology Data Exchange (ETDEWEB)

    Geisler, G C [Pennsylvania State University (United States)

    1974-07-01

    At the conference there was a considerable interest in research reactor standards and effluent standards in particular. On the program, this is demonstrated by the panel discussion on effluents, the paper on argon 41 measured by Sims, and the summary paper by Ringle, et al. on the activities of ANS research reactor standards committee (ANS-15). As a result, a meeting was organized to discuss the proposed ANS standard on research reactor effluents (15.9). This was held on Tuesday evening, was attended by members of the ANS-15 committee who were present at the conference, participants in the panel discussion on the subject, and others interested. Out of this meeting came a number of excellent suggestions for changes which will increase the utility of the standard, and a strong recommendation that the effluent standard (15.9) be combined with the effluent monitoring standard. It is expected that these suggestions and recommendations will be incorporated and a revised draft issued for comment early this summer. (author)

  18. Malaysian NDT standards

    International Nuclear Information System (INIS)

    Khazali Mohd Zin

    2001-01-01

    In order to become a developed country, Malaysia needs to develop her own national standards. It has been projected that by the year 2020, Malaysia requires about 8,000 standards (Department of Standard Malaysia). Currently more than 2,000 Malaysian Standards have been gazette by the government which considerably too low before tire year 2020. NDT standards have been identified by the standard working group as one of the areas to promote our national standards. In this paper the author describes the steps taken to establish the Malaysian very own NDT standards. The project starts with the establishment of radiographic standards. (Author)

  19. Standard guide for sampling radioactive tank waste

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This guide addresses techniques used to obtain grab samples from tanks containing high-level radioactive waste created during the reprocessing of spent nuclear fuels. Guidance on selecting appropriate sampling devices for waste covered by the Resource Conservation and Recovery Act (RCRA) is also provided by the United States Environmental Protection Agency (EPA) (1). Vapor sampling of the head-space is not included in this guide because it does not significantly affect slurry retrieval, pipeline transport, plugging, or mixing. 1.2 The values stated in inch-pound units are to be regarded as standard. No other units of measurement are included in this standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  20. Environmental assessment. Energy efficiency standards for consumer products

    Energy Technology Data Exchange (ETDEWEB)

    McSwain, Berah

    1980-06-01

    The Energy Policy and Conservation Act of 1975 requires DOE to prescribe energy efficiency standards for 13 consumer products. The Consumer Products Efficiency Standards (CPES) program covers: refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners (cooling and heat pumps), furnaces, dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers. This Environmental Assessment evaluates the potential environmental and socioeconomic impacts expected as a result of setting efficiency standards for all of the consumer products covered by the CPES program. DOE has proposed standards for eight of the products covered by the Program in a Notice of Proposed Rulemaking (NOPR). DOE expects to propose standards for home heating equipment, central air conditioners (heat pumps only), dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers in 1981. No significant adverse environmental or socioeconomic impacts have been found to result from instituting the CPES.

  1. An Enhanced Jaya Algorithm with a Two Group Adaption

    Directory of Open Access Journals (Sweden)

    Chibing Gong

    2017-01-01

    Full Text Available This paper proposes a novel performance enhanced Jaya algorithm with a two group adaption (E-Jaya. Two improvements are presented in E-Jaya. First, instead of using the best and the worst values in Jaya algorithm, EJaya separates all candidates into two groups: the better and the worse groups based on their fitness values, then the mean of the better group and the mean of the worse group are used. Second, in order to add non algorithm-specific parameters in E-Jaya, a novel adaptive method of dividing the two groups has been developed. Finally, twelve benchmark functions with different dimensionality, such as 40, 60, and 100, were evaluated using the proposed EJaya algorithm. The results show that E-Jaya significantly outperformed Jaya algorithm in terms of the solution accuracy. Additionally, E-Jaya was also compared with a differential evolution (DE, a self-adapting control parameters in differential evolution (jDE, a firefly algorithm (FA, and a standard particle swarm optimization 2011 (SPSO2011 algorithm. E-Jaya algorithm outperforms all the algorithms.

  2. Section for Standard and Patents - Standardization and Patents

    International Nuclear Information System (INIS)

    Wojtowicz, S.; Trechcinski, R.; Rybka, M.; Ryszkowska, A.; Wardaszko, J.

    1997-01-01

    Full text: The most important tasks of the Section in 1996 were: preparation of national standards and program of future work on standards for nuclear instrumentation and electronic equipment in nuclear engineering; organization of activities and participation in the meetings of the Commissions for Standardization No 173 Microprocessor Systems, No 266 Nuclear Instrumentation; giving opinions and expertises on national and international standards for equipment in nuclear engineering; cooperation with the Commission for Standardization No 246 Radiological Protection; control of inventiveness activity; The quality of the technical products is being improved by: a) selection of the proper types of interface systems, technical coordination and quality control; b) creation of standards at a high technical level; The Section works mainly for the Polish Committee for Standardization, the National Atomic Energy Agency, Association of Polish Electrical Engineers and Research Institutes in Poland. The activity of the Section is useful for all national institutions where backplane busses and nuclear electronic equipment is produced or used. The Section participates in the following international organizations: IEC (International Electrotechnical Commission) TC 45 (Nuclear Instrumentation); ISO/IEC Joint Technical Committee ISO/IEC JTCl SC26 (Microprocessor systems); ESONE (European Studies on Norms for Electronics); The section takes part in popularization of nuclear technology and instrumentation in the following ways: distribution of standards and technical documentation to national institutions dealing with nuclear apparatus; collecting and distributing technical information from international organizations (e.g. ESONE); organization of technical and scientific, national and international conferences (New Generation Nuclear Power Plants - September 96, QNX in Real World - January 96); participation in the technical conference organized by the Polish Committee for

  3. Standard Model Higgs Searches at the Tevatron

    Energy Technology Data Exchange (ETDEWEB)

    Knoepfel, Kyle J.

    2012-06-01

    We present results from the search for a standard model Higgs boson using data corresponding up to 10 fb{sup -1} of proton-antiproton collision data produced by the Fermilab Tevatron at a center-of-mass energy of 1.96 TeV. The data were recorded by the CDF and D0 detectors between March 2001 and September of 2011. A broad excess is observed between 105 < m{sub H} < 145 GeV/c{sup 2} with a global significance of 2.2 standard deviations relative to the background-only hypothesis.

  4. FACTORS AFFECTING THE COMPLIANCE OF MYANMAR NURSES IN PERFORMING STANDARD PRECAUTION

    Directory of Open Access Journals (Sweden)

    Sa Sa Aung

    2017-06-01

    Full Text Available Introduction: Exposure to pathogens is a serious issue for nurses. The literature explains that standard precaution have not consistently done in nursing. The purpose of this study was to analyze the factors affecting the compliance of nurses in Myanmar in performing standard precautions. Methods: This study used a cross-sectional design. Samples included 34 nurses in Waibagi Specialist Hospital (SHW, Myanmar. The independent variables were the characteristics of nurses, knowledge of standard precaution, and exposure to blood / body fluids and needle puncture wounds. The dependent variable was the performance of standard prevention. Data analyzed using descriptive analysis and logistic regression. Results: The result showed that almost respondents (91.18% had a good knowledge about prevention standards and 73.5% of respondents had good adherence in performing standard precaution. However, in practice nurses have not been consistent in closing the needles that have been used correctly. The results showed that nurse characteristics did not significantly affect adherence to standard precaution with statistical test results as follows: age (p = 0.97, gender (p = 1.00, religion (p = 0.72, education (p = 0.85, work experience at SHW (p = 0, 84, education training program (p = 0.71, knowledge (p = 0.76, and needle stick injury (p = 0,17. But, there was a significant influence between adherence to standard precaution on the incidence of injury due to puncture needle with p value = 0.01. Discussion: The barriers to applying standard precautions by Myanmar nurses can be reduced by providing basic training, supervision and improvement of operational standard procedures.

  5. To what extent have high schools in California been able to implement state-mandated nutrition standards?

    Science.gov (United States)

    Samuels, Sarah E; Bullock, Sally Lawrence; Woodward-Lopez, Gail; Clark, Sarah E; Kao, Janice; Craypo, Lisa; Barry, Jay; Crawford, Patricia B

    2009-09-01

    To determine extent and factors associated with implementation of California's school nutrition standards 1 year after standards became active. Information on competitive foods and beverages available in schools was collected from a representative sample of 56 public high schools in California. Adherence to nutrition standards was calculated for each item and summarized for each school by venue. The association between schools' sociodemographic characteristics and adherence to standards was determined by multivariate analysis. The majority of schools were adhering to the required beverage standards. None of the schools selling competitive foods were 100% adherent to the food standards. Adherence to both standards tended to be highest in food service venues. In univariate analyses, percent nonwhite enrollment, population density, percent free/reduced-price (FRP) meal eligibility, and school size were significantly correlated with the beverage adherence rate. Percent nonwhite enrollment and population density remained significant in the multivariate regression model. Percent nonwhite enrollment and percent FRP meal eligibility were significantly correlated with the food adherence rate in univariate analysis, but neither remained significant in the multiple regression model. California high schools are making progress toward implementation of the state nutrition standards. Beverage standards appear easier to achieve than nutrient-based food standards. Additional support is needed to provide schools with resources to implement and monitor these policies. Simpler standards and/or a reduction in the foods and beverages sold could better enable schools to achieve and monitor adherence.

  6. Standardization of positive controls in diagnostic immunohistochemistry

    DEFF Research Database (Denmark)

    Torlakovic, Emina E; Nielsen, Søren; Francis, Glenn

    2015-01-01

    Diagnostic immunohistochemistry (dIHC) has been practiced for several decades, with an ongoing expansion of applications for diagnostic use, and more recently for detection of prognostic and predictive biomarkers. However, standardization of practice has yet to be achieved, despite significant...

  7. Dynamic jump intensities and risk premiums

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Ornthanalai, Chayawat; Jacobs, Kris

    2012-01-01

    We build a new class of discrete-time models that are relatively easy to estimate using returns and/or options. The distribution of returns is driven by two factors: dynamic volatility and dynamic jump intensity. Each factor has its own risk premium. The models significantly outperform standard...... models without jumps when estimated on S&P500 returns. We find very strong support for time-varying jump intensities. Compared to the risk premium on dynamic volatility, the risk premium on the dynamic jump intensity has a much larger impact on option prices. We confirm these findings using joint...

  8. Cadmium ban spurs interest in zinc-nickel coating for corrosive aerospace environments

    Energy Technology Data Exchange (ETDEWEB)

    Bates, J. (Pure Coatings Inc., West Palm Beach, FL (United States))

    1994-02-01

    OSHA recently reduced the permissible exposure level for cadmium. The new standard virtually outlaws cadmium production and use, except in the most cost-insensitive applications. Aerospace manufacturers, which use cadmium extensively in coatings applications because of the material's corrosion resistance, are searching for substitutes. The most promising alternative found to date is a zinc-nickel alloy. Tests show that the alloy outperforms cadmium without generating associated toxicity issues. As a result, several major manufacturing and standards organizations have adopted the zinc-nickel compound as a standard cadmium replacement. The basis for revising the cadmium PEL -- which applies to occupational exposure in industrial, agricultural and maritime occupations -- is an official OSHA determination that employees exposed to cadmium under the existing PEL face significant health risks from lung cancer and kidney damage. In one of its principal uses, cadmium is electroplated to steel, where it acts as an anticorrosive agent.

  9. The Significance of Dewey's "Democracy and Education" for 21st-Century Education

    Science.gov (United States)

    Mason, Lance E.

    2017-01-01

    This paper explores the significance of Dewey's "Democracy and Education" for "21st-century education," a term used by proponents of curricular standardization and digital ubiquity in classrooms. Though these domains have distinct advocacy groups, they often share similar assumptions about the primary purposes of schooling as…

  10. Tissue-Based MRI Intensity Standardization: Application to Multicentric Datasets

    Directory of Open Access Journals (Sweden)

    Nicolas Robitaille

    2012-01-01

    Full Text Available Intensity standardization in MRI aims at correcting scanner-dependent intensity variations. Existing simple and robust techniques aim at matching the input image histogram onto a standard, while we think that standardization should aim at matching spatially corresponding tissue intensities. In this study, we present a novel automatic technique, called STI for STandardization of Intensities, which not only shares the simplicity and robustness of histogram-matching techniques, but also incorporates tissue spatial intensity information. STI uses joint intensity histograms to determine intensity correspondence in each tissue between the input and standard images. We compared STI to an existing histogram-matching technique on two multicentric datasets, Pilot E-ADNI and ADNI, by measuring the intensity error with respect to the standard image after performing nonlinear registration. The Pilot E-ADNI dataset consisted in 3 subjects each scanned in 7 different sites. The ADNI dataset consisted in 795 subjects scanned in more than 50 different sites. STI was superior to the histogram-matching technique, showing significantly better intensity matching for the brain white matter with respect to the standard image.

  11. Regulatory barriers blocking standardization of interoperability.

    Science.gov (United States)

    Zhong, Daidi; Kirwan, Michael J; Duan, Xiaolian

    2013-07-12

    Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the adoption of technical standards throughout the industry. These barriers have significantly impaired the motivations of consumer device vendors who desire to enter the personal health market and the overall success of personal health industry ecosystem. In this paper, we present the affect that these barriers have placed on the health ecosystem. This requires immediate action from policy makers and other stakeholders. The current regulatory policy needs to be updated to reflect the reality and demand of consumer health industry. Our hope is that this paper will draw wide consensus amongst its readers, policy makers, and other stakeholders.

  12. Accounting standards

    NARCIS (Netherlands)

    Stellinga, B.; Mügge, D.

    2014-01-01

    The European and global regulation of accounting standards have witnessed remarkable changes over the past twenty years. In the early 1990s, EU accounting practices were fragmented along national lines and US accounting standards were the de facto global standards. Since 2005, all EU listed

  13. Value investing in emerging markets : local macroeconomic risk and extrapolation

    NARCIS (Netherlands)

    Kouwenberg, R.; Salomons, R.M.

    2003-01-01

    Our results confirm the profitability of value investing at the country level in emerging markets. A portfolio of countries with low price-to-book ratios significantly outperforms a portfolio of high price-to-book countries. Global risk factors cannot explain this outperformance. Next we measure a

  14. Motion Normalized Proportional Control for Improved Pattern Recognition-Based Myoelectric Control.

    Science.gov (United States)

    Scheme, Erik; Lock, Blair; Hargrove, Levi; Hill, Wendy; Kuruganti, Usha; Englehart, Kevin

    2014-01-01

    This paper describes two novel proportional control algorithms for use with pattern recognition-based myoelectric control. The systems were designed to provide automatic configuration of motion-specific gains and to normalize the control space to the user's usable dynamic range. Class-specific normalization parameters were calculated using data collected during classifier training and require no additional user action or configuration. The new control schemes were compared to the standard method of deriving proportional control using a one degree of freedom Fitts' law test for each of the wrist flexion/extension, wrist pronation/supination and hand close/open degrees of freedom. Performance was evaluated using the Fitts' law throughput value as well as more descriptive metrics including path efficiency, overshoot, stopping distance and completion rate. The proposed normalization methods significantly outperformed the incumbent method in every performance category for able bodied subjects (p < 0.001) and nearly every category for amputee subjects. Furthermore, one proposed method significantly outperformed both other methods in throughput (p < 0.0001), yielding 21% and 40% improvement over the incumbent method for amputee and able bodied subjects, respectively. The proposed control schemes represent a computationally simple method of fundamentally improving myoelectric control users' ability to elicit robust, and controlled, proportional velocity commands.

  15. Chinese tallow trees (Triadica sebifera) from the invasive range outperform those from the native range with an active soil community or phosphorus fertilization.

    Science.gov (United States)

    Zhang, Ling; Zhang, Yaojun; Wang, Hong; Zou, Jianwen; Siemann, Evan

    2013-01-01

    Two mechanisms that have been proposed to explain success of invasive plants are unusual biotic interactions, such as enemy release or enhanced mutualisms, and increased resource availability. However, while these mechanisms are usually considered separately, both may be involved in successful invasions. Biotic interactions may be positive or negative and may interact with nutritional resources in determining invasion success. In addition, the effects of different nutrients on invasions may vary. Finally, genetic variation in traits between populations located in introduced versus native ranges may be important for biotic interactions and/or resource use. Here, we investigated the roles of soil biota, resource availability, and plant genetic variation using seedlings of Triadica sebifera in an experiment in the native range (China). We manipulated nitrogen (control or 4 g/m(2)), phosphorus (control or 0.5 g/m(2)), soil biota (untreated or sterilized field soil), and plant origin (4 populations from the invasive range, 4 populations from the native range) in a full factorial experiment. Phosphorus addition increased root, stem, and leaf masses. Leaf mass and height growth depended on population origin and soil sterilization. Invasive populations had higher leaf mass and growth rates than native populations did in fresh soil but they had lower, comparable leaf mass and growth rates in sterilized soil. Invasive populations had higher growth rates with phosphorus addition but native ones did not. Soil sterilization decreased specific leaf area in both native and exotic populations. Negative effects of soil sterilization suggest that soil pathogens may not be as important as soil mutualists for T. sebifera performance. Moreover, interactive effects of sterilization and origin suggest that invasive T. sebifera may have evolved more beneficial relationships with the soil biota. Overall, seedlings from the invasive range outperformed those from the native range, however

  16. Comparisons of significant parameters for a standard 20% enriched and FLIP 70% enriched TRIGA core

    International Nuclear Information System (INIS)

    Ringle, John C.; Anderson, Terrance V.; Johnson, Arthur G.

    1978-01-01

    A comparison is made between the 20% and 70% enriched cores. The initial start-up data for both cores show the FLIP needs ∼3.8 times the 235 U mass as the 20% core just to go critical. Operational configurations for both cores indicate a need for ∼33% additional fuel above initial critical for adequate maneuvering excess. The fuel element worths are higher in the central core locations for the 20% elements while the peripheral element worths are about the same (with some thermal flux peaking in the FLIP perheral elements). Pulsing comparisons of the two cores show significant differences in reactivity insertions and power peaks. (author)

  17. Passing the panda standard: a TAD off the mark?

    Science.gov (United States)

    Belton, Ben; Murray, Francis; Young, James; Telfer, Trevor; Little, David C

    2010-02-01

    Tilapia, a tropical freshwater fish native to Africa, is an increasingly important global food commodity. The World Wide Fund for Nature (WWF), a major environmental nongovernmental organization, has established stakeholder dialogues to formulate farm certification standards that promote "responsible" culture practices. As a preface to its "tilapia aquaculture dialogue," the WWF for Nature commissioned a review of potential certification issues, later published as a peer-reviewed article. This article contends that both the review and the draft certification standards subsequently developed fail to adequately integrate critical factors governing the relative sustainability of tilapia production and thereby miss more significant issues related to resource-use efficiency and the appropriation of ecosystem space and services. This raises a distinct possibility that subsequent certification will promote intensive systems of tilapia production that are far less ecologically benign than existing widely practiced semi-intensive alternatives. Given the likely future significance of this emergent standard, it is contended that a more holistic approach to certification is essential.

  18. State Standard-Setting Processes in Brief. State Academic Standards: Standard-Setting Processes

    Science.gov (United States)

    Thomsen, Jennifer

    2014-01-01

    Concerns about academic standards, whether created by states from scratch or adopted by states under the Common Core State Standards (CCSS) banner, have drawn widespread media attention and are at the top of many state policymakers' priority lists. Recently, a number of legislatures have required additional steps, such as waiting periods for…

  19. Testing non-standard CP violation in neutrino propagation

    International Nuclear Information System (INIS)

    Winter, Walter

    2009-01-01

    Non-standard physics which can be described by effective four fermion interactions may be an additional source of CP violation in the neutrino propagation. We discuss the detectability of such a CP violation at a neutrino factory. We assume the current baseline setup of the international design study of a neutrino factory (IDS-NF) for the simulation. We find that the CP violation from certain non-standard interactions is, in principle, detectable significantly below their current bounds - even if there is no CP violation in the standard oscillation framework. Therefore, a new physics effect might be mis-interpreted as the canonical Dirac CP violation, and a possibly even more exciting effect might be missed

  20. Standards in chestnut coppice system: cultural heritage or coltural requirement?

    Directory of Open Access Journals (Sweden)

    Manetti MC

    2012-12-01

    Full Text Available Standards in chestnut coppice system: cultural heritage or coltural requirement? This paper aims at evaluating the role of standards in chestnut coppices from a biological and functional perspective. In addition to a detailed analysis of Italian regulations on the issue, the technical definition of the term is analysed: (i as for the functional role of standards; (ii to assess whether the required functions are technically necessary and are being actually performed. In this contex, the results of an experimental trial are reported. The goal of the trial were to assess the shoots’ parameters, the stand productivity, the dynamics of canopy cover in coppices with or without standards. In 2001, at harvesting operations in a coppice aged 30 with standards managed by the local community, two experimental plots 2500 m2 each were established. The two theses being compared were: simple coppice and coppixce with standards (100 standards per hectare. The released standards were qualified immediately after final harvesting. Sprouting ability, growth pattern and stool vitality were surveyed in March 2004 (at age 2, in May 2008 (at age 6 and in April 2010 (at age 8. First results highlighted the evidence of statistically significant differences between the two thesis. The high number of standards effected negatively both vitality and growth pattern of the stools. Simple coppice recorded a lower shoot mortality, a higher diametrical growth and canopy cover degree as well; the heigth growth was, on the opposite, significantly lower. These results, although referred to a limited lifespan (1/3 of the rotation time and to one site only, underline productive, ecological and environmental benefits and as a consequence suggest the widening of the experimental network and the development of new, more relevant and consistent rules, making acceptable the simple coppice as a possible silvicultural choice to be applied to chestnut coppices.

  1. Development of nuclear power standards and relevant system in China

    International Nuclear Information System (INIS)

    Cao Shudong

    2008-01-01

    By analyzing the history of nuclear power development and the status of nuclear power codes and standards in China, the significance and necessity to quicken the development of nuclear power standards system in China are pointed out, and the guiding ideology, development thoughts, working doctrine and development objectives are put forward in this paper. (authors)

  2. Explanation of nurse standard of external exposure acute radiation sickness

    International Nuclear Information System (INIS)

    Lu Xiuling; Jiang Enhai; Sun Feifei; Zhang Bin; Wang Xiaoguang; Wang Guilin

    2012-01-01

    National occupational health standard-Nurse Standard of External Exposure Acute Radiation Sickness has been approved and issued by the Ministry of Health. Based on the extensive research of literature, collection of the previous nuclear and radiation accidents excessive exposed personnel data and specific situations in China, this standard was enacted according to the current national laws, regulations, and the opinions of peer experts. It is mainly used for care of patients with acute radiation sickness, and also has directive significance for care of patients with iatrogenic acute radiation sickness which due to the hematopoietic stem cell transplantation pretreatment. To correctly carry out this standard and to reasonably implement nursing measures for patients with acute radiation sickness, the contents of this standard were interpreted in this article. (authors)

  3. Mixed deep learning and natural language processing method for fake-food image recognition and standardization to help automated dietary assessment.

    Science.gov (United States)

    Mezgec, Simon; Eftimov, Tome; Bucher, Tamara; Koroušić Seljak, Barbara

    2018-04-06

    The present study tested the combination of an established and a validated food-choice research method (the 'fake food buffet') with a new food-matching technology to automate the data collection and analysis. The methodology combines fake-food image recognition using deep learning and food matching and standardization based on natural language processing. The former is specific because it uses a single deep learning network to perform both the segmentation and the classification at the pixel level of the image. To assess its performance, measures based on the standard pixel accuracy and Intersection over Union were applied. Food matching firstly describes each of the recognized food items in the image and then matches the food items with their compositional data, considering both their food names and their descriptors. The final accuracy of the deep learning model trained on fake-food images acquired by 124 study participants and providing fifty-five food classes was 92·18 %, while the food matching was performed with a classification accuracy of 93 %. The present findings are a step towards automating dietary assessment and food-choice research. The methodology outperforms other approaches in pixel accuracy, and since it is the first automatic solution for recognizing the images of fake foods, the results could be used as a baseline for possible future studies. As the approach enables a semi-automatic description of recognized food items (e.g. with respect to FoodEx2), these can be linked to any food composition database that applies the same classification and description system.

  4. Gradient plasticity crack tip characterization by means of the extended finite element method

    DEFF Research Database (Denmark)

    Martínez Pañeda, Emilio; Natarajan, S.; Bordas, S.

    2017-01-01

    of the displacementfield is enriched with the stress singularity of the gradientdominatedsolution. Results reveal that the proposed numericalmethodology largely outperforms the standard finiteelement approach. The present work could have importantimplications on the use of microstructurally-motivatedmodels in large scale...

  5. Telemetry Standards, RCC Standard 106-17, Chapter 4, Pulse Code Modulation Standards

    Science.gov (United States)

    2017-07-01

    A-4 Appendix 4-B. Citations ...investigation can be found in a paper by J. L. Maury, Jr. and J. Styles , “Development of Optimum Frame Synchronization Codes for Goddard Space Flight Center...Standards, RCC Standard 106-17 Chapter 4, July 2017 B-1 APPENDIX 4-B Citations Aeronautical Radio, Inc. Mark 33 Digital Information Transfer

  6. An empirical assessment of the impact of technical standards on the export of meat in Nigeria

    Directory of Open Access Journals (Sweden)

    Queeneth Odichi Ekeocha

    2017-10-01

    Full Text Available The study is an assessment of the impact of technical standards on meat export in Nigeria. Several literatures were reviewed in relation to meat standards, issues associated with standards compliance, the effects of SPS standards on food exports in developing countries, causes of non-export of meat in Nigeria, amongst others. A survey method was used and a cross tabulation analysis was made to ascertain the relationship among various variables and how significant they were in relation to food product standards. The findings of the study among others include- sanitary conditions for meat processing is a significant factor for meat export; standards compliance is a step in the right direction towards agricultural export diversification, food standard compliance can create market access for meat exports, etc. The study concluded that technical standard is very significant to meat exports in Nigeria. Therefore, the study recommends among others that the government should invest in the productive capacity of SPS requirements for meat export, standard abattoirs should be built and maintained, policymakers should re-think flexible export diversification policy that could attract foreign investor and meat companies in Nigeria.

  7. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA)

    DEFF Research Database (Denmark)

    Muharemovic, O; Troelsen, A; Thomsen, M G

    2018-01-01

    INTRODUCTION: Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time...... imaging. Radiographers in the control group used a standard RSA protocol. RESULTS: At three months, radiographers in the case group significantly reduced (p .... No significant improvements were found in the control group at any time point. CONCLUSION: There is strong evidence that personalized RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute...

  8. Study of the Standard Model Higgs boson decaying to taus at CMS

    CERN Document Server

    Botta, Valeria

    2017-01-01

    The most recent search for the Standard Model Higgs boson decaying to a pair of $\\tau$ leptons is performed using proton-proton collision events at a centre-of-mass energy of 13~TeV, recorded by the CMS experiment at the LHC. The full 2016 dataset, corresponding to an integrated luminosity of 35.9~fb$^{-1}$, has been analysed. The Higgs boson signal in the $\\tau^{+}\\tau^{-}$ decay mode is observed with a significance of 4.9 standard deviations, to be compared to an expected significance of 4.7 standard deviations. This measurement is the first observation of the Higgs boson decay into fermions by a single experiment.

  9. Nuclear standards

    International Nuclear Information System (INIS)

    Fichtner, N.; Becker, K.; Bashir, M.

    1981-01-01

    This compilation of all nuclear standards available to the authors by mid 1980 represents the third, carefully revised edition of a catalogue which was first published in 1975 as EUR 5362. In this third edition several changes have been made. The title has been condensed. The information has again been carefully up-dated, covering all changes regarding status, withdrawal of old standards, new projects, amendments, revisions, splitting of standards into several parts, combination of several standards into one, etc., as available to the authors by mid 1980. The speed with which information travels varies and requires in many cases rather tedious and cumbersome inquiries. Also, the classification scheme has been revised with the goal of better adjustment to changing situations and priorities. Whenever it turned out to be difficult to attribute a standard to a single subject category, multiple listings in all relevant categories have been made. As in previous editions, within the subcategories the standards are arranged by organization (in Categorie 2.1 by country) alphabetically and in ascending numerical order. It covers all relevant areas of power reactors, the fuel cycle, radiation protection, etc., from the basic laws and governmental regulations, regulatory guides, etc., all the way to voluntary industrial standards and codes of pratice. (orig./HP)

  10. Standardization Documents

    Science.gov (United States)

    2011-08-01

    Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK

  11. Comparison between old and new noise standards in Nagoya City

    Science.gov (United States)

    Asai, Atsushi; Mishina, Yoshiaki; Oishi, Yasaki; Ogura, Toshimitsu; Hayashi, Akinori; Omiya, Masaaki; Kuno, Kazuhiro

    2004-10-01

    The Japanese Environmental Agency (now the Ministry of the Environment) updated the environmental quality standards for noise in April 1999. The new standards replaced the median value of percentile level L50 for noise evaluation with the equivalent sound pressure level LAeq. The standards renewed the classification of areas and time sections. The most significant change was the introduction of category of artery-road-adjacent area. This report sets the range of the artery-road-adjacent area to 20 m or less from the applicable road to compare the new standards with the old, based on data collected in Nagoya City. The achieved rates for the new standards seem to be on the whole the same as those for the old standards. However, a detailed analysis reveals some differences, such as higher achieved rates in the artery-road-adjacent areas and lower achieved rates in the general areas for the new standards than for the old.

  12. The "Next Generation Science Standards" and the Earth and Space Sciences

    Science.gov (United States)

    Wysession, Michael E.

    2013-01-01

    In this article, Michael E. Wysession comments on the "Next Generation Science Standards" (NGSS), which are based on the recommendations of the National Research Council and represent a revolutionary step toward establishing modern, national K-12 science education standards. The NGSS involves significant changes from traditional…

  13. Standardized Testing Practices: Effect on Graduation and NCLEX® Pass Rates.

    Science.gov (United States)

    Randolph, Pamela K

    The use standardized testing in pre-licensure nursing programs has been accompanied by conflicting reports of effective practices. The purpose of this project was to describe standardized testing practices in one states' nursing programs and discover if the use of a cut score or oversight of remediation had any effect on (a) first time NCLEX® pass rates, (b) on-time graduation (OTG) or (c) the combination of (a) and (b). Administrators of 38 nursing programs in one Southwest state were sent surveys; surveys were returned by 34 programs (89%). Survey responses were compared to each program's NCLEX pass rate and on-time graduation rate; t-tests were conducted for significant differences associated with a required minimum score (cut score) and oversight of remediation. There were no significant differences in NCLEX pass or on-time graduation rates related to establishment of a cut score. There was a significant difference when the NCLEX pass rate and on-time graduation rate were combined (Outcome Index "OI") with significantly higher program outcomes (P=.02.) for programs without cut-scores. There were no differences associated with faculty oversight of remediation. The results of this study do not support establishment of a cut-score when implementing a standardized testing. Copyright © 2016. Published by Elsevier Inc.

  14. Standardisation in standards

    International Nuclear Information System (INIS)

    McDonald, J. C.

    2012-01-01

    The following observations are offered by one who has served on national and international standards-writing committees and standards review committees. Service on working groups consists of either updating previous standards or developing new standards. The process of writing either type of document proceeds along similar lines. The first order of business is to recognise the need for developing or updating a standard and to identify the potential user community. It is also necessary to ensure that there is a required number of members willing to do the writing. A justification is required as to why a new standard should be developed, and this is written as a new work item proposal or a project initiation notification system form. This document must be filed officially and approved, and a search is then undertaken to ensure that the proposed new standard will not duplicate a standard that has already been published or is underway in another standards organisation. (author)

  15. Result of standard patch test in patients suspected of having allergic contact dermatitis.

    Science.gov (United States)

    Wongpiyabovorn, Jongkonnee; Puvabanditsin, Porntip

    2005-09-01

    Contact dermatitis is a common skin disease. Disease was diagnosed by a history of contact substance together with geographic distribution of lesion. Up till now, standard patch test is one of the most reliable test to identify and confirm causative agent of allergic contact dermatitis. To determine the rate of positive standard patch test and to identify the common allergen of contact dermatitis in Thailand, we performed the standard patch test in 129 patients, suspected having allergic contact dermatitis at Department of Dermatology, King Chulalongkorn Memorial Hospital, Thailand from June 1, 2003 to September 1, 2004. The rate of positive standard patch test is 59.7% (n = 77/129). The most 3 common positive allergens were nickel sulfate (18.60%), cobalt chloride (17.05%) and fragrance mix (14.73%), respectively. The chance of positive standard patch test significantly correlated with sex (woman), initial diagnosis as contact dermatitis and history of house-worker (p = 0.017, p = 0.005 and p = 0.023, respectively). Whereas, there were no significant correlation between the chance of positive standard patch test and age of patient, location of lesion, history of recurrence, history of atopy, history of drug and food allergy. In addition, history of metal allergy significantly correlated with the chance of positive nickel sulfate or cobalt chloride in standard patch test (p = 0.017). In conclusion, this study demonstrated the prevalence of causative allergen of contact dermatitis in Thai patients using that standard patch test. Moreover, our data shown that the chance positive standard patch test was greater in patient, who were women or initial diagnosed as contact dermatitis or had history of houseworker or history of metal allergy.

  16. Climate-Specific Passive Building Standards

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Graham S. [Passive House Inst., Westford, MA (United States); Klingenberg, Katrin [Passive House Inst., Westford, MA (United States)

    2015-07-01

    Passive design principles (super insulation, airtight envelopes, elimination of thermal bridges, etc.) - pioneered in North America in the 70s and 80s and refined in Europe in the 90s have proven to be universally effective to significantly reduce heating and cooling loads. However, a single, rigid performance metric developed in Germany has led to limited uptake of passive building principles in many regions of the United States. It has also, in many cases, promoted some design decisions that had negative effects on economic feasibility and thermal comfort. This study's main objective is to validate (in a theoretical sense) verifiable, climate-specific passive standards and space conditioning criteria that retain ambitious, environmentally-necessary energy reduction targets and are economically feasible, such standards provide designers an ambitious but achievable performance target on the path to zero.

  17. Comparison of Hospital standards with ISO principles and presentation appropriate model of Hospital standard development

    Directory of Open Access Journals (Sweden)

    bahram Delgoshai

    2005-02-01

    Conclusion: Writing method of evaluation items in »B hand out« has significant difference with what is done in ISO audit. Research findings showed that in spite of leaping that »B hand out« developing had in national. There are serious lacks in hospital compelete evaluaton so that consider a hospitals system, means and content beacause of lack of enough experiment in standard development based on ISO principles.

  18. Standardization of UV LED measurements

    Science.gov (United States)

    Eppeldauer, G. P.; Larason, T. C.; Yoon, H. W.

    2015-09-01

    Traditionally used source spectral-distribution or detector spectral-response based standards cannot be applied for accurate UV LED measurements. Since the CIE standardized rectangular-shape spectral response function for UV measurements cannot be realized with small spectral mismatch when using filtered detectors, the UV measurement errors can be several times ten percent or larger. The UV LEDs produce broadband radiation and both their peaks or spectral bandwidths can change significantly. The detectors used for the measurement of these LEDs also have different spectral bandwidths. In the discussed example, where LEDs with 365 nm peak are applied for fluorescent crack-recognition using liquid penetrant (non-destructive) inspection, the broadband radiometric LED (signal) measurement procedure is standardized. A UV LED irradiance-source was calibrated against an FEL lamp standard to determine its spectral irradiance. The spectral irradiance responsivity of a reference UV meter was also calibrated. The output signal of the reference UV meter was calculated from the spectral irradiance of the UV source and the spectral irradiance responsivity of the reference UV meter. From the output signal, both the integrated irradiance (in the reference plane of the reference meter) and the integrated responsivity of the reference meter were determined. Test UV meters calibrated for integrated responsivity against the reference UV meter, can be used to determine the integrated irradiance from a field UV source. The obtained 5 % (k=2) measurement uncertainty can be decreased when meters with spectral response close to a constant value are selected.

  19. Assessment of liquefaction-induced hazards using Bayesian networks based on standard penetration test data

    Science.gov (United States)

    Tang, Xiao-Wei; Bai, Xu; Hu, Ji-Lei; Qiu, Jiang-Nan

    2018-05-01

    Liquefaction-induced hazards such as sand boils, ground cracks, settlement, and lateral spreading are responsible for considerable damage to engineering structures during major earthquakes. Presently, there is no effective empirical approach that can assess different liquefaction-induced hazards in one model. This is because of the uncertainties and complexity of the factors related to seismic liquefaction and liquefaction-induced hazards. In this study, Bayesian networks (BNs) are used to integrate multiple factors related to seismic liquefaction, sand boils, ground cracks, settlement, and lateral spreading into a model based on standard penetration test data. The constructed BN model can assess four different liquefaction-induced hazards together. In a case study, the BN method outperforms an artificial neural network and Ishihara and Yoshimine's simplified method in terms of accuracy, Brier score, recall, precision, and area under the curve (AUC) of the receiver operating characteristic (ROC). This demonstrates that the BN method is a good alternative tool for the risk assessment of liquefaction-induced hazards. Furthermore, the performance of the BN model in estimating liquefaction-induced hazards in Japan's 2011 Tōhoku earthquake confirms its correctness and reliability compared with the liquefaction potential index approach. The proposed BN model can also predict whether the soil becomes liquefied after an earthquake and can deduce the chain reaction process of liquefaction-induced hazards and perform backward reasoning. The assessment results from the proposed model provide informative guidelines for decision-makers to detect the damage state of a field following liquefaction.

  20. Communications standards

    CERN Document Server

    Stokes, A V

    1986-01-01

    Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se

  1. MRI-Targeted or Standard Biopsy for Prostate-Cancer Diagnosis.

    Science.gov (United States)

    Kasivisvanathan, Veeru; Rannikko, Antti S; Borghi, Marcelo; Panebianco, Valeria; Mynderse, Lance A; Vaarala, Markku H; Briganti, Alberto; Budäus, Lars; Hellawell, Giles; Hindley, Richard G; Roobol, Monique J; Eggener, Scott; Ghei, Maneesh; Villers, Arnauld; Bladou, Franck; Villeirs, Geert M; Virdi, Jaspal; Boxler, Silvan; Robert, Grégoire; Singh, Paras B; Venderink, Wulphert; Hadaschik, Boris A; Ruffion, Alain; Hu, Jim C; Margolis, Daniel; Crouzet, Sébastien; Klotz, Laurence; Taneja, Samir S; Pinto, Peter; Gill, Inderbir; Allen, Clare; Giganti, Francesco; Freeman, Alex; Morris, Stephen; Punwani, Shonit; Williams, Norman R; Brew-Graves, Chris; Deeks, Jonathan; Takwoingi, Yemisi; Emberton, Mark; Moore, Caroline M

    2018-05-10

    Multiparametric magnetic resonance imaging (MRI), with or without targeted biopsy, is an alternative to standard transrectal ultrasonography-guided biopsy for prostate-cancer detection in men with a raised prostate-specific antigen level who have not undergone biopsy. However, comparative evidence is limited. In a multicenter, randomized, noninferiority trial, we assigned men with a clinical suspicion of prostate cancer who had not undergone biopsy previously to undergo MRI, with or without targeted biopsy, or standard transrectal ultrasonography-guided biopsy. Men in the MRI-targeted biopsy group underwent a targeted biopsy (without standard biopsy cores) if the MRI was suggestive of prostate cancer; men whose MRI results were not suggestive of prostate cancer were not offered biopsy. Standard biopsy was a 10-to-12-core, transrectal ultrasonography-guided biopsy. The primary outcome was the proportion of men who received a diagnosis of clinically significant cancer. Secondary outcomes included the proportion of men who received a diagnosis of clinically insignificant cancer. A total of 500 men underwent randomization. In the MRI-targeted biopsy group, 71 of 252 men (28%) had MRI results that were not suggestive of prostate cancer, so they did not undergo biopsy. Clinically significant cancer was detected in 95 men (38%) in the MRI-targeted biopsy group, as compared with 64 of 248 (26%) in the standard-biopsy group (adjusted difference, 12 percentage points; 95% confidence interval [CI], 4 to 20; P=0.005). MRI, with or without targeted biopsy, was noninferior to standard biopsy, and the 95% confidence interval indicated the superiority of this strategy over standard biopsy. Fewer men in the MRI-targeted biopsy group than in the standard-biopsy group received a diagnosis of clinically insignificant cancer (adjusted difference, -13 percentage points; 95% CI, -19 to -7; Pprostate cancer who had not undergone biopsy previously. (Funded by the National Institute for

  2. Globalization quickly increased need for moving from local to international standards

    Energy Technology Data Exchange (ETDEWEB)

    Cappelli, Cataldo

    2005-07-01

    'Standardization' quickly changed in the past few years, due to the market's globalization that needs international standards as important instruments in eliminating technical barriers to trade. The 'Petroleum Sector' chose moving to international standards jointly processed by the International Organization for Standardization (ISO) and the American Petroleum Institute (API) that played the role of historical reference. Taking into account that oil industry wants only one Standard worldwide used, also Europe decided for adopting these ISO Standards as European Standards. The result is much better considering that also Russia and China seem to adopt these ISO documents as their national standards. It is so becoming much more significant the 'motto' that ISO TC 67 'Materials, equipment and offshore structures for petroleum, petrochemical and natural gas industries' adopted for its standardization activities: 'Do it once, do it right, do it internationally'. Examples of such international Standards worldwide used as National Standards are: ISO 11960:2004 - 'Steel pipes for use as casing or tubing for wells' and; ISO/DIS 3183 - 'Steel pipe for pipeline transportation systems' (under preparation). Standardization has so grown from technical to management tool and countries are also moving, including standards in more areas of its legislations. (author)

  3. Standard setting: comparison of two methods.

    Science.gov (United States)

    George, Sanju; Haque, M Sayeed; Oyebode, Femi

    2006-09-14

    The outcome of assessments is determined by the standard-setting method used. There is a wide range of standard-setting methods and the two used most extensively in undergraduate medical education in the UK are the norm-reference and the criterion-reference methods. The aims of the study were to compare these two standard-setting methods for a multiple-choice question examination and to estimate the test-retest and inter-rater reliability of the modified Angoff method. The norm-reference method of standard-setting (mean minus 1 SD) was applied to the 'raw' scores of 78 4th-year medical students on a multiple-choice examination (MCQ). Two panels of raters also set the standard using the modified Angoff method for the same multiple-choice question paper on two occasions (6 months apart). We compared the pass/fail rates derived from the norm reference and the Angoff methods and also assessed the test-retest and inter-rater reliability of the modified Angoff method. The pass rate with the norm-reference method was 85% (66/78) and that by the Angoff method was 100% (78 out of 78). The percentage agreement between Angoff method and norm-reference was 78% (95% CI 69% - 87%). The modified Angoff method had an inter-rater reliability of 0.81-0.82 and a test-retest reliability of 0.59-0.74. There were significant differences in the outcomes of these two standard-setting methods, as shown by the difference in the proportion of candidates that passed and failed the assessment. The modified Angoff method was found to have good inter-rater reliability and moderate test-retest reliability.

  4. Radiological Control Technician: Standardized technician Qualification Standard

    International Nuclear Information System (INIS)

    1992-10-01

    The Qualification Standard states and defines the knowledge and skill requirements necessary for successful completion of the Radiological Control Technician Training Program. The standard is divided into three phases: Phase I concerns RCT Academic training. There are 13 lessons associated with the core academics program and 19 lessons associated with the site academics program. The staff member should sign the appropriate blocks upon successful completion of the examination for that lesson or group of lessons. In addition, facility specific lesson plans may be added to meet the knowledge requirements in the Job Performance Measures (JPM) of the practical program. Phase II concerns RCT core/site practical (JPMs) training. There are thirteen generic tasks associated with the core practical program. Both the trainer/evaluator and student should sign the appropriate block upon successful completion of the JPM. In addition, facility specific tasks may be added or generic tasks deleted based on the results of the facility job evaluation. Phase III concerns the oral examination board successful completion of the oral examination board is documented by the signature of the chairperson of the board. Upon completion of all of the standardized technician qualification requirements, final qualification is verified by the student and the manager of the Radiological Control Department and acknowledged by signatures on the qualification standard. The completed Qualification Standard shall be maintained as an official training record

  5. Implementation Guidance Document: Properties and Types of Significant Photothermal Retinal Lesion Injuries

    Science.gov (United States)

    2017-04-17

    Include area code) (703) 432-0899 Standard Form 298 (Rev. 8/98) Prescribed 1::r;< ANSI S1d. Z39.18 Adobe Professional 7 .O Properties and Types of...are above MPE yet well below an irradiance that could cause significant retinal injuries. Estimating RSI in Practice - Some Illustrative Examples

  6. Working out the standards for nuclear power aging management implementation (PLM Standards)

    International Nuclear Information System (INIS)

    Miyano, Hiroshi

    2008-01-01

    Background of preparation of standards, preparation of standards for development of nuclear power aging management technologies, revision of PLM (Product Lifecycle Management) standards, and problems of PLM standards are stated. The placement of social needs, scheme, the standards system, preparation of rules and standards, and practical use of them by road map are illustrated and explained. Relation between the safety regulations and examination standards, and development and preparation of standards system are outlined. The nuclear power plant aging management and the maintenance control are provided by many rules and standards. PLM standards defines the aging phenomena and extracts the measurements and reflects them on the usual maintenance flow under the long term maintenance program. New examination system constructs the usual maintenance and the maintenance based on the aging management and long term maintenance program. Outline and construction of PLM standards are explained with notes and additional books. (S.Y.)

  7. Multi-level significance of vulnerability indicators. Case study: Eastern Romania

    Science.gov (United States)

    Stanga, I. C.; Grozavu, A.

    2012-04-01

    Vulnerability assessment aims, most frequently, to emphasize internal fragility of a system comparing to a reference standard, to similar systems or in relation to a given hazard. Internal fragility, either biophysical or structural, may affect the capacity to predict, to prepare for, to cope with or to recover from a disaster. Thus, vulnerability is linked to resilience and adaptive capacity. From local level to global one, vulnerability factors and corresponding indicators are different and their significance must be tested and validated in a well-structured conceptual and methodological framework. In this paper, the authors aim to show the real vulnerability of rural settlements in Eastern Romania in a multi-level approach. The research area, Tutova Hills, counts about 3421 sq.km and more than 200.000 inhabitants in 421 villages characterized by deficient accessibility, lack of endowments, subsistential agriculture, high pressure on natural environment (especially on forest and soil resources), poverty and aging process of population. Factors that could influence the vulnerability of these rural settlements have been inventoried and assigned into groups through a cluster analysis: habitat and technical urban facilities, infrastructure, economical, social and demographical indicators, environment quality, management of emergency situations etc. Firstly, the main difficulty was to convert qualitative variable in quantitative indicators and to standardize all values to make possible mathematical and statistical processing of data. Secondly, the great variability of vulnerability factors, their different measuring units and their high amplitude of variation require different method of standardization in order to obtain values between zero (minimum vulnerability) and one (maximum vulnerability). Final vulnerability indicators were selected and integrated in a general scheme, according to their significance resulted from an appropriate factor analysis: linear and

  8. Evaluating significance in linear mixed-effects models in R.

    Science.gov (United States)

    Luke, Steven G

    2017-08-01

    Mixed-effects models are being used ever more frequently in the analysis of experimental data. However, in the lme4 package in R the standards for evaluating significance of fixed effects in these models (i.e., obtaining p-values) are somewhat vague. There are good reasons for this, but as researchers who are using these models are required in many cases to report p-values, some method for evaluating the significance of the model output is needed. This paper reports the results of simulations showing that the two most common methods for evaluating significance, using likelihood ratio tests and applying the z distribution to the Wald t values from the model output (t-as-z), are somewhat anti-conservative, especially for smaller sample sizes. Other methods for evaluating significance, including parametric bootstrapping and the Kenward-Roger and Satterthwaite approximations for degrees of freedom, were also evaluated. The results of these simulations suggest that Type 1 error rates are closest to .05 when models are fitted using REML and p-values are derived using the Kenward-Roger or Satterthwaite approximations, as these approximations both produced acceptable Type 1 error rates even for smaller samples.

  9. Hearing protector performance and standard deviation.

    Science.gov (United States)

    Williams, W; Dillon, H

    2005-01-01

    The attenuation performance of a hearing protector is used to estimate the protected exposure level of the user. The aim is to reduce the exposed level to an acceptable value. Users should expect the attenuation to fall within a reasonable range of values around a norm. However, an analysis of extensive test data indicates that there is a negative relationship between attenuation performance and the standard deviation. This result is deduced using a variation in the method of calculating a single number rating of attenuation that is more amenable to drawing statistical inferences. As performance is typically specified as a function of the mean attenuation minus one or two standard deviations from the mean to ensure that greater than 50% of the wearer population are well protected, the implication of increasing standard deviation with decreasing attenuation found in this study means that a significant number of users are, in fact, experiencing over-protection. These users may be disinclined to use their hearing protectors because of an increased feeling of acoustic isolation. This problem is exacerbated in areas with lower noise levels.

  10. BUSINESS ETHICS STANDARDS AND HOTEL BUSINESS

    Directory of Open Access Journals (Sweden)

    Ivica Batinić

    2014-04-01

    Full Text Available By implementing certain standards in business, especially the standards of business ethics, each entity in the hotel industry emphasize its specificity and recognition, while giving a guestconsumer security and a guarantee that they will get desired quality. In today's global world, business ethics has become an indispensable part of the hotel business practices and prerequisite for achieving business success. Business ethics receives strategic significance because it creates a system of governance based on ethical principles that enables the hotel to properly respond to the demands of all interest groups. Successful will be precisely those hotels that do not separate ethics from profitability, but those that successfully coordinate them in its business. Business ethics has a strong impact on hotel business, and a major role in its implementation has a hotel management. Every responsible hotel management should, in accordance with the business philosophy of hotel, devise various ethical practices and ethical codes of conduct prescribed by the employees who will be an important standard of a business object.

  11. Seminar on standards, standardization, quality control and interlaboratory test programmes

    Energy Technology Data Exchange (ETDEWEB)

    de Bievre, P. [Central Bureau for Nuclear Measurements, Geel (Belgium)

    1978-12-15

    The author gives a resume on the proper use of standards and standardization of measurement procedures. Results of measurements obtained on the same instrument and on the same series of standards of different isotopic compositions are displayed.

  12. Improvement of Diagnostic Accuracy by Standardization in Diuretic Renal Scan

    International Nuclear Information System (INIS)

    Hyun, In Young; Lee, Dong Soo; Lee, Kyung Han; Chung, June Key; Lee, Myung Chul; Koh, Chang Soon; Kim, Kwang Myung; Choi, Hwang; Choi, Yong

    1995-01-01

    We evaluated diagnostic accuracy of diuretic renal scan with standardization in 45 children(107 hydronephrotic kidneys) with 91 diuretic assessments. Sensitivity was 100% specificity was 78%, and accuracy was 84% in 49 hydronephrotic kidneys with standardization. Diuretic renal scan without standardization, sensitivity was 100%, specificity was 38%, and accuracy was 57% in 58 hydronephrotic kidneys. The false-positive results were observed in 25 cases without standardization, and in 8 cases with standardization. In duretic renal scans without standardization, the causes of false-positive results were 10 early injection of lasix before mixing of radioactivity in loplsty, 6 extrarenal pelvis, and 3 immature kidneys of false-positive results were 2 markedly dilated systems postpyeloplsty, 2 etrarenal pevis, 1 immature kidney of neonate , and 2 severe renal dysfunction, 1 vesicoureteral, reflux. In diuretic renal scan without standardization the false-positive results by inadequate study were common, but false-positive results by inadequate study were not found after standardization. The false-positive results by dilated pelvo-calyceal systems postpyeloplsty, extrarenal pelvis, and immature kidneys of, neonates were not dissolved after standardization. In conclusion, diagnostic accuracy of diuretic renal scan with standardization was useful in children with renal outflow tract obstruction by improving specificity significantly.

  13. Implementing the National Council of Teachers of Mathematics Standards: A slow process

    Directory of Open Access Journals (Sweden)

    Joseph M. Furner

    2004-10-01

    Full Text Available The purpose of this study was to look at inservice teachers’ pedagogical beliefs about the National Council of Teachers of Mathematics Standards (1989 & 2000.  The Standards’ Belief Instrument (Zollman and Mason, 1992 was administered on teachers.  An ANOVA was used to look for a significant difference between teachers with five years or less experience of teaching mathematics, and those with more than five years teaching experience. One expectation was  that teachers who are recent graduates of teacher education programmes may have more training  on the NCTM Standards. Although there were no statistically significant differences between the two groups, this study did support the expectation. Current training with in-service teachers shows that many of the teachers are familiar with neither the National Council of Teachers of Mathematics nor their Standards.  It seems then from this study that the implementation process of the NCTM Standards, and  perhaps any standards or best practices and new curriculum implementation, is very sluggish.

  14. JPEG2000 vs. full frame wavelet packet compression for smart card medical records.

    Science.gov (United States)

    Leehan, Joaquín Azpirox; Lerallut, Jean-Francois

    2006-01-01

    This paper describes a comparison among different compression methods to be used in the context of electronic health records in the newer version of "smart cards". The JPEG2000 standard is compared to a full-frame wavelet packet compression method at high (33:1 and 50:1) compression rates. Results show that the full-frame method outperforms the JPEG2K standard qualitatively and quantitatively.

  15. Jointly-check iterative decoding algorithm for quantum sparse graph codes

    International Nuclear Information System (INIS)

    Jun-Hu, Shao; Bao-Ming, Bai; Wei, Lin; Lin, Zhou

    2010-01-01

    For quantum sparse graph codes with stabilizer formalism, the unavoidable girth-four cycles in their Tanner graphs greatly degrade the iterative decoding performance with a standard belief-propagation (BP) algorithm. In this paper, we present a jointly-check iterative algorithm suitable for decoding quantum sparse graph codes efficiently. Numerical simulations show that this modified method outperforms the standard BP algorithm with an obvious performance improvement. (general)

  16. The extended reciprocity: Strong belief outperforms persistence.

    Science.gov (United States)

    Kurokawa, Shun

    2017-05-21

    The existence of cooperation is a mysterious phenomenon and demands explanation, and direct reciprocity is one key potential explanation for the evolution of cooperation. Direct reciprocity allows cooperation to evolve for cooperators who switch their behavior on the basis of information about the opponent's behavior. Here, relevant to direct reciprocity is information deficiency. When the opponent's last move is unknown, how should players behave? One possibility is to choose cooperation with some default probability without using any further information. In fact, our previous paper (Kurokawa, 2016a) examined this strategy. However, there might be beneficial information other than the opponent's last move. A subsequent study of ours (Kurokawa, 2017) examined the strategy which uses the own last move when the opponent's last move is unknown, and revealed that referring to the own move and trying to imitate it when information is absent is beneficial. Is there any other beneficial information else? How about strong belief (i.e., have infinite memory and believe that the opponent's behavior is unchanged)? Here, we examine the evolution of strategies with strong belief. Analyzing the repeated prisoner's dilemma game and using evolutionarily stable strategy (ESS) analysis against an invasion by unconditional defectors, we find the strategy with strong belief is more likely to evolve than the strategy which does not use information other than the opponent player's last move and more likely to evolve than the strategy which uses not only the opponent player's last move but also the own last move. Strong belief produces the extended reciprocity and facilitates the evolution of cooperation. Additionally, we consider the two strategies game between strategies with strong belief and any strategy, and we consider the four strategies game in which unconditional cooperators, unconditional defectors, pessimistic reciprocators with strong belief, and optimistic reciprocators with strong belief are present. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. US line-ups outperform UK line-ups

    OpenAIRE

    Seale-Carlisle, Travis M.; Mickes, Laura

    2016-01-01

    In the USA and the UK, many thousands of police suspects are identified by eyewitnesses every year. Unfortunately, many of those suspects are innocent, which becomes evident when they are exonerated by DNA testing, often after having been imprisoned for years. It is, therefore, imperative to use identification procedures that best enable eyewitnesses to discriminate innocent from guilty suspects. Although police investigators in both countries often administer line-up procedures, the details ...

  18. Data Assimilation in Integrated and Distributed Hydrological Models

    DEFF Research Database (Denmark)

    Zhang, Donghua

    processes and provide simulations in refined temporal and spatial resolutions. Recent developments in measurement and sensor technologies have significantly improved the coverage, quality, frequency and diversity of hydrological observations. Data assimilation provides a great potential in relation...... point of view, different assimilation methodologies and techniques have been developed or customized to better serve hydrological assimilation. From the application point of view, real data and real-world complex catchments are used with the focus of investigating the models’ improvements with data...... a variety of model uncertainty sources and scales. Next the groundwater head assimilation experiment was tested in a much more complex catchment with assimilation of biased real observations. In such cases, the bias-aware assimilation method significantly outperforms the standard assimilation method...

  19. Internal Fiber Structure of a High-Performing, Additively Manufactured Injection Molding Insert

    DEFF Research Database (Denmark)

    Hofstätter, Thomas; Baier, Sina; Trinderup, Camilla H.

    A standard mold is equipped with additively manufactured inserts in a rectangular shape produced with vat photo polymerization. While the lifetime compared to conventional materials such as brass, steel, and aluminum is reduced, the prototyping and design phase can be shortened significantly...... by using flexible and cost-effective additive manufacturing technologies. Higher production volumes still exceed the capability of additively manufactured inserts, which are overruled by the stronger performance of less-flexible but mechanically advanced materials. In this contribution, the internal...... structure of a high-performing, fiber-reinforced injection molding insert has been analyzed. The insert reached a statistically proven and reproducible lifetime of 4,500 shots, which significantly outperforms any other previously published additively manufactured inserts. Computer tomography, tensile tests...

  20. Accounting standards that appeal to the professional

    Directory of Open Access Journals (Sweden)

    Alain BURLAUD

    2016-12-01

    In order to observe, in a scientific manner, this evolution of the accounting standards, we conducted a content analysis of principal legislative accounting texts, international and national (France and Romania, supplemented by a lexicometric analysis. These analyses allowed us to conclude that the importance of professional judgment in accounting standards is lower at the national level than it is at the international level. However, we highlight a number of dangers related to an increased use of professional judgment: loss of comparability and transparency, increased risks for accounting professionals including auditors, and significant discrepancies in the use of professional judgment in individual or consolidated accounts

  1. Training Standardization

    International Nuclear Information System (INIS)

    Agnihotri, Newal

    2003-01-01

    The article describes the benefits of and required process and recommendations for implementing the standardization of training in the nuclear power industry in the United States and abroad. Current Information and Communication Technologies (ICT) enable training standardization in the nuclear power industry. The delivery of training through the Internet, Intranet and video over IP will facilitate this standardization and bring multiple benefits to the nuclear power industry worldwide. As the amount of available qualified and experienced professionals decreases because of retirements and fewer nuclear engineering institutions, standardized training will help increase the number of available professionals in the industry. Technology will make it possible to use the experience of retired professionals who may be interested in working part-time from a remote location. Well-planned standardized training will prevent a fragmented approach among utilities, and it will save the industry considerable resources in the long run. It will also ensure cost-effective and safe nuclear power plant operation

  2. Functional assays for analysis of variants of uncertain significance in BRCA2

    DEFF Research Database (Denmark)

    Guidugli, Lucia; Carreira, Aura; Caputo, Sandrine M

    2014-01-01

    Missense variants in the BRCA2 gene are routinely detected during clinical screening for pathogenic mutations in patients with a family history of breast and ovarian cancer. These subtle changes frequently remain of unknown clinical significance because of the lack of genetic information that may...... of uncertain significance analyzed, and describe a validation set of (genetically) proven pathogenic and neutral missense variants to serve as a golden standard for the validation of each assay. Guidelines are proposed to enable implementation of laboratory-based methods to assess the impact of the variant...

  3. A Note on Standard Deviation and Standard Error

    Science.gov (United States)

    Hassani, Hossein; Ghodsi, Mansoureh; Howell, Gareth

    2010-01-01

    Many students confuse the standard deviation and standard error of the mean and are unsure which, if either, to use in presenting data. In this article, we endeavour to address these questions and cover some related ambiguities about these quantities.

  4. Globalization quickly increased need for moving from local to international standards

    Energy Technology Data Exchange (ETDEWEB)

    Cappelli, Cataldo

    2005-07-01

    'Standardization' quickly changed in the past few years, due to the market's globalization that needs international standards as important instruments in eliminating technical barriers to trade. The 'Petroleum Sector' chose moving to international standards jointly processed by the International Organization for Standardization (ISO) and the American Petroleum Institute (API) that played the role of historical reference. Taking into account that oil industry wants only one Standard worldwide used, also Europe decided for adopting these ISO Standards as European Standards. The result is much better considering that also Russia and China seem to adopt these ISO documents as their national standards. It is so becoming much more significant the 'motto' that ISO TC 67 'Materials, equipment and offshore structures for petroleum, petrochemical and natural gas industries' adopted for its standardization activities: 'Do it once, do it right, do it internationally'. Examples of such international Standards worldwide used as National Standards are: ISO 11960:2004 - 'Steel pipes for use as casing or tubing for wells' and; ISO/DIS 3183 - 'Steel pipe for pipeline transportation systems' (under preparation). Standardization has so grown from technical to management tool and countries are also moving, including standards in more areas of its legislations. (author)

  5. Adopting HLA standard for interdependency study

    International Nuclear Information System (INIS)

    Nan, Cen; Eusgeld, Irene

    2011-01-01

    In recent decades, modern Critical Infrastructure (CI) has become increasingly automated and interlinked as more and more resources and information are required to maintain its day-to-day operation. A system failure, or even just a service debilitation, of any CI may have significant adverse effects on other infrastructures it is connected/interconnected with. It is vital to study the interdependencies within and between CIs and provide advanced modeling and simulation techniques in order to prevent or at least minimize these adverse effects. The key limitation of traditional mathematical models such as complex network theory is their lacking the capabilities of providing sufficient insights into interrelationships between CIs due to the complexities of these systems. A comprehensive method, a hybrid approach combining various modeling/simulation techniques in a distributed simulation environment, is presented in this paper. High Level Architecture (HLA) is an open standard (IEEE standard 1516) supporting simulations composed of different simulation components, which can be regarded as the framework for implementing such a hybrid approach. The concept of adopting HLA standard for the interdependency study is still under discussion by many researchers. Whether or not this HLA standard, or even the distributed simulation environment, is able to meet desired model/simulation requirements needs to be carefully examined. This paper presents the results from our experimental test-bed, which recreates the architecture of a typical Electricity Power Supply System (EPSS) with its own Supervisory Control and Data Acquisition (SCADA) system, for the purpose of investigating the capabilities of the HLA technique as a standard to perform interdependency studies.

  6. Standard setting: Comparison of two methods

    Directory of Open Access Journals (Sweden)

    Oyebode Femi

    2006-09-01

    Full Text Available Abstract Background The outcome of assessments is determined by the standard-setting method used. There is a wide range of standard – setting methods and the two used most extensively in undergraduate medical education in the UK are the norm-reference and the criterion-reference methods. The aims of the study were to compare these two standard-setting methods for a multiple-choice question examination and to estimate the test-retest and inter-rater reliability of the modified Angoff method. Methods The norm – reference method of standard -setting (mean minus 1 SD was applied to the 'raw' scores of 78 4th-year medical students on a multiple-choice examination (MCQ. Two panels of raters also set the standard using the modified Angoff method for the same multiple-choice question paper on two occasions (6 months apart. We compared the pass/fail rates derived from the norm reference and the Angoff methods and also assessed the test-retest and inter-rater reliability of the modified Angoff method. Results The pass rate with the norm-reference method was 85% (66/78 and that by the Angoff method was 100% (78 out of 78. The percentage agreement between Angoff method and norm-reference was 78% (95% CI 69% – 87%. The modified Angoff method had an inter-rater reliability of 0.81 – 0.82 and a test-retest reliability of 0.59–0.74. Conclusion There were significant differences in the outcomes of these two standard-setting methods, as shown by the difference in the proportion of candidates that passed and failed the assessment. The modified Angoff method was found to have good inter-rater reliability and moderate test-retest reliability.

  7. Minimal extension of the standard model scalar sector

    International Nuclear Information System (INIS)

    O'Connell, Donal; Wise, Mark B.; Ramsey-Musolf, Michael J.

    2007-01-01

    The minimal extension of the scalar sector of the standard model contains an additional real scalar field with no gauge quantum numbers. Such a field does not couple to the quarks and leptons directly but rather through its mixing with the standard model Higgs field. We examine the phenomenology of this model focusing on the region of parameter space where the new scalar particle is significantly lighter than the usual Higgs scalar and has small mixing with it. In this region of parameter space most of the properties of the additional scalar particle are independent of the details of the scalar potential. Furthermore the properties of the scalar that is mostly the standard model Higgs can be drastically modified since its dominant branching ratio may be to a pair of the new lighter scalars

  8. The Economics of Standards and Standardization in Information and Communication Technologies

    DEFF Research Database (Denmark)

    Pedersen, Mogens Kuhn; Fomin, Vladislav V.

    2006-01-01

    processes in the field of ICT taking place? How and why do open standards differ from other types of standards? How may open standards influence ICT government policy and the reverse: How will government need to take action in the face of the international trend toward open standards in ICT?...

  9. Standardized gene nomenclature for the Brassica genus

    Directory of Open Access Journals (Sweden)

    King Graham J

    2008-05-01

    Full Text Available Abstract The genus Brassica (Brassicaceae, Brassiceae is closely related to the model plant Arabidopsis, and includes several important crop plants. Against the background of ongoing genome sequencing, and in line with efforts to standardize and simplify description of genetic entities, we propose a standard systematic gene nomenclature system for the Brassica genus. This is based upon concatenating abbreviated categories, where these are listed in descending order of significance from left to right (i.e. genus – species – genome – gene name – locus – allele. Indicative examples are provided, and the considerations and recommendations for use are discussed, including outlining the relationship with functionally well-characterized Arabidopsis orthologues. A Brassica Gene Registry has been established under the auspices of the Multinational Brassica Genome Project that will enable management of gene names within the research community, and includes provisional allocation of standard names to genes previously described in the literature or in sequence repositories. The proposed standardization of Brassica gene nomenclature has been distributed to editors of plant and genetics journals and curators of sequence repositories, so that it can be adopted universally.

  10. Natural background approach to setting radiation standards

    International Nuclear Information System (INIS)

    Adler, H.I.; Federow, H.; Weinberg, A.M.

    1979-01-01

    The suggestion has often been made that an additional radiation exposure imposed on humanity as a result of some important activity such as electricity generation would be acceptable if the exposure was small compared to the natural background. In order to make this concept quantitative and objective, we propose that small compared with the natural background be interpreted as the standard deviation (weighted with the exposed population) of the natural background. This use of the variation in natural background radiation is less arbitrary and requires fewer unfounded assumptions than some current approaches to standard-setting. The standard deviation is an easily calculated statistic that is small compared with the mean value for natural exposures of populations. It is an objectively determined quantity and its significance is generally understood. Its determination does not omit any of the pertinent data. When this method is applied to the population of the United States, it suggests that a dose of 20 mrem/year would be an acceptable standard. This is comparable to the 25 mrem/year suggested as the maximum allowable exposure to an individual from the complete uranium fuel cycle

  11. A Standard Mammography Unit - Standard 3D Ultrasound Probe Fusion Prototype: First Results.

    Science.gov (United States)

    Schulz-Wendtland, Rüdiger; Jud, Sebastian M; Fasching, Peter A; Hartmann, Arndt; Radicke, Marcus; Rauh, Claudia; Uder, Michael; Wunderle, Marius; Gass, Paul; Langemann, Hanna; Beckmann, Matthias W; Emons, Julius

    2017-06-01

    The combination of different imaging modalities through the use of fusion devices promises significant diagnostic improvement for breast pathology. The aim of this study was to evaluate image quality and clinical feasibility of a prototype fusion device (fusion prototype) constructed from a standard tomosynthesis mammography unit and a standard 3D ultrasound probe using a new method of breast compression. Imaging was performed on 5 mastectomy specimens from patients with confirmed DCIS or invasive carcinoma (BI-RADS ™ 6). For the preclinical fusion prototype an ABVS system ultrasound probe from an Acuson S2000 was integrated into a MAMMOMAT Inspiration (both Siemens Healthcare Ltd) and, with the aid of a newly developed compression plate, digital mammogram and automated 3D ultrasound images were obtained. The quality of digital mammogram images produced by the fusion prototype was comparable to those produced using conventional compression. The newly developed compression plate did not influence the applied x-ray dose. The method was not more labour intensive or time-consuming than conventional mammography. From the technical perspective, fusion of the two modalities was achievable. In this study, using only a few mastectomy specimens, the fusion of an automated 3D ultrasound machine with a standard mammography unit delivered images of comparable quality to conventional mammography. The device allows simultaneous ultrasound - the second important imaging modality in complementary breast diagnostics - without increasing examination time or requiring additional staff.

  12. Treatment planning using tailored and standard cylindrical light diffusers for photodynamic therapy of the prostate

    International Nuclear Information System (INIS)

    Rendon, Augusto; Lilge, Lothar; Beck, J Christopher

    2008-01-01

    Interstitial photodynamic therapy (PDT) has seen a rebirth, partially prompted by the development of photosensitizers with longer absorption wavelengths that enable the treatment of larger tissue volumes. Here, we study whether using diffusers with customizable longitudinal emission profiles, rather than conventional ones with flat emission profiles, improves our ability to conform the light dose to the prostate. We present a modified Cimmino linear feasibility algorithm to solve the treatment planning problem, which improves upon previous algorithms by (1) correctly minimizing the cost function that penalizes deviations from the prescribed light dose, and (2) regularizing the inverse problem. Based on this algorithm, treatment plans were obtained under a variety of light delivery scenarios using 5-15 standard or tailored diffusers. The sensitivity of the resulting light dose distributions to uncertainties in the optical properties, and the placement of diffusers was also studied. We find that tailored diffusers only marginally outperform conventional ones in terms of prostate coverage and rectal sparing. Furthermore, it is shown that small perturbations in optical properties can lead to large changes in the light dose distribution, but that those changes can be largely corrected with a simple light dose re-normalization. Finally, we find that prostate coverage is only minimally affected by small changes in diffuser placement. Our results suggest that prostate PDT is not likely to benefit from the use of tailored diffusers. Other locations with more complex geometries might see a better improvement

  13. The standards of Radiation Protection of IAEA

    International Nuclear Information System (INIS)

    Butragueno, J. L.

    2000-01-01

    Nuclear Safety and Radiation Protection are technological disciplines whose international character have been recognised since the very beginning. Safety culture and the defense in depth criterium address in the same way this international collaboration. The International Atomic Energy Agency, with headquater in Vienna, is specially sensitive to this aspect and a significant amount of resources has been dedicated to the promotion of a closer international collaboration through the promotion of two complementary programs: the Convention on Nuclear Safety and the Convention on Rad waste Management, and the reconstruction of a great piramide of standards, that staring with Fundamental Principles, is followed with a set of Basic Safety Standards and completed with Safety Requirements and additional technical information, that provide practical ways to implement the Fundamental Principles. This article describe briefly the RASS Program of the IAEA (Radiation Safety Standards) and the work of the Technical Committees established to assess the Director General of the IAEA in this task. (Author)

  14. Safety implications of standardized continuous quality improvement programs in community pharmacy.

    Science.gov (United States)

    Boyle, Todd A; Ho, Certina; Mackinnon, Neil J; Mahaffey, Thomas; Taylor, Jeffrey M

    2013-06-01

    Standardized continuous quality improvement (CQI) programs combine Web-based technologies and standardized improvement processes, tools, and expectations to enable quality-related events (QREs) occurring in individual pharmacies to be shared with pharmacies in other jurisdictions. Because standardized CQI programs are still new to community pharmacy, little is known about how they impact medication safety. This research identifies key aspects of medication safety that change as a result of implementing a standardized CQI program. Fifty-three community pharmacies in Nova Scotia, Canada, adopted the SafetyNET-Rx standardized CQI program in April 2010. The Institute for Safe Medication Practices (ISMP) Canada's Medication Safety Self-Assessment (MSSA) survey was administered to these pharmacies before and 1 year into their use of the SafetyNET-Rx program. The nonparametric Wilcoxon signed-rank test was used to explore where changes in patient safety occurred as a result of SafetyNETRx use. Significant improvements occurred with quality processes and risk management, staff competence, and education, and communication of drug orders and other information. Patient education, environmental factors, and the use of devices did not show statistically significant changes. As CQI programs are designed to share learning from QREs, it is reassuring to see that the largest improvements are related to quality processes, risk management, staff competence, and education.

  15. Search for the Standard Model Higgs boson produced in the decay ...

    Indian Academy of Sciences (India)

    2012-10-06

    Oct 6, 2012 ... s = 7 TeV. No evidence is found for a significant deviation from Standard Model expectations anywhere in the ZZ mass range considered in this analysis. An upper limit at 95% CL is placed on the product of the cross-section and decay branching ratio for the Higgs boson decaying with Standard Model-like ...

  16. Some consideration of Japanese standard man value

    International Nuclear Information System (INIS)

    Yoshizawa, Yasuo; Kusama, Tomoko

    1976-01-01

    Numerical values of standard man or reference man is important problem in the field of radiation protection and safety. The standard man values given by ICRP were obtained from European and North American adult data. For that reason, there are some theoretical problems in the application of standard man values to Japanese. The purpose of the present paper is to consider the difference of values between Japanese and standard man. The standard man values are divided into three categories. The first category is the size and weight of the body or organ, the second is the values of elementary composition, and the third is the numerical factors related to metabolic kinetics. It is natural that some values of the second and the third categories have little difference between Japanese and European. On the other hand, there are some differences in the value of the first category, but the differences can calculation in proportional allotment to the body weight. The values concerning the thyroid gland and iodine metabolism are important for radiation protection. It has been foreseen that these values of Japanese are significantly different from standard man. A survey of past reports was carried out with a view to search for normal values of the weight, iodine content, and iodine uptake rate of the thyroid of Japanese. The result of the survey showed that the weight of thyroid are about 19g for adult male and 17g for adult female and that the iodine contents are 12-22mg and iodine uptake rate (fw) is about 0.2. (auth.)

  17. FACTORS DESCRIBING STUDENTS´ PERCEPTION ON EDUCATION QUALITY STANDARDS

    Directory of Open Access Journals (Sweden)

    Lucie Vnoučková

    2017-12-01

    Full Text Available Education quality assurance is the necessity for today’s competitive environment in university education. Quality assurance standards and strategies are being used in most of universities and higher education institutions. But the perception of quality standards is being usually seen from the perspective of a university management. This study aims to analyze and present perceptions of students towards a measurement of education quality standards and to identify significant groups of students according to their preferences in education quality. Students’ questionnaires and focus groups collected the data. Two dimensional and multi-dimensional statistical methods were used to evaluate the results. The outputs show five groups of students based on their perception of the education quality. Examination of students’ interest in specific areas, subjects and courses leads to identification of factors which affect their preferences in education. The paper found five significant groups of perceived quality by students. These are Quality receptionists, Business oriented, Expert innovators, Distance learners and Arrangement oriented. Limit of the study is a narrow focus on one private university. This study may encourage other papers to develop and test further the impact of education quality on students’ preferences for measurable improvements. The paper is an extension of the conference paper presented on ERIE conference 2017.

  18. A suite of standards for radiation monitors and their revisions

    International Nuclear Information System (INIS)

    Noda, Kimio

    1991-01-01

    A suite of standards for radiation monitors applied in nuclear facilities in Japan was compiled mainly by Health Physicists in Power Reactor and Nuclear Fuel Development (PNC) and Japan Atomic Energy Research Institute (JAERI), and issued in 1971 as 'The Standard for Radiation Monitors'. PNC facilities such as Reprocessing Plant and Plutonium Fuel Fabrication Facility, as well as other nuclear industries have applied the standard, and contributed improvement of practical maintenability and availability of the radiation monitors. Meanwhile, the radiation monitors have remarkably progressed in its application and size of the monitors is growing. Furthermore, manufacturing techniques have significantly progressed especially in the field of system concepts and electronics elements. These progresses require revision of the standards. 'The Standard for Radiation Monitors' has been revised considering the problems in practical application and data processing capability. Considerations are given to keep compatibility of old and new modules. (author)

  19. Yoga & Cancer Interventions: A Review of the Clinical Significance of Patient Reported Outcomes for Cancer Survivors

    Directory of Open Access Journals (Sweden)

    S. Nicole Culos-Reed

    2012-01-01

    Full Text Available Limited research suggests yoga may be a viable gentle physical activity option with a variety of health-related quality of life, psychosocial and symptom management benefits. The purpose of this review was to determine the clinical significance of patient-reported outcomes from yoga interventions conducted with cancer survivors. A total of 25 published yoga intervention studies for cancer survivors from 2004–2011 had patient-reported outcomes, including quality of life, psychosocial or symptom measures. Thirteen of these studies met the necessary criteria to assess clinical significance. Clinical significance for each of the outcomes of interest was examined based on 1 standard error of the measurement, 0.5 standard deviation, and relative comparative effect sizes and their respective confidence intervals. This review describes in detail these patient-reported outcomes, how they were obtained, their relative clinical significance and implications for both clinical and research settings. Overall, clinically significant changes in patient-reported outcomes suggest that yoga interventions hold promise for improving cancer survivors' well-being. This research overview provides new directions for examining how clinical significance can provide a unique context for describing changes in patient-reported outcomes from yoga interventions. Researchers are encouraged to employ indices of clinical significance in the interpretation and discussion of results from yoga studies.

  20. Forecasting Tehran stock exchange volatility; Markov switching GARCH approach

    Science.gov (United States)

    Abounoori, Esmaiel; Elmi, Zahra (Mila); Nademi, Younes

    2016-03-01

    This paper evaluates several GARCH models regarding their ability to forecast volatility in Tehran Stock Exchange (TSE). These include GARCH models with both Gaussian and fat-tailed residual conditional distribution, concerning their ability to describe and forecast volatility from 1-day to 22-day horizon. Results indicate that AR(2)-MRSGARCH-GED model outperforms other models at one-day horizon. Also, the AR(2)-MRSGARCH-GED as well as AR(2)-MRSGARCH-t models outperform other models at 5-day horizon. In 10 day horizon, three models of AR(2)-MRSGARCH outperform other models. Concerning 22 day forecast horizon, results indicate no differences between MRSGARCH models with that of standard GARCH models. Regarding Risk management out-of-sample evaluation (95% VaR), a few models seem to provide reasonable and accurate VaR estimates at 1-day horizon, with a coverage rate close to the nominal level. According to the risk management loss functions, there is not a uniformly most accurate model.

  1. Noise robust automatic speech recognition with adaptive quantile based noise estimation and speech band emphasizing filter bank

    DEFF Research Database (Denmark)

    Bonde, Casper Stork; Graversen, Carina; Gregersen, Andreas Gregers

    2005-01-01

    and standard MFCC. AQBNE also outperforms the Aurora Baseline for the Medium Mismatch (MM) and Well Matched (WM) conditions. Though for all three conditions, the Aurora Advanced Frontend achieves superior performance, the AQBNE is still a relevant method to consider for small foot print applications....

  2. Determination analysis of energy conservation standards for distribution transformers

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  3. From free energy to expected energy: Improving energy-based value function approximation in reinforcement learning.

    Science.gov (United States)

    Elfwing, Stefan; Uchibe, Eiji; Doya, Kenji

    2016-12-01

    Free-energy based reinforcement learning (FERL) was proposed for learning in high-dimensional state and action spaces. However, the FERL method does only really work well with binary, or close to binary, state input, where the number of active states is fewer than the number of non-active states. In the FERL method, the value function is approximated by the negative free energy of a restricted Boltzmann machine (RBM). In our earlier study, we demonstrated that the performance and the robustness of the FERL method can be improved by scaling the free energy by a constant that is related to the size of network. In this study, we propose that RBM function approximation can be further improved by approximating the value function by the negative expected energy (EERL), instead of the negative free energy, as well as being able to handle continuous state input. We validate our proposed method by demonstrating that EERL: (1) outperforms FERL, as well as standard neural network and linear function approximation, for three versions of a gridworld task with high-dimensional image state input; (2) achieves new state-of-the-art results in stochastic SZ-Tetris in both model-free and model-based learning settings; and (3) significantly outperforms FERL and standard neural network function approximation for a robot navigation task with raw and noisy RGB images as state input and a large number of actions. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  4. Setting the standard: The IAEA safety standards set the global reference

    International Nuclear Information System (INIS)

    Williams, L.

    2003-01-01

    For the IAEA, setting and promoting standards for nuclear radiation, waste, and transport safety have been priorities from the start, rooted in the Agency's 1957 Statute. Today, a corpus of international standards are in place that national regulators and industries in many countries are applying, and more are being encouraged and assisted to follow them. Considerable work is done to keep safety standards updated and authoritative. They cover five main areas: the safety of nuclear facilities; radiation protection and safety of radiation sources; safe management of radioactive waste; safe transport of radioactive material; and thematic safety areas, such as emergency preparedness or legal infrastructures. Overall, the safety standards reflect an international consensus on what constitutes a high level of safety for protecting people and the environment. All IAEA Member States can nominate experts for the Agency standards committees and provide comments on draft standards. Through this ongoing cycle of review and feedback, the standards are refined, updated, and extended where needed

  5. An Inventory Controlled Supply Chain Model Based on Improved BP Neural Network

    Directory of Open Access Journals (Sweden)

    Wei He

    2013-01-01

    Full Text Available Inventory control is a key factor for reducing supply chain cost and increasing customer satisfaction. However, prediction of inventory level is a challenging task for managers. As one of the widely used techniques for inventory control, standard BP neural network has such problems as low convergence rate and poor prediction accuracy. Aiming at these problems, a new fast convergent BP neural network model for predicting inventory level is developed in this paper. By adding an error offset, this paper deduces the new chain propagation rule and the new weight formula. This paper also applies the improved BP neural network model to predict the inventory level of an automotive parts company. The results show that the improved algorithm not only significantly exceeds the standard algorithm but also outperforms some other improved BP algorithms both on convergence rate and prediction accuracy.

  6. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a ''standard model''. The ''standard model'' consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the ''standard model'' to determine if the requirements of ''non-standard'' architectures can be met. Several possible extensions to the ''standard model'' are suggested including software as well as the hardware architectural feature

  7. ATLAS Z Excess in Minimal Supersymmetric Standard Model

    International Nuclear Information System (INIS)

    Lu, Xiaochuan; Terada, Takahiro

    2015-06-01

    Recently the ATLAS collaboration reported a 3 sigma excess in the search for the events containing a dilepton pair from a Z boson and large missing transverse energy. Although the excess is not sufficiently significant yet, it is quite tempting to explain this excess by a well-motivated model beyond the standard model. In this paper we study a possibility of the minimal supersymmetric standard model (MSSM) for this excess. Especially, we focus on the MSSM spectrum where the sfermions are heavier than the gauginos and Higgsinos. We show that the excess can be explained by the reasonable MSSM mass spectrum.

  8. The Dynamics of Standardization

    DEFF Research Database (Denmark)

    Brunsson, Nils; Rasche, Andreas; Seidl, David

    2012-01-01

    This paper suggests that when the phenomenon of standards and standardization is examined from the perspective of organization studies, three aspects stand out: the standardization of organizations, standardization by organizations and standardization as (a form of) organization. Following a comp...

  9. Impacts of optimum cost effective energy efficiency standards

    International Nuclear Information System (INIS)

    Brancic, A.B.; Peters, J.S.; Arch, M.

    1991-01-01

    Building Codes are increasingly required to be responsive to social and economic policy concerns. In 1990 the State of Connecticut passes An Act Concerning Global Warming, Public Act 90-219, which mandates the revision of the state building code to require that buildings and building elements be designed to provide optimum cost-effective energy efficiency over the useful life of the building. Further, such revision must meet the American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) Standard 90.1 - 1989. As the largest electric energy supplier in Connecticut, Northeast Utilities (NU) sponsored a pilot study of the cost effectiveness of alternative building code standards for commercial construction. This paper reports on this study which analyzed design and construction means, building elements, incremental construction costs, and energy savings to determine the optimum cost-effective building code standard. Findings are that ASHRAE 90.1 results in 21% energy savings and alternative standards above it result in significant additional savings. Benefit/cost analysis showed that both are cost effective

  10. Proposal and Evaluation of Subordinate Standard Solar Irradiance Spectra: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wilbert, Stefan [German Aerospace Center (DLR); Jessen, Wilko [German Aerospace Center (DLR); Gueymard, Chris [Solar Consulting Services; Polo, Jesus [CIEMAT; Bian, Zeqiang [China Meteorological Administration; Driesse, Anton [Photovoltaic Performance Labs; Marzo, Aitor [University of Antofagasta; Armstrong, Peter [Masdar Institute of Science & Technology; Vignola, Frank [University of Oregon; Ramirez, Lourdes [CIEMAT

    2018-04-12

    This paper introduces a concept for global tilted irradiance (GTI) subordinate standard spectra to supplement the current standard spectra used in solar photovoltaic applications as defined in ASTM G173 and IEC60904. The proposed subordinate standard spectra correspond to atmospheric conditions and tilt angles that depart significantly from the main standard spectrum, and they can be used to more accurately represent various local conditions. For the definition of subordinate standard spectra cases with an elevation 1.5 km above sea level, the question arises whether the air mass should be calculated including a pressure correction or not. This study focuses on the impact of air mass used in standard spectra, and it uses data from 29 locations to examine which air mass is most appropriate for GTI and direct normal irradiance (DNI) spectra. Overall, it is found that the pressure-corrected air mass of 1.5 is most appropriate for DNI spectra. For GTI, a non-pressure-corrected air mass of 1.5 was found to be more appropriate.

  11. Radiation protection standards for the occupational workers and the public

    International Nuclear Information System (INIS)

    Minkin, S.C.; Dickson, R.L.; Halford, D.K.

    1987-01-01

    Federal Regulations concerning radiation protection standards have been undergoing significant changes within the last decade. In addition to these changes, a proliferation in the number of Federal radiation standards has also occurred. A tabulation of these regulations aids in the understanding of which current standards apply to the nuclear industry with respect to environmental contamination and exposure to workers, and the public. Furthermore, most of the current regulations, proposed revisions, and proposed new rulings fall into several major categories. A tabulation of these categories illustrates common public, occupational, and environmental needs for which the DOE, NRC, and EPA have developed their specific radiation standards. Finally, risk based systems for radiation protection have been proposed by the DOE, NRC, and EPA, although these agencies are not entirely consistent in the application of this methodology. 2 tables

  12. Work-related falls among union carpenters in Washington State before and after the Vertical Fall Arrest Standard.

    Science.gov (United States)

    Lipscomb, Hester J; Li, Leiming; Dement, John

    2003-08-01

    Washington State enacted a change in their fall standard for the construction industry in 1991, preceding the Safety Standard for Fall Protection in the Construction Industry promulgated by Federal OSHA in 1994. We evaluated changes in the rate of falls from elevations and measures of severity among a large cohort of union carpenters after the fall standard change in Washington State, taking into account the temporal trends in their overall injury rates. There was a significant decrease in the rate of falls from height after the standard went into effect, even after adjusting for the overall decrease in work-related injuries among this cohort. Much of the decrease was immediate, likely representing the publicity surrounding fatal falls and subsequent promulgation of the standard. The greatest decrease was seen between 3 and 3(1/2) years after the standard went into effect. There was a significant reduction in mean paid lost days per event after the standard change and there was a significant reduction in mean cost per fall when adjusting for age and the temporal trend for costs among non-fall injuries. Through the use of observational methods we have demonstrated significant effects of the Washington State Vertical Fall Arrest Standard among carpenters in the absence of a control or comparison group. Without controlling for the temporal trend in overall injury rates, the rate of decline in falls appeared significantly greater, but the more pronounced, but delayed, decline was not seen. The analyses demonstrate potential error in failing to account for temporal patterns or assuming that a decline after an intervention is related to the intervention. Copyright 2003 Wiley-Liss, Inc.

  13. Making standards work

    OpenAIRE

    Stigzelius, Ingrid

    2009-01-01

    Social and environmental standards can function as tools for companies that want to improve their conduct in social and environmental areas in the supply chain. However, relatively little attention has been given to how the adoption of social and environmental standards may influence the actual business practices in the supply chain. The overall aim of this thesis is to examine the institutional context surrounding the adoption of social and environmental standards and how these standards inf...

  14. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  15. Final Technical Report: Hydrogen Codes and Standards Outreach

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Karen I.

    2007-05-12

    This project contributed significantly to the development of new codes and standards, both domestically and internationally. The NHA collaborated with codes and standards development organizations to identify technical areas of expertise that would be required to produce the codes and standards that industry and DOE felt were required to facilitate commercialization of hydrogen and fuel cell technologies and infrastructure. NHA staff participated directly in technical committees and working groups where issues could be discussed with the appropriate industry groups. In other cases, the NHA recommended specific industry experts to serve on technical committees and working groups where the need for this specific industry expertise would be on-going, and where this approach was likely to contribute to timely completion of the effort. The project also facilitated dialog between codes and standards development organizations, hydrogen and fuel cell experts, the government and national labs, researchers, code officials, industry associations, as well as the public regarding the timeframes for needed codes and standards, industry consensus on technical issues, procedures for implementing changes, and general principles of hydrogen safety. The project facilitated hands-on learning, as participants in several NHA workshops and technical meetings were able to experience hydrogen vehicles, witness hydrogen refueling demonstrations, see metal hydride storage cartridges in operation, and view other hydrogen energy products.

  16. The European Stroke Organisation Guidelines: a standard operating procedure

    DEFF Research Database (Denmark)

    Ntaios, George; Bornstein, Natan M; Caso, Valeria

    2015-01-01

    pace with this progress and driven by the strong determination of the European Stroke Organisation to further promote stroke management, education, and research, the European Stroke Organisation decided to delineate a detailed standard operating procedure for its guidelines. There are two important...... cornerstones in this standard operating procedure: The first is the implementation of the Grading of Recommendations Assessment, Development, and Evaluation methodology for the development of its Guideline Documents. The second one is the decision of the European Stroke Organisation to move from the classical...... and significant input from European Stroke Organisation members as well as methodologists and analysts, this document presents the official standard operating procedure for the development of the Guideline Documents of the European Stroke Organisation....

  17. CONVERGENCE OF INTERNATIONAL AUDIT STANDARDS AND AMERICAN AUDIT STANDARDS REGARDING SAMPLING

    Directory of Open Access Journals (Sweden)

    Chis Anca Oana

    2013-07-01

    Full Text Available Abstract: Sampling is widely used in market research, scientific analysis, market analysis, opinion polls and not least in the financial statement audit. We wonder what is actually sampling and how did it appear? Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Nowadays the technique is indispensable, the economic entities operating with sophisticated computer systems and large amounts of data. Economic globalization and complexity of capital markets has made possible not only the harmonization of international accounting standards with the national ones, but also the convergence of international accounting and auditing standards with the American regulations. International Standard on Auditing 530 and Statement on Auditing Standard 39 are the two main international and American normalized referentials referring to audit sampling. This article discusses the origin of audit sampling, mentioning a brief history of the method and different definitions from literature review. The two standards are studied using Jaccard indicators in terms of the degree of similarity and dissimilarity concerning different issues. The Jaccard coefficient measures the degree of convergence of international auditing standards (ISA 530 and U.S. auditing standards (SAS 39. International auditing standards and American auditing standards, study the sampling problem, both regulations presenting common points with regard to accepted sampling techniques, factors influencing the audit sample, treatment of identified misstatements and the circumstances in which sampling is appropriate. The study shows that both standards agree on application of statistical and non-statistical sampling in auditing, that sampling is appropriate for tests of details and controls, the factors affecting audit sampling being audit risk, audit objectives and population\\'s characteristics.

  18. Searching for the New World Monetary Standard

    Directory of Open Access Journals (Sweden)

    Ishkhanov Aleksandr Vladimirovich

    2014-11-01

    Full Text Available In the article the influence of the existing world currency system on the international financial relations is considered, the retrospective analysis of the existing four currency systems is carried out. The change of a world currency order is justified. The concept of the new international currency standard based on division of functions of money between separate financial instruments of one currency is offered. The functional communications between financial instruments are revealed. The comparison of function of money and independent tools of new world currency is carried out, it is supposed that tools are actually completely capable to carry out all functions of money. Therefore, the new international currency standard is based on division of these functions between separate tools and can be defined as polytool. The general function chart of the polytool world currency standard including their functional connections between reserve tool, reverse tool and credit as well as their characteristics which should determine the activity of world reserve system. Prerequisites of replacement of the Jamaican currency system by the alternative are proved; the most perspective way of transition to the polytool standard is revealed; the additional functions of the polytool standard are designated – stimulation of issuers of the leading world currencies to refuse harmful policy of competitive devaluation, stimulation of integration of the countries and creation of collective currencies (currency zones and associations that will significantly increase financial stability of world economy.

  19. Standard Clock in primordial density perturbations and cosmic microwave background

    International Nuclear Information System (INIS)

    Chen, Xingang; Namjoo, Mohammad Hossein

    2014-01-01

    Standard Clocks in the primordial epoch leave a special type of features in the primordial perturbations, which can be used to directly measure the scale factor of the primordial universe as a function of time a(t), thus discriminating between inflation and alternatives. We have started to search for such signals in the Planck 2013 data using the key predictions of the Standard Clock. In this Letter, we summarize the key predictions of the Standard Clock and present an interesting candidate example in Planck 2013 data. Motivated by this candidate, we construct and compute full Standard Clock models and use the more complete prediction to make more extensive comparison with data. Although this candidate is not yet statistically significant, we use it to illustrate how Standard Clocks appear in Cosmic Microwave Background (CMB) and how they can be further tested by future data. We also use it to motivate more detailed theoretical model building

  20. Accurate determination of arsenic in arsenobetaine standard solutions of BCR-626 and NMIJ CRM 7901-a by neutron activation analysis coupled with internal standard method.

    Science.gov (United States)

    Miura, Tsutomu; Chiba, Koichi; Kuroiwa, Takayoshi; Narukawa, Tomohiro; Hioki, Akiharu; Matsue, Hideaki

    2010-09-15

    Neutron activation analysis (NAA) coupled with an internal standard method was applied for the determination of As in the certified reference material (CRM) of arsenobetaine (AB) standard solutions to verify their certified values. Gold was used as an internal standard to compensate for the difference of the neutron exposure in an irradiation capsule and to improve the sample-to-sample repeatability. Application of the internal standard method significantly improved linearity of the calibration curve up to 1 microg of As, too. The analytical reliability of the proposed method was evaluated by k(0)-standardization NAA. The analytical results of As in AB standard solutions of BCR-626 and NMIJ CRM 7901-a were (499+/-55)mgkg(-1) (k=2) and (10.16+/-0.15)mgkg(-1) (k=2), respectively. These values were found to be 15-20% higher than the certified values. The between-bottle variation of BCR-626 was much larger than the expanded uncertainty of the certified value, although that of NMIJ CRM 7901-a was almost negligible. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  1. Rough Standard Neutrosophic Sets: An Application on Standard Neutrosophic Information Systems

    Directory of Open Access Journals (Sweden)

    Nguyen Xuan Thao

    2016-12-01

    Full Text Available A rough fuzzy set is the result of the approximation of a fuzzy set with respect to a crisp approximation space. It is a mathematical tool for the knowledge discovery in the fuzzy information systems. In this paper, we introduce the concepts of rough standard neutrosophic sets and standard neutrosophic information system, and give some results of the knowledge discovery on standard neutrosophic information system based on rough standard neutrosophic sets.

  2. Improving cerebellar segmentation with statistical fusion

    Science.gov (United States)

    Plassard, Andrew J.; Yang, Zhen; Prince, Jerry L.; Claassen, Daniel O.; Landman, Bennett A.

    2016-03-01

    The cerebellum is a somatotopically organized central component of the central nervous system well known to be involved with motor coordination and increasingly recognized roles in cognition and planning. Recent work in multiatlas labeling has created methods that offer the potential for fully automated 3-D parcellation of the cerebellar lobules and vermis (which are organizationally equivalent to cortical gray matter areas). This work explores the trade offs of using different statistical fusion techniques and post hoc optimizations in two datasets with distinct imaging protocols. We offer a novel fusion technique by extending the ideas of the Selective and Iterative Method for Performance Level Estimation (SIMPLE) to a patch-based performance model. We demonstrate the effectiveness of our algorithm, Non- Local SIMPLE, for segmentation of a mixed population of healthy subjects and patients with severe cerebellar anatomy. Under the first imaging protocol, we show that Non-Local SIMPLE outperforms previous gold-standard segmentation techniques. In the second imaging protocol, we show that Non-Local SIMPLE outperforms previous gold standard techniques but is outperformed by a non-locally weighted vote with the deeper population of atlases available. This work advances the state of the art in open source cerebellar segmentation algorithms and offers the opportunity for routinely including cerebellar segmentation in magnetic resonance imaging studies that acquire whole brain T1-weighted volumes with approximately 1 mm isotropic resolution.

  3. [Study on standardization of cupping technique: elucidation on the establishment of the National Standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping].

    Science.gov (United States)

    Gao, Shu-zhong; Liu, Bing

    2010-02-01

    From the aspects of basis, technique descriptions, core contents, problems and solutions, and standard thinking in standard setting process, this paper states experiences in the establishment of the national standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping, focusing on methodologies used in cupping standard setting process, the method selection and operating instructions of cupping standardization, and the characteristics of standard TCM. In addition, this paper states the scope of application, and precautions for this cupping standardization. This paper also explaines tentative ideas on the research of standardized manipulation of acupuncture and moxibustion.

  4. Evaluating Varied Label Designs for Use with Medical Devices: Optimized Labels Outperform Existing Labels in the Correct Selection of Devices and Time to Select.

    Directory of Open Access Journals (Sweden)

    Laura Bix

    Full Text Available Effective standardization of medical device labels requires objective study of varied designs. Insufficient empirical evidence exists regarding how practitioners utilize and view labeling.Measure the effect of graphic elements (boxing information, grouping information, symbol use and color-coding to optimize a label for comparison with those typical of commercial medical devices.Participants viewed 54 trials on a computer screen. Trials were comprised of two labels that were identical with regard to graphics, but differed in one aspect of information (e.g., one had latex, the other did not. Participants were instructed to select the label along a given criteria (e.g., latex containing as quickly as possible. Dependent variables were binary (correct selection and continuous (time to correct selection.Eighty-nine healthcare professionals were recruited at Association of Surgical Technologists (AST conferences, and using a targeted e-mail of AST members.Symbol presence, color coding and grouping critical pieces of information all significantly improved selection rates and sped time to correct selection (α = 0.05. Conversely, when critical information was graphically boxed, probability of correct selection and time to selection were impaired (α = 0.05. Subsequently, responses from trials containing optimal treatments (color coded, critical information grouped with symbols were compared to two labels created based on a review of those commercially available. Optimal labels yielded a significant positive benefit regarding the probability of correct choice ((P<0.0001 LSM; UCL, LCL: 97.3%; 98.4%, 95.5%, as compared to the two labels we created based on commercial designs (92.0%; 94.7%, 87.9% and 89.8%; 93.0%, 85.3% and time to selection.Our study provides data regarding design factors, namely: color coding, symbol use and grouping of critical information that can be used to significantly enhance the performance of medical device labels.

  5. Differences in Faculty and Standardized Patient Scores on Professionalism for Second-Year Podiatric Medical Students During a Standardized Simulated Patient Encounter.

    Science.gov (United States)

    Mahoney, James M; Vardaxis, Vassilios; Anwar, Noreen; Hagenbucher, Jacob

    2018-03-01

    This study examined the differences between faculty and trained standardized patient (SP) evaluations on student professionalism during a second-year podiatric medicine standardized simulated patient encounter. Forty-nine second-year podiatric medicine students were evaluated for their professionalism behavior. Eleven SPs performed an assessment in real-time, and one faculty member performed a secondary assessment after observing a videotape of the encounter. Five domains were chosen for evaluation from a validated professionalism assessment tool. Significant differences were identified in the professionalism domains of "build a relationship" ( P = .008), "gather information" ( P = .001), and share information ( P = .002), where the faculty scored the students higher than the SP for 24.5%, 18.9%, and 26.5% of the cases, respectively. In addition, the faculty scores were higher than the SP scores in all of the "gather information" subdomains; however, the difference in scores was significant only in the "question appropriately" ( P = .001) and "listen and clarify" ( P = .003) subdomains. This study showed that professionalism scores for second-year podiatric medical students during a simulated patient encounter varied significantly between faculty and SPs. Further consideration needs to be given to determine the source of these differences.

  6. Designing an evaluation framework for WFME basic standards for medical education.

    Science.gov (United States)

    Tackett, Sean; Grant, Janet; Mmari, Kristin

    2016-01-01

    To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.

  7. PET/CT detectability and classification of simulated pulmonary lesions using an SUV correction scheme

    Science.gov (United States)

    Morrow, Andrew N.; Matthews, Kenneth L., II; Bujenovic, Steven

    2008-03-01

    Positron emission tomography (PET) and computed tomography (CT) together are a powerful diagnostic tool, but imperfect image quality allows false positive and false negative diagnoses to be made by any observer despite experience and training. This work investigates PET acquisition mode, reconstruction method and a standard uptake value (SUV) correction scheme on the classification of lesions as benign or malignant in PET/CT images, in an anthropomorphic phantom. The scheme accounts for partial volume effect (PVE) and PET resolution. The observer draws a region of interest (ROI) around the lesion using the CT dataset. A simulated homogenous PET lesion of the same shape as the drawn ROI is blurred with the point spread function (PSF) of the PET scanner to estimate the PVE, providing a scaling factor to produce a corrected SUV. Computer simulations showed that the accuracy of the corrected PET values depends on variations in the CT-drawn boundary and the position of the lesion with respect to the PET image matrix, especially for smaller lesions. Correction accuracy was affected slightly by mismatch of the simulation PSF and the actual scanner PSF. The receiver operating characteristic (ROC) study resulted in several observations. Using observer drawn ROIs, scaled tumor-background ratios (TBRs) more accurately represented actual TBRs than unscaled TBRs. For the PET images, 3D OSEM outperformed 2D OSEM, 3D OSEM outperformed 3D FBP, and 2D OSEM outperformed 2D FBP. The correction scheme significantly increased sensitivity and slightly increased accuracy for all acquisition and reconstruction modes at the cost of a small decrease in specificity.

  8. Do semantic standards lack quality? A survey among 34 semantic standards

    NARCIS (Netherlands)

    Folmer, E.J.A.; Oude Luttighuis, P.H.W.M.; Hillegersberg, J. van

    2011-01-01

    The adoption of standards to improve interoperability in the automotive, aerospace, shipbuilding and other sectors could save billions. While interoperability standards have been created for a number of industries, problems persist, suggesting a lack of quality of the standards themselves. The issue

  9. Do semantic standards lack quality? A survey among 34 semantic standards.

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; Oude Luttighuis, Paul; van Hillegersberg, Jos

    2011-01-01

    The adoption of standards to improve interoperability in the automotive, aerospace, shipbuilding and other sectors could save billions. While interoperability standards have been created for a number of industries, problems persist, suggesting a lack of quality of the standards themselves. The issue

  10. Requirements of quality standards

    International Nuclear Information System (INIS)

    Mueller, J.

    1977-01-01

    The lecture traces the development of nuclear standards, codes, and Federal regulations on quality assurance (QA) for nuclear power plants and associated facilities. The technical evolution of the last twelve years, especially in the area of nuclear technology, led to different activities and regulatory initiatives, and the present result is: several nations have their own homemade standards. The lecture discusses the former and especially current activities in standard development, and gives a description of the requirements of QA-standards used in USA and Europe, especially Western Germany. Furthermore the lecture attempts to give a comparison and an evaluation of the international quality standards from the author's viewpoint. Finally the lecture presents an outlook for the future international implications of QA-standards. There is an urgent need within the nuclear industry for simplification and standardization of QA-standards. The relationship between the various standards, and the applicability of the standards need clarification and a better transparancy. To point out these problems is the purpose of the lecture. (orig.) [de

  11. 78 FR 35559 - Updating OSHA Standards Based on National Consensus Standards; Signage

    Science.gov (United States)

    2013-06-13

    ...; Signage AGENCY: Occupational Safety and Health Administration (OSHA), Department of Labor. ACTION: Direct... signage standards by adding references to the latest versions of the American National Standards Institute... earlier ANSI standards, ANSI Z53.1-1967, Z35.1-1968 and Z35.2-1968, in its signage standards, thereby...

  12. Are Disposable and Standard Gonioscopy Lenses Comparable?

    Science.gov (United States)

    Lee, Bonny; Szirth, Bernard C; Fechtner, Robert D; Khouri, Albert S

    2017-04-01

    Gonioscopy is important in the evaluation and treatment of glaucoma. With increased scrutiny of acceptable sterilization processes for health care instruments, disposable gonioscopy lenses have recently been introduced. Single-time use lenses are theorized to decrease infection risk and eliminate the issue of wear and tear seen on standard, reusable lenses. However, patient care would be compromised if the quality of images produced by the disposable lens were inferior to those produced by the reusable lens. The purpose of this study was to compare the quality of images produced by disposable versus standard gonioscopy lenses. A disposable single mirror lens (Sensor Medical Technology) and a standard Volk G-1 gonioscopy lens were used to image 21 volunteers who were prospectively recruited for the study. Images of the inferior and temporal angles of each subject's left eye were acquired using a slit-lamp camera through the disposable and standard gonioscopy lens. In total, 74 images were graded using the Spaeth gonioscopic system and for clarity and quality. Clarity was scored as 1 or 2 and defined as either (1) all structures perceived or (2) all structures not perceived. Quality was scored as 1, 2, or 3, and defined as (1) all angle landmarks clear and well focused, (2) some angle landmarks clear, others blurred, or (3) angle landmarks could not be ascertained. The 74 images were divided into images taken with the disposable single mirror lens and images taken with the standard Volk G-1 gonioscopy lens. The clarity and quality scores for each of these 2 image groups were averaged and P-values were calculated. Average quality of images produced with the standard lens was 1.46±0.56 compared with 1.54±0.61 for those produced with the disposable lens (P=0.55). Average clarity of images produced with the standard lens was 1.47±0.51 compared with 1.49±0.51 (P=0.90) with the disposable lens. We conclude that there is no significant difference in quality of images

  13. Wavelength selection method with standard deviation: application to pulse oximetry.

    Science.gov (United States)

    Vazquez-Jaccaud, Camille; Paez, Gonzalo; Strojnik, Marija

    2011-07-01

    Near-infrared spectroscopy provides useful biological information after the radiation has penetrated through the tissue, within the therapeutic window. One of the significant shortcomings of the current applications of spectroscopic techniques to a live subject is that the subject may be uncooperative and the sample undergoes significant temporal variations, due to his health status that, from radiometric point of view, introduce measurement noise. We describe a novel wavelength selection method for monitoring, based on a standard deviation map, that allows low-noise sensitivity. It may be used with spectral transillumination, transmission, or reflection signals, including those corrupted by noise and unavoidable temporal effects. We apply it to the selection of two wavelengths for the case of pulse oximetry. Using spectroscopic data, we generate a map of standard deviation that we propose as a figure-of-merit in the presence of the noise introduced by the living subject. Even in the presence of diverse sources of noise, we identify four wavelength domains with standard deviation, minimally sensitive to temporal noise, and two wavelengths domains with low sensitivity to temporal noise.

  14. 75 FR 72664 - System Personnel Training Reliability Standards

    Science.gov (United States)

    2010-11-26

    ...--Staffing). \\2\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693, 72 FR 16416 (Apr... on the North American bulk electric system are competent to perform those reliability-related tasks... PER-004-2 will achieve a significant improvement in the reliability of the Bulk- Power System and...

  15. Discovery of Highly Potent Tyrosinase Inhibitor, T1, with Significant Anti-Melanogenesis Ability by zebrafish in vivo Assay and Computational Molecular Modeling

    Science.gov (United States)

    Chen, Wang-Chuan; Tseng, Tien-Sheng; Hsiao, Nai-Wan; Lin, Yun-Lian; Wen, Zhi-Hong; Tsai, Chin-Chuan; Lee, Yu-Ching; Lin, Hui-Hsiung; Tsai, Keng-Chang

    2015-01-01

    Tyrosinase is involved in melanin biosynthesis and the abnormal accumulation of melanin pigments leading to hyperpigmentation disorders that can be treated with depigmenting agents. A natural product T1, bis(4-hydroxybenzyl)sulfide, isolated from the Chinese herbal plant, Gastrodia elata, is a strong competitive inhibitor against mushroom tyrosinase (IC50 = 0.53 μM, Ki = 58 +/- 6 nM), outperforms than kojic acid. The cell viability and melanin quantification assay demonstrate that 50 μM of T1 apparently attenuates 20% melanin content of human normal melanocytes without significant cell toxicity. Moreover, the zebrafish in vivo assay reveals that T1 effectively reduces melanogenesis with no adverse side effects. The acute oral toxicity study evidently confirms that T1 molecule is free of discernable cytotoxicity in mice. Furthermore, the molecular modeling demonstrates that the sulfur atom of T1 coordinating with the copper ions in the active site of tyrosinase is essential for mushroom tyrosinase inhibition and the ability of diminishing the human melanin synthesis. These results evident that T1 isolated from Gastrodia elata is a promising candidate in developing pharmacological and cosmetic agents of great potency in skin-whitening.

  16. 29 CFR 500.132 - Applicable Federal standards: ETA and OSHA housing standards.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Applicable Federal standards: ETA and OSHA housing... Migrant Workers Housing Safety and Health § 500.132 Applicable Federal standards: ETA and OSHA housing... § 500.131, all migrant housing is subject to either the ETA standards or the OSHA standards, as follows...

  17. Increasing the statistical significance of entanglement detection in experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jungnitsch, Bastian; Niekamp, Soenke; Kleinmann, Matthias; Guehne, Otfried [Institut fuer Quantenoptik und Quanteninformation, Innsbruck (Austria); Lu, He; Gao, Wei-Bo; Chen, Zeng-Bing [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Chen, Yu-Ao; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Physikalisches Institut, Universitaet Heidelberg (Germany)

    2010-07-01

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. We show this to be the case for an error model in which the variance of an observable is interpreted as its error and for the standard error model in photonic experiments. Specifically, we demonstrate that the Mermin inequality yields a Bell test which is statistically more significant than the Ardehali inequality in the case of a photonic four-qubit state that is close to a GHZ state. Experimentally, we observe this phenomenon in a four-photon experiment, testing the above inequalities for different levels of noise.

  18. PLACE AND SIGNIFICANCE OF LATINISMS IN THE SLOVAK VOCABULARY

    Directory of Open Access Journals (Sweden)

    Katarína Karabová

    2014-09-01

    Full Text Available Vocabulary of any language undergoes a natural evolution. In many cases this centuries-long process is related to several factors, including the penetration of new words into the language lexis. Similarly, the historical development of the Slovak language and its enhancement can be observed by examining the adoption of words from other languages. At a time when Latin was the only official language as well as the language of scholars and religious institutions in the Hungarian Kingdom, the penetration of Latinisms into the lexis of the old Slovak was significant. This trend was still evident in the 18th and 19th centuries marked by the beginning revivalist efforts. Domestication of adopted words - that initially stood at the edge of the language standard - was significantly influenced by innovative trends and technologies. The study does not primarily examine penetration of foreign words from modern languages, but it aims to analyse the process of naturalisation of Latinisms in Slovak and their use at different language levels.

  19. Association of State Access Standards With Accessibility to Specialists for Medicaid Managed Care Enrollees.

    Science.gov (United States)

    Ndumele, Chima D; Cohen, Michael S; Cleary, Paul D

    2017-10-01

    Medicaid recipients have consistently reported less timely access to specialists than patients with other types of coverage. By 2018, state Medicaid agencies will be required by the Center for Medicare and Medicaid Services (CMS) to enact time and distance standards for managed care organizations to ensure an adequate supply of specialist physicians for enrollees; however, there have been no published studies of whether these policies have significant effects on access to specialty care. To compare ratings of access to specialists for adult Medicaid and commercial enrollees before and after the implementation of specialty access standards. We used Consumer Assessment of Healthcare Providers and Systems survey data to conduct a quasiexperimental difference-in-differences (DID) analysis of 20 163 nonelderly adult Medicaid managed care (MMC) enrollees and 54 465 commercially insured enrollees in 5 states adopting access standards, and 37 290 MMC enrollees in 5 matched states that previously adopted access standards. Reported access to specialty care in the previous 6 months. Seven thousand six hundred ninety-eight (69%) Medicaid enrollees and 28 423 (75%) commercial enrollees reported that it was always or usually easy to get an appointment with a specialist before the policy implementation (or at baseline) compared with 11 889 (67%) of Medicaid enrollees in states that had previously implemented access standards. Overall, there was no significant improvement in timely access to specialty services for MMC enrollees in the period following implementation of standard(s) (adjusted difference-in-differences, -1.2 percentage points; 95% CI, -2.7 to 0.1), nor was there any impact of access standards on insurance-based disparities in access (0.6 percentage points; 95% CI, -4.3 to 5.4). There was heterogeneity across states, with 1 state that implemented both time and distance standards demonstrating significant improvements in access and reductions in disparities

  20. National and International Standardization (International Organization for Standardization and European Committee for Standardization Relevant for Sustainability in Construction

    Directory of Open Access Journals (Sweden)

    Renata Morbiducci

    2010-12-01

    Full Text Available Sustainability in construction has a short history in terms of principles, standardizations and applications. From the Brundtland Report “Our Common Future”, a new vision of the resource deficits, climate impacts and the social responsibility gave growth to the idea of sustainability also in design and construction. Consequently, in around 2000, the international and national organizations for standardization started to develop standards for the application of sustainable principles. This paper gives an overview of existing and planned standards, and examples on how to use them as a framework for the development of methods and tools for assessment.

  1. Search for the standard model Higgs boson produced in association with a standard W or a Z boson and decaying to bottom quarks

    Energy Technology Data Exchange (ETDEWEB)

    Chatrchyan, Serguei; et al.,

    2014-01-21

    A search for the standard model Higgs boson (H) decaying to b b-bar when produced in association with a weak vector boson (V) is reported for the following channels: W(mu nu)H, W(e nu)H, W(tau nu)H, Z(mu mu)H, Z(e e)H, and Z(nu nu)H. The search is performed in data samples corresponding to integrated luminosities of up to 5.1 inverse femtobarns at sqrt(s) = 7 TeV and up to 18.9 inverse femtobarns at sqrt(s) = 8 TeV, recorded by the CMS experiment at the LHC. An excess of events is observed above the expected background with a local significance of 2.1 standard deviations for a Higgs boson mass of 125 GeV, consistent with the expectation from the production of the standard model Higgs boson. The signal strength corresponding to this excess, relative to that of the standard model Higgs boson, is 1.0 +/- 0.5.

  2. 47 CFR 90.379 - ASTM E2213-03 DSRC Standard (ASTM-DSRC Standard).

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false ASTM E2213-03 DSRC Standard (ASTM-DSRC Standard... Communications Service (dsrcs) § 90.379 ASTM E2213-03 DSRC Standard (ASTM-DSRC Standard). Roadside Units... incorporated by reference: American Society for Testing and Materials (ASTM) E2213-03, “Standard Specification...

  3. Significant characteristics of the new maize hybrid Rubin-7

    Directory of Open Access Journals (Sweden)

    Jeličić Zora

    2003-01-01

    Full Text Available The Rubin-7 maize hybrid belongs to the FAO 700 maturity group. It is characterized by high yield potential for kernels, which was proven during investigations by the Committee for Species. During the three year monitoring period, from 1999 to 2001, the average yield of kernel was 9.412 t/ha which is 5% above the ZP 704 standard, and was highly statistically significant. Resistance to disease was high for Ustilago maydis 0.49, Fusarium spp. 0.13, and Exerohilum turcicum 1.25. Tolerance against Ostrinia nubilalis is 3-33. All of the above parameters and the agreeable phenotype of this hybrid indicate the value of Rubin-7. .

  4. Normative significance of transnationalism?

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2010-01-01

    publications such as the Danish cartoons. It is argued that, although some of the usual arguments about free speech only or mainly apply domestically, many also apply transnationally; that standard arguments for multicultural recognition are difficult to apply transnationally; and that requirements of respect...

  5. Assessing clinical significance of treatment outcomes using the DASS-21.

    Science.gov (United States)

    Ronk, Fiona R; Korman, James R; Hooke, Geoffrey R; Page, Andrew C

    2013-12-01

    Standard clinical significance classifications are based on movement between the "dysfunctional" and "functional" distributions; however, this dichotomy ignores heterogeneity within the "dysfunctional" population. Based on the methodology described by Tingey, Lambert, Burlingame, and Hansen (1996), the present study sought to present a 3-distribution clinical significance model for the 21-item version of the Depression Anxiety Stress Scales (DASS-21; P. F. Lovibond & Lovibond, 1995) using data from a normative sample (n = 2,914), an outpatient sample (n = 1,000), and an inpatient sample (n = 3,964). DASS-21 scores were collected at pre- and post-treatment for both clinical samples, and patients were classified into 1 of 5 categories based on whether they had made a reliable change and whether they had moved into a different functional range. Evidence supported the validity of the 3-distribution model for the DASS-21, since inpatients who were classified as making a clinically significant change showed lower symptom severity, higher perceived quality of life, and higher clinician-rated functioning than those who did not make a clinically significant change. Importantly, results suggest that the new category of recovering is an intermediate point between recovered and making no clinically significant change. Inpatients and outpatients have different treatment goals and therefore use of the concept of clinical significance needs to acknowledge differences in what constitutes a meaningful change. (c) 2013 APA, all rights reserved.

  6. Standardization in smart grids. Introduction to IT-related methodologies, architectures and standards

    Energy Technology Data Exchange (ETDEWEB)

    Uslar, Mathias; Specht, Michael; Daenekas, Christian; Trefke, Joern; Rohjans, Sebastian; Gonzalez, Jose M.; Rosinger, Christine; Bleiker, Robert [OFFIS - Institut fuer Informatik, Oldenburg (Germany)

    2013-03-01

    Introduction to Standardization for Smart Grids. Presents a tutorial and best practice of Smart Grid Prototype Projects. Written by leading experts in the field. Besides the regulatory and market aspects, the technical level dealing with the knowledge from multiple disciplines and the aspects of technical system integration to achieve interoperability and integration has been a strong focus in the Smart Grid. This topic is typically covered by the means of using (technical) standards for processes, data models, functions and communication links. Standardization is a key issue for Smart Grids due to the involvement of many different sectors along the value chain from the generation to the appliances. The scope of Smart Grid is broad, therefore, the standards landscape is unfortunately very large and complex. This is why the three European Standards Organizations ETSI, CEN and CENELEC created a so called Joint Working Group (JWG). This was the first harmonized effort in Europe to bring together the needed disciplines and experts delivering the final report in May 2011. After this approach proved useful, the Commission used the Mandate M/490: Standardization Mandate to European Standardization Organizations (ESOs) to support European Smart Grid deployment. The focal point addressing the ESO's response to M/490 will be the CEN, CENELEC and ETSI Smart Grids Coordination Group (SG-CG). Based on this mandate, meaningful standardization of architectures, use cases, communication technologies, data models and security standards takes place in the four existing working groups. This book provides an overview on the various building blocks and standards identified as the most prominent ones by the JWG report as well as by the first set of standards group - IEC 61850 and CIM, IEC PAS 62559 for documenting Smart Grid use cases, security requirements from the SGIS groups and an introduction on how to apply the Smart Grid Architecture Model SGAM for utilities. In addition

  7. Air kerma standardization for diagnostic radiology in a secondary standard laboratory

    International Nuclear Information System (INIS)

    Ramos, Manoel M.O.; Peixoto, J. Guilherme P.; Lopes, Ricardo T.

    2009-01-01

    The demand for calibration services and quality control in diagnostic radiology has grown in the country since the publication of the governmental regulation 453, issued by the Brazilian Ministry of Health in 1998. At that time, to produce results facing the new legislation, many laboratories used different standards and radiation qualities, some of which could be inadequate. The international standards neither supplied consistent radiation qualities and standardization for the different types of equipment available. This situation changed with the publication of the new edition of the IEC 61267 standard, published in 2005. The objective of this work was to implement the standardization of the air kerma for the unatenuated qualities (RQR) of IEC 61267 in the National Laboratory of Metrology of the Ionizing Radiations (LNMRI) of the Institute of Radiation Protection and Dosimetry (IRD). Technical procedures were developed together with uncertainty budget. Results of interlaboratory comparisons demonstrate that the quantity is standardized and internationally traceable. (author)

  8. In a niche of time: do specialty hospitals outperform general services hospitals?

    Science.gov (United States)

    Poole, LeJon; Davis, Jullet A; Gunby, Norris W

    2013-01-01

    Niche hospitals represent a growing segment in the health care industry. Niche facilities are primarily engaged in the treatment of cardiac or orthopedic conditions. The effectiveness of this strategy is of interest because niche hospitals focus on only the most profitable services. The purpose of this research was to assess the financial effectiveness of the niche strategy. We theorize that firm and market-level factors concomitantly with the strategy of the hospital-niche versus traditional-are associated with financial performance. This research used 2 data sources, the 2003 Medicare Cost Report and the 2003 Area Resource File. The sample was limited to only for-profit, urban, nongovernmental hospitals (n = 995). The data were analyzed using hierarchical least squares regression. Financial performance was operationalized using the hospital's return on assets. The principal finding of this project is that niche hospitals had significantly higher performance than traditional facilities. From the organizational perspective, the niche strategy leads to better financial performance. From a societal perspective, the niche strategy provides increased focus and efficiencies through repetition. Despite the limited focus of this strategy, patients who can access these providers may experience better outcomes than patients in more traditional hospitals.

  9. Nuclear standardization development study

    International Nuclear Information System (INIS)

    Pan Jianjun

    2010-01-01

    Nuclear industry is the important part of national security and national economic development is key area of national new energy supported by government. nuclear standardization is the important force for nuclear industry development, is the fundamental guarantee of nuclear safe production, is the valuable means of China's nuclear industry technology to the world market. Now nuclear standardization faces to the new development opportunity, nuclear standardization should implement strategy in standard system building, foreign standard research, company standard building, and talented people building to meet the requirement of nuclear industry development. (author)

  10. Updating OSHA standards based on national consensus standards. Direct final rule.

    Science.gov (United States)

    2007-12-14

    In this direct final rule, the Agency is removing several references to consensus standards that have requirements that duplicate, or are comparable to, other OSHA rules; this action includes correcting a paragraph citation in one of these OSHA rules. The Agency also is removing a reference to American Welding Society standard A3.0-1969 ("Terms and Definitions") in its general-industry welding standards. This rulemaking is a continuation of OSHA's ongoing effort to update references to consensus and industry standards used throughout its rules.

  11. DCC DIFFUSE Standards Frameworks: A Standards Path through the Curation Lifecycle

    Directory of Open Access Journals (Sweden)

    Sarah Higgins

    2009-10-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 DCC DIFFUSE Standards Frameworks aims to offer domain specific advice on standards relevant to digital preservation and curation, to help curators identify which standards they should be using and where they can be appropriately implemented, to ensure authoritative digital material. The Project uses the DCC Curation Lifecycle Model and Web 2.0 technology, to visually present standards frameworks for a number of disciplines. The Digital Curation Centre (DCC is actively working with a different relevant organisations to present searchable frameworks of standards, for a number of domains. These include digital repositories, records management, the geo-information sector, archives and the museum sector. Other domains, such as e-science, will shortly be investigated.

  12. An ecological method to understand agricultural standardization in peach orchard ecosystems.

    Science.gov (United States)

    Wan, Nian-Feng; Zhang, Ming-Yi; Jiang, Jie-Xian; Ji, Xiang-Yun; Hao-Zhang

    2016-02-22

    While the worldwide standardization of agricultural production has been advocated and recommended, relatively little research has focused on the ecological significance of such a shift. The ecological concerns stemming from the standardization of agricultural production may require new methodology. In this study, we concentrated on how ecological two-sidedness and ecological processes affect the standardization of agricultural production which was divided into three phrases (pre-, mid- and post-production), considering both the positive and negative effects of agricultural processes. We constructed evaluation indicator systems for the pre-, mid- and post-production phases and here we presented a Standardization of Green Production Index (SGPI) based on the Full Permutation Polygon Synthetic Indicator (FPPSI) method which we used to assess the superiority of three methods of standardized production for peaches. The values of SGPI for pre-, mid- and post-production were 0.121 (Level IV, "Excellent" standard), 0.379 (Level III, "Good" standard), and 0.769 × 10(-2) (Level IV, "Excellent" standard), respectively. Here we aimed to explore the integrated application of ecological two-sidedness and ecological process in agricultural production. Our results are of use to decision-makers and ecologists focusing on eco-agriculture and those farmers who hope to implement standardized agricultural production practices.

  13. Implementing the GISB standards in Canada - electronic gas trading

    International Nuclear Information System (INIS)

    Anderson, I.

    1999-01-01

    Standards promulgated by the Gas Industry Standards Board (GISB) in the United States, its objective and applicability in Canada are discussed. The standards, while sponsored by an American trade organization, have had significant Canadian input, and are considered applicable throughout North America, although implementation in Canada is voluntary. In developing the standards, the intent of the GISB was to developing business practice and electronic commerce standards for the natural gas industry. Despite voluntary application in Canada, Canadians are affected by the standards since some 50 per cent of Canadian gas is exported to U.S. consumers, and U.S. gas is imported for Canadian consumers in certain parts of the country. In actual fact. a Canadian GISB Implementation Task Force has been established to develop recommendations for Canadian implementation. The task force is broadly representative of the industry and published its report in March of 1997. It explains the nature of the standards and provides details about the definition of 'gas day' , nomination schedules, accounting issues, electronic delivery mechanisms, capacity release, standard unit of measure for nominations, confirmations, scheduling, measurement reports and invoicing. Questions regarding electronic contracting and enforceability of electronic contracts also have been reviewed. Details are currently under consideration by a Working Group. Status of contracts under the Statute of Frauds, the Evidence Act and the Interpretation Act is reviewed, and legislative requirements in Canada to make electronic commerce legally enforceable are outlined. At present electronic transactions would likely be enforceable provided they are preceded by a paper-based Electronic Commerce Trading Partner Agreement

  14. Regional and global significance of nuclear energy

    International Nuclear Information System (INIS)

    Schilling, H.D.

    1995-01-01

    Measures to combat poverty and improve the standard of living in countries of the Third World will inevitably boost global demand for energy, and energy conservation measures will not be able to offset this increase. Nuclear energy will regain significance in the framework of approaches adopted to resolve the energy problem, which primarily is an ecologic problem created by an extremely large flow of materials. The extraordinarily high energy density of nuclear fuels can contribute to markedly reduce the flow of materials; and at that, electric energy is an efficient substitute for primary energy forms. Thus nuclear electricity generation is of double benefit to the ecology. Engineering goals in nuclear technology thus gain a service aspect, with progress in power plant engineering and design aiming not only at enhanced engineered safety, but also at regaining public acceptance of and confidence in nuclear power plant technology. (orig./UA) [de

  15. Adaptive Probabilistic Broadcasting over Dense Wireless Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Victor Gau

    2010-01-01

    Full Text Available We propose an idle probability-based broadcasting method, iPro, which employs an adaptive probabilistic mechanism to improve performance of data broadcasting over dense wireless ad hoc networks. In multisource one-hop broadcast scenarios, the modeling and simulation results of the proposed iPro are shown to significantly outperform the standard IEEE 802.11 under saturated condition. Moreover, the results also show that without estimating the number of competing nodes and changing the contention window size, the performance of the proposed iPro can still approach the theoretical bound. We further apply iPro to multihop broadcasting scenarios, and the experiment results show that within the same elapsed time after the broadcasting, the proposed iPro has significantly higher Packet-Delivery Ratios (PDR than traditional methods.

  16. Significant-Loophole-Free Test of Bell's Theorem with Entangled Photons.

    Science.gov (United States)

    Giustina, Marissa; Versteegh, Marijn A M; Wengerowsky, Sören; Handsteiner, Johannes; Hochrainer, Armin; Phelan, Kevin; Steinlechner, Fabian; Kofler, Johannes; Larsson, Jan-Åke; Abellán, Carlos; Amaya, Waldimar; Pruneri, Valerio; Mitchell, Morgan W; Beyer, Jörn; Gerrits, Thomas; Lita, Adriana E; Shalm, Lynden K; Nam, Sae Woo; Scheidl, Thomas; Ursin, Rupert; Wittmann, Bernhard; Zeilinger, Anton

    2015-12-18

    Local realism is the worldview in which physical properties of objects exist independently of measurement and where physical influences cannot travel faster than the speed of light. Bell's theorem states that this worldview is incompatible with the predictions of quantum mechanics, as is expressed in Bell's inequalities. Previous experiments convincingly supported the quantum predictions. Yet, every experiment requires assumptions that provide loopholes for a local realist explanation. Here, we report a Bell test that closes the most significant of these loopholes simultaneously. Using a well-optimized source of entangled photons, rapid setting generation, and highly efficient superconducting detectors, we observe a violation of a Bell inequality with high statistical significance. The purely statistical probability of our results to occur under local realism does not exceed 3.74×10^{-31}, corresponding to an 11.5 standard deviation effect.

  17. Innovation Opportunities: An Overview of Standards and Platforms in the Video Game Industry

    OpenAIRE

    Laakso, Mikael; Nyman, Linus Morten

    2014-01-01

    The video game industry offers insights into the significance of standards and platforms. Furthermore, it shows examples of how new entrants can offer innovative services, while reducing their own risk, through bridging the boundaries between standards. Through an exploration of both past and present, this article aims to serve as a primer for understanding, firstly, the technological standards and platforms of the video game industry, and secondly, the recent innovations within the video gam...

  18. A Community Standard: Equivalency of Healthcare in Australian Immigration Detention.

    Science.gov (United States)

    Essex, Ryan

    2017-08-01

    The Australian government has long maintained that the standard of healthcare provided in its immigration detention centres is broadly comparable with health services available within the Australian community. Drawing on the literature from prison healthcare, this article examines (1) whether the principle of equivalency is being applied in Australian immigration detention and (2) whether this standard of care is achievable given Australia's current policies. This article argues that the principle of equivalency is not being applied and that this standard of health and healthcare will remain unachievable in Australian immigration detention without significant reform. Alternate approaches to addressing the well documented issues related to health and healthcare in Australian immigration detention are discussed.

  19. The best-interests standard as threshold, ideal, and standard of reasonableness.

    Science.gov (United States)

    Kopelman, L M

    1997-06-01

    The best-interests standard is a widely used ethical, legal, and social basis for policy and decision-making involving children and other incompetent persons. It is under attack, however, as self-defeating, individualistic, unknowable, vague, dangerous, and open to abuse. The author defends this standard by identifying its employment, first, as a threshold for intervention and judgment (as in child abuse and neglect rulings), second, as an ideal to establish policies or prima facie duties, and, third, as a standard of reasonableness. Criticisms of the best-interests standard are reconsidered after clarifying these different meanings.

  20. The European Community eco-management and audit regulations and the ISO standard 14001 for eco-management systems: significance and consequences for the eco-management of utilities

    International Nuclear Information System (INIS)

    Gudet, C.

    1996-01-01

    Various companies in the electrical industries have concerned themselves with the EMAS regulations and the private sector industries standards BS 7750 and ISO 14001. In various pilot experiments, investigations were carried out on the suitability of these management instruments for utilities. Several power plants in Holland and England have already instituted standards-complying environmental management systems and had them partly certificated. The paper shows in which existing legal frameworks the new management instruments have an effect and what elements they consist of. There is shown, on the basis of an example, which regions of the utility are affected by the environmental management system and how it is co-ordinated into the whole company organisation. (author) 4 figs., 11 refs

  1. STATEWIDE DATA STANDARDS TO SUPPORT CURRENT AND FUTURE STRATEGIC PUBLIC TRANSIT INVESTMENT

    Science.gov (United States)

    2018-04-01

    Significant progress has been made in recent years in reporting and using public transit service data. With the creation and widespread use of the General Transit Feed Specification (GTFS) data standard, there has been a significant increase in the f...

  2. Internal standardization in atomic-emission spectrometry using inductively coupled plasma

    International Nuclear Information System (INIS)

    Moore, G.L.

    1985-01-01

    The principle of internal standardization has been used in quantitative analytical emission spectroscopy since 1925 to minimize the errors arising from fluctuations in sample preparation, excitation-source conditions, and detection parameters. Although modern spectroscopic excitation sources are far more stable and electronic detection methods are more precise than before, the system for the introduction of the sample in spectrometric analysis using inductively coupled plasma (ICP) introduces significant errors, and internal standardization can still play a useful role in improving the overall precision of the analytical results. The criteria for the selection of the elements to be used as internal standards in arc and spark spectrographic analysis apply to a much lesser extent in ICP-spectrometric analysis. Internal standardization is recommended for use in routine ICP-simultaneous spectrometric analysis to improve its accuracy and precision and to provide a monitor for the reassurance of the analyst. However, the selection of an unsuitable reference element can result in misuse of the principle of internal standardization and, although internal standardization can be applied when a sequential monochromator is used, the main sources of error will not be minimized

  3. Creating standards: Creating illusions?

    DEFF Research Database (Denmark)

    Linneberg, Mai Skjøtt

    written standards may open up for the creation of illusions. These are created when written standards' content is not in accordance with the perception standard adopters and standard users have of the specific practice phenomenon's content. This general theoretical argument is exemplified by the specific...

  4. Invasive physiological indices to determine the functional significance of coronary stenosis

    Directory of Open Access Journals (Sweden)

    Firas R. AL-Obaidi

    2018-03-01

    Full Text Available Physiological measurements are now commonly used to assess coronary lesions in the cardiac catheterisation laboratory, and this practice is evidence-based and supported by clinical guidelines. Fractional flow reserve is currently the gold standard method to determine whether coronary lesions are functionally significant, and is used to guide revascularization. There are however several other physiological measurements that have been proposed as alternatives to the fractional flow reserve. This review aims to comprehensively discuss physiological indices that can be used in the cardiac catheterisation laboratory to determine the functional significance of coronary lesions. We will focus on their advantages and disadvantages, and the current evidence supporting their use. Keywords: Coronary physiology, Fractional flow reserve, Resting physiological indices, Coronary flow reserve

  5. Tree-rings mirror management legacy: dramatic response of standard oaks to past coppicing in Central Europe.

    Directory of Open Access Journals (Sweden)

    Jan Altman

    Full Text Available BACKGROUND: Coppicing was one of the most important forest management systems in Europe documented in prehistory as well as in the Middle Ages. However, coppicing was gradually abandoned by the mid-20(th century, which has altered the ecosystem structure, diversity and function of coppice woods. METHODOLOGY/PRINCIPAL FINDINGS: Our aim was to disentangle factors shaping the historical growth dynamics of oak standards (i.e. mature trees growing through several coppice cycles in a former coppice-with-standards in Central Europe. Specifically, we tried to detect historical coppicing events from tree-rings of oak standards, to link coppicing events with the recruitment of mature oaks, and to determine the effects of neighbouring trees on the stem increment of oak standards. Large peaks in radial growth found for the periods 1895-1899 and 1935-1939 matched with historical records of coppice harvests. After coppicing, the number of newly recruited oak standards markedly grew in comparison with the preceding or following periods. The last significant recruitment of oak standards was after the 1930s following the last regular coppicing event. The diameter increment of oak standards from 1953 to 2003 was negatively correlated with competition indices, suggesting that neighbouring trees (mainly resprouting coppiced Tilia platyphyllos partly suppressed the growth of oak standards. Our results showed that improved light conditions following historical coppicing events caused significant increase in pulses of radial growth and most probably maintained oak recruitment. CONCLUSIONS/SIGNIFICANCE: Our historical perspective carries important implications for oak management in Central Europe and elsewhere. Relatively intense cutting creating open canopy woodlands, either as in the coppicing system or in the form of selective cutting, is needed to achieve significant radial growth in mature oaks. It is also critical for the successful regeneration and long

  6. Standardization's role in revitalizing the nuclear option

    International Nuclear Information System (INIS)

    Ward, J.E.

    1986-01-01

    Considering the moribund status of the nuclear industry, something has to be done in the near-term to reverse the decaying economics of nuclear power. Standardization can turn around nuclear economics in the short term and in the longer term can foster a significant return to nuclear power. In the short term the industry needs to take advantage of those current designs that have proved their worth by excellent operating records. These designs can be replicated taking advantage of the complete status of the design and the construction techniques already in place. In the longer term it needs to develop preapproved designs and sites. Further, it must develop a discipline within the system of regulation as well as within the utility management to accept a power design as is. They cannot afford customized regulation nor customized design. Traditional institutional structures may also be up for grabs as utilities struggle to be more cost-effective. Generating companies may plan a significant role in the future of electric utilities. This kind of emphasis will also provide an impetus for the use of cost-effective, standardized designs that can be the catalyst for nuclear power's resurgence

  7. Wellhead and tree standards updated

    International Nuclear Information System (INIS)

    Dach, A.J. Jr.; Haeberle, T.

    1996-01-01

    Revisions in the API 6A, 17th Edition, have resolved a number of long-term problems and expanded its scope and coverage of wellhead and christmas tree equipment. The 17th Edition, Feb. 1, 1996, represents the state-of-the-art in international requirements for wellhead and christmas tree equipment. The design, materials, and quality control aspects of API 6A have all been improved with an emphasis on making the document more acceptable around the world. However, there are unresolved issues that raise many questions about the future direction of efforts aimed at international standardization of wellhead and christmas tree equipment. Unfortunately, these unresolved issues confuse both manufacturers and companies purchasing this equipment. This ultimately increases wellhead and christmas tree costs, so it is to everyone's advantage to resolve these issues. This article describes the significant revisions that are included in API 6A, 17th Edition. Also discussed are the regulatory, standardization, and customer acceptance issues that cloud the future of API 6A, 17th Edition

  8. Russian seismic standards and demands for equipment and their conformity with international standards

    International Nuclear Information System (INIS)

    Kaznovsky, S.; Ostretsov, I.

    1993-01-01

    The principle regulations of standard documents concerning seismic safety of NPPs and demands for reactor equipment conformity with international standards are presented in this report. General state of NPP safety standards is reviewed, with a special emphasis on the state of seismic design standards for NPP equipment and piping. Russian standards documents on seismic resistance of NPPs and requirements are compared to international ones

  9. Significance of FIZ Technik Databases in nuclear safety and environmental protection

    International Nuclear Information System (INIS)

    Das, N.K.

    1993-01-01

    The language of the abstracts of the FIZ Technik databases is primarly German (e.g. DOMA 80%, SDIM 70%). Furthermore FIZ Technik offers licence databases on engineering and technology, management, manufacturers, products, contacts, standards and specifications, geosciences and natural resources. The contents and structure of the databases are described in the FIZ Technik bluesheets and the database news. With some examples the significance of the FIZ Technik databases DOMA, ZDEE, SDIM, SILI and MEDI in nuclear safety and environmental protection is shown. (orig.)

  10. Could Serum Laminin Replace Liver Biopsy as Gold Standard for Predicting Significant Fibrosis in Patients with Chronic Hepatitis B? Clinical and Histopathological Study

    OpenAIRE

    Abeer M. Hafez; Yasser S. Sheta; Mohamed H. Ibrahim; Shereen A. Elshazly

    2013-01-01

    Background: The prognosis and clinical treatment of chronic liver disease depends greatly on the progression of liver fibrosis, which has resulted from the loss of normal liver cell function due to disorganized over-accumulation of extra-cellular matrix (ECM) components in the liver. Liver biopsy has been considered the gold standard for staging and grading hepatic fibrosis and inflammation. However, the procedure is associated with complications such as bleeding, infection, damage to liver t...

  11. Multislice computed tomography: angiographic emulation versus standard assessment for detection of coronary stenoses

    Energy Technology Data Exchange (ETDEWEB)

    Schnapauff, Dirk; Hamm, Bernd; Dewey, Marc [Humboldt-Universitaet zu Berlin, Department of Radiology, Charite - Universitaetsmedizin Berlin, Chariteplatz 1, P.O. Box 10098, Berlin (Germany); Duebel, Hans-Peter; Baumann, Gert [Charite - Universitaetsmedizin Berlin, Department of Cardiology, Berlin (Germany); Scholze, Juergen [Charite - Universitaetsmedizin Berlin, Charite Outpatient Centre, Berlin (Germany)

    2007-07-15

    The present study investigated angiographic emulation of multislice computed tomography (MSCT) (catheter-like visualization) as an alternative approach of analyzing and visualizing findings in comparison with standard assessment. Thirty patients (120 coronary arteries) were randomly selected from 90 prospectively investigated patients with suspected coronary artery disease who underwent MSCT (16-slice scanner, 0.5 mm collimation, 400 ms rotation time) prior to conventional coronary angiography for comparison of both approaches. Sensitivity and specificity of angiographic emulation [81% (26/32) and 93% (82/88)] were not significantly different from those of standard assessment [88% (28/32) and 99% (87/88)], while the per-case analysis time was significantly shorter for angiographic emulation than for standard assessment (3.4 {+-} 1.5 vs 7.0 {+-} 2.5 min, P < 0.001). Both interventional and referring cardiologists preferred angiographic emulation over standard curved multiplanar reformations of MSCT coronary angiography for illustration, mainly because of improved overall lucidity and depiction of sidebranches (P < 0.001). In conclusion, angiographic emulation of MSCT reduces analysis time, yields a diagnostic accuracy comparable to that of standard assessment, and is preferred by cardiologists for visualization of results. (orig.)

  12. Dark matter, constrained minimal supersymmetric standard model, and lattice QCD.

    Science.gov (United States)

    Giedt, Joel; Thomas, Anthony W; Young, Ross D

    2009-11-13

    Recent lattice measurements have given accurate estimates of the quark condensates in the proton. We use these results to significantly improve the dark matter predictions in benchmark models within the constrained minimal supersymmetric standard model. The predicted spin-independent cross sections are at least an order of magnitude smaller than previously suggested and our results have significant consequences for dark matter searches.

  13. Comparative Study between Standard and Totally Tubeless Percutaneous Nephrolithotomy.

    Science.gov (United States)

    Yun, Sung Il; Lee, Yoon Hyung; Kim, Jae Soo; Cho, Sung Ryong; Kim, Bum Soo; Kwon, Joon Beom

    2012-11-01

    Several recent studies have reported the benefits of tubeless percutaneous nephrolithotomy (PNL). Postoperatively, tubeless PNL patients have an indwelling ureteral stent placed, which is often associated with stent-related morbidity. We have performed totally tubeless (tubeless and stentless) PNL in which no nephrostomy tube or ureteral stent is placed postoperatively. We evaluated the safety, effectiveness, and feasibility of totally tubeless PNL. From March 2008 to February 2012, 57 selected patients underwent standard or totally tubeless PNL. Neither a nephrostomy tube nor a ureteral stent was placed in the totally tubeless PNL group. We compared patient and stone characteristics, operation time, length of hospitalization, analgesia requirements, stone-free rate, blood loss, change in creatinine, and perioperative complications between the standard and totally tubeless PNL groups. There were no significant differences in preoperative patient characteristics, postoperative complications, or the stone-free rate between the two groups, but the totally tubeless PNL group showed a shorter hospitalization and a lesser analgesic requirement compared with the standard PNL group. Blood loss and change in creatinine were not significantly different between the two groups. Totally tubeless PNL appears to be a safe and effective alternative for the management of renal stone patients and is associated with a decrease in length of hospital stay.

  14. Current status of the MPEG-4 standardization effort

    Science.gov (United States)

    Anastassiou, Dimitris

    1994-09-01

    The Moving Pictures Experts Group (MPEG) of the International Standardization Organization has initiated a standardization effort, known as MPEG-4, addressing generic audiovisual coding at very low bit-rates (up to 64 kbits/s) with applications in videotelephony, mobile audiovisual communications, video database retrieval, computer games, video over Internet, remote sensing, etc. This paper gives a survey of the status of MPEG-4, including its planned schedule, and initial ideas about requirements and applications. A significant part of this paper is summarizing an incomplete draft version of a `requirements document' which presents specifications of desirable features on the video, audio, and system level of the forthcoming standard. Very low bit-rate coding algorithms are not described, because no endorsement of any particular algorithm, or class of algorithms, has yet been made by MPEG-4, and several seminars held concurrently with MPEG-4 meetings have not so far provided evidence that such high performance coding schemes are achievable.

  15. Task-induced frequency modulation features for brain-computer interfacing

    Science.gov (United States)

    Jayaram, Vinay; Hohmann, Matthias; Just, Jennifer; Schölkopf, Bernhard; Grosse-Wentrup, Moritz

    2017-10-01

    Objective. Task-induced amplitude modulation of neural oscillations is routinely used in brain-computer interfaces (BCIs) for decoding subjects’ intents, and underlies some of the most robust and common methods in the field, such as common spatial patterns and Riemannian geometry. While there has been some interest in phase-related features for classification, both techniques usually presuppose that the frequencies of neural oscillations remain stable across various tasks. We investigate here whether features based on task-induced modulation of the frequency of neural oscillations enable decoding of subjects’ intents with an accuracy comparable to task-induced amplitude modulation. Approach. We compare cross-validated classification accuracies using the amplitude and frequency modulated features, as well as a joint feature space, across subjects in various paradigms and pre-processing conditions. We show results with a motor imagery task, a cognitive task, and also preliminary results in patients with amyotrophic lateral sclerosis (ALS), as well as using common spatial patterns and Laplacian filtering. Main results. The frequency features alone do not significantly out-perform traditional amplitude modulation features, and in some cases perform significantly worse. However, across both tasks and pre-processing in healthy subjects the joint space significantly out-performs either the frequency or amplitude features alone. This result only does not hold for ALS patients, for whom the dataset is of insufficient size to draw any statistically significant conclusions. Significance. Task-induced frequency modulation is robust and straight forward to compute, and increases performance when added to standard amplitude modulation features across paradigms. This allows more information to be extracted from the EEG signal cheaply and can be used throughout the field of BCIs.

  16. Standard-E hydrogen monitoring system shop acceptance test procedure

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, T.C.

    1997-10-02

    The purpose of this report is to document that the Standard-E Hydrogen Monitoring Systems (SHMS-E), fabricated by Mid-Columbia Engineering (MCE) for installation on the Waste Tank Farms in the Hanford 200 Areas, are constructed as intended by the design. The ATP performance will verify proper system fabrication.

  17. Standard-E hydrogen monitoring system shop acceptance test report

    International Nuclear Information System (INIS)

    Schneider, T.C.

    1997-01-01

    The purpose of this report is to document that the Standard-E Hydrogen Monitoring Systems (SHMS-E), fabricated by Mid-Columbia Engineering (MCE) for installation on the Waste Tank Farms in the Hanford 200 Areas, are constructed as intended by the design. The ATP performance will verify proper system fabrication

  18. Secondary standards (non-activation) for neutron data measurements above 20 MeV

    International Nuclear Information System (INIS)

    Haight, R.C.

    1991-01-01

    In addition to H(n,p) scattering and 235,238 U(n,f) reactions, secondary standards for neutron flux determination may be useful for neutron energies above 20 MeV. For experiments where gamma rays are detected, reference gamma-ray production cross sections are relevant. For neutron-induced charged particle production, standard (n,p) and (n,alpha) cross sections would be helpful. Total cross section standards would serve to check the accuracy of these measurements. These secondary standards are desirable because they can be used with the same detector systems employed in measuring the quantities of interest. Uncertainties due to detector efficiency, geometrical effects, timing and length of flight paths can therefore be significantly reduced. Several secondary standards that do not depend on activation techniques are proposed. 14 refs

  19. Relationship of image magnification between periapical standard film and orthopantomogram

    International Nuclear Information System (INIS)

    Kim, Young Tae; Park, Tae Won

    1986-01-01

    The author studied the magnification ratio of teeth length in orthopantomogram through intraoral film taken by standardized paralleling technique. In this study, intraoral radiograph and orthopantomogram were taken in 2 cases of dry skull and 36 adults (504 teeth). The obtained results were as follows: 1. In case of dry skull, the magnification ratio of standard films was 4.6% to 5.9% and that of Orthopantomograph 5 was 15.1% to 33.1%. The magnification ratio of to the standard film was 9.2% to 26.5%. 2. In case of adults, the magnification ratio Orthopantomograph 5 to the standard films was 9.5% to 24.6%. 3. There were no significant difference in magnification between left and right. 4. Anterior teeth had lesser magnification than teeth. 5. It was considered that teeth length showed in Orthopantomograph 5 was magnified 15.4% to 31.3% than actual teeth length.

  20. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  1. Ideal Standards, Acceptance, and Relationship Satisfaction: Latitudes of Differential Effects

    Directory of Open Access Journals (Sweden)

    Asuman Buyukcan-Tetik

    2017-09-01

    Full Text Available We examined whether the relations of consistency between ideal standards and perceptions of a current romantic partner with partner acceptance and relationship satisfaction level off, or decelerate, above a threshold. We tested our hypothesis using a 3-year longitudinal data set collected from heterosexual newlywed couples. We used two indicators of consistency: pattern correspondence (within-person correlation between ideal standards and perceived partner ratings and mean-level match (difference between ideal standards score and perceived partner score. Our results revealed that pattern correspondence had no relation with partner acceptance, but a positive linear/exponential association with relationship satisfaction. Mean-level match had a significant positive association with actor’s acceptance and relationship satisfaction up to the point where perceived partner score equaled ideal standards score. Partner effects did not show a consistent pattern. The results suggest that the consistency between ideal standards and perceived partner attributes has a non-linear association with acceptance and relationship satisfaction, although the results were more conclusive for mean-level match.

  2. Audit Report The Procurement of Safety Class/Safety-Significant Items at the Savannah River Site

    International Nuclear Information System (INIS)

    2009-01-01

    The Department of Energy operates several nuclear facilities at its Savannah River Site, and several additional facilities are under construction. This includes the National Nuclear Security Administration's Tritium Extraction Facility (TEF) which is designated to help maintain the reliability of the U.S. nuclear stockpile. The Mixed Oxide Fuel Fabrication Facility (MOX Facility) is being constructed to manufacture commercial nuclear reactor fuel assemblies from weapon-grade plutonium oxide and depleted uranium. The Interim Salt Processing (ISP) project, managed by the Office of Environmental Management, will treat radioactive waste. The Department has committed to procuring products and services for nuclear-related activities that meet or exceed recognized quality assurance standards. Such standards help to ensure the safety and performance of these facilities. To that end, it issued Departmental Order 414.1C, Quality Assurance (QA Order). The QA Order requires the application of Quality Assurance Requirements for Nuclear Facility Applications (NQA-1) for nuclear-related activities. The NQA-1 standard provides requirements and guidelines for the establishment and execution of quality assurance programs during the siting, design, construction, operation, and decommissioning of nuclear facilities. These requirements, promulgated by the American Society of Mechanical Engineers, must be applied to 'safety-class' and 'safety-significant' structures, systems and components (SSCs). Safety-class SSCs are defined as those necessary to prevent exposure off site and to protect the public. Safety-significant SSCs are those whose failure could irreversibly impact worker safety such as a fatality, serious injury, or significant radiological or chemical exposure. Due to the importance of protecting the public, workers, and environment, we initiated an audit to determine whether the Department of Energy procured safety-class and safety-significant SSCs that met NQA-1 standards at

  3. Building Standards based Science Information Systems: A Survey of ISO and other standards

    Science.gov (United States)

    King, Todd; Walker, Raymond

    Science Information systems began with individual researchers maintaining personal collec-tions of data and managing them by using ad hoc, specialized approaches. Today information systems are an enterprise consisting of federated systems that manage and distribute both historical and contemporary data from distributed sources. Information systems have many components. Among these are metadata models, metadata registries, controlled vocabularies and ontologies which are used to describe entities and resources. Other components include services to exchange information and data; tools to populate the system and tools to utilize available resources. When constructing information systems today a variety of standards can be useful. The benefit of adopting standards is clear; it can shorten the design cycle, enhance software reuse and enable interoperability. We look at standards from the International Stan-dards Organization (ISO), International Telecommunication Union (ITU), Organization for the Advancement of Structured Information Standards (OASIS), Internet Engineering Task Force (IETF), American National Standards Institute (ANSI) which have influenced the develop-ment of information systems in the Heliophysics and Planetary sciences. No standard can solve the needs of every community. Individual disciplines often must fill the gap between general purpose standards and the unique needs of the discipline. To this end individual science dis-ciplines are developing standards, Examples include the International Virtual Observatory Al-liance (IVOA), Planetary Data System (PDS)/ International Planetary Data Alliance (IPDA), Dublin-Core Science, and the Space Physics Archive Search and Extract (SPASE) consortium. This broad survey of ISO and other standards provides some guidance for the development information systems. The development of the SPASE data model is reviewed and provides some insights into the value of applying appropriate standards and is used to illustrate

  4. Lettuce (Lactuca sativa L. var. Sucrine Growth Performance in Complemented Aquaponic Solution Outperforms Hydroponics

    Directory of Open Access Journals (Sweden)

    Boris Delaide

    2016-10-01

    Full Text Available Plant growth performance is optimized under hydroponic conditions. The comparison between aquaponics and hydroponics has attracted considerable attention recently, particularly regarding plant yield. However, previous research has not focused on the potential of using aquaponic solution complemented with mineral elements to commercial hydroponic levels in order to increase yield. For this purpose, lettuce plants were put into AeroFlo installations and exposed to hydroponic (HP, aquaponic (AP, or complemented aquaponic (CAP solutions. The principal finding of this research was that AP and HP treatments exhibited similar (p > 0.05 plant growth, whereas the shoot weight of the CAP treatment showed a significant (p < 0.05 growth rate increase of 39% on average compared to the HP and AP treatments. Additionally, the root weight was similar (p > 0.05 in AP and CAP treatments, and both were significantly higher (p < 0.05 than that observed in the HP treatment. The results highlight the beneficial effect of recirculating aquaculture system (RAS water on plant growth. The findings represent a further step toward developing decoupled aquaponic systems (i.e., two- or multi-loops that have the potential to establish a more productive alternative to hydroponic systems. Microorganisms and dissolved organic matter are suspected to play an important role in RAS water for promoting plant roots and shoots growth.

  5. Standard cross-section data

    International Nuclear Information System (INIS)

    Carlson, A.D.

    1984-01-01

    The accuracy of neutron cross-section measurement is limited by the uncertainty in the standard cross-section and the errors associated with using it. Any improvement in the standard immediately improves all cross-section measurements which have been made relative to that standard. Light element, capture and fission standards are discussed. (U.K.)

  6. Collaboration Between Multistakeholder Standards

    DEFF Research Database (Denmark)

    Rasche, Andreas; Maclean, Camilla

    Public interest in corporate social responsibility (CSR) has resulted in a wide variety of multistakeholder CSR standards in which companies can choose to participate. While such standards reflect collaborative governance arrangements between public and private actors, the market for corporate...... responsibility is unlikely to support a great variety of partly competing and overlapping standards. Increased collaboration between these standards would enhance both their impact and their adoption by firms. This report examines the nature, benefits, and shortcomings of existing multistakeholder standards...

  7. The IAEA radioactive waste safety standards programme

    International Nuclear Information System (INIS)

    Tourtellotte, James R.

    1995-01-01

    The IAEA is currently reviewing more than thirty publications in its Safety Series with a view toward consolidating and organizing information pertaining to radioactive waste. the effort is entitled Radioactive Waste Safety Standards programme (RADWASS). RADWASS is a significant undertaking and may have far reaching effects on radioactive waste management both in the international nuclear community and in individual nuclear States. This is because IAEA envisions the development of a consensus on the final document. In this circumstance, the product of RADWASS may ultimately be regarded as an international norm against which future actions of Member States may be measured. This program is organized in five subjects: planning, pre-disposal, disposal, uranium and thorium waste management and decommissioning, which has four levels: safety fundamentals, safety standards, safety guides and safety practices. (author)

  8. Corporate Schooling Meets Corporate Media: Standards, Testing, and Technophilia

    Science.gov (United States)

    Saltman, Kenneth J.

    2016-01-01

    Educational publishing corporations and media corporations in the United States have been converging, especially through the promotion of standardization, testing, and for-profit educational technologies. Media and technology companies--including News Corp, Apple, and Microsoft--have significantly expanded their presence in public schools to sell…

  9. Significance of perfectionism in understanding different forms of insomnia

    Directory of Open Access Journals (Sweden)

    Totić-Poznanović Sanja

    2012-01-01

    Full Text Available Introduction. Studies consistently show a connection between perfectionism as a multidimensional construct with various psychological and psychopathological states and characteristics. However, studies that analyze the connection between this concept and sleep disturbances, especially modalities of insomnia, are rare. Objective. The aim of this study was to examine whether dimensions of perfectionism can explain different forms of insomnia; difficulties initiating sleep (insomnia early, difficulties during the sleep (insomnia middle, waking in early hours of the morning (insomnia late and dissatisfaction with sleep quality (subjective insomnia. Methods. The sample consisted of 254 students of the School of Medicine in Belgrade. Predictive significance of nine perfectionism dimensions, measured by Frost’s and Hewitt’s and Flett’s scales of multi-dimensional perfectionism, related to four modalities of insomnia, measured by a structured questionnaire, was analyzed by multiple linear regression method. Results. Perfectionism dimensions are significant predictors of each of the tested forms of insomnia. Doubt about actions significantly predicts initial insomnia; to other-oriented perfectionism in the negative pole and socially prescribed perfectionism underlie the difficulties during the sleep, while organization and parental criticism underlie late insomnia. Significant predictors of subjective insomnia are personal standards and organization and to other-oriented perfectionism on the negative pole. Three of nine analyzed dimensions were not confirmed as significant; concern over mistakes, parental expectations and self-oriented perfectionism. Conclusion. Various aspects of perfectionism can be considered as a vulnerability factor for understanding some forms of insomnia. Out of all forms of insomnia tested, perfectionism as the personality trait proved to be the most significant for understanding subjective insomnia.

  10. Clinical evaluation of further-developed MRCP sequences in comparison with standard MRCP sequences

    International Nuclear Information System (INIS)

    Hundt, W.; Scheidler, J.; Reiser, M.; Petsch, R.

    2002-01-01

    The purpose of this study was the comparison of technically improved single-shot magnetic resonance cholangiopancreatography (MRCP) sequences with standard single-shot rapid acquisition with relaxation enhancement (RARE) and half-Fourier acquired single-shot turbo spin-echo (HASTE) sequences in evaluating the normal and abnormal biliary duct system. The bile duct system of 45 patients was prospectively investigated on a 1.5-T MRI system. The investigation was performed with RARE and HASTE MR cholangiography sequences with standard and high spatial resolutions, and with a delayed-echo half-Fourier RARE (HASTE) sequence. Findings of the improved MRCP sequences were compared with the standard MRCP sequences. The level of confidence in assessing the diagnosis was divided into five groups. The Wilcoxon signed-rank test at a level of p<0.05 was applied. In 15 patients no pathology was found. The MRCP showed stenoses of the bile duct system in 10 patients and choledocholithiasis and cholecystolithiasis in 16 patients. In 12 patients a dilatation of the bile duct system was found. Comparison of the low- and high spatial resolution sequences and the short and long TE times of the half-Fourier RARE (HASTE) sequence revealed no statistically significant differences regarding accuracy of the examination. The diagnostic confidence level in assessing normal or pathological findings for the high-resolution RARE and half-Fourier RARE (HASTE) was significantly better than for the standard sequences. For the delayed-echo half-Fourier RARE (HASTE) sequence no statistically significant difference was seen. The high-resolution RARE and half-Fourier RARE (HASTE) sequences had a higher confidence level, but there was no significant difference in diagnosis in terms of detection and assessment of pathological changes in the biliary duct system compared with standard sequences. (orig.)

  11. Water Phase Diagram Is Significantly Altered by Imidazolium Ionic Liquid

    DEFF Research Database (Denmark)

    Chaban, V. V.; Prezhdo, O. V.

    2014-01-01

    We report unusually large changes in the boiling temperature, saturated vapor pressure, and structure of the liquid-vapor interface for a range of 1-butyl-3-methyl tetrafluoroborate, [C4C1IM][BF4]-water mixtures. Even modest molar fractions of [C4C1IM][BF4] significantly affect the phase behavior...... of water, as represented, for instance, by strong negative deviations from Raoult's law, extending far beyond the standard descriptions. The investigation was carried out using classical molecular dynamics employing a specifically refined force field. The changes in the liquid-vapor interface and saturated...

  12. Rehearsal significantly improves immediate and delayed recall on the Rey Auditory Verbal Learning Test.

    Science.gov (United States)

    Hessen, Erik

    2011-10-01

    A repeated observation during memory assessment with the Rey Auditory Verbal Learning Test (RAVLT) is that patients who spontaneously employ a memory rehearsal strategy by repeating the word list more than once achieve better scores than patients who only repeat the word list once. This observation led to concern about the ability of the standard test procedure of RAVLT and similar tests in eliciting the best possible recall scores. The purpose of the present study was to test the hypothesis that a rehearsal recall strategy of repeating the word list more than once would result in improved scores of recall on the RAVLT. We report on differences in outcome after standard administration and after experimental administration on Immediate and Delayed Recall measures from the RAVLT of 50 patients. The experimental administration resulted in significantly improved scores for all the variables employed. Additionally, it was found that patients who failed effort screening showed significantly poorer improvement on Delayed Recall compared with those who passed the effort screening. The general clear improvement both in raw scores and T-scores demonstrates that recall performance can be significantly influenced by the strategy of the patient or by small variations in instructions by the examiner.

  13. Standards for holdup measurement

    International Nuclear Information System (INIS)

    Zucker, M.S.

    1982-01-01

    Holdup measurement, needed for material balance, depend intensively on standards and on interpretation of the calibration procedure. More than other measurements, the calibration procedure using the standard becomes part of the standard. Standards practical for field use and calibration techniques have been developed. While accuracy in holdup measurements is comparatively poor, avoidance of bias is a necessary goal

  14. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  15. Standards for the 21st-Century Learner: Comparisons with NETS and State Standards

    Science.gov (United States)

    Pappas, Marjorie

    2008-01-01

    The American Association of School Librarians (AASL) and the International Society for Technology in Education (ISTE) have both recently launched new standards. These are known as the "AASL Standards for the 21st-Century Learner" and "The National Educational Technology Standards for Students: The Next Generation" (NETS). The standards from each…

  16. Should we expect financial globalization to have significant effects on business cycles?

    OpenAIRE

    Iversen, Jens

    2009-01-01

    Empirical research suggests that financial globalization has insignificant effects on business cycles. Based on standard theoretical models it might be conjectured that the effects should be significant. I show that this conjecture is wrong. Theoretical effects of financial globalization can be determined to any level of precision by expanding the underlying artificial samples. In contrast, in the data the effects are imprecisely estimated because of short samples. I show that if the conclusi...

  17. Gas industry standards board: Legal considerations in the standard setting process

    Energy Technology Data Exchange (ETDEWEB)

    Mishkin, M.T.; Adelman, D.I.

    1994-01-01

    On December 23, 1993, the Federal Energy Regulatory Commission (FERC) issued Order 563, a Final Rule adopting the agreements of informal industry-wide working groups to standardize information relating to pipeline capacity release programs mandated under Order 636. Order 563 is noteworthy for its reliance upon the industry to develop consensus standards for Commission adoption. The industry's success in reaching agreements on key communication standards issues spawned recommendations from the working groups to continue the development and maintenance of industry-wide standards through a permanent Gas Industry Standards Board (GISB). This article examines legal issues bearing on GISB's potential role in the regulatory process. Specifically, this article addresses constitutional and statutory considerations relating to the FERC's authority to delegate certain responsibilities to a voluntary, industry sponsored and supported private body such as that taking shape within the gas industry.

  18. Standard Reference Tables -

    Data.gov (United States)

    Department of Transportation — The Standard Reference Tables (SRT) provide consistent reference data for the various applications that support Flight Standards Service (AFS) business processes and...

  19. [The significance of meat quality in marketing].

    Science.gov (United States)

    Kallweit, E

    1994-07-01

    Food quality in general and meat quality in particular are not only evaluated by means of objective quality traits but the entire production process is gaining more attention by the modern consumer. Due to this development quality programs were developed to define the majority of the processes in all production and marketing steps which are again linked by contracts. Not all of these items are quality relevant, but are concessions to ethic principles (animal welfare etc.). This is demonstrated by the example of Scharrel-pork production. The price differentiation at the pork market is still influenced predominantly by quantitative carcass traits. On the European market quality programs still are of minor significance. Premiums which are paid for high quality standards are more or less compensated by higher production costs and lower lean meat percentages, which must be expected in stress susceptible strains. The high efforts to establish quality programs, however, help to improve the quality level in general, and secure the market shares for local producers.

  20. Girls and STEM (Science, Technology, Engineering, and Mathematics) in Catholic Schools: A Mixed Methods Exploration of Interest, Confidence, and Perceptions of STEM

    Science.gov (United States)

    McKenna, Rachel Lynn-Pleis

    2016-01-01

    Over the past decade, there has been a considerable push in emphasizing STEM--an acronym standing for Science, Technology, Engineering, and Math--as an integral aspect of educational curriculums. Even though research suggests that females tend to outperform males in standardized testing in STEM areas, they remain underrepresented in STEM careers…

  1. Implications of Higgs searches on the four-generation standard model.

    Science.gov (United States)

    Kuflik, Eric; Nir, Yosef; Volansky, Tomer

    2013-03-01

    Within the four-generation standard model, the Higgs couplings to gluons and to photons deviate in a significant way from the predictions of the three-generation standard model. As a consequence, large departures in several Higgs production and decay channels are expected. Recent Higgs search results, presented by ATLAS, CMS, and CDF, hint on the existence of a Higgs boson with a mass around 125 GeV. Using these results and assuming such a Higgs boson, we derive exclusion limits on the four-generation standard model. For m(H)=125 GeV, the model is excluded above 99.95% confidence level. For 124.5 GeV≤m(H)≤127.5 GeV, an exclusion limit above 99% confidence level is found.

  2. Position paper on standardization

    International Nuclear Information System (INIS)

    1991-04-01

    The ''NPOC Strategic Plan for Building New Nuclear Plants'' creates a framework within which new standardized nuclear plants may be built. The Strategic Plan is an expression of the nuclear energy industry's serious intent to create the necessary conditions for new plant construction and operation. One of the key elements of the Strategic Plan is a comprehensive industry commitment to standardization: through design certification, combined license, first-of-a-kind engineering, construction, operation and maintenance of nuclear power plants. The NPOC plan proposes four stages of standardization in advanced light water reactors (ALWRs). The first stage is established by the ALWR Utility Requirements Document which specifies owner/operator requirements at a functional level covering all elements of plant design and construction, and many aspects of operations and maintenance. The second stage of standardization is that achieved in the NRC design certification. This certification level includes requirements, design criteria and bases, functional descriptions and performance requirements for systems to assure plant safety. The third stage of standardization, commercial standardization, carries the design to a level of completion beyond that required for design certification to enable the industry to achieve potential increases in efficiency and economy. The final stage of standardization is enhanced standardization beyond design. A standardized approach is being developed in construction practices, operating, maintenance training, and procurement practices. This comprehensive standardization program enables the NRC to proceed with design certification with the confidence that standardization beyond the regulations will be achieved. This confidence should answer the question of design detail required for design certification, and demonstrate that the NRC should require no further regulatory review beyond that required by 10 CFR Part 52

  3. International Electrotechnical Commission standards and French material control standards

    International Nuclear Information System (INIS)

    Furet, J.; Weill, J.

    1978-01-01

    There are reported the international standards incorporated into the IEC Subcommitee 45 A (Nuclear Reactor Instrumentation) and the national standards elaborated by the Commissariat a l'Energie Atomique, CEA, Group of normalized control equipment, the degree of application of those being reported on the base design, call of bids and exploitation of nuclear power plants. (J.E. de C)

  4. Calibration of working standard ionization chambers and dose standardization

    International Nuclear Information System (INIS)

    Abd Elmahoud, A. A. B.

    2011-01-01

    Measurements were performed for the calibration of two working standard ionization chambers in the secondary standard dosimetry laboratory of Sudan. 600 cc cylindrical former type and 1800 cc cylindrical radical radiation protection level ionization chambers were calibrated against 1000 cc spherical reference standard ionization chamber. The chamber were calibrated at X-ray narrow spectrum series with beam energies ranged from (33-116 KeV) in addition to 1''3''7''Cs beam with 662 KeV energy. The chambers 0.6 cc and 0.3 cc therapy level ionization were used for dose standardization and beam output calibrations of cobalt-60 radiotherapy machine located at the National Cancer Institute, University of Gazira. Concerning beam output measurements for 6''0''Co radiotherapy machine, dosimetric measurements were performed in accordance with the relevant per IAEA dosimetry protocols TRS-277 and TRS-398. The kinetic energy released per unit mass in air (air kerma) were obtained by multiplying the corrected electrometer reading (nC/min) by the calibration factors (Gy/n C) of the chambers from given in the calibration certificate. The uncertainty of measurements of air kerma were calculated for the all ionization chambers (combined uncertainty) the calibration factors of these ionization chambers then were calculated by comparing the reading of air kerma of secondary standard ionization chambers to than from radical and farmer chambers. The result of calibration working standard ionization chambers showed different calibration factors ranged from 0.99 to 1.52 for different radiation energies and these differences were due to chambers response and specification. The absorbed dose to to water calculated for therapy ionization chamber using two code of practice TRS-277 and TRS-398 as beam output for 6''0''Co radiotherapy machine and it can be used as a reference for future beam output calibration in radiotherapy dosimetry. The measurement of absorbed dose to water showed that the

  5. Standards update -- 1995

    Energy Technology Data Exchange (ETDEWEB)

    Mason, J.D.

    1995-12-31

    What a year this has been! Not since 1986, when SGML was being finished, has there been so much activity in the SGML world. In ISO, there are new standards being completed and old ones (some of which are not really all that old) being revised. As you`ll be hearing, there is lots of SGML activity in the applications world--particularly on the Internet--and that`s causing other kinds of standards activity. WG8 divides its work into five ``Rapporteur Groups`` (or ``RGs``) for DSSSL, Font Description and Interchange, SGML, SPDL, and Hypermedia Languages. Since interest is in DSSSL, SGML, and Hypermedia Languages, the author only mentions that the other groups have been active, too. The Fronts group has been doing amendments to its standards, ISO/IEC 9541 and ISO/IEC 10036. The Fronts groups has been active in providing support for ISO/IEC 10646, the massive character coding standard that has drawn a lot of attention in the SGML world. The SPDL group has at long last finished its standard, the Standard Page Description Language (ISO/IEC 10180) and is about to publish it. More detailed discussions are given for activity in SGML, DSSSL, and Hypermedia Languages.

  6. Contrast of Korean industrial standard and overseas standard

    International Nuclear Information System (INIS)

    Kim, Jong Deuk

    2006-12-01

    This book introduces Korean Industrial Standard and Overseas Standard, which deals with furniture, plasticizers, valcaning agents, gaskets, steel pipes and tubes. It covers wooden furniture for offices, washing dresser, children crib, chairs and desks for students, chairs and desks made from synthetic resins, tricresyl phosphate, dibutyl phthalate, dioctyl phthalate, phtalic acid dieptil, V packing, vulcanization accelerator CBS(CZ), vulcanization accelerator MBT(M), vulcanization accelerator Zn BDC, steel pipe for heating furnace and carbon steel pope for high voltage piping.

  7. Sensitivity of monthly streamflow forecasts to the quality of rainfall forcing: When do dynamical climate forecasts outperform the Ensemble Streamflow Prediction (ESP) method?

    Science.gov (United States)

    Tanguy, M.; Prudhomme, C.; Harrigan, S.; Smith, K. A.; Parry, S.

    2017-12-01

    Forecasting hydrological extremes is challenging, especially at lead times over 1 month for catchments with limited hydrological memory and variable climates. One simple way to derive monthly or seasonal hydrological forecasts is to use historical climate data to drive hydrological models using the Ensemble Streamflow Prediction (ESP) method. This gives a range of possible future streamflow given known initial hydrologic conditions alone. The degree of skill of ESP depends highly on the forecast initialisation month and catchment type. Using dynamic rainfall forecasts as driving data instead of historical data could potentially improve streamflow predictions. A lot of effort is being invested within the meteorological community to improve these forecasts. However, while recent progress shows promise (e.g. NAO in winter), the skill of these forecasts at monthly to seasonal timescales is generally still limited, and the extent to which they might lead to improved hydrological forecasts is an area of active research. Additionally, these meteorological forecasts are currently being produced at 1 month or seasonal time-steps in the UK, whereas hydrological models require forcings at daily or sub-daily time-steps. Keeping in mind these limitations of available rainfall forecasts, the objectives of this study are to find out (i) how accurate monthly dynamical rainfall forecasts need to be to outperform ESP, and (ii) how the method used to disaggregate monthly rainfall forecasts into daily rainfall time series affects results. For the first objective, synthetic rainfall time series were created by increasingly degrading observed data (proxy for a `perfect forecast') from 0 % to +/-50 % error. For the second objective, three different methods were used to disaggregate monthly rainfall data into daily time series. These were used to force a simple lumped hydrological model (GR4J) to generate streamflow predictions at a one-month lead time for over 300 catchments

  8. Physical standards and valid caibration

    International Nuclear Information System (INIS)

    Smith, D.B.

    1975-01-01

    The desire for improved nuclear material safeguards has led to the development and use of a number and techniques and instruments for the nondestructive assay (NDA) of special nuclear material. Sources of potential bias in NDA measurements are discussed and methods of eliminating the effects of bias in assay results are suggested. Examples are given of instruments in which these methods have been successfully applied. The results of careful attention to potential sources of assay bias are a significant reduction in the number and complexity of standards required for valid instrument calibration and more credible assay results. (auth)

  9. Heat flow of standard depth

    International Nuclear Information System (INIS)

    Cull, J.P.

    1981-01-01

    Secular and long-term periodic changes in surface temperature cause perturbations to the geothermal gradient which may be significant to depths of at least 1000 m, and major corrections are required to determine absolute values of heat flow from the Earth's interior. However, detailed climatic models remain contentious and estimates of error in geothermal gradients differ widely. Consequently, regions of anomalous heat flow which could contain geothermal resources may be more easily resolved by measuring relative values at a standard depth (e.g. 100 m) so that all data are subject to similar corrections. (orig./ME)

  10. 77 FR 43196 - Minimum Internal Control Standards and Technical Standards

    Science.gov (United States)

    2012-07-24

    ... NATIONAL INDIAN GAMING COMMISSION 25 CFR Parts 543 and 547 Minimum Internal Control Standards [email protected] . SUPPLEMENTARY INFORMATION: Part 543 addresses minimum internal control standards (MICS) for Class II gaming operations. The regulations require tribes to establish controls and implement...

  11. Submillisievert standard-pitch CT pulmonary angiography with ultra-low dose contrast media administration: A comparison to standard CT imaging.

    Science.gov (United States)

    Suntharalingam, Saravanabavaan; Mikat, Christian; Stenzel, Elena; Erfanian, Youssef; Wetter, Axel; Schlosser, Thomas; Forsting, Michael; Nassenstein, Kai

    2017-01-01

    To evaluate the image quality and radiation dose of submillisievert standard-pitch CT pulmonary angiography (CTPA) with ultra-low dose contrast media administration in comparison to standard CTPA. Hundred patients (56 females, 44 males, mean age 69.6±15.4 years; median BMI: 26.6, IQR: 5.9) with suspected pulmonary embolism were examined with two different protocols (n = 50 each, group A: 80 kVp, ref. mAs 115, 25 ml of contrast medium; group B: 100 kVp, ref. mAs 150, 60 ml of contrast medium) using a dual-source CT equipped with automated exposure control. Objective and subjective image qualities, radiation exposure as well as the frequency of pulmonary embolism were evaluated. There was no significant difference in subjective image quality scores between two groups regarding pulmonary arteries (p = 0.776), whereby the interobserver agreement was excellent (group A: k = 0.9; group B k = 1.0). Objective image analysis revealed that signal intensities (SI), signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) of the pulmonary arteries were equal or significantly higher in group B. There was no significant difference in the frequency of pulmonary embolism (p = 0.65). Using the low dose and low contrast media protocol resulted in a radiation dose reduction by 71.8% (2.4 vs. 0.7 mSv; pcontrast agent volume can obtain sufficient image quality to exclude or diagnose pulmonary emboli while reducing radiation dose by approximately 71%.

  12. Calibration of Flick standards

    International Nuclear Information System (INIS)

    Thalmann, Ruedi; Spiller, Jürg; Küng, Alain; Jusko, Otto

    2012-01-01

    Flick standards or magnification standards are widely used for an efficient and functional calibration of the sensitivity of form measuring instruments. The results of a recent measurement comparison have shown to be partially unsatisfactory and revealed problems related to the calibration of these standards. In this paper the influence factors for the calibration of Flick standards using roundness measurement instruments are discussed in detail, in particular the bandwidth of the measurement chain, residual form errors of the device under test, profile distortions due to the diameter of the probing element and questions related to the definition of the measurand. The different contributions are estimated using simulations and are experimentally verified. Also alternative methods to calibrate Flick standards are investigated. Finally the practical limitations of Flick standard calibration are shown and the usability of Flick standards both to calibrate the sensitivity of roundness instruments and to check the filter function of such instruments is analysed. (paper)

  13. [Roaming through methodology. XXXVIII. Common misconceptions involving standard deviation and standard error

    NARCIS (Netherlands)

    Mokkink, H.G.A.

    2002-01-01

    Standard deviation and standard error have a clear mutual relationship, but at the same time they differ strongly in the type of information they supply. This can lead to confusion and misunderstandings. Standard deviation describes the variability in a sample of measures of a variable, for instance

  14. An integrated methodology to develop a standard for landslide early warning systems

    OpenAIRE

    Fathani, Teuku Faisal; Karnawati, Dwikorita; Wilopo, Wahyu

    2016-01-01

    Landslides are one of the most widespread and commonly occurring natural hazards. In regions of high vulnerability, these complex hazards can cause significant negative social and economic impacts. Considering the worldwide susceptibility to landslides, it is necessary to establish a standard for early warning systems specific to landslide disaster risk reduction. This standard would provide guidance in conducting landslide detection, prediction, interpretation, and response...

  15. An astrometric standard field in omega Cen

    Science.gov (United States)

    Wyse, Rosemary

    2003-07-01

    We propose to obtain a high-precision astrometric standard in a two-step procedure. First, we will create a ground-based astrometric standard field around omega Cen down to V=22 with a 3 mas accuracy in positions and better than 0.5 mas/yr in proper motions. This standard will be used to obtain precise absolute plate solutions for selected WFPC2 CCD frames and refine the self-calibrated mean distortion solution for the WFPC2 CCD chips. This will eliminate systematic errors inherent in the self-calibration techniques down to the rms=0.3 mas level, thus opening new opportunities to perform precision astrometry with WFPC2 alone or in combination with the other HST imaging instruments. We will also address the issue of the distortion's variation which has a paramount significance for space astrometry such as spearheaded by the HST or being under development {SIM, GAIA}. Second, all reduced WFPC2 CCD frames will be combined into the two field catalogs {astrometric flat fields} of positions in omega Cen of unprecedented precision {s.e.=0.1 mas} down to V=22 and will be available to the GO community and readily applicable to calibrating the ACS.

  16. [{sup 18}F]FDG PET/CT outperforms [{sup 18}F]FDG PET/MRI in differentiated thyroid cancer

    Energy Technology Data Exchange (ETDEWEB)

    Vrachimis, Alexis; Wenning, Christian; Weckesser, Matthias; Stegger, Lars [University Hospital Muenster, Department of Nuclear Medicine, Muenster (Germany); Burg, Matthias Christian; Allkemper, Thomas [University Hospital Muenster, Department of Clinical Radiology, Muenster (Germany); Schaefers, Michael [University Hospital Muenster, Department of Nuclear Medicine, Muenster (Germany); Westfaelische Wilhelms University Muenster, European Institute for Molecular Imaging, Muenster (Germany)

    2016-02-15

    To evaluate the diagnostic potential of PET/MRI with [{sup 18}F]FDG in comparison to PET/CT in patients with differentiated thyroid cancer suspected or known to have dedifferentiated. The study included 31 thyroidectomized and remnant-ablated patients who underwent a scheduled [{sup 18}F]FDG PET/CT scan and were then enrolled for a PET/MRI scan of the neck and thorax. The datasets (PET/CT, PET/MRI) were rated regarding lesion count, conspicuity, diameter and characterization. Standardized uptake values were determined for all [{sup 18}F]FDG-positive lesions. Histology, cytology, and examinations before and after treatment served as the standards of reference. Of 26 patients with a dedifferentiated tumour burden, 25 were correctly identified by both [{sup 18}F]FDG PET/CT and PET/MRI. Detection rates by PET/CT and PET/MRI were 97 % (113 of 116 lesions) and 85 % (99 of 113 lesions) for malignant lesions, and 100 % (48 of 48 lesions) and 77 % (37 of 48 lesions) for benign lesions, respectively. Lesion conspicuity was higher on PET/CT for both malignant and benign pulmonary lesions and in the overall rating for malignant lesions (p < 0.001). There was a difference between PET/CT and PET/MRI in overall evaluation of malignant lesions (p < 0.01) and detection of pulmonary metastases (p < 0.001). Surgical evaluation revealed three malignant lesions missed by both modalities. PET/MRI additionally failed to detect 14 pulmonary metastases and 11 benign lesions. In patients with thyroid cancer and suspected or known dedifferentiation, [{sup 18}F]FDG PET/MRI was inferior to low-dose [{sup 18}F]FDG PET/CT for the assessment of pulmonary status. However, for the assessment of cervical status, [{sup 18}F]FDG PET/MRI was equal to contrast-enhanced neck [{sup 18}F]FDG PET/CT. Therefore, [{sup 18}F]FDG PET/MRI combined with a low-dose CT scan of the thorax may provide an imaging solution when high-quality imaging is needed and high-energy CT is undesirable or the use of a contrast

  17. Present status of standards relating to radiation control and protection

    International Nuclear Information System (INIS)

    Minami, Kentaro

    1996-01-01

    Japanese and international standards related to radiation control and radiation protective management are presented focusing on the forming condition, significance, current situation, and their relationship. Japanese Industrial Standards (JIS) is quite useful in the field of atomic energy as well as other fields in terms of optimization and rationalization of the management. JIS includes JIS Z 4001 Atomic Energy Terminology which corresponds to internationl standards ISO 921 Nuclear Glossary, and JIS Z 4005 Medical Radiation Terminology, covering about 500 articles, which corresponds to IEC 788 Medical Radiology-Terminology. The first standards regarding radiation protection was established in X-ray Film Badge, which is included in the field of personal dosimeter, in 1956. Currently, 36 JIS has been established in the field of radiation management dosimeter and 3 are under arrangement. As for radiation protective supplies, 9 JIS has been established so far. Before proposal of JIS, investigation had been conducted to improve, simplify, and standardize the standards of radiation dosimetric technique, dosimeters, dosimetric procedures, and improvement. In this article, the results of material surface contamination monitoring and body surface monitoring conducted in Atomic Energy Safety Association and Radiation Dosimetry Associationare reported, and ISO and IEC are also treated. (S.Y.)

  18. The impact of transport processes standardization on supply chain efficiency

    Directory of Open Access Journals (Sweden)

    Maciej Stajniak

    2016-03-01

    Full Text Available Background: During continuous market competition, focusing on the customer service level, lead times and supply flexibility is very important to analyze the efficiency of logistics processes. Analysis of supply chain efficiency is one of the fundamental elements of controlling analysis. Transport processes are a key process that provides physical material flow through the supply chain. Therefore, in this article Authors focus attention on the transport processes efficiency. Methods: The research carried out in the second half of 2014 year, in 210 enterprises of the Wielkopolska Region. Observations and business practice studies conducted by the authors, demonstrate a significant impact of standardization processes on supply chain efficiency. Based on the research results, have been developed standard processes that have been assessed as being necessary to standardize in business practice. Results: Based on these research results and observations, authors have developed standards for transport processes by BPMN notation. BPMN allows authors to conduct multivariate simulation of these processes in further stages of research. Conclusions: Developed standards are the initial stage of research conducted by Authors in the assessment of transport processes efficiency. Further research direction is to analyze the use efficiency of transport processes standards in business practice and their impact on the effectiveness of the entire supply chain.

  19. Development of a standardized differential-reflective bioassay for microbial pathogens

    Science.gov (United States)

    Wilhelm, Jay; Auld, J. R. X.; Smith, James E.

    2008-04-01

    This research examines standardizing a method for the rapid/semi-automated identification of microbial contaminates. It introduces a method suited to test for food/water contamination, serology, urinalysis and saliva testing for any >1 micron sized molecule that can be effectively bound to an identifying marker with exclusivity. This optical biosensor method seeks to integrate the semi-manual distribution of a collected sample onto a "transparent" substrate array of binding sites that will then be applied to a standard optical data disk and run for analysis. The detection of most microbe species is possible in this platform because the relative scale is greater than the resolution of the standard-scale digital information on a standard CD or DVD. This paper explains the critical first stage in the advance of this detection concept. This work has concentrated on developing the necessary software component needed to perform highly sensitive small-scale recognition using the standard optical disk as a detection platform. Physical testing has made significant progress in demonstrating the ability to utilize a standard optical drive for the purposes of micro-scale detection through the exploitation of CIRC error correction. Testing has also shown a definable trend in the optimum scale and geometry of micro-arrayed attachment sites for the technology's concept to reach achievement.

  20. A Generalizability Theory Approach to Standard Error Estimates for Bookmark Standard Settings

    Science.gov (United States)

    Lee, Guemin; Lewis, Daniel M.

    2008-01-01

    The bookmark standard-setting procedure is an item response theory-based method that is widely implemented in state testing programs. This study estimates standard errors for cut scores resulting from bookmark standard settings under a generalizability theory model and investigates the effects of different universes of generalization and error…

  1. 3G Standards

    DEFF Research Database (Denmark)

    Saugstrup, Dan; Henten, Anders

    2006-01-01

    Purpose – The main purpose of this paper is to analyze which standard/technology will win the 3G mobile markets. In addition, two sub topics are examined. First, which kind of victory will it be – will one technological solution be all-dominating or is co-existence more likely? Second, which....... Originality/value – The paper is based on the understanding that a vast array of different factors in a complex dynamic environment goes into the determination of the outcome of such standardization games. However, the battle between 3G standards has already reached a level, where relatively certain...... predictions can be made. And, the paper contributes with a methodologically based discussion concerning the outcome of the battle between 3G standards....

  2. International standards for monoclonal antibodies to support pre- and post-marketing product consistency: Evaluation of a candidate international standard for the bioactivities of rituximab.

    Science.gov (United States)

    Prior, Sandra; Hufton, Simon E; Fox, Bernard; Dougall, Thomas; Rigsby, Peter; Bristow, Adrian

    2018-01-01

    The intrinsic complexity and heterogeneity of therapeutic monoclonal antibodies is built into the biosimilarity paradigm where critical quality attributes are controlled in exhaustive comparability studies with the reference medicinal product. The long-term success of biosimilars will depend on reassuring healthcare professionals and patients of consistent product quality, safety and efficacy. With this aim, the World Health Organization has endorsed the need for public bioactivity standards for therapeutic monoclonal antibodies in support of current controls. We have developed a candidate international potency standard for rituximab that was evaluated in a multi-center collaborative study using participants' own qualified Fc-effector function and cell-based binding bioassays. Dose-response curve model parameters were shown to reflect similar behavior amongst rituximab preparations, albeit with some differences in potency. In the absence of a common reference standard, potency estimates were in poor agreement amongst laboratories, but the use of the candidate preparation significantly reduced this variability. Our results suggest that the candidate rituximab standard can support bioassay performance and improve data harmonization, which when implemented will promote consistency of rituximab products over their life-cycles. This data provides the first scientific evidence that a classical standardization exercise allowing traceability of bioassay data to an international standard is also applicable to rituximab. However, we submit that this new type of international standard needs to be used appropriately and its role not to be mistaken with that of the reference medicinal product.

  3. Defining Indicators and Standards for Tourism Impacts in Protected Areas: Cape Range National Park, Australia

    Science.gov (United States)

    Moore, Susan A.; Polley, Amanda

    2007-03-01

    Visitors’ perceptions of impacts and acceptable standards for environmental conditions can provide essential information for the sustainable management of tourist destinations, especially protected areas. To this end, visitor surveys were administered during the peak visitor season in Cape Range National Park, on the northwest coast of Western Australia and adjacent to the iconic Ningaloo Reef. The central focus was visitors’ perceptions regarding environmental conditions and standards for potential indicators. Conditions considered of greatest importance in determining visitors’ quality of experience included litter, inadequate disposal of human waste, presence of wildlife, levels of noise, and access to beach and ocean. Standards were determined, based on visitors’ perceptions, for a range of site-specific and non-site-specific indicators, with standards for facilities (e.g., acceptable number of parking bays, signs) and for negative environmental impacts (e.g., levels of littering, erosion) sought. The proposed standards varied significantly between sites for the facilities indicators; however, there was no significant difference between sites for environmental impacts. For the facilities, the standards proposed by visitors were closely related to the existing situation, suggesting that they were satisfied with the status quo. These results are considered in the context of current research interest in the efficacy of visitor-derived standards as a basis for protected area management.

  4. Defining indicators and standards for tourism impacts in protected areas: Cape Range National Park, Australia.

    Science.gov (United States)

    Moore, Susan A; Polley, Amanda

    2007-03-01

    Visitors' perceptions of impacts and acceptable standards for environmental conditions can provide essential information for the sustainable management of tourist destinations, especially protected areas. To this end, visitor surveys were administered during the peak visitor season in Cape Range National Park, on the northwest coast of Western Australia and adjacent to the iconic Ningaloo Reef. The central focus was visitors' perceptions regarding environmental conditions and standards for potential indicators. Conditions considered of greatest importance in determining visitors' quality of experience included litter, inadequate disposal of human waste, presence of wildlife, levels of noise, and access to beach and ocean. Standards were determined, based on visitors' perceptions, for a range of site-specific and non-site-specific indicators, with standards for facilities (e.g., acceptable number of parking bays, signs) and for negative environmental impacts (e.g., levels of littering, erosion) sought. The proposed standards varied significantly between sites for the facilities indicators; however, there was no significant difference between sites for environmental impacts. For the facilities, the standards proposed by visitors were closely related to the existing situation, suggesting that they were satisfied with the status quo. These results are considered in the context of current research interest in the efficacy of visitor-derived standards as a basis for protected area management.

  5. IAEA Safety Standards

    International Nuclear Information System (INIS)

    2016-09-01

    The IAEA Safety Standards Series comprises publications of a regulatory nature covering nuclear safety, radiation protection, radioactive waste management, the transport of radioactive material, the safety of nuclear fuel cycle facilities and management systems. These publications are issued under the terms of Article III of the IAEA’s Statute, which authorizes the IAEA to establish “standards of safety for protection of health and minimization of danger to life and property”. Safety standards are categorized into: • Safety Fundamentals, stating the basic objective, concepts and principles of safety; • Safety Requirements, establishing the requirements that must be fulfilled to ensure safety; and • Safety Guides, recommending measures for complying with these requirements for safety. For numbering purposes, the IAEA Safety Standards Series is subdivided into General Safety Requirements and General Safety Guides (GSR and GSG), which are applicable to all types of facilities and activities, and Specific Safety Requirements and Specific Safety Guides (SSR and SSG), which are for application in particular thematic areas. This booklet lists all current IAEA Safety Standards, including those forthcoming

  6. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  7. The illusory nature of standards

    DEFF Research Database (Denmark)

    Linneberg, Mai Skjøtt

    2011-01-01

    Purpose – The purpose of this paper is to investigate the implications of the paradoxical situation in which standard setters are placed when standardising human practice. Contrary to standards, human practices are ambiguous, heterogeneous, and highly context dependent; in contrast, standards...... creation is innate in the practice of standardisation and therefore the risk of creating untrustworthy standards is prevalent for standard setters. Originality/value – The paper provides a new understanding of standards and demonstrates the need to research standardization processes in depth and bring...

  8. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a open-quotes standard modelclose quotes. The open-quotes standard modelclose quotes consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the open-quotes standard modelclose quotes to determine if the requirements of open-quotes non-standardclose quotes architectures can be met. Several possible extensions to the open-quotes standard modelclose quotes are suggested including software as well as the hardware architectural features

  9. Alternate Assessments for Students with Significant Cognitive Disabilities: Participation Guidelines and Definitions. NCEO Report 406

    Science.gov (United States)

    Thurlow, Martha L.; Lazarus, Sheryl S.; Larson, Erik D.; Albus, Deb A.; Liu, Kristi K.; Kwong, Elena

    2017-01-01

    With the reauthorization of the Elementary and Secondary Education Act (ESEA) in 2015, renewed attention was paid to the importance of guidelines for participation in alternate assessments based on alternate achievement standards (AA-AAS) and to understanding of who the students are who have significant cognitive disabilities. The analyses…

  10. Proportions of maxillary anterior teeth relative to each other and to golden standard in tabriz dental faculty students.

    Science.gov (United States)

    Parnia, Fereydoun; Hafezeqoran, Ali; Mahboub, Farhang; Moslehifard, Elnaz; Koodaryan, Rodabeh; Moteyagheni, Rosa; Saleh Saber, Fariba

    2010-01-01

    Various methods are used to measure the size and form of the teeth, including the golden pro-portion, and the width-to-length ratio of central teeth, referred to as the golden standard. The aim of this study was to eval-uate the occurrence of golden standard values and golden proportion in the anterior teeth. Photographs of 100 dentistry students (50 males and 50 females) were taken under standard conditions. The visible widths and lengths of maxillary right and left incisors were calculated and the ratios were compared with golden standard. Data was analyzed using SPSS 14 software. Review of the results of the means showed statistically significant differences between the width ratio of right lateral teeth to the central teeth width with golden proportion (Pmean differences showed that the mean difference between proportion of right laterals to centrals with golden proportion was significant (Pgolden proportion among maxillary incisors. The review of results of mean differences for single samples showed that the mean differences between the proportion of width-to-length of left and right central teeth was statistically significant by golden standard (Pgolden standard exists. In the evaluation of the width-to-width and width-to-length proportions of maxillary incisors no golden proportions and standards were detected, respectively.

  11. Stable aesthetic standards delusion: changing 'artistic quality' by elaboration.

    Science.gov (United States)

    Carbon, Claus-Christian; Hesslinger, Vera M

    2014-01-01

    The present study challenges the notion that judgments of artistic quality are based on stable aesthetic standards. We propose that such standards are a delusion and that judgments of artistic quality are the combined result of exposure, elaboration, and discourse. We ran two experiments using elaboration tasks based on the repeated evaluation technique in which different versions of the Mona Lisa had to be elaborated deeply. During the initial task either the version known from the Louvre or an alternative version owned by the Prado was elaborated; during the second task both versions were elaborated in a comparative fashion. After both tasks multiple blends of the two versions had to be evaluated concerning several aesthetic key variables. Judgments of artistic quality of the blends were significantly different depending on the initially elaborated version of the Mona Lisa, indicating experience-based aesthetic processing, which contradicts the notion of stable aesthetic standards.

  12. Effect of standards on new equipment design by new international standards and industry restraints

    Science.gov (United States)

    Endelman, Lincoln L.

    1991-01-01

    The use of international standards to further trade is one of the objectives of creating a standard. By having form fit and function compatible the free interchange of manufactured goods can be handled without hindrance. Unfortunately by setting up standards that are peculiar to a particular country or district it is possible to exclude competition from a group of manufacturers. A major effort is now underway to develop international laser standards. In the May I 990 issue of Laser Focus World Donald R. Johnson the director of industrial technology services for the National Institute of Standards and Technology (NIST formerly the National Bureau of Standards) is quoted as follows: " The common means of protectionism has been through certification for the market place. " The article goes on to say " Mr. Johnson expects this tradition to continue and that the new European Community (EC) will demand not just safety standards but performance standards as well. . . . the American laser industry must move very quickly on this issue or risk being left behind the European standards bandwagon. " The article continues laser companies must get involved in the actual standards negotiating process if they are to have a say in future policy. A single set of standards would reduce the need to repeatedly recalibrate products for different national markets. " As a member of ISO TC-72 SC9 I am

  13. 77 FR 37587 - Updating OSHA Standards Based on National Consensus Standards; Head Protection

    Science.gov (United States)

    2012-06-22

    ... Z89.1-2003 as Appendix E, to the main text. Adds ``ASTM E1164-02 Colorimetry--Standard Practice for... National complete citations for standards on Standards Referred to in This colorimetry, headforms, and...

  14. 77 FR 37617 - Updating OSHA Standards Based on National Consensus Standards; Head Protection

    Science.gov (United States)

    2012-06-22

    ... Z89.1-2003 as Appendix E, to the main text. Adds ``ASTM E1164-02 Colorimetry--Standard Practice for... National complete citations for standards on Standards Referred to in This colorimetry, headforms, and...

  15. The Distance Standard Deviation

    OpenAIRE

    Edelmann, Dominic; Richards, Donald; Vogel, Daniel

    2017-01-01

    The distance standard deviation, which arises in distance correlation analysis of multivariate data, is studied as a measure of spread. New representations for the distance standard deviation are obtained in terms of Gini's mean difference and in terms of the moments of spacings of order statistics. Inequalities for the distance variance are derived, proving that the distance standard deviation is bounded above by the classical standard deviation and by Gini's mean difference. Further, it is ...

  16. The standardization debate: A conflation trap in critical care electroencephalography.

    Science.gov (United States)

    Ng, Marcus C; Gaspard, Nicolas; Cole, Andrew J; Hoch, Daniel B; Cash, Sydney S; Bianchi, Matt; O'Rourke, Deirdre A; Rosenthal, Eric S; Chu, Catherine J; Westover, M Brandon

    2015-01-01

    Persistent uncertainty over the clinical significance of various pathological continuous electroencephalography (cEEG) findings in the intensive care unit (ICU) has prompted efforts to standardize ICU cEEG terminology and an ensuing debate. We set out to understand the reasons for, and a satisfactory resolution to, this debate. We review the positions for and against standardization, and examine their deeper philosophical basis. We find that the positions for and against standardization are not fundamentally irreconcilable. Rather, both positions stem from conflating the three cardinal steps in the classic approach to EEG, which we term "description", "interpretation", and "prescription". Using real-world examples we show how this conflation yields muddled clinical reasoning and unproductive debate among electroencephalographers that is translated into confusion among treating clinicians. We propose a middle way that judiciously uses both standardized terminology and clinical reasoning to disentangle these critical steps and apply them in proper sequence. The systematic approach to ICU cEEG findings presented herein not only resolves the standardization debate but also clarifies clinical reasoning by helping electroencephalographers assign appropriate weights to cEEG findings in the face of uncertainty. Copyright © 2014 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  17. Consumers’ Perception on Standardized Advertizing and Localized Advertising of Multinational Companies in Smartphone Industry

    Directory of Open Access Journals (Sweden)

    Ran Liu

    2014-10-01

    Full Text Available This research analyzed the difference between standardized advertising and localized (adapted advertising based on the perception of consumers from China and the U.S. Both qualitative and quantitative analyses have been using to evaluate different marketing approaches in multiple international markets. The emphasis of this study focuses on evaluating the efficiency of advertising by assessing to what extent a standardized and localized commercial enhances brand preference and consumer’s likelihood to purchase. Quantitative analysis was conducted to identify the significance of the difference between the efficiency of standardized advertising and localized advertising in the smartphone industry, based on the perception of consumers from China and the United States in the smartphone industry. By testing the significance of the hypothesis on ad standardization and localization, some implications are suggested. The results show that it is more effective to implement a standardized ad rather than a localized ad in China.  Although the sample data of this study is collected from China and the U.S., qualitative analysis covers multiple nations from Asia to Europe and has meaningful empirical value for MNCs to develop business in those countries 

  18. Two paths from lab to market: Product and standard

    Science.gov (United States)

    Knapp, Robert H.

    2018-01-01

    To shed light on the movement of sustainable technologies from basic science to widespread use, this chapter describes key aspects of the quite different paths followed by two important examples—photovoltaics (a product) and passive-house buildings (a standard). Discussion of photovoltaics includes the experience curve concept, the increasing significance of balance-of-system costs, and the importance of market heterogeneity (niches and sub-national markets) to the long-term trajectory of major cost reductions. Discussion of passive-houses highlights the array of technical developments needed for present-day energy efficient houses, and the relevance of "stretch" standards to the development of a market for very high-performance houses.

  19. Department of Energy Standards Index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-01

    This TSL, intended for use in selecting and using DOE technical standards and other Government and non-Government standards, provides listing of current and inactive DOE technical standards, non-Government standards adopted by DOE, other Government documents in which DOE has a recorded interest, and cancelled DOE technical standards.

  20. Standard Model Higgs Boson with the L3 Experiment at LEP

    CERN Document Server

    Achard, P.; Aguilar-Benitez, M.; Alcaraz, J.; Alemanni, G.; Allaby, J.; Aloisio, A.; Alviggi, M.G.; Anderhub, H.; Andreev, Valery P.; Anselmo, F.; Arefev, A.; Azemoon, T.; Aziz, T.; Baarmand, M.; Bagnaia, P.; Bajo, A.; Baksay, G.; Baksay, L.; Baldew, S.V.; Banerjee, S.; Banerjee, Sw.; Barczyk, A.; Barillere, R.; Bartalini, P.; Basile, M.; Batalova, N.; Battiston, R.; Bay, A.; Becattini, F.; Becker, U.; Behner, F.; Bellucci, L.; Berbeco, R.; Berdugo, J.; Berges, P.; Bertucci, B.; Betev, B.L.; Biasini, M.; Biglietti, M.; Biland, A.; Blaising, J.J.; Blyth, S.C.; Bobbink, G.J.; Bohm, A.; Boldizsar, L.; Borgia, B.; Bourilkov, D.; Bourquin, M.; Braccini, S.; Branson, J.G.; Brochu, F.; Buijs, A.; Burger, J.D.; Burger, W.J.; Cai, X.D.; Capell, M.; Cara Romeo, G.; Carlino, G.; Cartacci, A.; Casaus, J.; Cavallari, F.; Cavallo, N.; Cecchi, C.; Cerrada, M.; Chamizo, M.; Chang, Y.H.; Chemarin, M.; Chen, A.; Chen, G.; Chen, G.M.; Chen, H.F.; Chen, H.S.; Chiefari, G.; Cifarelli, L.; Cindolo, F.; Clare, I.; Clare, R.; Coignet, G.; Colino, N.; Costantini, S.; De la Cruz, B.; Cucciarelli, S.; Dai, T.S.; Van Dalen, J.A.; De Asmundis, R.; Deglon, P.; Debreczeni, J.; Degre, A.; Deiters, K.; Della Volpe, D.; Delmeire, E.; Denes, P.; DeNotaristefani, F.; De Salvo, A.; Diemoz, M.; Dierckxsens, M.; Van Dierendonck, D.; Dionisi, C.; Dittmar, M.; Doria, A.; Dova, M.T.; Duchesneau, D.; Duinker, P.; Echenard, B.; Eline, A.; El Mamouni, H.; Engler, A.; Eppling, F.J.; Ewers, A.; Extermann, P.; Falagan, M.A.; Falciano, S.; Favara, A.; Fay, J.; Fedin, O.; Felcini, M.; Ferguson, T.; Fesefeldt, H.; Fiandrini, E.; Field, J.H.; Filthaut, F.; Fisher, P.H.; Fisher, W.; Fisk, I.; Forconi, G.; Freudenreich, K.; Furetta, C.; Galaktionov, Iouri; Ganguli, S.N.; Garcia-Abia, Pablo; Gataullin, M.; Gentile, S.; Giagu, S.; Gong, Z.F.; Grenier, Gerald Jean; Grimm, O.; Gruenewald, M.W.; Guida, M.; Van Gulik, R.; Gupta, V.K.; Gurtu, A.; Gutay, L.J.; Haas, D.; Hatzifotiadou, D.; Hebbeker, T.; Herve, Alain; Hirschfelder, J.; Hofer, H.; Holzner, G.; Hou, S.R.; Hu, Y.; Jin, B.N.; Jones, Lawrence W.; de Jong, P.; Josa-Mutuberria, I.; Kafer, D.; Kaur, M.; Kienzle-Focacci, M.N.; Kim, J.K.; Kirkby, Jasper; Kittel, W.; Klimentov, A.; Konig, A.C.; Kopal, M.; Koutsenko, V.; Kraber, M.; Kraemer, R.W.; Krenz, W.; Kruger, A.; Kunin, A.; Ladron De Guevara, P.; Laktineh, I.; Landi, G.; Lebeau, M.; Lebedev, A.; Lebrun, P.; Lecomte, P.; Lecoq, P.; Le Coultre, P.; Lee, H.J.; Le Goff, J.M.; Leiste, R.; Levtchenko, P.; Li, C.; Likhoded, S.; Lin, C.H.; Lin, W.T.; Linde, F.L.; Lista, L.; Liu, Z.A.; Lohmann, W.; Longo, E.; Lu, Y.S.; Lubelsmeyer, K.; Luci, C.; Luckey, David; Luminari, L.; Lustermann, W.; Ma, W.G.; Malgeri, L.; Malinin, A.; Mana, C.; Mangeol, D.; Mans, J.; Martin, J.P.; Marzano, F.; Mazumdar, K.; McNeil, R.R.; Mele, S.; Merola, L.; Meschini, M.; Metzger, W.J.; Mihul, A.; Milcent, H.; Mirabelli, G.; Mnich, J.; Mohanty, G.B.; Muanza, G.S.; Muijs, A.J.M.; Musicar, B.; Musy, M.; Nagy, S.; Napolitano, M.; Nessi-Tedaldi, F.; Newman, H.; Niessen, T.; Nisati, A.; Kluge, Hannelies; Ofierzynski, R.; Organtini, G.; Palomares, C.; Pandoulas, D.; Paolucci, P.; Paramatti, R.; Passaleva, G.; Patricelli, S.; Paul, Thomas Cantzon; Pauluzzi, M.; Paus, C.; Pauss, F.; Pedace, M.; Pensotti, S.; Perret-Gallix, D.; Petersen, B.; Piccolo, D.; Pierella, F.; Piroue, P.A.; Pistolesi, E.; Plyaskin, V.; Pohl, M.; Pojidaev, V.; Postema, H.; Pothier, J.; Prokofev, D.O.; Prokofiev, D.; Quartieri, J.; Rahal-Callot, G.; Rahaman, M.A.; Raics, P.; Raja, N.; Ramelli, R.; Rancoita, P.G.; Ranieri, R.; Raspereza, A.; Razis, P.; Ren, D.; Rescigno, M.; Reucroft, S.; Riemann, S.; Riles, Keith; Roe, B.P.; Romero, L.; Rosca, A.; Rosier-Lees, S.; Roth, Stefan; Rosenbleck, C.; Roux, B.; Rubio, J.A.; Ruggiero, G.; Rykaczewski, H.; Sakharov, A.; Saremi, S.; Sarkar, S.; Salicio, J.; Sanchez, E.; Sanders, M.P.; Schafer, C.; Schegelsky, V.; Schmidt-Kaerst, S.; Schmitz, D.; Schopper, H.; Schotanus, D.J.; Schwering, G.; Sciacca, C.; Servoli, L.; Shevchenko, S.; Shivarov, N.; Shoutko, V.; Shumilov, E.; Shvorob, A.; Siedenburg, T.; Son, D.; Spillantini, P.; Steuer, M.; Stickland, D.P.; Stoyanov, B.; Straessner, A.; Sudhakar, K.; Sultanov, G.; Sun, L.Z.; Sushkov, S.; Suter, H.; Swain, J.D.; Szillasi, Z.; Tang, X.W.; Tarjan, P.; Tauscher, L.; Taylor, L.; Tellili, B.; Teyssier, D.; Timmermans, Charles; Ting, Samuel C.C.; Ting, S.M.; Tonwar, S.C.; Toth, J.; Tully, C.; Tung, K.L.; Uchida, Y.; Ulbricht, J.; Valente, E.; Van de Walle, R.T.; Veszpremi, V.; Vesztergombi, G.; Vetlitsky, I.; Vicinanza, D.; Viertel, G.; Villa, S.; Vivargent, M.; Vlachos, S.; Vodopianov, I.; Vogel, H.; Vogt, H.; Vorobev, I.; Vorobyov, A.A.; Wadhwa, M.; Wallraff, W.; Wang, M.; Wang, X.L.; Wang, Z.M.; Weber, M.; Wienemann, P.; Wilkens, H.; Wu, S.X.; Wynhoff, S.; Xia, L.; Xu, Z.Z.; Yamamoto, J.; Yang, B.Z.; Yang, C.G.; Yang, H.J.; Yang, M.; Yeh, S.C.; Zalite, A.; Zalite, Yu.; Zhang, Z.P.; Zhao, J.; Zhu, G.Y.; Zhu, R.Y.; Zhuang, H.L.; Zichichi, A.; Zilizi, G.; Zimmermann, B.; Zoller, M.

    2001-01-01

    Final results of the search for the Standard Model Higgs boson are presented for the data collected by the L3 detector at LEP at centre-of-mass energies up to about 209 GeV. These data are compared with the expectations of Standard Model processes for Higgs boson masses up to 120 GeV. A lower limit on the mass of the Standard Model Higgs boson of 112.0 GeV is set at the 95% confidence level. The most significant high mass candidate is a Hnn bar event. It has a reconstructed Higgs mass of 115 GeV and it was recorded at Square root of s =206.4 GeV.

  1. XML — an opportunity for data standards in the geosciences

    Science.gov (United States)

    Houlding, Simon W.

    2001-08-01

    Extensible markup language (XML) is a recently introduced meta-language standard on the Web. It provides the rules for development of metadata (markup) standards for information transfer in specific fields. XML allows development of markup languages that describe what information is rather than how it should be presented. This allows computer applications to process the information in intelligent ways. In contrast hypertext markup language (HTML), which fuelled the initial growth of the Web, is a metadata standard concerned exclusively with presentation of information. Besides its potential for revolutionizing Web activities, XML provides an opportunity for development of meaningful data standards in specific application fields. The rapid endorsement of XML by science, industry and e-commerce has already spawned new metadata standards in such fields as mathematics, chemistry, astronomy, multi-media and Web micro-payments. Development of XML-based data standards in the geosciences would significantly reduce the effort currently wasted on manipulating and reformatting data between different computer platforms and applications and would ensure compatibility with the new generation of Web browsers. This paper explores the evolution, benefits and status of XML and related standards in the more general context of Web activities and uses this as a platform for discussion of its potential for development of data standards in the geosciences. Some of the advantages of XML are illustrated by a simple, browser-compatible demonstration of XML functionality applied to a borehole log dataset. The XML dataset and the associated stylesheet and schema declarations are available for FTP download.

  2. A study on the implementation of the performance of electrical products standards

    Energy Technology Data Exchange (ETDEWEB)

    Berkowitz, M A

    1984-03-01

    The Steering Committee on Electrical Products (SCOPEP) has developed a number of performance standards for selected electrical products and product attributes, under the direction of the Canadian Standards Association. Although the development of these standards shows that a successful standards writing process is in place, the initiation and implementation has been achieved by one or more stakeholder groups spearheading the requirement for the standard. As a result, SCOPEP has been unable to achieve its full potential in its role of developing peformance standards for electrical products. A study was undertaken with the aim of developing a strategic plan for SCOPEP which would be supported by marketing and other strategies. The major elements of the strategy include: selecting those products for which the benefits of a standards program significantly outweigh the costs; agreeing on the use of a framework which rigorously evaluates the cost-benefits of developing performance standards; producing a critical mass of products through the selection process so that revenues from label fees will ultimately balance program costs and support a full time SCOPEP secretariat; aggressively marketing the standards in all sectors, using a balanced push-and-pull strategy; and developing a protagonist attitude so that standards programs implemented are credible to all SCOPEP stakeholders. Detailed cost-benefit tables are appended. 11 figs., 18 tabs.

  3. 77 FR 56421 - Standards of Performance for Petroleum Refineries; Standards of Performance for Petroleum...

    Science.gov (United States)

    2012-09-12

    ... Parts 9 and 60 Standards of Performance for Petroleum Refineries; Standards of Performance for Petroleum...-9672-3] RIN 2060-AN72 Standards of Performance for Petroleum Refineries; Standards of Performance for Petroleum Refineries for Which Construction, Reconstruction, or Modification Commenced After May 14, 2007...

  4. 106-17 Telemetry Standards Digitized Audio Telemetry Standard Chapter 5

    Science.gov (United States)

    2017-07-01

    Digitized Audio Telemetry Standard 5.1 General This chapter defines continuously variable slope delta (CVSD) modulation as the standard for digitizing...audio signal. The CVSD modulator is, in essence , a 1-bit analog-to-digital converter. The output of this 1-bit encoder is a serial bit stream, where

  5. Management system - correlation study between new IAEA standards and the market standards

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Dirceu Paulo de [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Ipero, SP (Brazil)], e-mail: dirceupo@hotmail.com; Zouain, Desiree Moraes [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)], e-mail: dmzouain@ipen.br

    2009-07-01

    In order to answer the growing concern of society with respect of the aspects that affect the quality of life, international and national regulatory bodies have developed standards that enable organizations to establish management systems for quality, environment and sustainable development, health, safety and social responsibility, among other functions. Within this context it is necessary to structure an integrated management system that promotes interests compatibility of several distinct and complementary functions involved. Considering this vision of the management system integration, the International Atomic Energy Agency (IAEA) decided to review the structure of safety standards on Quality Assurance - code and guides 50-C/SGQ1/ 14:1996, publishing, in 2006, IAEA GS-R-3 and IAEA GS-G-3.1 standards, enlarging the management approach of the previous standards, including the possibility of integrating the functions foremost mentioned. This paper presents the results about a correlation study between IAEA management system standards - IAEA GS-R-3: 2006, IAEA GS-G-3.1: 2006 and IAEA DS349 rev. 2007, this latter still a draft standard, with those market management system standards on quality - ISO 9001:2008, environmental - ISO 14001:2004, and occupational health and safety - BS OHSAS 18001:2007, identifying gaps, redundancies and complementarities among their requirements and guidances. The purpose of the study is to provide subsidies that could contribute to the structuring of a management system to nuclear facilities that satisfies, in an integrated manner, the common and complementary requirements and guidances of IAEA and market standards. (author)

  6. Management system - correlation study between new IAEA standards and the market standards

    International Nuclear Information System (INIS)

    Oliveira, Dirceu Paulo de; Zouain, Desiree Moraes

    2009-01-01

    In order to answer the growing concern of society with respect of the aspects that affect the quality of life, international and national regulatory bodies have developed standards that enable organizations to establish management systems for quality, environment and sustainable development, health, safety and social responsibility, among other functions. Within this context it is necessary to structure an integrated management system that promotes interests compatibility of several distinct and complementary functions involved. Considering this vision of the management system integration, the International Atomic Energy Agency (IAEA) decided to review the structure of safety standards on Quality Assurance - code and guides 50-C/SGQ1/ 14:1996, publishing, in 2006, IAEA GS-R-3 and IAEA GS-G-3.1 standards, enlarging the management approach of the previous standards, including the possibility of integrating the functions foremost mentioned. This paper presents the results about a correlation study between IAEA management system standards - IAEA GS-R-3: 2006, IAEA GS-G-3.1: 2006 and IAEA DS349 rev. 2007, this latter still a draft standard, with those market management system standards on quality - ISO 9001:2008, environmental - ISO 14001:2004, and occupational health and safety - BS OHSAS 18001:2007, identifying gaps, redundancies and complementarities among their requirements and guidances. The purpose of the study is to provide subsidies that could contribute to the structuring of a management system to nuclear facilities that satisfies, in an integrated manner, the common and complementary requirements and guidances of IAEA and market standards. (author)

  7. The Role of Standardization in Improving the Effectiveness of Integrated Risk Management

    OpenAIRE

    Ciocoiu, Carmen Nadia; Dobrea, Razvan Catalin

    2010-01-01

    The need of standardization in risk management is justified by the efforts to develop and introduce, during the last few years, integrated risk management frameworks inside the organizations. The financial crisis has underscored the fact that significant improvements in risk management organizations and capabilities are required. The business community and also the experts recognize that the risk management standards have an important role in improving the effectiveness of integrated risk man...

  8. Mixtures of genetically modified wheat lines outperform monocultures

    OpenAIRE

    Zeller, Simon L; Kalinina, Olena; Flynn, Dan F B; Schmid, Bernhard

    2012-01-01

    Biodiversity research shows that diverse plant communities are more stable and productive than monocultures. Similarly, populations in which genotypes with different pathogen resistance are mixed may have lower pathogen levels and thus higher productivity than genetically uniform populations. We used genetically modified (GM) wheat as a model system to test this prediction, because it allowed us to use genotypes that differed only in the trait pathogen resistance but were otherwise identical....

  9. Mixtures of genetically modified wheat lines outperform monocultures.

    Science.gov (United States)

    Zeller, Simon L; Kalinina, Olena; Flynn, Dan F B; Schmid, Bernhard

    2012-09-01

    Biodiversity research shows that diverse plant communities are more stable and productive than monocultures. Similarly, populations in which genotypes with different pathogen resistance are mixed may have lower pathogen levels and thus higher productivity than genetically uniform populations. We used genetically modified (GM) wheat as a model system to test this prediction, because it allowed us to use genotypes that differed only in the trait pathogen resistance but were otherwise identical. We grew three such genotypes or lines in monocultures or two-line mixtures. Phenotypic measurements were taken at the level of individual plants and of entire plots (population level). We found that resistance to mildew increased with both GM richness (0, 1, or 2 Pm3 transgenes with different resistance specificities per plot) and GM concentration (0%, 50%, or 100% of all plants in a plot with a Pm3 transgene). Plots with two transgenes had 34.6% less mildew infection and as a consequence 7.3% higher seed yield than plots with one transgene. We conclude that combining genetic modification with mixed cropping techniques could be a promising approach to increase sustainability and productivity in agricultural systems, as the fitness cost of stacking transgenes within individuals may thus be avoided.

  10. Artistic Tasks Outperform Nonartistic Tasks for Stress Reduction

    Science.gov (United States)

    Abbott, Kayleigh A.; Shanahan, Matthew J.; Neufeld, Richard W. J.

    2013-01-01

    Art making has been documented as an effective stress reduction technique. In this between-subjects experimental study, possible mechanisms of stress reduction were examined in a sample of 52 university students randomly assigned to one of four conditions generated by factorially crossing Activity Type (artistic or nonartistic) with Coping…

  11. Quality of semantic standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert

    2012-01-01

    Little scientific literature addresses the issue of quality of semantic standards, albeit a problem with high economic and social impact. Our problem survey, including 34 semantic Standard Setting Organizations (SSOs), gives evidence that quality of standards can be improved, but for improvement a

  12. Non-perturbative effective interactions in the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Arbuzov, Boris A. [Moscow Lomonosov State Univ. (Russian Federation). Skobeltsyn Inst. of Nuclear Physics

    2014-07-01

    This monograph is devoted to the nonperturbative dynamics in the Standard Model (SM), the basic theory of allfundamental interactions in natureexcept gravity. The Standard Model is divided into two parts: the quantum chromodynamics (QCD) and the electro-weak theory (EWT) are well-defined renormalizable theories in which the perturbation theory is valid. However, for the adequate description of the real physics nonperturbative effects are inevitable. This book describes how these nonperturbative effects may be obtained in the framework of spontaneous generation of effective interactions. The well-known example of such effective interaction is provided by the famous Nambu-Jona-Lasinio effective interaction. Also a spontaneous generation of this interaction in the framework of QCD is described and applied to the method for other effective interactions in QCD and EWT. The method is based on N.N. Bogolyubov's conception of compensation equations. As a result we then describe the principal features of the Standard Model, e.g. Higgs sector, and significant nonperturbative effects including recent results obtained at LHC and TEVATRON.

  13. Effects of national accounting standards convergence to international accounting standards on foreign direct investment

    Directory of Open Access Journals (Sweden)

    Asieh Farazandehnia

    2015-09-01

    Full Text Available One of the most important factors on attracting foreign investors to invest on Tehran Stock Exchange is to have transparent accounting rules and regulations. When there are some consistency between national accounting standards and international accounting standards, we may, at least, expect foreign investors to have better understanding on financial statements. In 2006, there were some changes on Iranian national accounting standards in an attempt to make them closer to international accounting standards. In this study, we select the information of 153 firms five years before and after this regulation and study the effect of convergence from national accounting standards to international accounting standards on foreign direct investment. Using some statistical tests, the study has determined that there was no meaningful relationship between foreign direct investment before and after change on accounting standards. In addition, there was no difference on the information quality before and after change on accounting standards. However, there was some meaningful relationship between the information quality and foreign direct investment.

  14. Standardized acquisition, storing and provision of 3D enabled spatial data

    Science.gov (United States)

    Wagner, B.; Maier, S.; Peinsipp-Byma, E.

    2017-05-01

    In the area of working with spatial data, in addition to the classic, two-dimensional geometrical data (maps, aerial images, etc.), the needs for three-dimensional spatial data (city models, digital elevation models, etc.) is increasing. Due to this increased demand the acquiring, storing and provision of 3D enabled spatial data in Geographic Information Systems (GIS) is more and more important. Existing proprietary solutions quickly reaches their limits during data exchange and data delivery to other systems. They generate a large workload, which will be very costly. However, it is noticeable that these expenses and costs can generally be significantly reduced using standards. The aim of this research is therefore to develop a concept in the field of three-dimensional spatial data that runs on existing standards whenever possible. In this research, the military image analysts are the preferred user group of the system. To achieve the objective of the widest possible use of standards in spatial 3D data, existing standards, proprietary interfaces and standards under discussion have been analyzed. Since the here used GIS of the Fraunhofer IOSB is already using and supporting OGC (Open Geospatial Consortium) and NATO-STANAG (NATO-Standardization Agreement) standards for the most part of it, a special attention for possible use was laid on their standards. The most promising standard is the OGC standard 3DPS (3D Portrayal Service) with its occurrences W3DS (Web 3D Service) and WVS (Web View Service). A demo system was created, using a standardized workflow from the data acquiring, storing and provision and showing the benefit of our approach.

  15. Midupper Arm Circumference Outperforms Weight-Based Measures of Nutritional Status in Children with Diarrhea.

    Science.gov (United States)

    Modi, Payal; Nasrin, Sabiha; Hawes, Meagan; Glavis-Bloom, Justin; Alam, Nur H; Hossain, M Iqbal; Levine, Adam C

    2015-07-01

    Undernutrition contributes to 45% of all deaths in children children with diarrhea and possible dehydration. This study assessed the validity of different measures of undernutrition in children with diarrhea. A prospective cohort study was conducted at an urban hospital in Bangladesh. Children children for screening, of which 1025 were eligible, 850 were enrolled, and 721 had complete data for analysis. Anthropometric measurements, including weight-for-age z score (WAZ), weight-for-length z score (WLZ), midupper arm circumference (MUAC), and midupper arm circumference z score (MUACZ), were calculated pre- and posthydration in all patients. Measurements were evaluated for their ability to correctly identify undernutrition in children with varying degrees of dehydration. Of the 721 patients with full data for analysis, the median percent dehydration was 4%. Of the 4 measures evaluated, MUAC and MUACZ demonstrated 92-94% agreement pre- and posthydration compared with 69-76% for WAZ and WLZ. Although each 1% change in hydration status was found to change weight-for-age by 0.0895 z scores and weight-for-length by 0.1304 z scores, MUAC and MUACZ were not significantly affected by dehydration status. Weight-based measures misclassified 12% of children with severe underweight and 14% with severe acute malnutrition (SAM) compared with only 1-2% for MUAC and MUACZ. MUAC and MUACZ were the most accurate predictors of undernutrition in children with diarrhea. WAZ and WLZ were significantly affected by dehydration status, leading to the misdiagnosis of many patients on arrival with severe underweight and SAM. This trial was registered at clinicaltrials.gov as NCT02007733. © 2015 American Society for Nutrition.

  16. Task-induced frequency modulation features for brain-computer interfacing.

    Science.gov (United States)

    Jayaram, Vinay; Hohmann, Matthias; Just, Jennifer; Schölkopf, Bernhard; Grosse-Wentrup, Moritz

    2017-10-01

    Task-induced amplitude modulation of neural oscillations is routinely used in brain-computer interfaces (BCIs) for decoding subjects' intents, and underlies some of the most robust and common methods in the field, such as common spatial patterns and Riemannian geometry. While there has been some interest in phase-related features for classification, both techniques usually presuppose that the frequencies of neural oscillations remain stable across various tasks. We investigate here whether features based on task-induced modulation of the frequency of neural oscillations enable decoding of subjects' intents with an accuracy comparable to task-induced amplitude modulation. We compare cross-validated classification accuracies using the amplitude and frequency modulated features, as well as a joint feature space, across subjects in various paradigms and pre-processing conditions. We show results with a motor imagery task, a cognitive task, and also preliminary results in patients with amyotrophic lateral sclerosis (ALS), as well as using common spatial patterns and Laplacian filtering. The frequency features alone do not significantly out-perform traditional amplitude modulation features, and in some cases perform significantly worse. However, across both tasks and pre-processing in healthy subjects the joint space significantly out-performs either the frequency or amplitude features alone. This result only does not hold for ALS patients, for whom the dataset is of insufficient size to draw any statistically significant conclusions. Task-induced frequency modulation is robust and straight forward to compute, and increases performance when added to standard amplitude modulation features across paradigms. This allows more information to be extracted from the EEG signal cheaply and can be used throughout the field of BCIs.

  17. Risk factors of significant pain syndrome 90 days after minor thoracic injury: trajectory analysis.

    Science.gov (United States)

    Daoust, Raoul; Emond, Marcel; Bergeron, Eric; LeSage, Natalie; Camden, Stéphanie; Guimont, Chantal; Vanier, Laurent; Chauny, Jean-Marc

    2013-11-01

    The objective was to identify the risk factors of clinically significant pain at 90 days in patients with minor thoracic injury (MTI) discharged from the emergency department (ED). A prospective, multicenter, cohort study was conducted in four Canadian EDs from November 2006 to November 2010. All consecutive patients aged 16 years or older with MTI were eligible at discharge from EDs. They underwent standardized clinical and radiologic evaluations at 1 and 2 weeks, followed by standardized telephone interviews at 30 and 90 days. A pain trajectory model characterized groups of patients with different pain evolutions and ascertained specific risk factors in each group through multivariate analysis. In this cohort of 1,132 patients, 734 were eligible for study inclusion. The authors identified a pain trajectory that characterized 18.2% of the study population experiencing clinically significant pain (>3 of 10) at 90 days after a MTI. Multivariate modeling found two or more rib fractures, smoking, and initial oxygen saturation below 95% to be predictors of this group of patients. To the authors' knowledge, this is the first prospective study of trajectory modeling to detect risk factors associated with significant pain at 90 days after MTI. These factors may help in planning specific treatment strategies and should be validated in another prospective cohort. © 2013 by the Society for Academic Emergency Medicine.

  18. Wireless installation standard

    International Nuclear Information System (INIS)

    Lim, Hwang Bin

    2007-12-01

    This is divided six parts which are radio regulation law on securing of radio resource, use of radio resource, protection of radio resource, radio regulation enforcement ordinance with securing, distribution and assignment of radio regulation, radio regulation enforcement regulation on utility of radio resource and technical qualification examination, a wireless installation regulation of technique standard and safety facility standard, radio regulation such as certification regulation of information communicative machines and regulation of radio station on compliance of signal security, radio equipment in radio station, standard frequency station and emergency communication.

  19. Standardization of nuclear power plants in the United States: recent regulatory developments

    International Nuclear Information System (INIS)

    Cowan, B.Z.; Tourtellotte, J.R.

    1992-01-01

    On April 18, 1989, the United States (U.S.) Nuclear Regulatory Commission (NRC) amended the regulations governing the process for licensing nuclear power plants in the United States to provide for issuance of early site permits, standard design certifications and combined construction permits and operating licenses for nuclear power reactors. The new regulations are designed to achieve early resolution of licensing issues and facilitate standardization of nuclear power plants in the United States. The program for design standardization is central to efforts mounted by the U.S. government and industry to ensure that there will be a next generation of nuclear power facilities in the U.S. The most significant changes are provisions for certification of standard designs and for issuance prior to start of construction of combined licenses which incorporate a construction permit and an operating license with conditions. Such certifications and combined licenses must contain tests, inspections and analyses, and acceptance criteria, which are necessary and sufficient to provide reasonable assurance that the facility has been constructed and will operate in accordance with the combined license. A number of significant implementation issues have arisen. In addition a major court case brought by several anti-nuclear groups is pending, challenging NRC authority to issue combined licenses. It is the goal of the U.S. nuclear industry to have the first of the next generation of standardized nuclear power plants ordered, licensed, constructed and on-line by the year 2000. (author)

  20. Standards, the users perspective

    International Nuclear Information System (INIS)

    Nason, W.D.

    1993-01-01

    The term standard has little meaning until put into the proper context. What is being standardized? What are the standard conditions to be applied? The list of questions that arise goes on and on. In this presentation, answers to these questions are considered in the interest of providing a basic understanding of what might be useful to the electrical power industry in the way of standards and what the limitations on application of them would be as well. 16 figs

  1. The Standard Model course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    Suggested Readings: Aspects of Quantum Chromodynamics/A Pich, arXiv:hep-ph/0001118. - The Standard Model of Electroweak Interactions/A Pich, arXiv:hep-ph/0502010. - The Standard Model of Particle Physics/A Pich The Standard Model of Elementary Particle Physics will be described. A detailed discussion of the particle content, structure and symmetries of the theory will be given, together with an overview of the most important experimental facts which have established this theoretical framework as the Standard Theory of particle interactions.

  2. Do health care workforce, population, and service provision significantly contribute to the total health expenditure? An econometric analysis of Serbia.

    Science.gov (United States)

    Santric-Milicevic, M; Vasic, V; Terzic-Supic, Z

    2016-08-15

    In times of austerity, the availability of econometric health knowledge assists policy-makers in understanding and balancing health expenditure with health care plans within fiscal constraints. The objective of this study is to explore whether the health workforce supply of the public health care sector, population number, and utilization of inpatient care significantly contribute to total health expenditure. The dependent variable is the total health expenditure (THE) in Serbia from the years 2003 to 2011. The independent variables are the number of health workers employed in the public health care sector, population number, and inpatient care discharges per 100 population. The statistical analyses include the quadratic interpolation method, natural logarithm and differentiation, and multiple linear regression analyses. The level of significance is set at P Total health expenditure increased by 1.21 standard deviations, with an increase in health workforce growth rate by 1 standard deviation. Furthermore, this rate decreased by 1.12 standard deviations, with an increase in (negative) population growth rate by 1 standard deviation. Finally, the growth rate increased by 0.38 standard deviation, with an increase of the growth rate of inpatient care discharges per 100 population by 1 standard deviation (P < 0.001). Study results demonstrate that the government has been making an effort to control strongly health budget growth. Exploring causality relationships between health expenditure and health workforce is important for countries that are trying to consolidate their public health finances and achieve universal health coverage at the same time.

  3. Preparation of high purity plutonium oxide for radiochemistry instrument calibration standards and working standards

    International Nuclear Information System (INIS)

    Wong, A.S.; Stalnaker, N.D.

    1997-04-01

    Due to the lack of suitable high level National Institute of Standards and Technology (NIST) traceable plutonium solution standards from the NIST or commercial vendors, the CST-8 Radiochemistry team at Los Alamos National Laboratory (LANL) has prepared instrument calibration standards and working standards from a well-characterized plutonium oxide. All the aliquoting steps were performed gravimetrically. When a 241 Am standardized solution obtained from a commercial vendor was compared to these calibration solutions, the results agreed to within 0.04% for the total alpha activity. The aliquots of the plutonium standard solutions and dilutions were sealed in glass ampules for long term storage

  4. Significance testing in ridge regression for genetic data

    Directory of Open Access Journals (Sweden)

    De Iorio Maria

    2011-09-01

    Full Text Available Abstract Background Technological developments have increased the feasibility of large scale genetic association studies. Densely typed genetic markers are obtained using SNP arrays, next-generation sequencing technologies and imputation. However, SNPs typed using these methods can be highly correlated due to linkage disequilibrium among them, and standard multiple regression techniques fail with these data sets due to their high dimensionality and correlation structure. There has been increasing interest in using penalised regression in the analysis of high dimensional data. Ridge regression is one such penalised regression technique which does not perform variable selection, instead estimating a regression coefficient for each predictor variable. It is therefore desirable to obtain an estimate of the significance of each ridge regression coefficient. Results We develop and evaluate a test of significance for ridge regression coefficients. Using simulation studies, we demonstrate that the performance of the test is comparable to that of a permutation test, with the advantage of a much-reduced computational cost. We introduce the p-value trace, a plot of the negative logarithm of the p-values of ridge regression coefficients with increasing shrinkage parameter, which enables the visualisation of the change in p-value of the regression coefficients with increasing penalisation. We apply the proposed method to a lung cancer case-control data set from EPIC, the European Prospective Investigation into Cancer and Nutrition. Conclusions The proposed test is a useful alternative to a permutation test for the estimation of the significance of ridge regression coefficients, at a much-reduced computational cost. The p-value trace is an informative graphical tool for evaluating the results of a test of significance of ridge regression coefficients as the shrinkage parameter increases, and the proposed test makes its production computationally feasible.

  5. The Future of Geospatial Standards

    Science.gov (United States)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds

  6. Reducing the variation in animal models by standardizing the gut microbiota

    DEFF Research Database (Denmark)

    Ellekilde, Merete; Hufeldt, Majbritt Ravn; Hansen, Camilla Hartmann Friis

    2011-01-01

    , a large proportion of laboratory animals are used to study such diseases, but inter-individual variation in these animal models leads to the need for larger group sizes to reach statistical significance and adequate power. By standardizing the microbial and immunological status of laboratory animals we...... mice changed the glucose tolerance without affecting weight or mucosal immunity. Further investigations concerning the mechanisms of how GM influences disease development is necessary, but based on these results it seems reasonable to assume that by manipulating the GM we may produce animal models...... may therefore be able to produce animals with a more standardized response and less variation. This would lead to more precise results and a reduced number of animals needed for statistical significance. Denaturing gradient gel electrophoresis (DGGE) - a culture independent approach separating PCR...

  7. Cytomegalovirus neutralization by hyperimmune and standard intravenous immunoglobulin preparations.

    Science.gov (United States)

    Planitzer, Christina B; Saemann, Marcus D; Gajek, Hartwig; Farcet, Maria R; Kreil, Thomas R

    2011-08-15

    Cytomegalovirus (CMV) remains one of the most important pathogens after transplantation, potentially leading to CMV disease, allograft dysfunction, acute, and chronic rejection and opportunistic infections. Immunoglobulin G (IgG) preparations with high antibody titers against CMV are a valuable adjunctive prevention and treatment option for clinicians and apart from standard intravenous immunoglobulin (IVIG), CMV hyperimmune preparations are available. The CMV antibody titer of these preparations is typically determined by Enzyme-linked immunosorbent assay (ELISA), also used for the selection of high titer plasma donors for the production of the CMV Hyperimmune product. However, CMV ELISA titers do not necessarily correlate with CMV antibody function which is determined by virus neutralization tests. CMV antibody titers were determined by both ELISA and virus neutralization assay and the IgG subclass distribution was compared between a CMV hyperimmune licensed in Europe and standard IVIG preparations. Although the expected high CMV IgG ELISA antibody titers were confirmed for three lots of a CMV hyperimmune preparation, the functionally more relevant CMV neutralizing antibody titers were significantly higher for 31 lots of standard IVIG preparations. Moreover, considerably lower IgG3 levels were found for the CMV hyperimmune preparation compared with standard IVIG preparations. The higher functional CMV neutralization titers of standard IVIG preparations and the better availability of these preparations, suggest that these products could be a valuable alternative to the CMV hyperimmune preparation.

  8. Quality standards in 480 pancreatic resections: a prospective observational study

    Directory of Open Access Journals (Sweden)

    Francisco Javier Herrera-Cabezón

    2015-03-01

    Full Text Available Pancreatic resection is a standard procedure for the treatment of periampullary tumors. Morbidity and mortality are high, and quality standards are scarce in our setting. International classifications of complications (Clavien-Dindo and those specific for pancreatectomies (ISGPS allow adequate case comparisons. The goals of our work are to describe the morbidity and mortality of 480 pancreatectomies using the international classifications ISGPS and Clavien-Dindo to help establish a quality standard in our setting and to compare the results of CPD with reconstruction by pancreaticogastrostomy (1,55 versus 177 pancreaticojejunostomy. We report 480 resections including 337 duodenopancreatectomies, 116 distal pancreatectomies, 11 total pancreatectomies, 10 central pancreatectomies, and 6 enucleations. Results for duodenopancreatectomy include: 62 % morbidity (Clavien ≥ III 25.9 %, 12.3 % reinterventions, and 3.3 % overall mortality. For reconstruction by pancreaticojejunostomy: 71.2 % morbidity (Clavien ≥ III 34.4 %, 17.5 % reinterventions, and 3.3 % mortality. For reconstruction by pancreaticogastrostomy: 51 % morbidity (Clavien ≥ III 15.4%, 6.4 % reinterventions, and 3.2 % mortality. Differences are significant except for mortality. We conclude that our series meets quality criteria as compared to other groups. Reconstruction with pancreaticogastrostomy significantly reduces complication number and severity, as well as pancreatic fistula and reintervention rates.

  9. Decline eccentric squats increases patellar tendon loading compared to standard eccentric squats.

    Science.gov (United States)

    Kongsgaard, M; Aagaard, P; Roikjaer, S; Olsen, D; Jensen, M; Langberg, H; Magnusson, S P

    2006-08-01

    Recent studies have shown excellent clinical results using eccentric squat training on a 25 degrees decline board to treat patellar tendinopathy. It remains unknown why therapeutic management of patellar tendinopathy using decline eccentric squats offer superior clinical efficacy compared to standard horizontal eccentric squats. This study aimed to compare electromyography activity, patellar tendon strain and joint angle kinematics during standard and decline eccentric squats. Thirteen subjects performed unilateral eccentric squats on flat-and a 25 degrees decline surface. During the squats, electromyography activity was obtained in eight representative muscles. Also, ankle, knee and hip joint goniometry was obtained. Additionally, patellar tendon strain was measured in vivo using ultrasonography as subjects maintained a unilateral isometric 90 degrees knee angle squat position on either flat or 25 degrees decline surface. Patellar tendon strain was significantly greater (Psquat position on the decline surface compared to the standard surface. The stop angles of the ankle and hip joints were significantly smaller during the decline compared to the standard squats (Psquats (Psquats. The use of a 25 degrees decline board increases the load and the strain of the patellar tendon during unilateral eccentric squats. This finding likely explains previous reports of superior clinical efficacy of decline eccentric squats in the rehabilitative management of patellar tendinopathy.

  10. International standards for radiation protection

    International Nuclear Information System (INIS)

    Ambrosi, P.

    2011-01-01

    International standards for radiation protection are issued by many bodies. These bodies differ to a large extent in their organisation, in the way the members are designated and in the way the international standards are authorised by the issuing body. Large differences also exist in the relevance of the international standards. One extreme is that the international standards are mandatory in the sense that no conflicting national standard may exist, the other extreme is that national and international standards conflict and there is no need to resolve that conflict. Between these extremes there are some standards or documents of relevance, which are not binding by any formal law or contract but are de facto binding due to the scientific reputation of the issuing body. This paper gives, for radiation protection, an overview of the main standards issuing bodies, the international standards or documents of relevance issued by them and the relevance of these documents. (authors)

  11. Fluorescence of ceramic color standards

    International Nuclear Information System (INIS)

    Koo, Annette; Clare, John F.; Nield, Kathryn M.; Deadman, Andrew; Usadi, Eric

    2010-01-01

    Fluorescence has been found in color standards available for use in calibration and verification of color measuring instruments. The fluorescence is excited at wavelengths below about 600 nm and emitted above 700 nm, within the response range of silicon photodiodes, but at the edge of the response of most photomultipliers and outside the range commonly scanned in commercial colorimeters. The degree of fluorescence on two of a set of 12 glossy ceramic tiles is enough to introduce significant error when those tiles have been calibrated in one mode of measurement and are used in another. We report the nature of the fluorescence and the implications for color measurement.

  12. The emergence of Southern standards in agricultural value chains: a new trend in sustainability governance?

    NARCIS (Netherlands)

    Schouten, A.M.; Bitzer, V.C.

    2015-01-01

    The objective of this paper is to understand and trace the emergence of Southern standards in global agricultural value chains. While the trend towards private standards established by developed country or ‘Northern’ actors has received significant attention in the literature, recently an emergent

  13. Low-dose vaporized cannabis significantly improves neuropathic pain.

    Science.gov (United States)

    Wilsey, Barth; Marcotte, Thomas; Deutsch, Reena; Gouaux, Ben; Sakai, Staci; Donaghe, Haylee

    2013-02-01

    We conducted a double-blind, placebo-controlled, crossover study evaluating the analgesic efficacy of vaporized cannabis in subjects, the majority of whom were experiencing neuropathic pain despite traditional treatment. Thirty-nine patients with central and peripheral neuropathic pain underwent a standardized procedure for inhaling medium-dose (3.53%), low-dose (1.29%), or placebo cannabis with the primary outcome being visual analog scale pain intensity. Psychoactive side effects and neuropsychological performance were also evaluated. Mixed-effects regression models demonstrated an analgesic response to vaporized cannabis. There was no significant difference between the 2 active dose groups' results (P > .7). The number needed to treat (NNT) to achieve 30% pain reduction was 3.2 for placebo versus low-dose, 2.9 for placebo versus medium-dose, and 25 for medium- versus low-dose. As these NNTs are comparable to those of traditional neuropathic pain medications, cannabis has analgesic efficacy with the low dose being as effective a pain reliever as the medium dose. Psychoactive effects were minimal and well tolerated, and neuropsychological effects were of limited duration and readily reversible within 1 to 2 hours. Vaporized cannabis, even at low doses, may present an effective option for patients with treatment-resistant neuropathic pain. The analgesia obtained from a low dose of delta-9-tetrahydrocannabinol (1.29%) in patients, most of whom were experiencing neuropathic pain despite conventional treatments, is a clinically significant outcome. In general, the effect sizes on cognitive testing were consistent with this minimal dose. As a result, one might not anticipate a significant impact on daily functioning. Published by Elsevier Inc.

  14. Swedish snus and the GothiaTek® standard

    Directory of Open Access Journals (Sweden)

    Ringberger Tommy

    2011-05-01

    Full Text Available Abstract Some smokeless tobacco products, such as Swedish snus, are today considered to be associated with substantially fewer health hazards than cigarettes. This risk differential has contributed to the scientific debate about the possibilities of harm reduction within the tobacco area. Although current manufacturing methods for snus build on those that were introduced more than a century ago, the low levels of unwanted substances in modern Swedish snus are largely due to improvements in production techniques and selection of raw materials in combination with several programs for quality assurance and quality control. These measures have been successively introduced during the past 30-40 years. In the late 1990s they formed the basis for a voluntary quality standard for Swedish snus named GothiaTek®. In recent years the standard has been accepted by the members of the trade organization European Smokeless Tobacco Council (ESTOC so it has now evolved into an industrial standard for all smokeless tobacco products in Europe. The initial impetus for the mentioned changes of the production was quality problems related to microbial activity and formation of ammonia and nitrite in the finished products. Other contributing factors were that snus came under the jurisdiction of the Swedish Food Act in 1971, and concerns that emerged in the 1960s and 1970s about health effects of tobacco, and the significance of agrochemical residues and other potential toxicants in food stuffs. This paper summarizes the historical development of the manufacture of Swedish snus, describes the chemical composition of modern snus, and gives the background and rationale for the GothiaTek® standard, including the selection of constituents for which the standard sets limits. The paper also discusses the potential future of this voluntary standard in relation to current discussions about tobacco harm reduction and regulatory science in tobacco control.

  15. Photon and proton activation analysis of iron and steel standards using the internal standard method coupled with the standard addition method

    International Nuclear Information System (INIS)

    Masumoto, K.; Hara, M.; Hasegawa, D.; Iino, E.; Yagi, M.

    1997-01-01

    The internal standard method coupled with the standard addition method has been applied to photon activation analysis and proton activation analysis of minor elements and trace impurities in various types of iron and steel samples issued by the Iron and Steel Institute of Japan (ISIJ). Samples and standard addition samples were once dissolved to mix homogeneously, an internal standard and elements to be determined and solidified as a silica-gel to make a similar matrix composition and geometry. Cerium and yttrium were used as an internal standard in photon and proton activation, respectively. In photon activation, 20 MeV electron beam was used for bremsstrahlung irradiation to reduce matrix activity and nuclear interference reactions, and the results were compared with those of 30 MeV irradiation. In proton activation, iron was removed by the MIBK extraction method after dissolving samples to reduce the radioactivity of 56 Co from iron via 56 Fe(p, n) 56 Co reaction. The results of proton and photon activation analysis were in good agreement with the standard values of ISIJ. (author)

  16. Wavelength standards in the infrared

    CERN Document Server

    Rao, KN

    2012-01-01

    Wavelength Standards in the Infrared is a compilation of wavelength standards suitable for use with high-resolution infrared spectrographs, including both emission and absorption standards. The book presents atomic line emission standards of argon, krypton, neon, and xenon. These atomic line emission standards are from the deliberations of Commission 14 of the International Astronomical Union, which is the recognized authority for such standards. The text also explains the techniques employed in determining spectral positions in the infrared. One of the techniques used includes the grating con

  17. A randomized controlled trial of a brief versus standard group parenting program for toddler aggression.

    Science.gov (United States)

    Tully, Lucy A; Hunt, Caroline

    2017-05-01

    Physical aggression (PA) in the toddler years is common and developmentally normal, however, longitudinal research shows that frequent PA is highly stable and associated with long-term negative outcomes. Significant research has demonstrated the efficacy of parenting interventions for reducing externalizing behavior in children yet their typical length may overburden families, leading to low participation rates and high attrition rates. To increase the reach of parenting interventions and impact on the prevalence of externalizing behavior problems, brief interventions are needed. This RCT compared a standard (8 session) group Triple P to a brief (3 session) discussion group and a waitlist control for reducing toddler PA, dysfunctional parenting and related aspects of parent functioning. Sixty-nine self-referred families of toddlers with PA were randomized to the respective conditions. At post-assessment, families in the standard intervention had significantly lower levels of observed child aversive behavior, mother reports of PA and dysfunctional parenting, and higher levels of mother- and partner-rated behavioral self-efficacy than the waitlist control. Families in the standard intervention also had significantly lower levels mother-rated dysfunctional parenting than the brief intervention, and the brief intervention had significantly lower levels of mother-rated dysfunctional parenting than waitlist. There were no significant group differences at post-assessment for measures of parental negative affect or satisfaction with the partner relationship. By 6 month follow-up, families in the brief and standard intervention did not differ significantly on any measure. The implications of the findings to delivery of brief parenting interventions are discussed. Aggr. Behav. 43:291-303, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. Preliminary Efficacy of Adapted Responsive Teaching for Infants at Risk of Autism Spectrum Disorder in a Community Sample

    Directory of Open Access Journals (Sweden)

    Grace T. Baranek

    2015-01-01

    Full Text Available This study examined the (a feasibility of enrolling 12-month-olds at risk of ASD from a community sample into a randomized controlled trial, (b subsequent utilization of community services, and (c potential of a novel parent-mediated intervention to improve outcomes. The First Year Inventory was used to screen and recruit 12-month-old infants at risk of ASD to compare the effects of 6–9 months of Adapted Responsive Teaching (ART versus referral to early intervention and monitoring (REIM. Eighteen families were followed for ~20 months. Assessments were conducted before randomization, after treatment, and at 6-month follow-up. Utilization of community services was highest for the REIM group. ART significantly outperformed REIM on parent-reported and observed measures of child receptive language with good linear model fit. Multiphase growth models had better fit for more variables, showing the greatest effects in the active treatment phase, where ART outperformed REIM on parental interactive style (less directive, child sensory responsiveness (less hyporesponsive, and adaptive behavior (increased communication and socialization. This study demonstrates the promise of a parent-mediated intervention for improving developmental outcomes for infants at risk of ASD in a community sample and highlights the utility of earlier identification for access to community services earlier than standard practice.

  19. Astrophysical neutrinos flavored with beyond the Standard Model physics

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Rasmus W.; Ackermann, Markus; Winter, Walter [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Lechner, Lukas [Vienna Univ. of Technology (Austria). Dept. of Physics; Kowalski, Marek [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2017-07-15

    We systematically study the allowed parameter space for the flavor composition of astrophysical neutrinos measured at Earth, including beyond the Standard Model theories at production, during propagation, and at detection. One motivation is to illustrate the discrimination power of the next-generation neutrino telescopes such as IceCube-Gen2. We identify several examples that lead to potential deviations from the standard neutrino mixing expectation such as significant sterile neutrino production at the source, effective operators modifying the neutrino propagation at high energies, dark matter interactions in neutrino propagation, or non-standard interactions in Earth matter. IceCube-Gen2 can exclude about 90% of the allowed parameter space in these cases, and hence will allow to efficiently test and discriminate models. More detailed information can be obtained from additional observables such as the energy-dependence of the effect, fraction of electron antineutrinos at the Glashow resonance, or number of tau neutrino events.

  20. Impersonating the Standard Model Higgs boson: alignment without decoupling

    International Nuclear Information System (INIS)

    Carena, Marcela; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E.M.

    2014-01-01

    In models with an extended Higgs sector there exists an alignment limit, in which the lightest CP-even Higgs boson mimics the Standard Model Higgs. The alignment limit is commonly associated with the decoupling limit, where all non-standard scalars are significantly heavier than the Z boson. However, alignment can occur irrespective of the mass scale of the rest of the Higgs sector. In this work we discuss the general conditions that lead to “alignment without decoupling”, therefore allowing for the existence of additional non-standard Higgs bosons at the weak scale. The values of tan β for which this happens are derived in terms of the effective Higgs quartic couplings in general two-Higgs-doublet models as well as in supersymmetric theories, including the MSSM and the NMSSM. Moreover, we study the information encoded in the variations of the SM Higgs-fermion couplings to explore regions in the m A −tan β parameter space

  1. Astrophysical neutrinos flavored with beyond the Standard Model physics

    International Nuclear Information System (INIS)

    Rasmussen, Rasmus W.; Ackermann, Markus; Winter, Walter; Lechner, Lukas; Kowalski, Marek; Humboldt-Universitaet, Berlin

    2017-07-01

    We systematically study the allowed parameter space for the flavor composition of astrophysical neutrinos measured at Earth, including beyond the Standard Model theories at production, during propagation, and at detection. One motivation is to illustrate the discrimination power of the next-generation neutrino telescopes such as IceCube-Gen2. We identify several examples that lead to potential deviations from the standard neutrino mixing expectation such as significant sterile neutrino production at the source, effective operators modifying the neutrino propagation at high energies, dark matter interactions in neutrino propagation, or non-standard interactions in Earth matter. IceCube-Gen2 can exclude about 90% of the allowed parameter space in these cases, and hence will allow to efficiently test and discriminate models. More detailed information can be obtained from additional observables such as the energy-dependence of the effect, fraction of electron antineutrinos at the Glashow resonance, or number of tau neutrino events.

  2. [Standardization of names in prescriptions of traditional Chinese medicines].

    Science.gov (United States)

    Li, Chao-Feng; Zhang, Yu-Jun; Fan, Dong-He; Zhang, Meng-Jie; Bai, Xue; Yang, Wen-Hua; Qi, Shu-Ya; Zhang, Zhi-Jie; Xue, Chun-Miao; Mao, Liu-Ying; Cao, Jun-Ling

    2017-01-01

    Chinese medicine prescriptions are a type of medical documents written by doctors after they understand the patients' conditions for syndrome differentiation. Chinese medicine prescriptions are also the basis for pharmacy personnel to dispense medicines and guide patients to use drugs. It has the legal, technical and economic significances. Chinese medicine prescriptions contain such information of names, quantity and usage. Whether the names of drugs in Chinese medicine prescriptions are standardized or not is directly related to the safety and efficacy of the drugs. At present, nonstandard clinical prescriptions are frequently seen. With "Chinese medicine prescription", "names of drug in Chinese medicine prescription" and "standards of Chinese medicine prescription" as key words, the author searched CNKI, Wanfang and other databases, and consulted nearly 100 literatures, so as to summarize current names of drugs in traditional Chinese medicine prescription, analyze the reasons, and give suggestions, in the expectation of standardizing the names of drugs used in traditional Chinese medicine prescriptions. Copyright© by the Chinese Pharmaceutical Association.

  3. Current situation of International Organization for Standardization/Technical Committee 249 international standards of traditional Chinese medicine.

    Science.gov (United States)

    Liu, Yu-Qi; Wang, Yue-Xi; Shi, Nan-Nan; Han, Xue-Jie; Lu, Ai-Ping

    2017-05-01

    To review the current situation and progress of traditional Chinese medicine (TCM) international standards, standard projects and proposals in International Organization for Standardization (ISO)/ technical committee (TC) 249. ISO/TC 249 standards and standard projects on the ISO website were searched and new standard proposals information were collected from ISO/TC 249 National Mirror Committee in China. Then all the available data were summarized in 5 closely related items, including proposed time, proposed country, assigned working group (WG), current stage and classifification. In ISO/TC 249, there were 2 international standards, 18 standard projects and 24 new standard proposals proposed in 2014. These 44 standard subjects increased year by year since 2011. Twenty-nine of them were proposed by China, 15 were assigned to WG 4, 36 were in preliminary and preparatory stage and 8 were categorized into 4 fifields, 7 groups and sub-groups based on International Classifification Standards. A rapid and steady development of international standardization in TCM can be observed in ISO/TC 249.

  4. Developing an ANSI [American National Standards Institute] standard for semitrailers used to transport radioactive materials

    International Nuclear Information System (INIS)

    Gregory, P.

    1990-01-01

    A proposed new American National Standards Institute (ANSI) standard has been prepared which establishes requirements for the design, fabrication, and maintenance of semitrailers used in the highway transport of weight-concentrated radioactive loads. A weight-concentrated load is any payload which exceeds 1,488 kilograms per lineal meter (1,000 lb/ft) over any portion of the semitrailer. The proposed standard also provides detailed procedures for in-service inspections, as well as requirements for testing and quality assurance. The standard addresses only semitrailers, excluding the tractor. Trailers already in service may be certified as complying with the standard if they meet the requirements of the standard. This standard is intended to provide guidance and acceptance criteria needed to establish a uniform minimum level of performance for the designer, manufacturer, owner, operator, and shipper. This standard is not intended to apply to special, non-routine shipments of a one-time or occasional nature which require special permitting. The background and history of the standard are traced and a summary discussion of the standard is provided in this article

  5. Weston Standard battery

    CERN Multimedia

    This is a Weston AOIP standard battery with its calibration certificate (1956). Inside, the glassware forms an "H". Its name comes from the British physicist Edward Weston. A standard is the materialization of a given quantity whose value is known with great accuracy.

  6. The German radiation protection standards

    International Nuclear Information System (INIS)

    Becker, Klaus; Neider, Rudolf

    1977-01-01

    The German Standards Institute (DIN Deutsches Institut fuer Normung, Berlin) is engaged in health physics standards development in the following committees. The Nuclear Standards Committee (NKe), which deals mainly with nuclear science and technology, the fuel cycle, and radiation protection techniques. The Radiology Standards Committee (FNR), whose responsibilities are traditionally the principles of radiation protection and dosimetry, applied medical dosimetry, and medical health physics. The German Electrotechnical Commission (DKE), which is concerned mostly with instrumentation standards. The Material Testing Committee (FNM), which is responsible for radiation protection in nonmedical radiography. The current body of over one hundred standards and draft standards was established to supplement the Federal German radiation protection legislation, because voluntary standards can deal in more detail with the specific practical problems. The number of standards is steadily expanding due to the vigorous efforts of about thirty working groups, consisting of essentially all leading German experts of this field. Work is supported by the industry and the Federal Government. A review of the present status and future plans, and of the international aspects with regard to European and world (ISO, etc.) standards will be presented

  7. SOFG: Standards requirements

    International Nuclear Information System (INIS)

    Gerganov, T.; Grigorov, S.; Kozhukharov, V.; Brashkova, N.

    2005-01-01

    It is well-known that Solid Oxide Fuel Cells will have industrial application in the nearest future. In this context, the problem of SOFC materials and SOFC systems standardization is of high level of priority. In the present study the attention is focused on the methods for physical and chemical characterization of the materials for SOFC components fabrication and about requirements on single SOFC cells tests. The status of the CEN, ISO, ASTM (ANSI, ASSN) and JIS class of standards has been verified. Standards regarding the test methods for physical-chemical characterization of vitreous materials (as sealing SOFC component), ceramic materials (as electrodes and electrolyte components, including alternative materials used) and metallic materials (interconnect components) are subject of overview. It is established that electrical, mechanical, surface and interfacial phenomena, chemical durability and thermal corrosion behaviour are the key areas for standardization of the materials for SOFC components

  8. Revision of the ASME nuclear quality assurance standard and its historical background

    International Nuclear Information System (INIS)

    Suzuki, Tetsuya

    2009-01-01

    ASME NQA-1-2008 'Quality Assurance Requirements for Nuclear Facility Applications' will be endorsed by US NRC by the end of 2009. This standard will apply to design, construction and operation of nuclear power plants newly erected in USA. It is important to Japanese vendors developing nuclear business in USA. Historical background, significance of revision and main revised points of the ASME nuclear quality assurance standard are described in the present paper. (T. Tanaka)

  9. Gender-Specific Effects of Artificially Induced Gender Beliefs in Mental Rotation

    Science.gov (United States)

    Heil, Martin; Jansen, Petra; Quaiser-Pohl, Claudia; Neuburger, Sarah

    2012-01-01

    Men outperform women in the Mental Rotation Test (MRT) by about one standard deviation. The present study replicated a gender belief account [Moe, A., & Pazzaglia, F. (2006). Following the instructions! Effects of gender beliefs in mental rotation. Learning and Individual Differences, 16, 369-377.] for (part of) this effect. A sample of 300…

  10. Neonatal therapeutic hypothermia outside of standard guidelines: a survey of U.S. neonatologists.

    Science.gov (United States)

    Burnsed, Jennifer; Zanelli, Santina A

    2017-11-01

    Therapeutic hypothermia is standard of care in term infants with moderate-to-severe hypoxic-ischaemic encephalopathy (HIE). The goal of this survey was to explore the attitudes of U.S. neonatologists caring for infants with HIE who fall outside of current guidelines. Case-based survey administered to members of the Section on Neonatal-Perinatal Medicine of the American Academy of Pediatrics. A total of 447 responses were analysed, a response rate of 19%. We found significant variability amongst U.S. neonatologists with regard to the use of therapeutic hypothermia for infants with HIE who fall outside standard inclusion criteria. Scenarios with the most variability included HIE in a late preterm infant and HIE following a postnatal code. Provision of therapeutic hypothermia outside of standard guidelines was not influenced by number of years in practice, neonatal intensive care type (NICU) or NICU size. Significant variability in practice exists when caring for infants with HIE who do not meet standard inclusion criteria, emphasizing the need for continued and rigorous research in this area. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  11. Accident analysis for aircraft crash into hazardous facilities: DOE standard

    International Nuclear Information System (INIS)

    1996-10-01

    This standard provides the user with sufficient information to evaluate and assess the significance of aircraft crash risk on facility safety without expending excessive effort where it is not required. It establishes an approach for performing a conservative analysis of the risk posed by a release of hazardous radioactive or chemical material resulting from an aircraft crash into a facility containing significant quantities of such material. This can establish whether a facility has a significant potential for an aircraft impact and whether this has the potential for producing significant offsite or onsite consequences. General implementation guidance, screening and evaluation guidelines, and methodologies for the evaluations are included

  12. Dental Hygiene Faculty Calibration Using Two Accepted Standards for Calculus Detection: A Pilot Study.

    Science.gov (United States)

    Santiago, Lisa J; Freudenthal, Jacqueline J; Peterson, Teri; Bowen, Denise M

    2016-08-01

    Faculty calibration studies for calculus detection use two different standards for examiner evaluation, yet the only therapeutic modality that can be used for nonsurgical periodontal treatment is scaling/root debridement or planing. In this study, a pretest-posttest design was used to assess the feasibility of faculty calibration for calculus detection using two accepted standards: that established by the Central Regional Dental Testing Service, Inc. (CRDTS; readily detectible calculus) and the gold standard for scaling/root debridement (root roughness). Four clinical dental hygiene faculty members out of five possible participants at Halifax Community College agreed to participate. The participants explored calculus on the 16 assigned teeth (64 surfaces) of four patients. Calculus detection scores were calculated before and after training. Kappa averages using CRDTS criteria were 0.561 at pretest and 0.631 at posttest. Kappa scores using the scaling/root debridement or planing standard were 0.152 at pretest and 0.271 at posttest. The scores indicated improvement from moderate (Kappa=0.41-0.60) to substantial agreement (Kappa=0.61-0.80) following training using the CRDTS standard. Although this result differed qualitatively and Kappas were significantly different from 0, the differences for pre- to post-Kappas for patient-rater dyads using CRDTS were not statistically significant (p=0.778). There was no difference (p=0.913) in Kappa scores pre- to post-training using the scaling/root debridement standard. Despite the small number of participants in this study, the results indicated that training to improve interrater reliability to substantial agreement was feasible using the CRDTS standard but not using the gold standard. The difference may have been due to greater difficulty in attaining agreement regarding root roughness. Future studies should include multiple training sessions with patients using the same standard for scaling/root debridement used for

  13. Predictors of awareness of standard drink labelling and drinking guidelines to reduce negative health effects among Australian drinkers.

    Science.gov (United States)

    Coomber, Kerri; Jones, Sandra C; Martino, Florentine; Miller, Peter G

    2017-03-01

    This study examined rates of awareness of standard drink labelling and drinking guidelines among Australian adult drinkers. Demographic predictors of these two outcomes were also explored. Online survey panel participants aged 18-45 years(n = 1061; mean age = 33.2 years) completed an online survey assessing demographics, alcohol consumption patterns, awareness of standard drink labels and the National Health and Medical Research Council (NHMRC) guidelines, and support for more detailed labels. The majority (80%) of participants had seen standard drink labels on alcohol products; with younger drinkers, those from a regional/rural location and high-risk drinkers significantly more likely to have seen such labelling. Most respondents estimated at or below the maximum number of drinks stipulated in the NHMRC guidelines. However, their estimates of the levels for male drinkers were significantly higher than for female drinkers. High-risk drinkers were significantly less likely to provide accurate estimates, while those who had seen the standard drink logo were significantly more likely to provide accurate estimates of drinking levels to reduce the risk of long-term harms only. Just under three-quarters of respondents supported the inclusion of more information on labels regarding guidelines to reduce negative health effects. The current standard drink labelling approach fails to address high-risk drinkers. The inclusion of information about NHMRC guidelines on alcohol labels, and placing standard drink labelling on the front of products could improve awareness of what constitutes a standard drink and safe levels of consumption among Australian drinkers.[Kerri Coomber, Sandra C. Jones, Florentine Martino, Peter G. Miller. Predictors of awareness of standard drink labelling and drinking guidelines to reduce negative health effects among Australian drinkers. Drug Alcohol Rev 2017;36:200-209]. © 2016 Australasian Professional Society on Alcohol and other Drugs.

  14. Technical standards in nuclear area

    International Nuclear Information System (INIS)

    Grimberg, M.

    1978-01-01

    The technical standardization in nuclear area is discussed. Also, the competence of CNEN in standardization pursuit is analysed. Moreover, the process of working up of technical standards is explained; in addition, some kinds of technical standards are discussed. (author) [pt

  15. Developing and enforcing internal information systems standards: InduMaker’s Standards Management Process

    Directory of Open Access Journals (Sweden)

    Claudia Loebbecke

    2016-01-01

    Full Text Available It is widely agreed that standards provide numerous benefits when available and enforced. Company-internal Information Systems (IS management procedures and solutions, in the following coined IS ‘standards’, allow for harmonizing operations between company units, locations and even different service providers. However, many companies lack an organized process for defining and managing internal IS standards, which causes uncertainties and delays in decision making, planning, and design processes. In this case study of the globally operating InduMaker (anonymized company name, an established manufacturing supplier, we look into the company-internal management of IS standards. Theoretically grounded in the organizational and IS-focused literature on business process modelling and business process commoditization, we describe and investigate InduMaker’s newly developed Standard Management Process (SMP for defining and managing company-internal business and IS standards, with which the multinational pursues offering clear answers to business and IT departments about existing IS standards, their degree of obligation, applicability, and scope at any time.

  16. Governing through standards

    DEFF Research Database (Denmark)

    Brøgger, Katja

    This abstract adresses the ways in which new education standards have become integral to new modes of education governance. The paper explores the role of standards for accelerating the shift from national to transnational governance in higher education. Drawing on the case of higher education...

  17. DROUGHT FORECASTING BASED ON MACHINE LEARNING OF REMOTE SENSING AND LONG-RANGE FORECAST DATA

    Directory of Open Access Journals (Sweden)

    J. Rhee

    2016-06-01

    Full Text Available The reduction of drought impacts may be achieved through sustainable drought management and proactive measures against drought disaster. Accurate and timely provision of drought information is essential. In this study, drought forecasting models to provide high-resolution drought information based on drought indicators for ungauged areas were developed. The developed models predict drought indices of the 6-month Standardized Precipitation Index (SPI6 and the 6-month Standardized Precipitation Evapotranspiration Index (SPEI6. An interpolation method based on multiquadric spline interpolation method as well as three machine learning models were tested. Three machine learning models of Decision Tree, Random Forest, and Extremely Randomized Trees were tested to enhance the provision of drought initial conditions based on remote sensing data, since initial conditions is one of the most important factors for drought forecasting. Machine learning-based methods performed better than interpolation methods for both classification and regression, and the methods using climatology data outperformed the methods using long-range forecast. The model based on climatological data and the machine learning method outperformed overall.

  18. The Standard Model

    International Nuclear Information System (INIS)

    Sutton, Christine

    1994-01-01

    The initial evidence from Fermilab for the long awaited sixth ('top') quark puts another rivet in the already firm structure of today's Standard Model of physics. Analysis of the Fermilab CDF data gives a top mass of 174 GeV with an error of ten per cent either way. This falls within the mass band predicted by the sum total of world Standard Model data and underlines our understanding of physics in terms of six quarks and six leptons. In this specially commissioned overview, physics writer Christine Sutton explains the Standard Model

  19. International radiofrequency standards

    International Nuclear Information System (INIS)

    Lincoln, J.

    2001-01-01

    Of the various radiofrequency standards in use around the world, many are based on or similar to the Guidelines published by ICNIRP (The International Commission on Non-ionising Radiation Protection). This organisation is a working group operating in co-operation with the Environmental Health division of the World Health Organisation (WHO). This paper presents a very brief overview of current international standards, beginning with a summary of the salient points of the ICNIRP Guidelines. It should be remembered that these are guidelines only and do not exist as a separate standard. Copyright (2001) Australasian Radiation Protection Society Inc

  20. GENERIC QUALITY STANDARDS VS. SPECIFIC QUALITY STANDARDS: THE CASE OF HIGHER EDUCATION

    Directory of Open Access Journals (Sweden)

    Laila El Abbadi

    2011-06-01

    Full Text Available Quality as a new requirement for the field of higher education leads institutions to seek to satisfy generic or specific quality standards imposed directly or indirectly by its customers. The aim of this study is to compare between ISO9001, as a generic quality standard, and the Code of Practice of the Quality Assurance Agency for Higher Education (QAA, as a specific quality standard. A correlation matrix is drawn and correlation rates are calculated to show similarities and differences between them. This paper shows, first, that ISO9001 and QAA Code of Practice are compatible. Second, implementing a quality management system in accordance with ISO9001 requirements can constitute an adequate framework for the application of the QAA Code of Practice requirements. Third, to make the ISO9001 requirements closer to a specific quality standard in the field of higher education, it is recommended to complete these standards by specific requirements to the field of higher education.

  1. PRAXIOCENTRALISM IN THE PROFESSIONAL STANDARD OF THE TEACHER

    Directory of Open Access Journals (Sweden)

    L. Y. Monakhova

    2017-01-01

    position of praxiocentralism in the professional standard are highlighted; criteria and indicators of an estimation of labour activity of the teacher from the point of view of its efficiency and effectiveness are given.Practical significance. This study contributes to the solution of theoretical and practical problems of correlation existing Federal educational state standards (FESS and professional standards. This is especially important due to the possibility of subsequent approval and implementation into the pedagogical practices of the next generation of FESS developed on the basis of professional standards

  2. European standards for composite construction

    NARCIS (Netherlands)

    Stark, J.W.B.

    2000-01-01

    The European Standards Organisation (CEN) has planned to develop a complete set of harmonized European building standards. This set includes standards for composite steel and concrete buildings and bridges. The Eurocodes, being the design standards, form part of this total system of European

  3. IEEE [Institute of Electrical and Electronics Engineers] standards and nuclear software quality engineering

    International Nuclear Information System (INIS)

    Daughtrey, T.

    1988-01-01

    Significant new nuclear-specific software standards have recently been adopted under the sponsorship of the American Nuclear Society and the American Society of Mechanical Engineers. The interest of the US Nuclear Regulatory Commission has also been expressed through their issuance of NUREG/CR-4640. These efforts all indicate a growing awareness of the need for thorough, referenceable expressions of the way to build in and evaluate quality in nuclear software. A broader professional perspective can be seen in the growing number of software engineering standards sponsored by the Institute of Electrical and Electronics Engineers (IEEE) Computer Society. This family of standards represents a systematic effort to capture professional consensus on quality practices throughout the software development life cycle. The only omission-the implementation phase-is treated by accepted American National Standards Institute or de facto standards for programming languages

  4. Accounting and Co-constructing: The Development of a Standard for Electronic Health Records

    DEFF Research Database (Denmark)

    Bossen, Claus

    2011-01-01

    care services, cooperation between staff and make patient records immediately and accessible to distributed actors. Investments also have the aims of making health care services more accountable, integrated and increase quality and efficiency. This paper analyses a Danish national standard...... for electronic health records on the basis of an application prototype test build on that standard. The analysis shows that inscribed in the standard is an ambition to make staff and health care services more accountable at the cost of more work, loss of overview and fragmentation of patient cases. Significantly......, despite the standard having been conceived and developed in a process of co-construction with clinicians, clinicians did not find it adequate for their work. The analysis argues this was the result of the model of work embedded in the standard coming from a stance from without practice. Subsequently...

  5. Performance Evaluation of Five Turbidity Sensors in Three Primary Standards

    Science.gov (United States)

    Snazelle, Teri T.

    2015-10-28

    deviation of 0.51 percent for the operating range, which was limited to 0.01–1600 NTU at the time of this report. Test results indicated an average percent error of 19.81 percent in the three standards for the EXO turbidity sensor and 9.66 percent for the YSI 6136. The significant variability in sensor performance in the three primary standards suggests that although all three types are accepted as primary calibration standards, they are not interchangeable, and sensor results in the three types of standards are not directly comparable.

  6. Implementing the Next Generation Science Standards: Impacts on Geoscience Education

    Science.gov (United States)

    Wysession, M. E.

    2014-12-01

    This is a critical time for the geoscience community. The Next Generation Science Standards (NGSS) have been released and are now being adopted by states (a dozen states and Washington, DC, at the time of writing this), with dramatic implications for national K-12 science education. Curriculum developers and textbook companies are working hard to construct educational materials that match the new standards, which emphasize a hands-on practice-based approach that focuses on working directly with primary data and other forms of evidence. While the set of 8 science and engineering practices of the NGSS lend themselves well to the observation-oriented approach of much of the geosciences, there is currently not a sufficient number of geoscience educational modules and activities geared toward the K-12 levels, and geoscience research organizations need to be mobilizing their education & outreach programs to meet this need. It is a rare opportunity that will not come again in this generation. There are other significant issues surrounding the implementation of the NGSS. The NGSS involves a year of Earth and space science at the high school level, but there does not exist a sufficient workforce is geoscience teachers to meet this need. The form and content of the geoscience standards are also very different from past standards, moving away from a memorization and categorization approach and toward a complex Earth Systems Science approach. Combined with the shift toward practice-based teaching, this means that significant professional development will therefore be required for the existing K-12 geoscience education workforce. How the NGSS are to be assessed is another significant question, with an NRC report providing some guidance but leaving many questions unanswered. There is also an uneasy relationship between the NGSS and the Common Core of math and English, and the recent push-back against the Common Core in many states may impact the implementation of the NGSS.

  7. Experimental use of two standard tick collection methods to evaluate the relative effectiveness of several plant-derived and synthetic repellents against Ixodes scapularis and Amblyomma americanum (Acari: Ixodidae).

    Science.gov (United States)

    Schulze, Terry L; Jordan, Robert A; Dolan, Marc C

    2011-12-01

    We used two standard tick collection methods to test the relative effectiveness of two natural product compounds (nootkatone and carvacrol, classified as an eremophilene sesquiterpene and a monoterpene, respectively, that are derived from botanical sources) with commercially-available plant-derived (EcoSMART Organic Insect Repellent, comprised of plant essential oils) and permethrin-based (Repel Permanone) repellents against Ixodes scapularis Say and Amblyomma americanum (L.). Cloth drags were equally effective in sampling both species of host-seeking nymphs, whereas CO, traps attracted primarily A. americanum. All four repellents performed well on drags, with nootkatone and Permanone Repel (100% repelled through 14 d) slightly more effective than carvacrol and EcoSMART (90.7% and 97.7% repelled at 14 d, respectively) at repelling I. scapularis nymphs. Although the same trend in percent repellency was noted in the CO2 trap trial against both A. americanum nymphs and adults, EcoSMART outperformed Permanone in repelling A. Americanum nymphs after 14 d in the drag trial. Generally, the effectiveness of all repellents tested declined over time. The use of tick drags and CO2 traps was rapid, inexpensive, and easy to use in determining the relative effectiveness of repellents in the field.

  8. INTERDEPENDENCIES OF THE INTERNAL / MANAGERIAL CONTROL STANDARD NO. 6 - ORGANIZATIONAL STRUCTURE

    Directory of Open Access Journals (Sweden)

    Ionut-Cosmin BĂLOI

    2015-06-01

    Full Text Available Our initiative of analyzing the internal control standard which deals with the organizational structure comes from the observations on the significance of these essential aspects of modern management and on the sensitivity with which this standard is treated in most of the public institutions considered representative for the Oltenia region. Although the administrators of public institutions strive to optimize the systems of internal/managerial control, they frequently face many issues concerning the misunderstanding of these standards, vaguely explained, for example throughout some guidelines or other documents. The hypothesis of our study is that most of public institutions face gaps in understanding, interpreting, adapting and implementing an effective model of organizational structure, and the causes are due to the lack of an interdependent, correlated approach of the pillars that support the internal/managerial control system: the 25 standards required by the Romanian legislation. Our study critically describes the superficial approach founded in the self-evaluation reports of the public institutions, if we refer only to the conformity of the organizational structure and the four standards that we consider inextricably related with this internal/managerial control standard. From the methodological point of view, our study tests the correlation between the level of compliance of these standards and the functionality of the system composed by them in the public organizations that we have investigated.

  9. Development of job standards for clinical nutrition therapy for dyslipidemia patients.

    Science.gov (United States)

    Kang, Min-Jae; Seo, Jung-Sook; Kim, Eun-Mi; Park, Mi-Sun; Woo, Mi-Hye; Ju, Dal-Lae; Wie, Gyung-Ah; Lee, Song-Mi; Cha, Jin-A; Sohn, Cheong-Min

    2015-04-01

    Dyslipidemia has significantly contributed to the increase of death and morbidity rates related to cardiovascular diseases. Clinical nutrition service provided by dietitians has been reported to have a positive effect on relief of medical symptoms or reducing the further medical costs. However, there is a lack of researches to identify key competencies and job standard for clinical dietitians to care patients with dyslipidemia. Therefore, the purpose of this study was to analyze the job components of clinical dietitian and develop the standard for professional practice to provide effective nutrition management for dyslipidemia patients. The current status of clinical nutrition therapy for dyslipidemia patients in hospitals with 300 or more beds was studied. After duty tasks and task elements of nutrition care process for dyslipidemia clinical dietitians were developed by developing a curriculum (DACUM) analysis method. The developed job standards were pretested in order to evaluate job performance, difficulty, and job standards. As a result, the job standard included four jobs, 18 tasks, and 53 task elements, and specific job description includes 73 basic services and 26 recommended services. When clinical dietitians managing dyslipidemia patients performed their practice according to this job standard for 30 patients the job performance rate was 68.3%. Therefore, the job standards of clinical dietitians for clinical nutrition service for dyslipidemia patients proposed in this study can be effectively used by hospitals.

  10. Legal consequences of standard setting for competitive athletes with cardiovascular abnormalities.

    Science.gov (United States)

    Weistart, J C

    1985-12-01

    This paper addresses the issue of whether establishing consensus standards for the treatment of particular medical conditions increases a physician's exposure to legal liability. The conclusion reached is that the legal effects of standard setting, rather than representing a significant threat of liability, should be seen as beneficial to the medical profession. A fundamental point is that the legal test for liability is entirely dependent on the medical profession's definition of what constitutes adequate care. The law incorporates the standard of care defined by the medical profession and does not impose an external norm. In the absence of formally stated standards, the process of defining relevant medical criteria will involve a great deal of uncertainty. Outcomes of legal contests will be affected by such extraneous factors as the relative experience of the lawyers involved, their access to knowledgeable expert witnesses, and their strategic decisions made with respect to tactics and procedures. Establishment of formal standards has the salutory effect of limiting the influence of these factors and thus reducing the randomness of the results reached. Formal standards also have the advantage of being easily replicated in unrelated proceedings and thereby contribute to the development of a consistent, evenly applied rule of liability. Finally, even if formal standards are either more, or less, progressive than the actual state of medical practice, there is relatively little risk that they will produce untoward results.

  11. Automotive Technology Skill Standards

    Science.gov (United States)

    Garrett, Tom; Asay, Don; Evans, Richard; Barbie, Bill; Herdener, John; Teague, Todd; Allen, Scott; Benshoof, James

    2009-01-01

    The standards in this document are for Automotive Technology programs and are designed to clearly state what the student should know and be able to do upon completion of an advanced high-school automotive program. Minimally, the student will complete a three-year program to achieve all standards. Although these exit-level standards are designed…

  12. Do state renewable portfolio standards promote in-state renewable generation

    International Nuclear Information System (INIS)

    Yin, Haitao; Powers, Nicholas

    2010-01-01

    Several US states have passed renewable portfolio standard (RPS) policies in order to encourage investment in renewable energy technologies. Existing research on their effectiveness has either employed a cross-sectional approach or has ignored heterogeneity among RPS policies. In this paper, we introduce a new measure for the stringency of an RPS that explicitly accounts for some RPS design features that may have a significant impact on the strength of an RPS. We also investigate the impacts of renewable portfolio standards on in-state renewable electricity development using panel data and our new measure of RPS stringency, and compare the results with those when alternative measures are used. Using our new measure, the results suggest that RPS policies have had a significant and positive effect on in-state renewable energy development, a finding which is masked when design differences among RPS policies are ignored. We also find that another important design feature - allowing 'free trade' of REC's - can significantly weaken the impact of an RPS. These results should prove instructive to policy makers, whether considering the development of a federal-level RPS or the development or redesign of a state-level RPS. (author)

  13. Factors influencing incidence of acute grade 2 morbidity in conformal and standard radiation treatment of prostate cancer

    International Nuclear Information System (INIS)

    Hanks, Gerald E.; Schultheiss, Timothy E.; Hunt, Margie A.; Epstein, Barry

    1995-01-01

    Purpose: The fundament hypothesis of conformal radiation therapy is that tumor control can be increased by using conformal treatment techniques that allow a higher tumor dose while maintaining an acceptable level of complications. To test this hypothesis, it is necessary first to estimate the incidence of morbidity for both standard and conformal fields. In this study, we examine factors that influence the incidence of acute grade 2 morbidity in patients treated with conformal and standard radiation treatment for prostate cancer. Methods and Materials: Two hundred and forty-seven consecutive patients treated with conformal technique are combined with and compared to 162 consecutive patients treated with standard techniques. The conformal technique includes special immobilization by a cast, careful identification of the target volume in three dimensions, localization of the inferior border of the prostate using the retrograde urethrogram, and individually shaped portals that conform to the Planning Target Volume (PTV). Univariate analysis compares differences in the incidence of RTOG-EORTC grade two acute morbidity by technique, T stage, age, irradiated volume, and dose. Multivariate logistic regression includes these same variables. Results: In nearly all categories, the conformal treatment group experienced significantly fewer acute grade 2 complications than the standard treatment group. Only volume (prostate ± whole pelvis) and technique (conformal vs. standard) were significantly related to incidence of morbidity on multivariate analysis. When dose is treated as a continuous variable (rather than being dichotomized into two levels), a trend is observed on multivariate analysis, but it does not reach significant levels. The incidence of acute grade 2 morbidity in patients 65 years or older is significantly reduced by use of the conformal technique. Conclusion: The conformal technique is associated with fewer grade 2 acute toxicities for all patients. This

  14. Difficulties Using Standardized Tests to Identify the Receptive Expressive Gap in Bilingual Children's Vocabularies.

    Science.gov (United States)

    Gibson, Todd A; Oller, D Kimbrough; Jarmulowicz, Linda

    2018-03-01

    Receptive standardized vocabulary scores have been found to be much higher than expressive standardized vocabulary scores in children with Spanish as L1, learning L2 (English) in school (Gibson et al., 2012). Here we present evidence suggesting the receptive-expressive gap may be harder to evaluate than previously thought because widely-used standardized tests may not offer comparable normed scores. Furthermore monolingual Spanish-speaking children tested in Mexico and monolingual English-speaking children in the US showed other, yet different statistically significant discrepancies between receptive and expressive scores. Results suggest comparisons across widely used standardized tests in attempts to assess a receptive-expressive gap are precarious.

  15. Leukoaraiosis significantly worsens driving performance of ordinary older drivers.

    Directory of Open Access Journals (Sweden)

    Kimihiko Nakano

    Full Text Available BACKGROUND: Leukoaraiosis is defined as extracellular space caused mainly by atherosclerotic or demyelinated changes in the brain tissue and is commonly found in the brains of healthy older people. A significant association between leukoaraiosis and traffic crashes was reported in our previous study; however, the reason for this is still unclear. METHOD: This paper presents a comprehensive evaluation of driving performance in ordinary older drivers with leukoaraiosis. First, the degree of leukoaraiosis was examined in 33 participants, who underwent an actual-vehicle driving examination on a standard driving course, and a driver skill rating was also collected while the driver carried out a paced auditory serial addition test, which is a calculating task given verbally. At the same time, a steering entropy method was used to estimate steering operation performance. RESULTS: The experimental results indicated that a normal older driver with leukoaraiosis was readily affected by external disturbances and made more operation errors and steered less smoothly than one without leukoaraiosis during driving; at the same time, their steering skill significantly deteriorated. CONCLUSIONS: Leukoaraiosis worsens the driving performance of older drivers because of their increased vulnerability to distraction.

  16. Radiological Evaluation Standards in the Radiology Department of Shahid Beheshti Hospital (RAH YASUJ Based on Radiology standards in 92

    Directory of Open Access Journals (Sweden)

    A َKalantari

    2014-08-01

    Full Text Available Background & aim: Radiology personnel’s working in terms of performance and safety is one of the most important functions in order to increase the quality and quantity. This study aimed to evaluate the radiological standards in Shahid Beheshti Hospital of Yasuj, Iran, in 2013. Methods: The present cross-sectional study was based on a 118 randomly selected graphs and the ranking list, with full knowledge of the standards in radiology was performed two times. Data were analyzed using descriptive statistics. Results: 87.3% of the students chose the cassette, 76.3%, patients chose the position, 87.3%, member state, the central ray 83.9%, and the distance between the tube and the patient 68.6% had been operated in accordance with the standards practice. Among all the factors and variables, between view with cassette, view with SID, sex with position patients, grid with central ray, grid with SID, Request with positioning the patient and between density with patient position and member position significant relationship were observed p<0.05 . Conclusions: Staff and students in terms of performance were at high levels, but in the levels of protection were in poor condition. Therefore, in order to promote their conservation, education and periodical monitoring should be carried out continuously.

  17. 75 FR 64684 - Cost Accounting Standards: Elimination of the Exemption From Cost Accounting Standards for...

    Science.gov (United States)

    2010-10-20

    ... cost accounting standards governing the measurement, assignment, and allocation of costs to contracts... Accounting Standards: Elimination of the Exemption From Cost Accounting Standards for Contracts Executed and... and Budget (OMB), Office of Federal Procurement Policy, Cost Accounting Standards Board. ACTION...

  18. Frequency standards

    CERN Document Server

    Riehle, Fritz

    2006-01-01

    Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards

  19. 75 FR 15440 - Guidance for Industry on Standards for Securing the Drug Supply Chain-Standardized Numerical...

    Science.gov (United States)

    2010-03-29

    ...] Guidance for Industry on Standards for Securing the Drug Supply Chain--Standardized Numerical... industry entitled ``Standards for Securing the Drug Supply Chain-Standardized Numerical Identification for... the Drug Supply Chain-Standardized Numerical Identification for Prescription Drug Packages.'' In the...

  20. Repeated Interaction in Standard Setting

    NARCIS (Netherlands)

    Larouche, Pierre; Schütt, Florian

    2016-01-01

    As part of the standard-setting process, certain patents become essential. This may allow the owners of these standard-essential patents to hold up implementers of the standard, who can no longer turn to substitute technologies. However, many real-world standards evolve over time, with several