WorldWideScience

Sample records for large-scale high-stakes assessments

  1. Stereotype Threat, Inquiring about Test Takers' Race and Gender, and Performance on Low-Stakes Tests in a Large-Scale Assessment. Research Report. ETS RR-15-02

    Science.gov (United States)

    Stricker, Lawrence J.; Rock, Donald A.; Bridgeman, Brent

    2015-01-01

    This study explores stereotype threat on low-stakes tests used in a large-scale assessment, math and reading tests in the Education Longitudinal Study of 2002 (ELS). Issues identified in laboratory research (though not observed in studies of high-stakes tests) were assessed: whether inquiring about their race and gender is related to the…

  2. The Mediating Role of Textbooks in High-Stakes Assessment Reform

    Science.gov (United States)

    Leung, Ching Yin; Andrews, Stephen

    2012-01-01

    Whenever high-stakes assessment/curriculum reforms take place, new textbooks appear on the market. These textbooks inevitably play a significant mediating role in the implementation of any reform and on teaching and learning. This paper reports on a small-scale study which attempts to investigate the role of textbooks in the mediation of a…

  3. Large Stroke High Fidelity PZN-PT Single-Crystal "Stake" Actuator.

    Science.gov (United States)

    Huang, Yu; Xia, Yuexue; Lin, Dian Hua; Yao, Kui; Lim, Leong Chew

    2017-10-01

    A new piezoelectric actuator design, called "Stake" actuator, is proposed and demonstrated in this paper. As an example, the stake actuator is made of four d 32 -mode PZN-5.5%PT single crystals (SCs), each of 25 mm ( L ) ×8 mm ( W ) ×0.4 mm (T) in dimensions, bonded with the aid of polycarbonate edge guide-cum-stiffeners into a square-pipe configuration for improved bending and twisting strengths and capped with top and bottom pedestals made of 1.5-mm-thick anodized aluminum. The resultant stake actuator measured 9 mm ×9 mm ×28 mm. The hollow structure is a key design feature, which optimizes SC usage efficiency and lowers the overall cost of the actuator. The displacement-voltage responses, blocking forces, resonance characteristics of the fabricated stake actuator, as well as the load and temperature effects, are measured and discussed. Since d 32 is negative for [011]-poled SC, the "Stake" actuator contracts in the axial direction when a positive-polarity field is applied to the crystals. Biased drive is thus recommended when extensional displacement is desired. The SC stake actuator has negligible (0.13% when driven up to +300 V (i.e., 0.75 kV/mm), which is close to the rhombohedral-to-orthorhombic transformation field ( E RO ) of 0.85 kV/mm of the SC used. The stake actuator displays a stroke of [Formula: see text] (at +300 V) despite its small overall dimensions, and has a blocking force of 114 N. The SC d 32 stake actuator fabricated displays more than 30% larger axial strain than the state-of-the-art PZT stack actuators of comparable length as well as moderate blocking forces. Said actuators are thus ideal for applications when large displacements with simple open-loop control are preferred.

  4. The Measurement Invariance of the Student Opinion Survey across English and non-English Language Learner Students within the Context of Low- and High-Stakes Assessments

    Directory of Open Access Journals (Sweden)

    Jason C. Immekus

    2016-09-01

    Full Text Available Student effort on large-scale assessments has important implications on the interpretation and use of scores to guide decisions. Within the United States, English Language Learners (ELLs generally are outperformed on large-scale assessments by non-ELLs, prompting research to examine factors associated with test performance. There is a gap in the literature regarding the test-taking motivation of ELLs compared to non-ELLs and whether existing measures have similar psychometric properties across groups. The Student Opinion Survey (SOS; Sundre, 2007 was designed to be administered after completion of a large-scale assessment to operationalize students’ test-taking motivation. Based on data obtained on 5,257 (41.8% ELL 10th grade students, study purpose was to test the measurement invariance of the SOS across ELLs and non-ELLs based on completion of low- and high-stakes assessments. Preliminary item analyses supported the removal of two SOS items (Items 3 and 7 that resulted in improved internal consistency for each of the two SOS subscales: Importance, Effort. A subsequent multi-sample confirmatory factor analysis (MCFA supported the measurement invariance of the scale’s two-factor model across language groups, indicating it met strict factorial invariance (Meredith 1993. A follow-up latent means analysis found that ELLs had higher effort on both the low- and high-stakes assessment with a small effect size. Effect size estimates indicated negligible differences on the importance factor. Although the instrument can be expected to function similarly across diverse language groups, which may have direct utility of test users and research into factors associated with large-scale test performance, continued research is recommended. Implications for SOS use in applied and research settings are discussed.

  5. High-Stakes Educational Testing and Democracy--Antagonistic or Symbiotic Relationship?

    Science.gov (United States)

    Ydesen, Christian

    2014-01-01

    This article argues that high-stakes educational testing, along with the attendant questions of power, education access, education management and social selection, cannot be considered in isolation from society at large. Thus, high-stakes testing practices bear numerous implications for democratic conditions in society. For decades, advocates of…

  6. Split or Steal? Cooperative Behavior When the Stakes Are Large

    NARCIS (Netherlands)

    M.J. van den Assem (Martijn); D. van Dolder (Dennie); R.H. Thaler (Richard)

    2012-01-01

    textabstractWe examine cooperative behavior when large sums of money are at stake, using data from the television game show Golden Balls. At the end of each episode, contestants play a variant on the classic prisoner's dilemma for large and widely ranging stakes averaging over $20,000. Cooperation

  7. Validating High-Stakes Testing Programs.

    Science.gov (United States)

    Kane, Michael

    2002-01-01

    Makes the point that the interpretations and use of high-stakes test scores rely on policy assumptions about what should be taught and the content standards and performance standards that should be applied. The assumptions built into an assessment need to be subjected to scrutiny and criticism if a strong case is to be made for the validity of the…

  8. Reconsidering the Impact of High-stakes Testing

    Directory of Open Access Journals (Sweden)

    Henry Braun

    2004-01-01

    Full Text Available Over the last fifteen years, many states have implemented high-stakes tests as part of an effort to strengthen accountability for schools, teachers, and students. Predictably, there has been vigorous disagreement regarding the contributions of such policies to increasing test scores and, more importantly, to improving student learning. A recent study by Amrein and Berliner (2002a has received a great deal of media attention. Employing various databases covering the period 1990-2000, the authors conclude that there is no evidence that states that implemented high-stakes tests demonstrated improved student achievement on various external measures such as performance on the SAT, ACT, AP, or NAEP. In a subsequent study in which they conducted a more extensive analysis of state policies (Amrein & Berliner, 2002b, they reach a similar conclusion. However, both their methodology and their findings have been challenged by a number of authors. In this article, we undertake an extended reanalysis of one component of Amrein and Berliner (2002a. We focus on the performance of states, over the period 1992 to 2000, on the NAEP mathematics assessments for grades 4 and 8. In particular, we compare the performance of the high-stakes testing states, as designated by Amrein and Berliner, with the performance of the remaining states (conditioning, of course, on a state’s participation in the relevant NAEP assessments. For each grade, when we examine the relative gains of states over the period, we find that the comparisons strongly favor the high-stakes testing states. Moreover, the results cannot be accounted for by differences between the two groups of states with respect to changes in percent of students excluded from NAEP over the same period. On the other hand, when we follow a particular cohort (grade 4, 1992 to grade 8, 1996 or grade 4, 1996 to grade 8, 2000, we find the comparisons slightly favor the low-stakes testing states, although the discrepancy can

  9. Raising the stakes: How students' motivation for mathematics associates with high- and low-stakes test achievement.

    Science.gov (United States)

    Simzar, Rahila M; Martinez, Marcela; Rutherford, Teomara; Domina, Thurston; Conley, AnneMarie M

    2015-04-01

    This study uses data from an urban school district to examine the relation between students' motivational beliefs about mathematics and high- versus low-stakes math test performance. We use ordinary least squares and quantile regression analyses and find that the association between students' motivation and test performance differs based on the stakes of the exam. Students' math self-efficacy and performance avoidance goal orientation were the strongest predictors for both exams; however, students' math self-efficacy was more strongly related to achievement on the low-stakes exam. Students' motivational beliefs had a stronger association at the low-stakes exam proficiency cutoff than they did at the high-stakes passing cutoff. Lastly, the negative association between performance avoidance goals and high-stakes performance showed a decreasing trend across the achievement distribution, suggesting that performance avoidance goals are more detrimental for lower achieving students. These findings help parse out the ways motivation influences achievement under different stakes.

  10. High School Students with Learning Disabilities: Mathematics Instruction, Study Skills, and High Stakes Tests

    Science.gov (United States)

    Steele, Marcee M.

    2010-01-01

    This article reviews characteristics of high school students with learning disabilities and presents instructional modifications and study skills to help them succeed in algebra and geometry courses and on high stakes mathematics assessments.

  11. The Impact of High Stakes Testing: The Australian Story

    Science.gov (United States)

    Klenowski, Val; Wyatt-Smith, Claire

    2012-01-01

    High stakes testing in Australia was introduced in 2008 by way of the National Assessment Program--Literacy and Numeracy (NAPLAN). Currently, every year all students in Years 3, 5, 7 and 9 are assessed on the same days using national tests in Reading, Writing, Language Conventions (Spelling, Grammar and Punctuation) and Numeracy. In 2010 the…

  12. High-stakes educational testing and democracy

    DEFF Research Database (Denmark)

    Ydesen, Christian

    2014-01-01

    This article investigates the relation between high-stakes educational testing and democracy drawn from the experiences of 20th-century high-stakes educational testing practices in the Danish history of education. The article presents various concepts of democracy using leading propositions within...... the field of education. Then a sample of relevant historic case studies are examined in light of these definitions. Among other things, the article concludes that a combination of different evaluation technologies – some formative and some summative – might be the safest way to go from a democratic...

  13. Small- and large-stakes risk aversion: implications of concavity calabration for decision theory

    NARCIS (Netherlands)

    Cox, J.C.; Sadiraj, V.

    2006-01-01

    A growing literature reports the conclusions that: (a) expected utility theory does not provide a plausible theory of risk aversion for both small-stakes and large-stakes gambles; and (b) this decision theory should be replaced with an alternative theory characterized by loss aversion. This paper

  14. The impact of high-stakes, state-mandated student performance assessment on 10th grade English, mathematics, and science teachers' instructional practices

    Science.gov (United States)

    Vogler, Kenneth E.

    The purpose of this study was to determine if the public release of student results on high-stakes, state-mandated performance assessments influence instructional practices, and if so in what manner. The research focused on changes in teachers' instructional practices and factors that may have influenced such changes since the public release of high-stakes, state-mandated student performance assessment scores. The data for this study were obtained from a 54-question survey instrument given to a stratified random sample of teachers teaching at least one section of 10th grade English, mathematics, or science in an academic public high school within Massachusetts. Two hundred and fifty-seven (257) teachers, or 62% of the total sample, completed the survey instrument. An analysis of the data found that teachers are making changes in their instructional practices. The data show notable increases in the use of open-response questions, creative/critical thinking questions, problem-solving activities, use of rubrics or scoring guides, writing assignments, and inquiry/investigation. Teachers also have decreased the use of multiple-choice and true-false questions, textbook-based assignments, and lecturing. Also, the data show that teachers felt that changes made in their instructional practices were most influenced by an "interest in helping my students attain MCAS assessment scores that will allow them to graduate high school" and by an "interest in helping my school improve student (MCAS) assessment scores," Finally, mathematics teachers and teachers with 13--19 years of experience report making significantly more changes than did others. It may be interpreted from the data that the use of state-mandated student performance assessments and the high-stakes attached to this type of testing program contributed to changes in teachers' instructional practices. The changes in teachers' instructional practices have included increases in the use of instructional practices deemed

  15. Measuring Motivation in Low-Stakes Assessments. Research Report. ETS RR-15-19

    Science.gov (United States)

    Finn, Bridgid

    2015-01-01

    There is a growing concern that when scores from low-stakes assessments are reported without considering student motivation as a construct of interest, biased conclusions about how much students know will result. Low motivation is a problem particularly relevant to low-stakes testing scenarios, which may be low stakes for the test taker but have…

  16. Inquiry-Based Instruction and High Stakes Testing

    Science.gov (United States)

    Cothern, Rebecca L.

    Science education is a key to economic success for a country in terms of promoting advances in national industry and technology and maximizing competitive advantage in a global marketplace. The December 2010 Program for International Student Assessment (PISA) ranked the United States 23rd of 65 countries in science. That dismal standing in science proficiency impedes the ability of American school graduates to compete in the global market place. Furthermore, the implementation of high stakes testing in science mandated by the 2007 No Child Left Behind (NCLB) Act has created an additional need for educators to find effective science pedagogy. Research has shown that inquiry-based science instruction is one of the predominant science instructional methods. Inquiry-based instruction is a multifaceted teaching method with its theoretical foundation in constructivism. A correlational survey research design was used to determine the relationship between levels of inquiry-based science instruction and student performance on a standardized state science test. A self-report survey, using a Likert-type scale, was completed by 26 fifth grade teachers. Participants' responses were analyzed and grouped as high, medium, or low level inquiry instruction. The unit of analysis for the achievement variable was the student scale score average from the state science test. Spearman's Rho correlation data showed a positive relationship between the level of inquiry-based instruction and student achievement on the state assessment. The findings can assist teachers and administrators by providing additional research on the benefits of the inquiry-based instructional method. Implications for positive social change include increases in student proficiency and decision-making skills related to science policy issues which can help make them more competitive in the global marketplace.

  17. A Better Leveled Playing Field for Assessing Satisfactory Job Performance of Superintendents on the Basis of High-Stakes Testing Outcomes

    Science.gov (United States)

    Young, I. Phillip; Cox, Edward P.; Buckman, David G.

    2014-01-01

    To assess satisfactory job performance of superintendents on the basis of school districts' high-stakes testing outcomes, existing teacher models were reviewed and critiqued as potential options for retrofit. For these models, specific problems were identified relative to the choice of referent groups. An alternate referent group (statewide…

  18. New Possibilities for High-Resolution, Large-Scale Ecosystem Assessment of the World's Semi-Arid Regions

    Science.gov (United States)

    Burney, J. A.; Goldblatt, R.

    2016-12-01

    Understanding drivers of land use change - and in particular, levels of ecosystem degradation - in semi-arid regions is of critical importance because these agroecosystems (1) are home to the world's poorest populations, almost all of whom depend on agriculture for their livelihoods, (2) play a critical role in the global carbon and climate cycles, and (3) have in many cases seen dramatic changes in temperature and precipitation, relative to global averages, over the past several decades. However, assessing ecosystem health (or, conversely, degradation) presents a difficult measurement problem. Established methods are very labor intensive and rest on detailed questionnaires and field assessments. High-resolution satellite imagery has a unique role semi-arid ecosystem assessment in that it can be used for rapid (or repeated) and very simple measurements of tree and shrub density, an excellent overall indicator for dryland ecosystem health. Because trees and large shrubs are more sparse in semi-arid regions, sub-meter resolution imagery in conjunction with automated image analysis can be used to assess density differences at high spatial resolution without expensive and time-consuming ground-truthing. This could be used down to the farm level, for example, to better assess the larger-scale ecosystem impacts of different management practices, to assess compliance with REDD+ carbon offset protocols, or to evaluate implementation of conservation goals. Here we present results comparing spatial and spectral remote sensing methods for semi-arid ecosystem assessment across new data sources, using the Brazilian Sertão as an example, and the implications for large-scale use in semi-arid ecosystem science.

  19. School-based assessments in high-stakes examinations in Bhutan: a question of trust? : exploring inconsistencies between external exam scores, school-based assessments, detailed teacher ratings, and student self-ratings

    NARCIS (Netherlands)

    Luyten, Johannes W.; Dolkar, Dechen

    2010-01-01

    This study explores the validity of school-based assessments when they serve to supplement scores on central tests in high-stakes examinations. The school-based continuous assessment (CA) marks are compared to the marks scored on the central written Bhutan Certificate of Secondary Education (BCSE)

  20. Student Motivation in Low-Stakes Assessment Contexts: An Exploratory Analysis in Engineering Mechanics

    Science.gov (United States)

    Musekamp, Frank; Pearce, Jacob

    2016-01-01

    The goal of this paper is to examine the relationship of student motivation and achievement in low-stakes assessment contexts. Using Pearson product-moment correlations and hierarchical linear regression modelling to analyse data on 794 tertiary students who undertook a low-stakes engineering mechanics assessment (along with the questionnaire of…

  1. Implications of Fuzziness for the Practical Management of High-Stakes Risks

    Directory of Open Access Journals (Sweden)

    Mark Jablonowski

    2010-04-01

    Full Text Available High-stakes (dangerous, catastrophic risks take on a wider profile as progress unfolds. What are the impacts of technological and social change on the risk landscape? Due to the complexities and dynamics involved, we can only answer these questions approximately. By using the concept of fuzziness, we can formalize our imprecision about high-stakes risk, and therefore place their management on a stronger footing. We review here the impacts of fuzziness, i.e., knowledge imperfection, on high-stakes risk management, including its implementation via computationally intelligent decision aids.

  2. Examining a Public Montessori School’s Response to the Pressures of High-Stakes Accountability

    Directory of Open Access Journals (Sweden)

    Corrie Rebecca Block

    2015-11-01

    Full Text Available In order to succeed in the current school assessment and accountability era, a public Montessori school is expected to achieve high student scores on standardized assessments. A problem for a public Montessori elementary school is how to make sense of the school’s high-stakes assessment scores in terms of its unique educational approach. This case study examined a public Montessori elementary school’s efforts as the school implemented the Montessori Method within the accountability era. The research revealed the ways the principal, teachers, and parents on the school council modified Montessori practices, curriculum, and assessment procedures based on test scores. A quality Montessori education is designed to offer children opportunities to develop both cognitive skills and affective components such as student motivation and socio-emotional skills that will serve them beyond their public school experiences. Sadly, the high-stakes testing environment influences so much of public education today. When quality education was measured through only one narrow measure of success the result in this school was clearly a restriction of priorities to areas that were easily assessed.

  3. Politics in evaluation: Politically responsive evaluation in high stakes environments.

    Science.gov (United States)

    Azzam, Tarek; Levine, Bret

    2015-12-01

    The role of politics has often been discussed in evaluation theory and practice. The political influence of the situation can have major effects on the evaluation design, approach and methods. Politics also has the potential to influence the decisions made from the evaluation findings. The current study focuses on the influence of the political context on stakeholder decision making. Utilizing a simulation scenario, this study compares stakeholder decision making in high and low stakes evaluation contexts. Findings suggest that high stakes political environments are more likely than low stakes environments to lead to reduced reliance on technically appropriate measures and increased dependence on measures better reflect the broader political environment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Examining a Public Montessori School's Response to the Pressures of High-Stakes Accountability

    Science.gov (United States)

    Block, Corrie Rebecca

    2015-01-01

    A public Montessori school is expected to demonstrate high student scores on standardized assessments to succeed in the current school accountability era. A problem for a public Montessori elementary school is how to make sense of the school's high-stakes assessment scores in terms of Montessori's unique educational approach. This case study…

  5. Social stakes of the reversibility in the deep storage of high level radioactive wastes

    International Nuclear Information System (INIS)

    Heriard-Dubreuil, G.; Schieber, C.; Schneider, T.

    1998-06-01

    This document proposes a study of the conditions which surrounded the reversibility introduction in high activity wastes deep storage at an international scale, as well as a reflexion on the social stakes associated there. In France, the law of december 30, 1991 concerning the research on the radioactive wastes prescribes '' the study of possibilities retrieval or non retrieval storage in deep geological deposits''. The analysis of the reversibility associated social stakes emphasizes the necessity to prevent irreversible consequences, to take care to the choices reversibility, to preserve the future generations autonomy. Thus to elaborate a more satisfactory solution between deep disposal and surface storage, a deep storage, capable of gradually evolution, concept is defined. (A.L.B.)

  6. Toward Instructional Leadership: Principals' Perceptions of Large-Scale Assessment in Schools

    Science.gov (United States)

    Prytula, Michelle; Noonan, Brian; Hellsten, Laurie

    2013-01-01

    This paper describes a study of the perceptions that Saskatchewan school principals have regarding large-scale assessment reform and their perceptions of how assessment reform has affected their roles as principals. The findings revealed that large-scale assessments, especially provincial assessments, have affected the principal in Saskatchewan…

  7. Modeling Student Motivation and Students’ Ability Estimates From a Large-Scale Assessment of Mathematics

    Directory of Open Access Journals (Sweden)

    Carlos Zerpa

    2011-09-01

    Full Text Available When large-scale assessments (LSA do not hold personal stakes for students, students may not put forth their best effort. Low-effort examinee behaviors (e.g., guessing, omitting items result in an underestimate of examinee abilities, which is a concern when using results of LSA to inform educational policy and planning. The purpose of this study was to explore the relationship between examinee motivation as defined by expectancy-value theory, student effort, and examinee mathematics abilities. A principal components analysis was used to examine the data from Grade 9 students (n = 43,562 who responded to a self-report questionnaire on their attitudes and practices related to mathematics. The results suggested a two-component model where the components were interpreted as task-values in mathematics and student effort. Next, a hierarchical linear model was implemented to examine the relationship between examinee component scores and their estimated ability on a LSA. The results of this study provide evidence that motivation, as defined by the expectancy-value theory and student effort, partially explains student ability estimates and may have implications in the information that get transferred to testing organizations, school boards, and teachers while assessing students’ Grade 9 mathematics learning.

  8. Correlates of cooperation in a one-shot high-stakes televised prisoners' dilemma.

    Directory of Open Access Journals (Sweden)

    Maxwell N Burton-Chellew

    Full Text Available Explaining cooperation between non-relatives is a puzzle for both evolutionary biology and the social sciences. In humans, cooperation is often studied in a laboratory setting using economic games such as the prisoners' dilemma. However, such experiments are sometimes criticized for being played for low stakes and by misrepresentative student samples. Golden balls is a televised game show that uses the prisoners' dilemma, with a diverse range of participants, often playing for very large stakes. We use this non-experimental dataset to investigate the factors that influence cooperation when "playing" for considerably larger stakes than found in economic experiments. The game show has earlier stages that allow for an analysis of lying and voting decisions. We found that contestants were sensitive to the stakes involved, cooperating less when the stakes were larger in both absolute and relative terms. We also found that older contestants were more likely to cooperate, that liars received less cooperative behavior, but only if they told a certain type of lie, and that physical contact was associated with reduced cooperation, whereas laughter and promises were reliable signals or cues of cooperation, but were not necessarily detected.

  9. Why Has High-Stakes Testing So Easily Slipped into Contemporary American Life?

    Science.gov (United States)

    Nichols, Sharon L.; Berliner, David C.

    2008-01-01

    High-stakes testing is the practice of attaching important consequences to standardized test scores, and it is the engine that drives the No Child Left Behind (NCLB) Act. The rationale for high-stakes testing is that the promise of rewards and the threat of punishments will cause teachers to work more effectively, students to be more motivated,…

  10. Learning to Label: Socialisation, Gender, and the Hidden Curriculum of High-Stakes Testing

    Science.gov (United States)

    Booher-Jennings, Jennifer

    2008-01-01

    Although high-stakes tests play an increasing role in students' schooling experiences, scholars have not examined these tests as sites for socialisation. Drawing on qualitative data collected at an American urban primary school, this study explores what educators teach students about motivation and effort through high-stakes testing, how students…

  11. Small Stakes Risk Aversion in the Laboratory

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Lau, Morten I.; Ross, Don

    2017-01-01

    Evidence of risk aversion in laboratory settings over small stakes leads to a priori implausible levels of risk aversion over large stakes under certain assumptions. One core assumption in statements of this calibration puzzle is that small-stakes risk aversion is observed over all levels of wealth...

  12. Fuel pin integrity assessment under large scale transients

    International Nuclear Information System (INIS)

    Dutta, B.K.

    2006-01-01

    The integrity of fuel rods under normal, abnormal and accident conditions is an important consideration during fuel design of advanced nuclear reactors. The fuel matrix and the sheath form the first barrier to prevent the release of radioactive materials into the primary coolant. An understanding of the fuel and clad behaviour under different reactor conditions, particularly under the beyond-design-basis accident scenario leading to large scale transients, is always desirable to assess the inherent safety margins in fuel pin design and to plan for the mitigation the consequences of accidents, if any. The severe accident conditions are typically characterized by the energy deposition rates far exceeding the heat removal capability of the reactor coolant system. This may lead to the clad failure due to fission gas pressure at high temperature, large- scale pellet-clad interaction and clad melting. The fuel rod performance is affected by many interdependent complex phenomena involving extremely complex material behaviour. The versatile experimental database available in this area has led to the development of powerful analytical tools to characterize fuel under extreme scenarios

  13. Hiding behind High-Stakes Testing: Meritocracy, Objectivity and Inequality in U.S. Education

    Science.gov (United States)

    Au, Wayne

    2013-01-01

    This paper analyses how high-stakes, standardised testing became the policy tool in the U.S. that it is today and discusses its role in advancing an ideology of meritocracy that fundamentally masks structural inequalities related to race and economic class. This paper first traces the early history of high-stakes testing within the U.S. context,…

  14. Curricular constraints, high-stakes testing and the reality of reform in high school science classrooms

    Science.gov (United States)

    Coble, Jennifer

    Through a series of open-ended interviews, this study investigated the beliefs of six third year high school science teachers about how they implement science education reform ideals in their practice and the contextual challenges they face as they attempt to implement reform. The teachers argue that the lack of connection between their curricula and students' lives serves as a significant obstacle to them utilizing more inquiry-based and student-centered strategies. In their science classes that are not subject to a high stakes exam, the teachers shared instances where they engage students in inquiry by refraining the focus of their curricula away from the decontextualized factual information and onto how the information relates to human experience. In their science classes subject to a high stakes test, however, the teachers confessed to feeling no choice but to utilize more teacher-centered strategies focused on information transmission. This study provides an in depth analysis of how the presence of high stakes tests discourages teachers from utilizing reform based teaching strategies within high school science classrooms.

  15. How the Internet Will Help Large-Scale Assessment Reinvent Itself

    Directory of Open Access Journals (Sweden)

    Randy Elliot Bennett

    2001-02-01

    Full Text Available Large-scale assessment in the United States is undergoing enormous pressure to change. That pressure stems from many causes. Depending upon the type of test, the issues precipitating change include an outmoded cognitive-scientific basis for test design; a mismatch with curriculum; the differential performance of population groups; a lack of information to help individuals improve; and inefficiency. These issues provide a strong motivation to reconceptualize both the substance and the business of large-scale assessment. At the same time, advances in technology, measurement, and cognitive science are providing the means to make that reconceptualization a reality. The thesis of this paper is that the largest facilitating factor will be technological, in particular the Internet. In the same way that it is already helping to revolutionize commerce, education, and even social interaction, the Internet will help revolutionize the business and substance of large-scale assessment.

  16. Raising the Stakes: High-Stakes Testing and the Attack on Public Education in New York

    Science.gov (United States)

    Hursh, David

    2013-01-01

    Over the last almost two decades, high-stakes testing has become increasingly central to New York's schools. In the 1990s, the State Department of Education began requiring that secondary students pass five standardized exams to graduate. In 2002, the federal No Child Left Behind Act required students in grades three through eight to take math and…

  17. High-Stakes and Non-Stakes Testing States and the Transfer of Knowledge to Students' Advanced Placement Test, Advanced Placement U.S. History Test, and SAT Exam Scores

    Science.gov (United States)

    Lessler, Karen Jean

    2010-01-01

    The Federal education policy No Child Left Behind Act (NCLB) has initiated high-stakes testing among U.S. public schools. The premise of the NCLB initiative is that all students reach proficiency in reading and math by 2014. Under NCLB, individual state education departments were required to implement annual assessments in grades two through eight…

  18. Implementing Assessment Engineering in the Uniform Certified Public Accountant (CPA) Examination

    Science.gov (United States)

    Burke, Matthew; Devore, Richard; Stopek, Josh

    2013-01-01

    This paper describes efforts to bring principled assessment design to a large-scale, high-stakes licensure examination by employing the frameworks of Assessment Engineering (AE), the Revised Bloom's Taxonomy (RBT), and Cognitive Task Analysis (CTA). The Uniform CPA Examination is practice-oriented and focuses on the skills of accounting. In…

  19. Re-analysis of NAEP Math and Reading Scores in States with and without High-stakes Tests: Response to Rosenshine

    Directory of Open Access Journals (Sweden)

    Audrey Amrein-Beardsley

    2003-08-01

    Full Text Available Here we address the criticism of our NAEP analyses by Rosenshine (2003. On the basis of his thoughtful critique we redid some of the analyses on which he focused. Our findings contradict his. This is no fault of his, the reasons for which are explained in this paper. Our findings do support our position that high-stakes tests do not do much to improve academic achievement. The extent to which states with high-stakes tests outperform states without high-stakes tests is, at best, indeterminable. Using 1994-1998 NAEP reading and 1996-2000 NAEP math data and accounting for NAEP exemption rates for the same years, we found that states with high-stakes tests are not outperforming states without high-stakes tests in reading in the 4th grade or math in the 8th grade at a statistically significant level. States with high-stakes tests are, however, outperforming states without high-stakes tests in math in the 4th grade at a statistically significant level. Our findings also support our earlier stance that states with high-stakes tests are exempting more students from participating in the NAEP than are states without high-stakes tests. This is more prevalent the more recent the NAEP test administration. This is illustrated in the tables below.

  20. Stakes Matter in Ultimatum Games

    DEFF Research Database (Denmark)

    Andersen, Steffen; Ertaç, Seda; Gneezy, Uri

    2011-01-01

    One of the most robust findings in experimental economics is that individuals in one-shot ultimatum games reject unfair offers. Puzzlingly, rejections have been found robust to substantial increases in stakes. By using a novel experimental design that elicits frequent low offers and uses much...... larger stakes than in the literature, we are able to examine stakes' effects over ranges of data that are heretofore unexplored. Our main result is that proportionally equivalent offers are less likely to be rejected with high stakes. In fact, our paper is the first to present evidence that as stakes...

  1. Mindfulness, anxiety, and high-stakes mathematics performance in the laboratory and classroom.

    Science.gov (United States)

    Bellinger, David B; DeCaro, Marci S; Ralston, Patricia A S

    2015-12-01

    Mindfulness enhances emotion regulation and cognitive performance. A mindful approach may be especially beneficial in high-stakes academic testing environments, in which anxious thoughts disrupt cognitive control. The current studies examined whether mindfulness improves the emotional response to anxiety-producing testing situations, freeing working memory resources, and improving performance. In Study 1, we examined performance in a high-pressure laboratory setting. Mindfulness indirectly benefited math performance by reducing the experience of state anxiety. This benefit occurred selectively for problems that required greater working memory resources. Study 2 extended these findings to a calculus course taken by undergraduate engineering majors. Mindfulness indirectly benefited students' performance on high-stakes quizzes and exams by reducing their cognitive test anxiety. Mindfulness did not impact performance on lower-stakes homework assignments. These findings reveal an important mechanism by which mindfulness benefits academic performance, and suggest that mindfulness may help attenuate the negative effects of test anxiety. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. NASA: Assessments of Selected Large-Scale Projects

    Science.gov (United States)

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  3. Does high-stakes testing increase cultural capital among low-income and racial minority students?

    Directory of Open Access Journals (Sweden)

    Won-Pyo Hong

    2008-03-01

    Full Text Available This article draws on research from Texas and Chicago to examine whether high-stakes testing enables low-income and racial minority students to acquire cultural capital. While students' performance on state or district tests rose after the implementation of high-stakes testing and accountability policies in Texas and Chicago in the 1990s, several studies indicate that these policies seemed to have had deleterious effects on curriculum, instruction, the percentage of students excluded from the tests, and student dropout rates. As a result, the policies seemed to have had mixed effects on students' opportunities to acquire embodied and institutionalized cultural capital. These findings are consistent with the work of Shepard (2000, Darling-Hammond (2004a, and others who have written of the likely negative repercussions of high-stakes testing and accountability policies.

  4. High Stakes Testing and Its Impact on Rural Schools.

    Science.gov (United States)

    Hodges, V. Pauline

    2002-01-01

    The movement to standardization and high-stakes testing has been driven by ideological and political concerns and has adversely affected teaching/learning, democratic discourse, and educational equity. Rural schools are hit harder because of geographic isolation and insufficient staff and resources. Testing used for purposes other than measuring…

  5. The Validity and Incremental Validity of Knowledge Tests, Low-Fidelity Simulations, and High-Fidelity Simulations for Predicting Job Performance in Advanced-Level High-Stakes Selection

    Science.gov (United States)

    Lievens, Filip; Patterson, Fiona

    2011-01-01

    In high-stakes selection among candidates with considerable domain-specific knowledge and experience, investigations of whether high-fidelity simulations (assessment centers; ACs) have incremental validity over low-fidelity simulations (situational judgment tests; SJTs) are lacking. Therefore, this article integrates research on the validity of…

  6. Negotiating the terrain of high-stakes accountability in science teaching

    Science.gov (United States)

    Aronson, Isaak

    Teachers interact with their students on behalf of the entire educational system. The aim of this study is to explore how biology teachers understand and construct their practice in a high-stakes accountability environment that is likely to be riddled with tensions. By critically questioning the technical paradigms of accountability this study challenges the fundamental assumptions of accountability. Such a critical approach may help teachers develop empowerment strategies that can free them from the de-skilling effects of the educational accountability system. This interpretive case study of a high-school in Maryland is grounded in three streams of research literature: quality science instruction based on scientific inquiry, the effects of educational accountability on the curriculum, and the influence of policy on classroom practice with a specific focus on how teachers balance competing tensions. This study theoretically occurs at the intersection of educational accountability and pedagogy. In terms of data collection, I conduct two interviews with all six biology teachers in the school. I observe each teacher for at least fifteen class periods. I review high-stakes accountability policy documents from the federal, state, and district levels of the education system. Three themes emerge from the research. The first theme, "re-defining science teaching," captures how deeply accountability structures have penetrated the science curriculum. The second theme, "the pressure mounts," explores how high-stakes accountability in science has increased the stress placed on teachers. The third theme, "teaching-in-between," explores how teachers compromise between accountability mandates and their own understandings of quality teaching. Together, the three themes shed light on the current high-stakes climate in which teachers currently work. This study's findings inform the myriad paradoxes at all levels of the educational system. As Congress and advocacy groups battle over

  7. The Rise of High-Stakes Educational Testing in Denmark (1920-1970)

    DEFF Research Database (Denmark)

    Ydesen, Christian

    The Rise of High-Stakes Educational Testing in Denmark (1920–1970) is an attempt to determine why and how tests rose to prominence in an educational system that used to rely on qualitative tests and teacher evaluations. The study addresses the important issues of how testing interacts...... with and influences an educational system, and which common factors are involved in order to implement testing in an educational system. The study is based on three relatively unknown case studies – illustrious examples of high-stakes educational testing practices in the Danish public school system. The first case...... to 1959. The third case study examines the testing of Greenlandic children during the preparation scheme in the Greenlandic educational system from 1961 to 1976....

  8. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  9. Evolutionary leap in large-scale flood risk assessment needed

    OpenAIRE

    Vorogushyn, Sergiy; Bates, Paul D.; de Bruijn, Karin; Castellarin, Attilio; Kreibich, Heidi; Priest, Sally J.; Schröter, Kai; Bagli, Stefano; Blöschl, Günter; Domeneghetti, Alessio; Gouldby, Ben; Klijn, Frans; Lammersen, Rita; Neal, Jeffrey C.; Ridder, Nina

    2018-01-01

    Current approaches for assessing large-scale flood risks contravene the fundamental principles of the flood risk system functioning because they largely ignore basic interactions and feedbacks between atmosphere, catchments, river-floodplain systems and socio-economic processes. As a consequence, risk analyses are uncertain and might be biased. However, reliable risk estimates are required for prioritizing national investments in flood risk mitigation or for appraisal and management of insura...

  10. Accuracy assessment of planimetric large-scale map data for decision-making

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-06-01

    Full Text Available This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data.

  11. Large-Scale Assessment, Rationality, and Scientific Management: The Case of No Child Left Behind

    Science.gov (United States)

    Roach, Andrew T.; Frank, Jennifer

    2007-01-01

    This article examines the ways in which NCLB and the movement towards large-scale assessment systems are based on Weber's concept of formal rationality and tradition of scientific management. Building on these ideas, the authors use Ritzer's McDonaldization thesis to examine some of the core features of large-scale assessment and accountability…

  12. Estimating the Effectiveness of Special Education Using Large-Scale Assessment Data

    Science.gov (United States)

    Ewing, Katherine Anne

    2009-01-01

    The inclusion of students with disabilities in large scale assessment and accountability programs has provided new opportunities to examine the impact of special education services on student achievement. Hanushek, Kain, and Rivkin (1998, 2002) evaluated the effectiveness of special education programs by examining students' gains on a large-scale…

  13. "I Like to Read, but I Know I'm Not Good at It": Children's Perspectives on High-Stakes Testing in a High-Poverty School

    Science.gov (United States)

    Dutro, Elizabeth; Selland, Makenzie

    2012-01-01

    A significant body of research articulates concerns about the current emphasis on high-stakes testing as the primary lever of education reform in the United States. However, relatively little research has focused on how children make sense of the assessment policies in which they are centrally located. In this article, we share analyses of…

  14. The Effects of High-Stakes Testing Policy on Arts Education

    Science.gov (United States)

    Baker, Richard A., Jr.

    2012-01-01

    This study examined high-stakes test scores for 37,222 eighth grade students enrolled in music and/or visual arts classes and those students not enrolled in arts courses. Students enrolled in music had significantly higher mean scores than those not enrolled in music (p less than 0.001). Results for visual arts and dual arts were not as…

  15. Using Assessment to Drive the Reform of Schooling: Time to Stop Pursuing the Chimera?

    Science.gov (United States)

    Torrance, Harry

    2011-01-01

    Internationally, over the last 20-30 years, changing the procedures and processes of assessment has come to be seen, by many educators as well as policy-makers, as a way to frame the curriculum and drive the reform of schooling. Such developments have often been manifested in large scale, high stakes testing programmes. At the same time…

  16. Small Stakes Risk Aversion in the Laboratory

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Lau, Morten; Ross, Don

    Evidence of risk aversion in laboratory settings over small stakes leads to a priori implausible levels of risk aversion over large stakes under certain assumptions. One core assumption in standard statements of this calibration puzzle is that individuals define utility over terminal wealth......, and that terminal wealth is defined as the sum of extra-lab wealth and any wealth accumulated in the lab. This assumption is often used in Expected Utility Theory, as well as popular alternatives such as RankDependent Utility theory. Another core assumption is that the small-stakes risk aversion is observed over...... all levels of wealth, or over a “sufficiently large” range of wealth. Although this second assumption if often viewed as self-evident from the vast experimental literature showing risk aversion over laboratory stakes, it actually requires that lab wealth be varied for a given subject as one takes...

  17. Large-scale model-based assessment of deer-vehicle collision risk.

    Directory of Open Access Journals (Sweden)

    Torsten Hothorn

    Full Text Available Ungulates, in particular the Central European roe deer Capreolus capreolus and the North American white-tailed deer Odocoileus virginianus, are economically and ecologically important. The two species are risk factors for deer-vehicle collisions and as browsers of palatable trees have implications for forest regeneration. However, no large-scale management systems for ungulates have been implemented, mainly because of the high efforts and costs associated with attempts to estimate population sizes of free-living ungulates living in a complex landscape. Attempts to directly estimate population sizes of deer are problematic owing to poor data quality and lack of spatial representation on larger scales. We used data on >74,000 deer-vehicle collisions observed in 2006 and 2009 in Bavaria, Germany, to model the local risk of deer-vehicle collisions and to investigate the relationship between deer-vehicle collisions and both environmental conditions and browsing intensities. An innovative modelling approach for the number of deer-vehicle collisions, which allows nonlinear environment-deer relationships and assessment of spatial heterogeneity, was the basis for estimating the local risk of collisions for specific road types on the scale of Bavarian municipalities. Based on this risk model, we propose a new "deer-vehicle collision index" for deer management. We show that the risk of deer-vehicle collisions is positively correlated to browsing intensity and to harvest numbers. Overall, our results demonstrate that the number of deer-vehicle collisions can be predicted with high precision on the scale of municipalities. In the densely populated and intensively used landscapes of Central Europe and North America, a model-based risk assessment for deer-vehicle collisions provides a cost-efficient instrument for deer management on the landscape scale. The measures derived from our model provide valuable information for planning road protection and defining

  18. Social Perception of Hydrogen Technologies: The View of Spanish Stake holders

    International Nuclear Information System (INIS)

    Ferri Anglada, S.

    2013-01-01

    This technical report presents an overview of the social perception and vision of a sample of Spanish stake holders on hydrogen technologies. The study is based on the implementation of a survey, combining both quantitative and qualitative data. An ad hoc electronic survey was design to collect views and perceptions on several key factors regarding this innovative energy alternative. The group of experts participating (N=130) in the study, comes mainly from research centers, universities and private companies. The survey addresses three major themes: expert views, social acceptability, and contextual factors of hydrogen technologies. The aim is to capture both the current and the future scene as viewed by the experts on hydrogen technologies, identifying key factors in terms of changes, uncertainties, obstacles and opportunities. The objective is to identify potential key features for the introduction, development, promotion, implementation, and large-scale deployment of a highly successful energy proposal in countries such as Iceland, one of the pioneers in base its economy on hydrogen technologies. To conclude, this report illustrates the positive engagement of a sample of Spanish stake holders towards hydrogen technologies that may prove vital in the transition towards the Hydrogen Economy in Spain. (Author)

  19. Analysis of environmental impact assessment for large-scale X-ray medical equipments

    International Nuclear Information System (INIS)

    Fu Jin; Pei Chengkai

    2011-01-01

    Based on an Environmental Impact Assessment (EIA) project, this paper elaborates the basic analysis essentials of EIA for the sales project of large-scale X-ray medical equipment, and provides the analysis procedure of environmental impact and dose estimation method under normal and accident conditions. The key points of EIA for the sales project of large-scale X-ray medical equipment include the determination of pollution factor and management limit value according to the project's actual situation, the utilization of various methods of assessment and prediction such as analogy, actual measurement and calculation to analyze, monitor, calculate and predict the pollution during normal and accident condition. (authors)

  20. A conceptual analysis of standard setting in large-scale assessments

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1994-01-01

    Elements of arbitrariness in the standard setting process are explored, and an alternative to the use of cut scores is presented. The first part of the paper analyzes the use of cut scores in large-scale assessments, discussing three different functions: (1) cut scores define the qualifications used

  1. The creation of a pedagogy of promise: Examples of educational excellence in high-stakes science classrooms

    Science.gov (United States)

    McCollough, Cherie A.

    The current reform movement in education has two forces that appear contradictory in nature. The first is an emphasis on rigor and accountability that is assessed through high-stakes testing. The second is the recommendation to have student centered approaches to teaching and learning, especially those that emphasize inquiry methodology and constructivist pedagogy. Literature reports that current reform efforts involving accountability through high-stakes tests are detrimental to student learning and are contradictory to student-centered teaching approaches. However, by focusing attention on those teachers who "teach against the grain" and raise the achievement levels of students from diverse backgrounds, instructional strategies and personal characteristics of exemplary teachers can be identified. This mixed-methods research study investigated four exemplary urban high school science teachers in high-stakes (TAKS) tested science classrooms. Classroom observations, teacher and student interviews, pre-/postcontent tests and the Constructivist Learning Environment Survey (CLES) (Johnson & McClure, 2004) provided the main data sources. The How People Learn (National Research Council, 2000) theoretical framework provided evidence of elements of inquiry-based, student-centered teaching. Descriptive case analysis (Yin, 1994) and quantitative analysis of pre/post tests and the CLES revealed the following results. First, all participating teachers included elements of learner-centeredness, knowledge-centeredness, assessment-centeredness and community-centeredness in their teaching as recommended by the National Research Council, (2000), thus creating student-centered classroom environments. Second, by establishing a climate of caring where students felt supported and motivated to learn, teachers managed tensions resulting from the incorporation of student-centered elements and the accountability-based instructional mandates outlined by their school district and state

  2. Assessment of renewable energy resources potential for large scale and standalone applications in Ethiopia

    NARCIS (Netherlands)

    Tucho, Gudina Terefe; Weesie, Peter D.M.; Nonhebel, Sanderine

    2014-01-01

    This study aims to determine the contribution of renewable energy to large scale and standalone application in Ethiopia. The assessment starts by determining the present energy system and the available potentials. Subsequently, the contribution of the available potentials for large scale and

  3. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  4. High Stakes Trigger the Use of Multiple Memories to Enhance the Control of Attention

    Science.gov (United States)

    Reinhart, Robert M.G.; Woodman, Geoffrey F.

    2014-01-01

    We can more precisely tune attention to highly rewarding objects than other objects in our environment, but how our brains do this is unknown. After a few trials of searching for the same object, subjects' electrical brain activity indicated that they handed off the memory representations used to control attention from working memory to long-term memory. However, when a large reward was possible, the neural signature of working memory returned as subjects recruited working memory to supplement the cognitive control afforded by the representations accumulated in long-term memory. The amplitude of this neural signature of working memory predicted the magnitude of the subsequent behavioral reward-based attention effects across tasks and individuals, showing the ubiquity of this cognitive reaction to high-stakes situations. PMID:23448876

  5. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  6. Understanding the Reading Attributes and Their Cognitive Relationships on a High-Stakes Biology Assessment

    Science.gov (United States)

    Rawlusyk, Kevin James

    Test items used to assess learners' knowledge on high-stakes science examinations contain contextualized questions that unintentionally assess reading skill along with conceptual knowledge. Therefore, students who are not proficient readers are unable to comprehend the text within the test item to demonstrate effectively their level of science knowledge. The purpose of this quantitative study was to understand what reading attributes were required to successfully answer the Biology 30 Diploma Exam. Furthermore, the research sought to understand the cognitive relationships among the reading attributes through quantitative analysis structured by the Attribute Hierarchy Model (AHM). The research consisted of two phases: (1) Cognitive development, where the cognitive attributes of the Biology 30 Exam were specified and hierarchy structures were developed; and (2) Psychometric analysis, that statistically tested the attribute hierarchy using the Hierarchy Consistency Index (HCI), and calculate attribute probabilities. Phase one of the research used January 2011, Biology 30 Diploma Exam, while phase two accessed archival data for the 9985 examinees who took the assessment on January 24th, 2011. Phase one identified ten specific reading attributes, of which five were identified as unique subsets of vocabulary, two were identified as reading visual representations, and three corresponded to general reading skills. Four hierarchical cognitive model were proposed then analyzed using the HCI as a mechanism to explain the relationship among the attributes. Model A had the highest HCI value (0.337), indicating an overall poor data fit, yet for the top achieving examinees the model had an excellent model fit with an HCI value of 0.888, and for examinees that scored over 60% there was a moderate model fit (HCI = 0.592). Linear regressions of the attribute probability estimates suggest that there is a cognitive relationship among six of the ten reading attributes (R2 = 0.958 and 0

  7. The Role of Policy Assumptions in Validating High-stakes Testing Programs.

    Science.gov (United States)

    Kane, Michael

    L. Cronbach has made the point that for validity arguments to be convincing to diverse audiences, they need to be based on assumptions that are credible to these audiences. The interpretations and uses of high stakes test scores rely on a number of policy assumptions about what should be taught in schools, and more specifically, about the content…

  8. Large scale modulation of high frequency acoustic waves in periodic porous media.

    Science.gov (United States)

    Boutin, Claude; Rallu, Antoine; Hans, Stephane

    2012-12-01

    This paper deals with the description of the modulation at large scale of high frequency acoustic waves in gas saturated periodic porous media. High frequencies mean local dynamics at the pore scale and therefore absence of scale separation in the usual sense of homogenization. However, although the pressure is spatially varying in the pores (according to periodic eigenmodes), the mode amplitude can present a large scale modulation, thereby introducing another type of scale separation to which the asymptotic multi-scale procedure applies. The approach is first presented on a periodic network of inter-connected Helmholtz resonators. The equations governing the modulations carried by periodic eigenmodes, at frequencies close to their eigenfrequency, are derived. The number of cells on which the carrying periodic mode is defined is therefore a parameter of the modeling. In a second part, the asymptotic approach is developed for periodic porous media saturated by a perfect gas. Using the "multicells" periodic condition, one obtains the family of equations governing the amplitude modulation at large scale of high frequency waves. The significant difference between modulations of simple and multiple mode are evidenced and discussed. The features of the modulation (anisotropy, width of frequency band) are also analyzed.

  9. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    Science.gov (United States)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  10. Teacher and headmaster attitudes towards benchmarking and high-stakes testing in adult teaching in Denmark

    DEFF Research Database (Denmark)

    Petersen, Karen Bjerg

    Based on research, surveys and interviews the paper traces teacher and headmaster attitudes towards the introduction of benchmarking and high-stakes language testing introduced in the wake of a neo-liberal education policy in adult teaching for migrants in Denmark in the 2000s. The findings show...... students, reduced use of both project work and non test related activities and stressful working conditions....... that the majority of teachers and headmasters reject benchmarking. Meanwhile, due to both headmasters and language teachers the introduction of high stakes language testing has had an immense impact on the organization, content and quality of adult language teaching. On the one side teachers do not necessarily...

  11. "It's Important for Them to Know Who They Are": Teachers' Efforts to Sustain Students' Cultural Competence in an Age of High-Stakes Testing

    Science.gov (United States)

    Zoch, Melody

    2017-01-01

    This article examines how four urban elementary teachers designed their literacy instruction in ways that sought to sustain students' cultural competence--maintaining their language and cultural practices while also gaining access to more dominant ones--amid expectations to prepare students for high-stakes testing. A large part of their teaching…

  12. What does social research say about high-stakes tests?

    Directory of Open Access Journals (Sweden)

    Rafael Feito Alonso

    2017-03-01

    Full Text Available High-stake tests, which students need to pass in order to gain a Secondary education certificate, have aroused a lot of controversy whenever they have been implemented. Especially in the USA these tests have produced a dramatic shrinking of school knowledge as they have been focused into questions posed by the very tests. At the same time, there has been a critical modification of the learning processes due to the fact that these tests encourage students to pay more attention to factual knowledge, which is far away from hands-on learning, debating in class or working in teams. In spite of certain discrepancies, by and large, research casts serious doubts about whether these tests are conductive to better academic performance. Nevertheless, the dropout rate is on the rise. More often than not, school principals have been found preventing struggling students to sit the exams. A review of the research into this matter allows us to put into context the debate surrounding external exams posed by the current education law (LOMCE: Law for the Improvement of Educational Quality passed by the parliamentary majority of the Popular Party in Spain in 2013.

  13. LARGE-SCALE HYDROGEN PRODUCTION FROM NUCLEAR ENERGY USING HIGH TEMPERATURE ELECTROLYSIS

    International Nuclear Information System (INIS)

    O'Brien, James E.

    2010-01-01

    Hydrogen can be produced from water splitting with relatively high efficiency using high-temperature electrolysis. This technology makes use of solid-oxide cells, running in the electrolysis mode to produce hydrogen from steam, while consuming electricity and high-temperature process heat. When coupled to an advanced high temperature nuclear reactor, the overall thermal-to-hydrogen efficiency for high-temperature electrolysis can be as high as 50%, which is about double the overall efficiency of conventional low-temperature electrolysis. Current large-scale hydrogen production is based almost exclusively on steam reforming of methane, a method that consumes a precious fossil fuel while emitting carbon dioxide to the atmosphere. Demand for hydrogen is increasing rapidly for refining of increasingly low-grade petroleum resources, such as the Athabasca oil sands and for ammonia-based fertilizer production. Large quantities of hydrogen are also required for carbon-efficient conversion of biomass to liquid fuels. With supplemental nuclear hydrogen, almost all of the carbon in the biomass can be converted to liquid fuels in a nearly carbon-neutral fashion. Ultimately, hydrogen may be employed as a direct transportation fuel in a 'hydrogen economy.' The large quantity of hydrogen that would be required for this concept should be produced without consuming fossil fuels or emitting greenhouse gases. An overview of the high-temperature electrolysis technology will be presented, including basic theory, modeling, and experimental activities. Modeling activities include both computational fluid dynamics and large-scale systems analysis. We have also demonstrated high-temperature electrolysis in our laboratory at the 15 kW scale, achieving a hydrogen production rate in excess of 5500 L/hr.

  14. Explore the Usefulness of Person-Fit Analysis on Large-Scale Assessment

    Science.gov (United States)

    Cui, Ying; Mousavi, Amin

    2015-01-01

    The current study applied the person-fit statistic, l[subscript z], to data from a Canadian provincial achievement test to explore the usefulness of conducting person-fit analysis on large-scale assessments. Item parameter estimates were compared before and after the misfitting student responses, as identified by l[subscript z], were removed. The…

  15. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  16. Students' Attitudes toward High-Stakes Testing and Its Effect on Educational Decisions

    Science.gov (United States)

    Moran, Aldo Alfredo

    2010-01-01

    With the recent increase in accountability due to No Child Left Behind, graduation rates and drop-out rates are important indicators of how well a school district is performing. High-stakes testing scores are at the forefront of a school's success and recognition as a school that is preparing and graduating students to meet society's challenging…

  17. Agro-fuels, a cartography of stakes

    International Nuclear Information System (INIS)

    2008-09-01

    This document proposes a dashboard of the main issues regarding agro-fuels. Nine sheets propose basic information and data on these issues: 1- agro-fuel production and consumption in the world (ethanol, vegetable oils, perspective for demand in the transport sector), 2- energy efficiency and greenhouse gas emissions (energy assessments and greenhouse effect of agro-fuels, discrepancies of results between first-generation European agro-fuels, case of agro-fuels produced in Southern countries), 3- needed surfaces in Europe (land use and cultivable areas for agro-fuel production in Europe and in France, competition between food and energy crops), 4- deforestation in the South (relationship between agriculture, deforestation and agro-fuels, between deforestation and greenhouse gas emissions), 5- impacts on biodiversity (use of pesticides and fertilizers, large scale cultivations and single-crop farming, cultivation of fallow land and permanent meadows, deforestation in the South, relationship between agro-fuels and GMOs), 6- impacts on water, soil and air (water quality and availability, soil erosion, compaction and fertility loss, air quality), 7- food-related and social stakes (issue of food security, social impacts of agro-fuel production with pressure on family agriculture and issues of land property), 8- public supports and economic efficiency (public promotion of agro-fuels, agro-fuel and oil prices, assessment of the 'avoided' CO 2 ton), and 9- perspectives for second-generation agro-fuels (definitions and processes, benefits with respect to first-generation fuels, possible impacts on the environment, barriers to their development)

  18. Reproducible, large-scale production of thallium-based high-temperature superconductors

    International Nuclear Information System (INIS)

    Gay, R.L.; Stelman, D.; Newcomb, J.C.; Grantham, L.F.; Schnittgrund, G.D.

    1990-01-01

    This paper reports on the development of a large scale spray-calcination technique generic to the preparation of ceramic high-temperature superconductor (HTSC) powders. Among the advantages of the technique is that of producing uniformly mixed metal oxides on a fine scale. Production of both yttrium and thallium-based HTSCs has been demonstrated using this technique. In the spray calciner, solutions of the desired composition are atomized as a fine mist into a hot gas. Evaporation and calcination are instantaneous, yielding an extremely fine, uniform oxide powder. The calciner is 76 cm in diameter and can produce metal oxide powder at relatively large rates (approximately 100 g/h) without contamination

  19. Secondary Analysis of Large-Scale Assessment Data: An Alternative to Variable-Centred Analysis

    Science.gov (United States)

    Chow, Kui Foon; Kennedy, Kerry John

    2014-01-01

    International large-scale assessments are now part of the educational landscape in many countries and often feed into major policy decisions. Yet, such assessments also provide data sets for secondary analysis that can address key issues of concern to educators and policymakers alike. Traditionally, such secondary analyses have been based on a…

  20. Selfish play increases during high-stakes NBA games and is rewarded with more lucrative contracts.

    Science.gov (United States)

    Uhlmann, Eric Luis; Barnes, Christopher M

    2014-01-01

    High-stakes team competitions can present a social dilemma in which participants must choose between concentrating on their personal performance and assisting teammates as a means of achieving group objectives. We find that despite the seemingly strong group incentive to win the NBA title, cooperative play actually diminishes during playoff games, negatively affecting team performance. Thus team cooperation decreases in the very high stakes contexts in which it is most important to perform well together. Highlighting the mixed incentives that underlie selfish play, personal scoring is rewarded with more lucrative future contracts, whereas assisting teammates to score is associated with reduced pay due to lost opportunities for personal scoring. A combination of misaligned incentives and psychological biases in performance evaluation bring out the "I" in "team" when cooperation is most critical.

  1. Selfish play increases during high-stakes NBA games and is rewarded with more lucrative contracts.

    Directory of Open Access Journals (Sweden)

    Eric Luis Uhlmann

    Full Text Available High-stakes team competitions can present a social dilemma in which participants must choose between concentrating on their personal performance and assisting teammates as a means of achieving group objectives. We find that despite the seemingly strong group incentive to win the NBA title, cooperative play actually diminishes during playoff games, negatively affecting team performance. Thus team cooperation decreases in the very high stakes contexts in which it is most important to perform well together. Highlighting the mixed incentives that underlie selfish play, personal scoring is rewarded with more lucrative future contracts, whereas assisting teammates to score is associated with reduced pay due to lost opportunities for personal scoring. A combination of misaligned incentives and psychological biases in performance evaluation bring out the "I" in "team" when cooperation is most critical.

  2. The Complex and Unequal Impact of High Stakes Accountability on Untested Social Studies

    Science.gov (United States)

    Pace, Judith L.

    2011-01-01

    This article contributes to research on the impact of high stakes accountability on social studies teaching where it is "not" tested by the state, and addresses the question of what is happening in middle and higher performing versus struggling schools (Wills, 2007). The author presents complex findings from a qualitative study in five…

  3. The Disproportionate Erosion of Local Control: Urban School Boards, High-Stakes Accountability, and Democracy

    Science.gov (United States)

    Trujillo, Tina M.

    2013-01-01

    This case study of an urban school board's experiences under high-stakes accountability demonstrates how the district leaders eschewed democratic governance processes in favor of autocratic behaviors. They possessed narrowly defined goals for teaching and learning that emphasized competitive, individualized means of achievement. Their decision…

  4. Linking Large-Scale Reading Assessments: Comment

    Science.gov (United States)

    Hanushek, Eric A.

    2016-01-01

    E. A. Hanushek points out in this commentary that applied researchers in education have only recently begun to appreciate the value of international assessments, even though there are now 50 years of experience with these. Until recently, these assessments have been stand-alone surveys that have not been linked, and analysis has largely focused on…

  5. Philosophical Questions about Teaching Philosophy: What's at Stake in High School Philosophy Education?

    Science.gov (United States)

    Norris, Trevor

    2015-01-01

    What is at stake in high school philosophy education, and why? Why is it a good idea to teach philosophy at this level? This essay seeks to address some issues that arose in revising the Ontario grade 12 philosophy curriculum documents, significant insights from philosophy teacher education, and some early results of recent research funded by the…

  6. Group Differences in Test-Taking Behaviour: An Example from a High-Stakes Testing Program

    Science.gov (United States)

    Stenlund, Tova; Eklöf, Hanna; Lyrén, Per-Erik

    2017-01-01

    This study investigated whether different groups of test-takers vary in their reported test-taking behaviour in a high-stakes test situation. A between-group design (N = 1129) was used to examine whether high and low achievers, as well as females and males, differ in their use of test-taking strategies, and in level of reported test anxiety and…

  7. Large-scale runoff generation - parsimonious parameterisation using high-resolution topography

    Science.gov (United States)

    Gong, L.; Halldin, S.; Xu, C.-Y.

    2011-08-01

    World water resources have primarily been analysed by global-scale hydrological models in the last decades. Runoff generation in many of these models are based on process formulations developed at catchments scales. The division between slow runoff (baseflow) and fast runoff is primarily governed by slope and spatial distribution of effective water storage capacity, both acting at very small scales. Many hydrological models, e.g. VIC, account for the spatial storage variability in terms of statistical distributions; such models are generally proven to perform well. The statistical approaches, however, use the same runoff-generation parameters everywhere in a basin. The TOPMODEL concept, on the other hand, links the effective maximum storage capacity with real-world topography. Recent availability of global high-quality, high-resolution topographic data makes TOPMODEL attractive as a basis for a physically-based runoff-generation algorithm at large scales, even if its assumptions are not valid in flat terrain or for deep groundwater systems. We present a new runoff-generation algorithm for large-scale hydrology based on TOPMODEL concepts intended to overcome these problems. The TRG (topography-derived runoff generation) algorithm relaxes the TOPMODEL equilibrium assumption so baseflow generation is not tied to topography. TRG only uses the topographic index to distribute average storage to each topographic index class. The maximum storage capacity is proportional to the range of topographic index and is scaled by one parameter. The distribution of storage capacity within large-scale grid cells is obtained numerically through topographic analysis. The new topography-derived distribution function is then inserted into a runoff-generation framework similar VIC's. Different basin parts are parameterised by different storage capacities, and different shapes of the storage-distribution curves depend on their topographic characteristics. The TRG algorithm is driven by the

  8. Is the Physical Being Taken out of Physical Education? On the Possible Effects of High-Stakes Testing on an Embattled Profession's Curriculum Goals

    Science.gov (United States)

    Seymour, Clancy; Garrison, Mark

    2015-01-01

    Building on recent discussions regarding how current national standards for physical education promote cognitive outcomes over physical outcomes, the authors explore how a new era in high-stakes testing is also contributing to an emphasis on the cognitive, over the physical. While high-stakes testing has been linked to reducing the amount of…

  9. Figuring out How to Be a Teacher in a High-Stakes Context: A Case Study of First-Year Teachers' Conceptual and Practical Development

    Science.gov (United States)

    Brown, Christopher P.; Bay-Borelli, Debra E.; Scott, Jill

    2015-01-01

    High-stakes education reforms across the United States and the globe continue to alter the landscape of teaching and teacher education. One key but understudied aspect of this reform process is the experiences of first-year teachers, particularly those who participated in these high-stakes education systems as students and as a…

  10. Guidance for Large-scale Implementation of Alternate Wetting and Drying: A Biophysical Suitability Assessment

    Science.gov (United States)

    Sander, B. O.; Wassmann, R.; Nelson, A.; Palao, L.; Wollenberg, E.; Ishitani, M.

    2014-12-01

    The alternate wetting and drying (AWD) technology for rice production does not only save 15-30% of irrigation water, it also reduces methane emissions by up to 70%. AWD is defined by periodic drying and re-flooding of a rice field. Due to its high mitigation potential and its simplicity to execute this practice AWD has gained a lot of attention in recent years. The Climate and Clean Air Coalition (CCAC) has put AWD high on its agenda and funds a project to guide implementation of this technology in Vietnam, Bangladesh and Colombia. One crucial activity is a biophysical suitability assessment for AWD in the three countries. For this, we analyzed rainfall and soil data as well as potential evapotranspiration to assess if the water balance allows practicing AWD or if precipitation is too high for rice fields to fall dry. In my talk I will outline key factors for a successful large-scale implementation of AWD with a focus on the biophysical suitability assessment. The seasonal suitability maps that we generated highlight priority areas for AWD implementation and guide policy makers to informed decisions about meaningful investments in infrastructure and extension work.

  11. Negotiating the Literacy Block: Constructing Spaces for Critical Literacy in a High Stakes Setting

    Science.gov (United States)

    Paugh, Patricia; Carey, Jane; King-Jackson, Valerie; Russell, Shelley

    2007-01-01

    This article focuses on the evolution of the classroom literacy block as a learning space where teachers and students renegotiated activities for independent vocabulary and word work within a high-stakes reform environment. When a second grade classroom teacher and literacy support specialist decided to co-teach, they invited all students in the…

  12. Academically Buoyant Students Are Less Anxious about and Perform Better in High-Stakes Examinations

    Science.gov (United States)

    Putwain, David W.; Daly, Anthony L.; Chamberlain, Suzanne; Sadreddini, Shireen

    2015-01-01

    Background: Prior research has shown that test anxiety is negatively related to academic buoyancy, but it is not known whether test anxiety is an antecedent or outcome of academic buoyancy. Furthermore, it is not known whether academic buoyancy is related to performance on high-stakes examinations. Aims: To test a model specifying reciprocal…

  13. "I'm Just Going through the Motions": High-Stakes Accountability and Teachers' Access to Intrinsic Rewards

    Science.gov (United States)

    Rooney, Erin

    2015-01-01

    This article explores teachers' experiences under high-stakes accountability and shows how the narrowing of curriculum depleted teachers' intrinsic work rewards. The article analyzes data from an ethnographic study of teachers' work in two high-poverty urban public schools. The study shows that as instructional mandates emphasized a narrowed…

  14. Optimizing Prediction Using Bayesian Model Averaging: Examples Using Large-Scale Educational Assessments.

    Science.gov (United States)

    Kaplan, David; Lee, Chansoon

    2018-01-01

    This article provides a review of Bayesian model averaging as a means of optimizing the predictive performance of common statistical models applied to large-scale educational assessments. The Bayesian framework recognizes that in addition to parameter uncertainty, there is uncertainty in the choice of models themselves. A Bayesian approach to addressing the problem of model uncertainty is the method of Bayesian model averaging. Bayesian model averaging searches the space of possible models for a set of submodels that satisfy certain scientific principles and then averages the coefficients across these submodels weighted by each model's posterior model probability (PMP). Using the weighted coefficients for prediction has been shown to yield optimal predictive performance according to certain scoring rules. We demonstrate the utility of Bayesian model averaging for prediction in education research with three examples: Bayesian regression analysis, Bayesian logistic regression, and a recently developed approach for Bayesian structural equation modeling. In each case, the model-averaged estimates are shown to yield better prediction of the outcome of interest than any submodel based on predictive coverage and the log-score rule. Implications for the design of large-scale assessments when the goal is optimal prediction in a policy context are discussed.

  15. The impact of high-stakes school admission exams on study achievements: quasi-experimental evidence from Slovakia

    Czech Academy of Sciences Publication Activity Database

    Federičová, Miroslava; Münich, Daniel

    2017-01-01

    Roč. 30, č. 4 (2017), s. 1069-1092 ISSN 0933-1433 Institutional support: Progres-Q24 Keywords : high-stakes exams * students’ motivation * achievement Subject RIV: AH - Economics OBOR OECD: Applied Economics , Econometrics Impact factor: 1.136, year: 2016

  16. Markets, Managerialism and Teachers' Work: The Invisible Hand of High Stakes Testing in England

    Science.gov (United States)

    Stevenson, Howard; Wood, Phil

    2013-01-01

    High stakes testing has been long established in the English school system. In this article, we seek to demonstrate how testing has become pivotal to securing the neo-liberal restructuring of schools, that commenced during the Thatcher era, and is reaching a critical point at the current time. Central to this project has been the need to assert…

  17. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  18. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  19. How to Measure and Explain Achievement Change in Large-Scale Assessments: A Rejoinder

    Science.gov (United States)

    Hickendorff, Marian; Heiser, Willem J.; van Putten, Cornelis M.; Verhelst, Norman D.

    2009-01-01

    In this rejoinder, we discuss substantive and methodological validity issues of large-scale assessments of trends in student achievement, commenting on the discussion paper by Van den Heuvel-Panhuizen, Robitzsch, Treffers, and Koller (2009). We focus on methodological challenges in deciding what to measure, how to measure it, and how to foster…

  20. How much is too much assessment? Insight into assessment-driven student learning gains in large-scale undergraduate microbiology courses.

    Science.gov (United States)

    Wang, Jack T H; Schembri, Mark A; Hall, Roy A

    2013-01-01

    Designing and implementing assessment tasks in large-scale undergraduate science courses is a labor-intensive process subject to increasing scrutiny from students and quality assurance authorities alike. Recent pedagogical research has provided conceptual frameworks for teaching introductory undergraduate microbiology, but has yet to define best-practice assessment guidelines. This study assessed the applicability of Biggs' theory of constructive alignment in designing consistent learning objectives, activities, and assessment items that aligned with the American Society for Microbiology's concept-based microbiology curriculum in MICR2000, an introductory microbiology course offered at the University of Queensland, Australia. By improving the internal consistency in assessment criteria and increasing the number of assessment items explicitly aligned to the course learning objectives, the teaching team was able to efficiently provide adequate feedback on numerous assessment tasks throughout the semester, which contributed to improved student performance and learning gains. When comparing the constructively aligned 2011 offering of MICR2000 with its 2010 counterpart, students obtained higher marks in both coursework assignments and examinations as the semester progressed. Students also valued the additional feedback provided, as student rankings for course feedback provision increased in 2011 and assessment and feedback was identified as a key strength of MICR2000. By designing MICR2000 using constructive alignment and iterative assessment tasks that followed a common set of learning outcomes, the teaching team was able to effectively deliver detailed and timely feedback in a large introductory microbiology course. This study serves as a case study for how constructive alignment can be integrated into modern teaching practices for large-scale courses.

  1. Lessons from a large-scale assessment: Results from conceptual inventories

    Directory of Open Access Journals (Sweden)

    Beth Thacker

    2014-07-01

    Full Text Available We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER (physics education research-informed materials into a department where most instruction has previously been traditional and a significant number of faculty are hesitant, ambivalent, or even resistant to the introduction of such reforms. Data were collected in all of the sections of both the large algebra- and calculus-based introductory courses for a number of years employing commonly used conceptual inventories. Results from a small PER-informed, inquiry-based, laboratory-based class are also reported. Results suggest that when PER-informed materials are introduced in the labs and recitations, independent of the lecture style, there is an increase in students’ conceptual inventory gains. There is also an increase in the results on conceptual inventories if PER-informed instruction is used in the lecture. The highest conceptual inventory gains were achieved by the combination of PER-informed lectures and laboratories in large class settings and by the hands-on, laboratory-based, inquiry-based course taught in a small class setting.

  2. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  3. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  4. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    Science.gov (United States)

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software

  5. Large-scale runoff generation – parsimonious parameterisation using high-resolution topography

    Directory of Open Access Journals (Sweden)

    L. Gong

    2011-08-01

    Full Text Available World water resources have primarily been analysed by global-scale hydrological models in the last decades. Runoff generation in many of these models are based on process formulations developed at catchments scales. The division between slow runoff (baseflow and fast runoff is primarily governed by slope and spatial distribution of effective water storage capacity, both acting at very small scales. Many hydrological models, e.g. VIC, account for the spatial storage variability in terms of statistical distributions; such models are generally proven to perform well. The statistical approaches, however, use the same runoff-generation parameters everywhere in a basin. The TOPMODEL concept, on the other hand, links the effective maximum storage capacity with real-world topography. Recent availability of global high-quality, high-resolution topographic data makes TOPMODEL attractive as a basis for a physically-based runoff-generation algorithm at large scales, even if its assumptions are not valid in flat terrain or for deep groundwater systems. We present a new runoff-generation algorithm for large-scale hydrology based on TOPMODEL concepts intended to overcome these problems. The TRG (topography-derived runoff generation algorithm relaxes the TOPMODEL equilibrium assumption so baseflow generation is not tied to topography. TRG only uses the topographic index to distribute average storage to each topographic index class. The maximum storage capacity is proportional to the range of topographic index and is scaled by one parameter. The distribution of storage capacity within large-scale grid cells is obtained numerically through topographic analysis. The new topography-derived distribution function is then inserted into a runoff-generation framework similar VIC's. Different basin parts are parameterised by different storage capacities, and different shapes of the storage-distribution curves depend on their topographic characteristics. The TRG algorithm

  6. The Contradictions of High-Stakes Accountability "Success": A Case Study of Focused Leadership and Performance Agency

    Science.gov (United States)

    Black, William R.

    2008-01-01

    This article seeks to advance the discussion of the availability of contemporary notions of school leadership for school leaders working within high-stakes accountability reform environment that produce discourses of urgency and legitimize practices of performance that implicitly favour centralized, neo-Tayloristic managerial approaches. Drawing…

  7. Large-scale assessment of benthic communities across multiple marine protected areas using an autonomous underwater vehicle.

    Science.gov (United States)

    Ferrari, Renata; Marzinelli, Ezequiel M; Ayroza, Camila Rezende; Jordan, Alan; Figueira, Will F; Byrne, Maria; Malcolm, Hamish A; Williams, Stefan B; Steinberg, Peter D

    2018-01-01

    Marine protected areas (MPAs) are designed to reduce threats to biodiversity and ecosystem functioning from anthropogenic activities. Assessment of MPAs effectiveness requires synchronous sampling of protected and non-protected areas at multiple spatial and temporal scales. We used an autonomous underwater vehicle to map benthic communities in replicate 'no-take' and 'general-use' (fishing allowed) zones within three MPAs along 7o of latitude. We recorded 92 taxa and 38 morpho-groups across three large MPAs. We found that important habitat-forming biota (e.g. massive sponges) were more prevalent and abundant in no-take zones, while short ephemeral algae were more abundant in general-use zones, suggesting potential short-term effects of zoning (5-10 years). Yet, short-term effects of zoning were not detected at the community level (community structure or composition), while community structure varied significantly among MPAs. We conclude that by allowing rapid, simultaneous assessments at multiple spatial scales, autonomous underwater vehicles are useful to document changes in marine communities and identify adequate scales to manage them. This study advanced knowledge of marine benthic communities and their conservation in three ways. First, we quantified benthic biodiversity and abundance, generating the first baseline of these benthic communities against which the effectiveness of three large MPAs can be assessed. Second, we identified the taxonomic resolution necessary to assess both short and long-term effects of MPAs, concluding that coarse taxonomic resolution is sufficient given that analyses of community structure at different taxonomic levels were generally consistent. Yet, observed differences were taxa-specific and may have not been evident using our broader taxonomic classifications, a classification of mid to high taxonomic resolution may be necessary to determine zoning effects on key taxa. Third, we provide an example of statistical analyses and

  8. Achievement goal orientation and situational motivation for a low-stakes test of content knowledge.

    Science.gov (United States)

    Waskiewicz, Rhonda A

    2012-05-10

    To determine the extent of the relationship between students' inherent motivation to achieve in a doctor of pharmacy program and their motivation to achieve on a single low-stakes test of content knowledge. The Attitude Toward Learning Questionnaire (ATL) was administered to 66 third-year pharmacy students at the beginning of the spring 2011 semester, and the Student Opinion Scale (SOS) was administered to the same group immediately following completion of the Pharmacy Curricular Outcomes Assessment (PCOA). Significant differences were found in performance approach and work avoidance based on situational motivation scores. Situational motivation was also found to be directly correlated with performance and mastery approaches and inversely correlated with work avoidance. Criteria were met for predicting importance and effort from performance and mastery approaches and work avoidance scores of pharmacy students. The ability to predict pharmacy students' motivation to perform on a low-stakes standardized test of content knowledge increases the test's usefulness as a measure of curricular effectiveness.

  9. High-stakes conflicts and the link between theory and practice : celebrating the work of Ellen Giebels

    NARCIS (Netherlands)

    Oostinga, Miriam S.D.; Rispens, Sonja; Taylor, Paul J.; Ufkes, Elze G.

    2018-01-01

    In this tribute to the 2012 recipient of the IACM's Jeffrey Rubin's Theory-to-Practice Award, we celebrate the work of Ellen Giebels. We highlight her groundbreaking research on influence tactics in crisis negotiations and other high-stakes conflict situations, showing how her focus on theoretical

  10. High-Stakes Conflicts and the Link between Theory and Practice : Celebrating the Work of Ellen Giebels

    NARCIS (Netherlands)

    Oostinga, Miriam S.D.; Rispens, Sonja; Taylor, Paul J.; Ufkes, Elze G.

    2018-01-01

    In this tribute to the 2012 recipient of the IACM's Jeffrey Rubin's Theory-to-Practice Award, we celebrate the work of Ellen Giebels. We highlight her groundbreaking research on influence tactics in crisis negotiations and other high-stakes conflict situations, showing how her focus on theoretical

  11. Large-scale DNA Barcode Library Generation for Biomolecule Identification in High-throughput Screens.

    Science.gov (United States)

    Lyons, Eli; Sheridan, Paul; Tremmel, Georg; Miyano, Satoru; Sugano, Sumio

    2017-10-24

    High-throughput screens allow for the identification of specific biomolecules with characteristics of interest. In barcoded screens, DNA barcodes are linked to target biomolecules in a manner allowing for the target molecules making up a library to be identified by sequencing the DNA barcodes using Next Generation Sequencing. To be useful in experimental settings, the DNA barcodes in a library must satisfy certain constraints related to GC content, homopolymer length, Hamming distance, and blacklisted subsequences. Here we report a novel framework to quickly generate large-scale libraries of DNA barcodes for use in high-throughput screens. We show that our framework dramatically reduces the computation time required to generate large-scale DNA barcode libraries, compared with a naїve approach to DNA barcode library generation. As a proof of concept, we demonstrate that our framework is able to generate a library consisting of one million DNA barcodes for use in a fragment antibody phage display screening experiment. We also report generating a general purpose one billion DNA barcode library, the largest such library yet reported in literature. Our results demonstrate the value of our novel large-scale DNA barcode library generation framework for use in high-throughput screening applications.

  12. Large-scale assessment of flood risk and the effects of mitigation measures along the Elbe River

    NARCIS (Netherlands)

    de Kok, Jean-Luc; Grossmann, M.

    2010-01-01

    The downstream effects of flood risk mitigation measures and the necessity to develop flood risk management strategies that are effective on a basin scale call for a flood risk assessment methodology that can be applied at the scale of a large river. We present an example of a rapid flood risk

  13. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  14. Policy Implications for Continuous Employment Decisions of High School Principals: An Alternative Methodological Approach for Using High-Stakes Testing Outcomes

    Science.gov (United States)

    Young, I. Phillip; Fawcett, Paul

    2013-01-01

    Several teacher models exist for using high-stakes testing outcomes to make continuous employment decisions for principals. These models are reviewed, and specific flaws are noted if these models are retrofitted for principals. To address these flaws, a different methodology is proposed on the basis of actual field data. Specially addressed are…

  15. Let's Poem: The Essential Guide to Teaching Poetry in a High-Stakes, Multimodal World (Middle through High School). Language & Literacy Practitioners Bookshelf

    Science.gov (United States)

    Dressman, Mark

    2010-01-01

    This cutting-edge guide presents multiple approaches to teaching poetry at the middle and high school levels. The author provides field-tested activities with detailed how-to instructions, as well as advice for how educators can "justify" their teaching within a high-stakes curriculum environment. "Let's Poem" will show pre- and inservice teachers…

  16. Comprehensive large-scale assessment of intrinsic protein disorder.

    Science.gov (United States)

    Walsh, Ian; Giollo, Manuel; Di Domenico, Tomás; Ferrari, Carlo; Zimmermann, Olav; Tosatto, Silvio C E

    2015-01-15

    Intrinsically disordered regions are key for the function of numerous proteins. Due to the difficulties in experimental disorder characterization, many computational predictors have been developed with various disorder flavors. Their performance is generally measured on small sets mainly from experimentally solved structures, e.g. Protein Data Bank (PDB) chains. MobiDB has only recently started to collect disorder annotations from multiple experimental structures. MobiDB annotates disorder for UniProt sequences, allowing us to conduct the first large-scale assessment of fast disorder predictors on 25 833 different sequences with X-ray crystallographic structures. In addition to a comprehensive ranking of predictors, this analysis produced the following interesting observations. (i) The predictors cluster according to their disorder definition, with a consensus giving more confidence. (ii) Previous assessments appear over-reliant on data annotated at the PDB chain level and performance is lower on entire UniProt sequences. (iii) Long disordered regions are harder to predict. (iv) Depending on the structural and functional types of the proteins, differences in prediction performance of up to 10% are observed. The datasets are available from Web site at URL: http://mobidb.bio.unipd.it/lsd. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  18. The impact of high-stakes school admission exams on study achievements: quasi-experimental evidence from Slovakia

    Czech Academy of Sciences Publication Activity Database

    Federičová, Miroslava; Münich, Daniel

    2017-01-01

    Roč. 30, č. 4 (2017), s. 1069-1092 ISSN 0933-1433 R&D Projects: GA ČR(CZ) GBP402/12/G130 Institutional support: RVO:67985998 Keywords : high-stakes exams * students’ motivation * achievement Subject RIV: AH - Economics OBOR OECD: Applied Economics , Econometrics Impact factor: 1.136, year: 2016

  19. A Case Study of Co-Teaching in an Inclusive Secondary High-Stakes World History I Classroom

    Science.gov (United States)

    van Hover, Stephanie; Hicks, David; Sayeski, Kristin

    2012-01-01

    In order to provide increasing support for students with disabilities in inclusive classrooms in high-stakes testing contexts, some schools have implemented co-teaching models. This qualitative case study explores how 1 special education teacher (Anna) and 1 general education history teacher (John) make sense of working together in an inclusive…

  20. Measuring large-scale social networks with high resolution.

    Directory of Open Access Journals (Sweden)

    Arkadiusz Stopczynski

    Full Text Available This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years-the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics for a densely connected population of 1000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection.

  1. Air pollutant dispersion from a large semi-enclosed stadium in an urban area: high-resolution CFD modeling versus full-scale measurements

    NARCIS (Netherlands)

    Hooff, van T.A.J.; Blocken, B.J.E.; Seppelt, R.; Voinov, A.A.; Lange, S.; Bankamp, D.

    2012-01-01

    Abstract: High-resolution CFD simulations and full-scale measurements have been performed to assess the dispersion of air pollutants (CO2) from the large semi-enclosed Amsterdam ArenA football stadium. The dispersion process is driven by natural ventilation by the urban wind flow and by buoyancy,

  2. Approaches to large scale unsaturated flow in heterogeneous, stratified, and fractured geologic media

    International Nuclear Information System (INIS)

    Ababou, R.

    1991-08-01

    This report develops a broad review and assessment of quantitative modeling approaches and data requirements for large-scale subsurface flow in radioactive waste geologic repository. The data review includes discussions of controlled field experiments, existing contamination sites, and site-specific hydrogeologic conditions at Yucca Mountain. Local-scale constitutive models for the unsaturated hydrodynamic properties of geologic media are analyzed, with particular emphasis on the effect of structural characteristics of the medium. The report further reviews and analyzes large-scale hydrogeologic spatial variability from aquifer data, unsaturated soil data, and fracture network data gathered from the literature. Finally, various modeling strategies toward large-scale flow simulations are assessed, including direct high-resolution simulation, and coarse-scale simulation based on auxiliary hydrodynamic models such as single equivalent continuum and dual-porosity continuum. The roles of anisotropy, fracturing, and broad-band spatial variability are emphasized. 252 refs

  3. Large-scale assessment of olfactory preferences and learning in Drosophila melanogaster: behavioral and genetic components

    Directory of Open Access Journals (Sweden)

    Elisabetta Versace

    2015-09-01

    Full Text Available In the Evolve and Resequence method (E&R, experimental evolution and genomics are combined to investigate evolutionary dynamics and the genotype-phenotype link. As other genomic approaches, this methods requires many replicates with large population sizes, which imposes severe restrictions on the analysis of behavioral phenotypes. Aiming to use E&R for investigating the evolution of behavior in Drosophila, we have developed a simple and effective method to assess spontaneous olfactory preferences and learning in large samples of fruit flies using a T-maze. We tested this procedure on (a a large wild-caught population and (b 11 isofemale lines of Drosophila melanogaster. Compared to previous methods, this procedure reduces the environmental noise and allows for the analysis of large population samples. Consistent with previous results, we show that flies have a preference for orange vs. apple odor. With our procedure wild-derived flies exhibit olfactory learning in the absence of previous laboratory selection. Furthermore, we find genetic differences in the olfactory learning with relatively high heritability. We propose this large-scale method as an effective tool for E&R and genome-wide association studies on olfactory preferences and learning.

  4. Sophisticated Epistemologies of Physics versus High-Stakes Tests: How Do Elite High School Students Respond to Competing Influences about How to Learn Physics?

    Science.gov (United States)

    Yerdelen-Damar, Sevda; Elby, Andrew

    2016-01-01

    This study investigates how elite Turkish high school physics students claim to approach learning physics when they are simultaneously (i) engaged in a curriculum that led to significant gains in their epistemological sophistication and (ii) subject to a high-stakes college entrance exam. Students reported taking surface (rote) approaches to…

  5. On-line transient stability assessment of large-scale power systems by using ball vector machines

    International Nuclear Information System (INIS)

    Mohammadi, M.; Gharehpetian, G.B.

    2010-01-01

    In this paper ball vector machine (BVM) has been used for on-line transient stability assessment of large-scale power systems. To classify the system transient security status, a BVM has been trained for all contingencies. The proposed BVM based security assessment algorithm has very small training time and space in comparison with artificial neural networks (ANN), support vector machines (SVM) and other machine learning based algorithms. In addition, the proposed algorithm has less support vectors (SV) and therefore is faster than existing algorithms for on-line applications. One of the main points, to apply a machine learning method is feature selection. In this paper, a new Decision Tree (DT) based feature selection technique has been presented. The proposed BVM based algorithm has been applied to New England 39-bus power system. The simulation results show the effectiveness and the stability of the proposed method for on-line transient stability assessment procedure of large-scale power system. The proposed feature selection algorithm has been compared with different feature selection algorithms. The simulation results demonstrate the effectiveness of the proposed feature algorithm.

  6. Impact of tissue atrophy on high-pass filtered MRI signal phase-based assessment in large-scale group-comparison studies: A simulation study

    Science.gov (United States)

    Schweser, Ferdinand; Dwyer, Michael G.; Deistung, Andreas; Reichenbach, Jürgen R.; Zivadinov, Robert

    2013-10-01

    The assessment of abnormal accumulation of tissue iron in the basal ganglia nuclei and in white matter plaques using the gradient echo magnetic resonance signal phase has become a research focus in many neurodegenerative diseases such as multiple sclerosis or Parkinson’s disease. A common and natural approach is to calculate the mean high-pass-filtered phase of previously delineated brain structures. Unfortunately, the interpretation of such an analysis requires caution: in this paper we demonstrate that regional gray matter atrophy, which is concomitant with many neurodegenerative diseases, may itself directly result in a phase shift seemingly indicative of increased iron concentration even without any real change in the tissue iron concentration. Although this effect is relatively small results of large-scale group comparisons may be driven by anatomical changes rather than by changes of the iron concentration.

  7. A continental-scale hydrology and water quality model for Europe: Calibration and uncertainty of a high-resolution large-scale SWAT model

    Science.gov (United States)

    Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.

    2015-05-01

    A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.

  8. Balancing Tensions in Educational Policy Reforms: Large-Scale Implementation of Assessment for Learning in Norway

    Science.gov (United States)

    Hopfenbeck, Therese N.; Flórez Petour, María Teresa; Tolo, Astrid

    2015-01-01

    This study investigates how different stakeholders in Norway experienced a government-initiated, large-scale policy implementation programme on "Assessment for Learning" ("AfL"). Data were collected through 58 interviews with stakeholders in charge of the policy; Ministers of Education and members of the Directorate of…

  9. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment.

    Science.gov (United States)

    Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.

  10. The use of test scores from large-scale assessment surveys: psychometric and statistical considerations

    Directory of Open Access Journals (Sweden)

    Henry Braun

    2017-11-01

    Full Text Available Abstract Background Economists are making increasing use of measures of student achievement obtained through large-scale survey assessments such as NAEP, TIMSS, and PISA. The construction of these measures, employing plausible value (PV methodology, is quite different from that of the more familiar test scores associated with assessments such as the SAT or ACT. These differences have important implications both for utilization and interpretation. Although much has been written about PVs, it appears that there are still misconceptions about whether and how to employ them in secondary analyses. Methods We address a range of technical issues, including those raised in a recent article that was written to inform economists using these databases. First, an extensive review of the relevant literature was conducted, with particular attention to key publications that describe the derivation and psychometric characteristics of such achievement measures. Second, a simulation study was carried out to compare the statistical properties of estimates based on the use of PVs with those based on other, commonly used methods. Results It is shown, through both theoretical analysis and simulation, that under fairly general conditions appropriate use of PV yields approximately unbiased estimates of model parameters in regression analyses of large scale survey data. The superiority of the PV methodology is particularly evident when measures of student achievement are employed as explanatory variables. Conclusions The PV methodology used to report student test performance in large scale surveys remains the state-of-the-art for secondary analyses of these databases.

  11. Differences Across Levels in the Language of Agency and Ability in Rating Scales for Large-Scale Second Language Writing Assessments

    OpenAIRE

    Anderson Salena Sampson

    2017-01-01

    While large-scale language and writing assessments benefit from a wealth of literature on the reliability and validity of specific tests and rating procedures, there is comparatively less literature that explores the specific language of second language writing rubrics. This paper provides an analysis of the language of performance descriptors for the public versions of the TOEFL and IELTS writing assessment rubrics, with a focus on linguistic agency encoded by agentive verbs and language of ...

  12. The Convergence of High Performance Computing and Large Scale Data Analytics

    Science.gov (United States)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  13. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    Science.gov (United States)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated

  14. Comparison of the large-scale radon risk map for southern Belgium with results of high resolution surveys

    International Nuclear Information System (INIS)

    Zhu, H.-C.; Charlet, J.M.; Poffijn, A.

    2000-01-01

    A large-scale radon survey consisting of long-term measurements in about 5200 singe-family houses in the southern part of Belgium was carried from 1995 to 1999. A radon risk map for the region was produced using geostatistical and GIS approaches. Some communes or villages situated within high risk areas were chosen for detailed surveys. A high resolution radon survey with about 330 measurements was performed in half part of the commune of Burg-Reuland. Comparison of radon maps on quite different scales shows that the general Rn risk map has similar pattern as the radon map for the detailed study area. Another detailed radon survey in the village of Hatrival, situated in a high radon area, found very high proportion of houses with elevated radon concentrations. The results of this detailed survey are comparable to the expectation for high risk areas on the large-scale radon risk map. The good correspondence between the findings of the general risk map and the analysis of the limited detailed surveys, suggests that the large-scale radon risk map is likely reliable. (author)

  15. A probabilistic assessment of large scale wind power development for long-term energy resource planning

    Science.gov (United States)

    Kennedy, Scott Warren

    A steady decline in the cost of wind turbines and increased experience in their successful operation have brought this technology to the forefront of viable alternatives for large-scale power generation. Methodologies for understanding the costs and benefits of large-scale wind power development, however, are currently limited. In this thesis, a new and widely applicable technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic modeling techniques to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. A method for including the spatial smoothing effect of geographically dispersed wind farms is also introduced. The model has been used to analyze potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle (NGCC) and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on natural gas and coal prices is also discussed. In power systems with a high penetration of wind generated electricity, the intermittent availability of wind power may influence hourly spot prices. A price responsive electricity demand model is introduced that shows a small increase in wind power value when consumers react to hourly spot prices. The effectiveness of this mechanism depends heavily on estimates of the own- and cross-price elasticities of aggregate electricity demand. This work makes a valuable

  16. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  17. Environmental impact assessment and environmental audit in large-scale public infrastructure construction: the case of the Qinghai-Tibet Railway.

    Science.gov (United States)

    He, Guizhen; Zhang, Lei; Lu, Yonglong

    2009-09-01

    Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.

  18. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  19. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Directory of Open Access Journals (Sweden)

    Xiangyun Xiao

    Full Text Available The reconstruction of gene regulatory networks (GRNs from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM, experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  20. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Science.gov (United States)

    Xiao, Xiangyun; Zhang, Wei; Zou, Xiufen

    2015-01-01

    The reconstruction of gene regulatory networks (GRNs) from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE)-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM), experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  1. Modelling high Reynolds number wall-turbulence interactions in laboratory experiments using large-scale free-stream turbulence.

    Science.gov (United States)

    Dogan, Eda; Hearst, R Jason; Ganapathisubramani, Bharathram

    2017-03-13

    A turbulent boundary layer subjected to free-stream turbulence is investigated in order to ascertain the scale interactions that dominate the near-wall region. The results are discussed in relation to a canonical high Reynolds number turbulent boundary layer because previous studies have reported considerable similarities between these two flows. Measurements were acquired simultaneously from four hot wires mounted to a rake which was traversed through the boundary layer. Particular focus is given to two main features of both canonical high Reynolds number boundary layers and boundary layers subjected to free-stream turbulence: (i) the footprint of the large scales in the logarithmic region on the near-wall small scales, specifically the modulating interaction between these scales, and (ii) the phase difference in amplitude modulation. The potential for a turbulent boundary layer subjected to free-stream turbulence to 'simulate' high Reynolds number wall-turbulence interactions is discussed. The results of this study have encouraging implications for future investigations of the fundamental scale interactions that take place in high Reynolds number flows as it demonstrates that these can be achieved at typical laboratory scales.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).

  2. Teaching under the New Taylorism: High-Stakes Testing and the Standardization of the 21st Century Curriculum

    Science.gov (United States)

    Au, Wayne

    2011-01-01

    The application of the principles of scientific management within the structure, organization, and curriculum of public schools in the US became dominant during the early 1900s. Based upon research evidence from the modern day era of high-stakes testing in US public education, the fundamental logics guiding scientific management have resurfaced…

  3. A Phenotype Classification of Internet Use Disorder in a Large-Scale High-School Study

    Directory of Open Access Journals (Sweden)

    Katajun Lindenberg

    2018-04-01

    Full Text Available Internet Use Disorder (IUD affects numerous adolescents worldwide, and (Internet Gaming Disorder, a specific subtype of IUD, has recently been included in DSM-5 and ICD-11. Epidemiological studies have identified prevalence rates up to 5.7% among adolescents in Germany. However, little is known about the risk development during adolescence and its association to education. The aim of this study was to: (a identify a clinically relevant latent profile in a large-scale high-school sample; (b estimate prevalence rates of IUD for distinct age groups and (c investigate associations to gender and education. N = 5387 adolescents out of 41 schools in Germany aged 11–21 were assessed using the Compulsive Internet Use Scale (CIUS. Latent profile analyses showed five profile groups with differences in CIUS response pattern, age and school type. IUD was found in 6.1% and high-risk Internet use in 13.9% of the total sample. Two peaks were found in prevalence rates indicating the highest risk of IUD in age groups 15–16 and 19–21. Prevalence did not differ significantly between boys and girls. High-level education schools showed the lowest (4.9% and vocational secondary schools the highest prevalence rate (7.8%. The differences between school types could not be explained by academic level.

  4. Use of large-scale acoustic monitoring to assess anthropogenic pressures on Orthoptera communities.

    Science.gov (United States)

    Penone, Caterina; Le Viol, Isabelle; Pellissier, Vincent; Julien, Jean-François; Bas, Yves; Kerbiriou, Christian

    2013-10-01

    Biodiversity monitoring at large spatial and temporal scales is greatly needed in the context of global changes. Although insects are a species-rich group and are important for ecosystem functioning, they have been largely neglected in conservation studies and policies, mainly due to technical and methodological constraints. Sound detection, a nondestructive method, is easily applied within a citizen-science framework and could be an interesting solution for insect monitoring. However, it has not yet been tested at a large scale. We assessed the value of a citizen-science program in which Orthoptera species (Tettigoniidae) were monitored acoustically along roads. We used Bayesian model-averaging analyses to test whether we could detect widely known patterns of anthropogenic effects on insects, such as the negative effects of urbanization or intensive agriculture on Orthoptera populations and communities. We also examined site-abundance correlations between years and estimated the biases in species detection to evaluate and improve the protocol. Urbanization and intensive agricultural landscapes negatively affected Orthoptera species richness, diversity, and abundance. This finding is consistent with results of previous studies of Orthoptera, vertebrates, carabids, and butterflies. The average mass of communities decreased as urbanization increased. The dispersal ability of communities increased as the percentage of agricultural land and, to a lesser extent, urban area increased. Despite changes in abundances over time, we found significant correlations between yearly abundances. We identified biases linked to the protocol (e.g., car speed or temperature) that can be accounted for ease in analyses. We argue that acoustic monitoring of Orthoptera along roads offers several advantages for assessing Orthoptera biodiversity at large spatial and temporal extents, particularly in a citizen science framework. © 2013 Society for Conservation Biology.

  5. Probing high scale physics with top quarks at the Large Hadron Collider

    Science.gov (United States)

    Dong, Zhe

    With the Large Hadron Collider (LHC) running at TeV scale, we are expecting to find the deviations from the Standard Model in the experiments, and understanding what is the origin of these deviations. Being the heaviest elementary particle observed so far in the experiments with the mass at the electroweak scale, top quark is a powerful probe for new phenomena of high scale physics at the LHC. Therefore, we concentrate on studying the high scale physics phenomena with top quark pair production or decay at the LHC. In this thesis, we study the discovery potential of string resonances decaying to t/tbar final state, and examine the possibility of observing baryon-number-violating top-quark production or decay, at the LHC. We point out that string resonances for a string scale below 4 TeV can be detected via the t/tbar channel, by reconstructing center-of-mass frame kinematics of the resonances from either the t/tbar semi-leptonic decay or recent techniques of identifying highly boosted tops. For the study of baryon-number-violating processes, by a model independent effective approach and focusing on operators with minimal mass-dimension, we find that corresponding effective coefficients could be directly probed at the LHC already with an integrated luminosity of 1 inverse femtobarns at 7 TeV, and further constrained with 30 (100) inverse femtobarns at 7 (14) TeV.

  6. Large-scale Assessment Yields Evidence of Minimal Use of Reasoning Skills in Traditionally Taught Classes

    Science.gov (United States)

    Thacker, Beth

    2017-01-01

    Large-scale assessment data from Texas Tech University yielded evidence that most students taught traditionally in large lecture classes with online homework and predominantly multiple choice question exams, when asked to answer free-response (FR) questions, did not support their answers with logical arguments grounded in physics concepts. In addition to a lack of conceptual understanding, incorrect and partially correct answers lacked evidence of the ability to apply even lower level reasoning skills in order to solve a problem. Correct answers, however, did show evidence of at least lower level thinking skills as coded using a rubric based on Bloom's taxonomy. With the introduction of evidence-based instruction into the labs and recitations of the large courses and in a small, completely laboratory-based, hands-on course, the percentage of correct answers with correct explanations increased. The FR format, unlike other assessment formats, allowed assessment of both conceptual understanding and the application of thinking skills, clearly pointing out weaknesses not revealed by other assessment instruments, and providing data on skills beyond conceptual understanding for course and program assessment. Supported by National Institutes of Health (NIH) Challenge grant #1RC1GM090897-01.

  7. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment

    Science.gov (United States)

    Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632

  8. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  9. Assessing large-scale wildlife responses to human infrastructure development.

    Science.gov (United States)

    Torres, Aurora; Jaeger, Jochen A G; Alonso, Juan Carlos

    2016-07-26

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future.

  10. Local governance of energy. Clarification of stakes and illustration by spatial planning

    International Nuclear Information System (INIS)

    Saujot, Mathieu; Ruedinger, Andreas; Guerry, Anais

    2014-01-01

    As energy transition implies important societal transformations, the authors developed an analysis framework about the main questions raised by local governance: role of the different levels of local communities in the definition and implementation of strategies, key stakes of the sharing of skills between the State and communities, and stakes regarding spatial planning in this context. The authors first address the issue of relevance of the different territorial scales in a context of evolution of energy policies. They propose an overview of this issue with reference to the debate on local governance of transition. They discuss the return on experience of decentralisation in other fields of action of local policies, notably urban planning and spatial planning

  11. Failure Impact Assessment for Large-Scale Landslides Located Near Human Settlement: Case Study in Southern Taiwan

    Directory of Open Access Journals (Sweden)

    Ming-Chien Chung

    2018-05-01

    Full Text Available In 2009, Typhoon Morakot caused over 680 deaths and more than 20,000 landslides in Taiwan. From 2010 to 2015, the Central Geological Survey of the Ministry of Economic Affairs identified 1047 potential large-scale landslides in Taiwan, of which 103 may have affected human settlements. This paper presents an analytical procedure that can be applied to assess the possible impact of a landslide collapse on nearby settlements. In this paper, existing technologies, including interpretation of remote sensing images, hydrogeological investigation, and numerical analysis, are integrated to evaluate potential failure scenarios and the landslide scale of a specific case: the Xinzhuang landslide. GeoStudio and RAMMS analysis modes and hazard classification produced the following results: (1 evaluation of the failure mechanisms and the influence zones of large-scale landslides; (2 assessment of the migration and accumulation of the landslide mass after failure; and (3 a landslide hazard and evacuation map. The results of the case study show that this analytical procedure can quantitatively estimate potential threats to human settlements. Furthermore, it can be applied to other villages and used as a reference in disaster prevention and evacuation planning.

  12. Material versatility using replica molding for large-scale fabrication of high aspect-ratio, high density arrays of nano-pillars

    International Nuclear Information System (INIS)

    Li, Y; Menon, C; Ng, H W; Gates, B D

    2014-01-01

    Arrays of high aspect-ratio (AR) nano-pillars have attracted a lot of interest for various applications, such as for use in solar cells, surface acoustic sensors, tissue engineering, bio-inspired adhesives and anti-reflective surfaces. Each application may require a different structural material, which can vary in the required chemical composition and mechanical properties. In this paper, a low cost fabrication procedure is proposed for large scale, high AR and high density arrays of nano-pillars. The proposed method enables the replication of a master with high fidelity, using the subsequent replica molds multiple times, and preparing arrays of nano-pillars in a variety of different materials. As an example applied to bio-inspired dry adhesion, polymeric arrays of nano-pillars are prepared in this work. Thermoset and thermoplastic nano-pillar arrays are examined using an atomic force microscope to assess their adhesion strength and its uniformity. Results indicate the proposed method is robust and can be used to reliably prepare nano-structures with a high AR. (paper)

  13. Differences Across Levels in the Language of Agency and Ability in Rating Scales for Large-Scale Second Language Writing Assessments

    Directory of Open Access Journals (Sweden)

    Anderson Salena Sampson

    2017-12-01

    Full Text Available While large-scale language and writing assessments benefit from a wealth of literature on the reliability and validity of specific tests and rating procedures, there is comparatively less literature that explores the specific language of second language writing rubrics. This paper provides an analysis of the language of performance descriptors for the public versions of the TOEFL and IELTS writing assessment rubrics, with a focus on linguistic agency encoded by agentive verbs and language of ability encoded by modal verbs can and cannot. While the IELTS rubrics feature more agentive verbs than the TOEFL rubrics, both pairs of rubrics feature uneven syntax across the band or score descriptors with either more agentive verbs for the highest scores, more nominalization for the lowest scores, or language of ability exclusively in the lowest scores. These patterns mirror similar patterns in the language of college-level classroom-based writing rubrics, but they differ from patterns seen in performance descriptors for some large-scale admissions tests. It is argued that the lack of syntactic congruity across performance descriptors in the IELTS and TOEFL rubrics may reflect a bias in how actual student performances at different levels are characterized.

  14. Strengthening the abilities of French-speaking NGOs. Post-2012 climate stakes. Adaptation - Energy - Deforestation, France - Africa - Canada

    International Nuclear Information System (INIS)

    Creach, Morgane; Margot, Stephanie; Connor, Richard; Angerand, Sylvain

    2007-10-01

    The first part of this report discusses the possibilities of an international response to face the challenge of adaptation to climate change (presentation of the main notions, discussion of the United Nations Framework Convention on Climate Change and international stakes about adaptation). The second part discusses the perspectives of access to energy in African countries: description of the African energy context, applications of existing mechanisms of struggle against climate change to the field of energy, stakes for post-2012 negotiations. The next part addresses the stakes of the avoided deforestation: definitions and key figures, direct and underlying causes of deforestation and assessment of the cost for slowing down or stopping it, stakes and struggle of interests about the 'avoided deforestation'. The last part reports the 'post-2012 climate stakes' workshop which addressed these same topics (access to energy in African countries, adaptation to climate change, avoided deforestation)

  15. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  16. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  17. High-­Stakes Schooling: What We Can Learn from Japan's Experiences with Testing, Accountability, and Education Reform

    Science.gov (United States)

    Bjork, Christopher

    2015-01-01

    If there is one thing that describes the trajectory of American education, it is this: more high-stakes testing. In the United States, the debates surrounding this trajectory can be so fierce that it feels like we are in uncharted waters. As Christopher Bjork reminds us in this study, however, we are not the first to make testing so central to…

  18. Assessment of disturbance at three spatial scales in two large tropical reservoirs

    Directory of Open Access Journals (Sweden)

    Letícia de Morais

    2016-12-01

    Full Text Available Large reservoirs are an increasingly common feature across tropical landscapes because of their importance for water supply, flood control and hydropower, but their ecological conditions are infrequently evaluated. Our objective was to assess the range of disturbances for two large tropical reservoirs and their influences on benthic macroinvertebrates. We tested three hypotheses: i a wide variation in the level of environmental disturbance can be observed among sites in the reservoirs; ii the two reservoirs would exhibit a different degree of disturbance level; and iii the magnitude of disturbance would influence the structure and composition of benthic assemblages. For each reservoir, we assessed land use (macroscale, physical habitat structure (mesoscale, and water quality (microscale. We sampled 40 sites in the littoral zones of both Três Marias and São Simão Reservoirs (Minas Gerais, Brazil. At the macroscale, we measured cover percentages of land use categories in buffer areas at each site, where each buffer was a circular arc of 250 m. At the mesoscale, we assessed the presence of human disturbances in the riparian and drawdown zones at the local (site scale. At the microscale, we assessed water quality at each macroinvertebrate sampling station using the Micro Disturbance Index (MDI. To evaluate anthropogenic disturbance of each site, we calculated an integrated disturbance index (IDI from a buffer disturbance index (BDI and a local disturbance index (LDI. For each site, we calculated richness and abundance of benthic macroinvertebrates, Chironomidae genera richness, abundance and percent Chironomidae individuals, abundance and percent EPT individuals, richness and percent EPT taxa, abundance and percent resistant individuals, and abundance and percent non-native individuals. We also evaluated the influence of disturbance on benthic macroinvertebrate assemblages at the entire-reservoir scale. The BDI, LDI and IDI had significantly

  19. The Contribution of International Large-Scale Assessments to Educational Research: Combining Individual and Institutional Data Sources

    Science.gov (United States)

    Strietholt, Rolf; Scherer, Ronny

    2018-01-01

    The present paper aims to discuss how data from international large-scale assessments (ILSAs) can be utilized and combined, even with other existing data sources, in order to monitor educational outcomes and study the effectiveness of educational systems. We consider different purposes of linking data, namely, extending outcomes measures,…

  20. Coverage of the migrant population in large-scale assessment surveys. Experiences from PIAAC in Germany

    Directory of Open Access Journals (Sweden)

    Débora B. Maehler

    2017-03-01

    Full Text Available Abstract Background European countries, and especially Germany, are currently very much affected by human migration flows, with the result that the task of integration has become a challenge. Only very little empirical evidence on topics such as labor market participation and processes of social integration of migrant subpopulations is available to date from large-scale population surveys. The present paper provides an overview of the representation of the migrant population in the German Programme for the International Assessment of Adult Competencies (PIAAC sample and evaluates reasons for the under-coverage of this population. Methods We examine outcome rates and reasons for nonresponse among the migrant population based on sampling frame data, and we also examine para data from the interviewers’ contact protocols to evaluate time patterns for the successful contacting of migrants. Results and Conclusions This is the first time that results of this kind have been presented for a large-scale assessment in educational research. These results are also discussed in the context of future PIAAC cycles. Overall, they confirm the expectations in the literature that factors such as language problems result in lower contact and response rates among migrants.

  1. Towards large scale stochastic rainfall models for flood risk assessment in trans-national basins

    Science.gov (United States)

    Serinaldi, F.; Kilsby, C. G.

    2012-04-01

    While extensive research has been devoted to rainfall-runoff modelling for risk assessment in small and medium size watersheds, less attention has been paid, so far, to large scale trans-national basins, where flood events have severe societal and economic impacts with magnitudes quantified in billions of Euros. As an example, in the April 2006 flood events along the Danube basin at least 10 people lost their lives and up to 30 000 people were displaced, with overall damages estimated at more than half a billion Euros. In this context, refined analytical methods are fundamental to improve the risk assessment and, then, the design of structural and non structural measures of protection, such as hydraulic works and insurance/reinsurance policies. Since flood events are mainly driven by exceptional rainfall events, suitable characterization and modelling of space-time properties of rainfall fields is a key issue to perform a reliable flood risk analysis based on alternative precipitation scenarios to be fed in a new generation of large scale rainfall-runoff models. Ultimately, this approach should be extended to a global flood risk model. However, as the need of rainfall models able to account for and simulate spatio-temporal properties of rainfall fields over large areas is rather new, the development of new rainfall simulation frameworks is a challenging task involving that faces with the problem of overcoming the drawbacks of the existing modelling schemes (devised for smaller spatial scales), but keeping the desirable properties. In this study, we critically summarize the most widely used approaches for rainfall simulation. Focusing on stochastic approaches, we stress the importance of introducing suitable climate forcings in these simulation schemes in order to account for the physical coherence of rainfall fields over wide areas. Based on preliminary considerations, we suggest a modelling framework relying on the Generalized Additive Models for Location, Scale

  2. Large-scale application of highly-diluted bacteria for Leptospirosis epidemic control.

    Science.gov (United States)

    Bracho, Gustavo; Varela, Enrique; Fernández, Rolando; Ordaz, Barbara; Marzoa, Natalia; Menéndez, Jorge; García, Luis; Gilling, Esperanza; Leyva, Richard; Rufín, Reynaldo; de la Torre, Rubén; Solis, Rosa L; Batista, Niurka; Borrero, Reinier; Campa, Concepción

    2010-07-01

    Leptospirosis is a zoonotic disease of major importance in the tropics where the incidence peaks in rainy seasons. Natural disasters represent a big challenge to Leptospirosis prevention strategies especially in endemic regions. Vaccination is an effective option but of reduced effectiveness in emergency situations. Homeoprophylactic interventions might help to control epidemics by using highly-diluted pathogens to induce protection in a short time scale. We report the results of a very large-scale homeoprophylaxis (HP) intervention against Leptospirosis in a dangerous epidemic situation in three provinces of Cuba in 2007. Forecast models were used to estimate possible trends of disease incidence. A homeoprophylactic formulation was prepared from dilutions of four circulating strains of Leptospirosis. This formulation was administered orally to 2.3 million persons at high risk in an epidemic in a region affected by natural disasters. The data from surveillance were used to measure the impact of the intervention by comparing with historical trends and non-intervention regions. After the homeoprophylactic intervention a significant decrease of the disease incidence was observed in the intervention regions. No such modifications were observed in non-intervention regions. In the intervention region the incidence of Leptospirosis fell below the historic median. This observation was independent of rainfall. The homeoprophylactic approach was associated with a large reduction of disease incidence and control of the epidemic. The results suggest the use of HP as a feasible tool for epidemic control, further research is warranted. 2010 Elsevier Ltd. All rights reserved.

  3. Multilevel stake holder consensus building in radioactive waste management

    International Nuclear Information System (INIS)

    Dreimanis, Andrejs

    2008-01-01

    Full text: The increased demand of our society to its quality of life, global security and environmental safety as well as to observing a basic ethical principle of equity have advanced our attitude towards the recent proposals to develop shared multinational projects in the use of nuclear energy technologies, in particular, to: a) Siting of shared deep repositories for high-level radioactive waste (RW) and spent nuclear fuel safe disposal. In turn, arrangement of multinational facilities requires to gain more complex consensus between all involved parties. Method: We propose an interdisciplinary synergetic approach to multilevel consensus building for siting and construction of shared multinational repositories for RW deep disposal, based on self-organization (SO) of various stake holders, chaos and fuzziness concepts as well as Ashby principle of requisite variety. In the siting of a multi-national repository there appears an essential novel component of stake holder consensus building, namely: to reach consent - political, social, economic, ecological - among international partners, in addition to solving the whole set of intra-national consensus building items. An entire partnering country is considered as a national stake holder, represented by the national government, being faced to simultaneous seeking an upward (international) and a downward (intra-national) consensus in a psychologically stressed environment, having possibly diverse political, economic and social interests. Main Results: Following inferences about building of multilevel consensus are developed: 1) The basis of synergetic approach to stake holder interaction - informational SO, by forming a knowledge-creating stake holder community via cooperation and competition among individuals, public bodies/groups, companies, institutions; 2) Building of international stake holder consensus could be promoted by activating and diversifying multilateral interactions between intra- and international stake

  4. What is at stake in multi-scale approaches

    International Nuclear Information System (INIS)

    Jamet, Didier

    2008-01-01

    Full text of publication follows: Multi-scale approaches amount to analyzing physical phenomena at small space and time scales in order to model their effects at larger scales. This approach is very general in physics and engineering; one of the best examples of success of this approach is certainly statistical physics that allows to recover classical thermodynamics and to determine the limits of application of classical thermodynamics. Getting access to small scale information aims at reducing the models' uncertainty but it has a cost: fine scale models may be more complex than larger scale models and their resolution may require the development of specific and possibly expensive methods, numerical simulation techniques and experiments. For instance, in applications related to nuclear engineering, the application of computational fluid dynamics instead of cruder models is a formidable engineering challenge because it requires resorting to high performance computing. Likewise, in two-phase flow modeling, the techniques of direct numerical simulation, where all the interfaces are tracked individually and where all turbulence scales are captured, are getting mature enough to be considered for averaged modeling purposes. However, resolving small scale problems is a necessary step but it is not sufficient in a multi-scale approach. An important modeling challenge is to determine how to treat small scale data in order to get relevant information for larger scale models. For some applications, such as single-phase turbulence or transfers in porous media, this up-scaling approach is known and is now used rather routinely. However, in two-phase flow modeling, the up-scaling approach is not as mature and specific issues must be addressed that raise fundamental questions. This will be discussed and illustrated. (author)

  5. The Kyoto protocol: assessment and perspectives. Towards a new regime up to the climate stake

    International Nuclear Information System (INIS)

    Gautier, Celia

    2012-01-01

    This report proposes an analysis within the context of transition of the climate regime from the 'before-2012' regime to the 'post-2020' regime. It first gives an overview of international stakes and context (lack of ambition for climate policy, perspective of an international agreement from 2020). Then, the authors recall the history and achievements of the Kyoto protocol which is the basis of the present climate policy regime. They propose an assessment of actions performed by countries during the first period of the protocol, and focus on the present climate regime elements which are to be safeguarded. They analyse the weaknesses of the present regime, and propose possible improvements for the future post-2020 climate regime

  6. Diving in or Guarding the Tower: Mina Shaughnessy's Resistance and Capitulation to High-Stakes Writing Tests at City College

    Science.gov (United States)

    Molloy, Sean

    2012-01-01

    Mina Shaughnessy continues to exert powerful influences over Basic Writing practices, discourses and pedagogy thirty-five years after her death: Basic Writing remains in some ways trapped by Shaughnessy's legacy in what Min-Zhan Lu labeled as essentialism, accommodationism and linguistic innocence. High-stakes writing tests, a troubling hallmark…

  7. High Stakes Principalship--Sleepless Nights, Heart Attacks and Sudden Death Accountabilities: Reading Media Representations of the United States Principal Shortage.

    Science.gov (United States)

    Thomson, Pat; Blackmore, Jill; Sachs, Judyth; Tregenza, Karen

    2003-01-01

    Subjects a corpus of predominantly United States news articles to deconstructive narrative analysis and finds that the dominant media representation of principals' work is one of long hours, low salary, high stress, and sudden death from high stakes accountabilities. Notes that the media picture may perpetuate the problem, and that it is at odds…

  8. On the use of Cloud Computing and Machine Learning for Large-Scale SAR Science Data Processing and Quality Assessment Analysi

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Geodetic imaging is revolutionizing geophysics, but the scope of discovery has been limited by labor-intensive technological implementation of the analyses. The Advanced Rapid Imaging and Analysis (ARIA) project has proven capability to automate SAR data processing and analysis. Existing and upcoming SAR missions such as Sentinel-1A/B and NISAR are also expected to generate massive amounts of SAR data. This has brought to the forefront the need for analytical tools for SAR quality assessment (QA) on the large volumes of SAR data-a critical step before higher-level time series and velocity products can be reliably generated. Initially leveraging an advanced hybrid-cloud computing science data system for performing large-scale processing, machine learning approaches were augmented for automated analysis of various quality metrics. Machine learning-based user-training of features, cross-validation, prediction models were integrated into our cloud-based science data processing flow to enable large-scale and high-throughput QA analytics for enabling improvements to the production quality of geodetic data products.

  9. Rater Training to Support High-Stakes Simulation-Based Assessments

    Science.gov (United States)

    Feldman, Moshe; Lazzara, Elizabeth H.; Vanderbilt, Allison A.; DiazGranados, Deborah

    2012-01-01

    Competency-based assessment and an emphasis on obtaining higher-level outcomes that reflect physicians' ability to demonstrate their skills has created a need for more advanced assessment practices. Simulation-based assessments provide medical education planners with tools to better evaluate the 6 Accreditation Council for Graduate Medical…

  10. Social Perception of Hydrogen Technologies: The View of Spanish Stake holders; Percepcion Social de las Tecnologias del Hidrogeno. La Vision de los Stakeholders Espanoles

    Energy Technology Data Exchange (ETDEWEB)

    Ferri Anglada, S.

    2013-07-01

    This technical report presents an overview of the social perception and vision of a sample of Spanish stake holders on hydrogen technologies. The study is based on the implementation of a survey, combining both quantitative and qualitative data. An ad hoc electronic survey was design to collect views and perceptions on several key factors regarding this innovative energy alternative. The group of experts participating (N=130) in the study, comes mainly from research centers, universities and private companies. The survey addresses three major themes: expert views, social acceptability, and contextual factors of hydrogen technologies. The aim is to capture both the current and the future scene as viewed by the experts on hydrogen technologies, identifying key factors in terms of changes, uncertainties, obstacles and opportunities. The objective is to identify potential key features for the introduction, development, promotion, implementation, and large-scale deployment of a highly successful energy proposal in countries such as Iceland, one of the pioneers in base its economy on hydrogen technologies. To conclude, this report illustrates the positive engagement of a sample of Spanish stake holders towards hydrogen technologies that may prove vital in the transition towards the Hydrogen Economy in Spain. (Author)

  11. International Large-Scale Assessment Studies and Educational Policy-Making in Chile: Contexts and Dimensions of Influence

    Science.gov (United States)

    Cox, Cristián; Meckes, Lorena

    2016-01-01

    Since the 1990s, Chile has participated in all major international large-scale assessment studies (ILSAs) of the IEA and OECD, as well as the regional ones conducted by UNESCO in Latin America, after it had been involved in the very first international Science Study in 1970-1971. This article examines the various ways in which these studies have…

  12. Standard Errors for National Trends in International Large-Scale Assessments in the Case of Cross-National Differential Item Functioning

    Science.gov (United States)

    Sachse, Karoline A.; Haag, Nicole

    2017-01-01

    Standard errors computed according to the operational practices of international large-scale assessment studies such as the Programme for International Student Assessment's (PISA) or the Trends in International Mathematics and Science Study (TIMSS) may be biased when cross-national differential item functioning (DIF) and item parameter drift are…

  13. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  14. Modeling the impact of large-scale energy conversion systems on global climate

    International Nuclear Information System (INIS)

    Williams, J.

    There are three energy options which could satisfy a projected energy requirement of about 30 TW and these are the solar, nuclear and (to a lesser extent) coal options. Climate models can be used to assess the impact of large scale deployment of these options. The impact of waste heat has been assessed using energy balance models and general circulation models (GCMs). Results suggest that the impacts are significant when the heat imput is very high and studies of more realistic scenarios are required. Energy balance models, radiative-convective models and a GCM have been used to study the impact of doubling the atmospheric CO 2 concentration. State-of-the-art models estimate a surface temperature increase of 1.5-3.0 0 C with large amplification near the poles, but much uncertainty remains. Very few model studies have been made of the impact of particles on global climate, more information on the characteristics of particle input are required. The impact of large-scale deployment of solar energy conversion systems has received little attention but model studies suggest that large scale changes in surface characteristics associated with such systems (surface heat balance, roughness and hydrological characteristics and ocean surface temperature) could have significant global climatic effects. (Auth.)

  15. Using Procedure Based on Item Response Theory to Evaluate Classification Consistency Indices in the Practice of Large-Scale Assessment

    Directory of Open Access Journals (Sweden)

    Shanshan Zhang

    2017-09-01

    Full Text Available In spite of the growing interest in the methods of evaluating the classification consistency (CC indices, only few researches are available in the field of applying these methods in the practice of large-scale educational assessment. In addition, only few studies considered the influence of practical factors, for example, the examinee ability distribution, the cut score location and the score scale, on the performance of CC indices. Using the newly developed Lee's procedure based on the item response theory (IRT, the main purpose of this study is to investigate the performance of CC indices when practical factors are taken into consideration. A simulation study and an empirical study were conducted under comprehensive conditions. Results suggested that with negatively skewed distribution, the CC indices were larger than with other distributions. Interactions occurred among ability distribution, cut score location, and score scale. Consequently, Lee's IRT procedure is reliable to be used in the field of large-scale educational assessment, and when reporting the indices, it should be treated with caution as testing conditions may vary a lot.

  16. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  17. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  18. Co-Cure-Ply Resins for High Performance, Large-Scale Structures

    Data.gov (United States)

    National Aeronautics and Space Administration — Large-scale composite structures are commonly joined by secondary bonding of molded-and-cured thermoset components. This approach may result in unpredictable joint...

  19. SQDFT: Spectral Quadrature method for large-scale parallel O(N) Kohn-Sham calculations at high temperature

    Science.gov (United States)

    Suryanarayana, Phanish; Pratapa, Phanisri P.; Sharma, Abhiraj; Pask, John E.

    2018-03-01

    We present SQDFT: a large-scale parallel implementation of the Spectral Quadrature (SQ) method for O(N) Kohn-Sham Density Functional Theory (DFT) calculations at high temperature. Specifically, we develop an efficient and scalable finite-difference implementation of the infinite-cell Clenshaw-Curtis SQ approach, in which results for the infinite crystal are obtained by expressing quantities of interest as bilinear forms or sums of bilinear forms, that are then approximated by spatially localized Clenshaw-Curtis quadrature rules. We demonstrate the accuracy of SQDFT by showing systematic convergence of energies and atomic forces with respect to SQ parameters to reference diagonalization results, and convergence with discretization to established planewave results, for both metallic and insulating systems. We further demonstrate that SQDFT achieves excellent strong and weak parallel scaling on computer systems consisting of tens of thousands of processors, with near perfect O(N) scaling with system size and wall times as low as a few seconds per self-consistent field iteration. Finally, we verify the accuracy of SQDFT in large-scale quantum molecular dynamics simulations of aluminum at high temperature.

  20. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  1. Assessment of clean development mechanism potential of large-scale energy efficiency measures in heavy industries

    International Nuclear Information System (INIS)

    Hayashi, Daisuke; Krey, Matthias

    2007-01-01

    This paper assesses clean development mechanism (CDM) potential of large-scale energy efficiency measures in selected heavy industries (iron and steel, cement, aluminium, pulp and paper, and ammonia) taking India and Brazil as examples of CDM project host countries. We have chosen two criteria for identification of the CDM potential of each energy efficiency measure: (i) emission reductions volume (in CO 2 e) that can be expected from the measure and (ii) likelihood of the measure passing the additionality test of the CDM Executive Board (EB) when submitted as a proposed CDM project activity. The paper shows that the CDM potential of large-scale energy efficiency measures strongly depends on the project-specific and country-specific context. In particular, technologies for the iron and steel industry (coke dry quenching (CDQ), top pressure recovery turbine (TRT), and basic oxygen furnace (BOF) gas recovery), the aluminium industry (point feeder prebake (PFPB) smelter), and the pulp and paper industry (continuous digester technology) offer promising CDM potential

  2. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  3. French government to trim direct stake in Total

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that the French government has decided to slash its direct stake in partly state owned oil company Total to 5% from 31.7%, a surprise move expected to raise 10 billion francs ($1.8 billion). At the same time, other state owned entities will be asked to boost their combined 2.2% stake in Total to 10%, leaving the government with a net 15% interest in Total vs. the current 34%. Initially, state owned insurance companies Groupe des Assurances Nationales and Assurances Generale de France will be asked to hike their stakes in Total, but others could be asked to join if needed to meet the 10% target. The government the its phase-down of participation in Total, established in 1924 to manage French interests in Iraq Petroleum Co., was prompted by the evolution of the oil context, which differs greatly from what had prompted a significant stake of the state in Total's capital

  4. Revising the potential of large-scale Jatropha oil production in Tanzania: An economic land evaluation assessment

    International Nuclear Information System (INIS)

    Segerstedt, Anna; Bobert, Jans

    2013-01-01

    Following up the rather sobering results of the biofuels boom in Tanzania, we analyze the preconditions that would make large-scale oil production from the feedstock Jatropha curcas viable. We do this by employing an economic land evaluation approach; first, we estimate the physical land suitability and the necessary inputs to reach certain amounts of yields. Subsequently, we estimate costs and benefits for different input-output levels. Finally, to incorporate the increased awareness of sustainability in the export sector, we introduce also certification criteria. Using data from an experimental farm in Kilosa, we find that high yields are crucial for the economic feasibility and that they can only be obtained on good soils at high input rates. Costs of compliance with certification criteria depend on site specific characteristics such as land suitability and precipitation. In general, both domestic production and (certified) exports are too expensive to be able to compete with conventional diesel/rapeseed oil from the EU. Even though the crop may have potential for large scale production as a niche product, there is still a lot of risk involved and more experimental research is needed. - Highlights: ► We use an economic land evaluation analysis to reassess the potential of large-scale Jatropha oil. ► High yields are possible only at high input rates and for good soil qualities. ► Production costs are still too high to break even on the domestic and export market. ► More research is needed to stabilize yields and improve the oil content. ► Focus should be on broadening our knowledge-base rather than promoting new Jatropha investments

  5. Applying Kane's Validity Framework to a Simulation Based Assessment of Clinical Competence

    Science.gov (United States)

    Tavares, Walter; Brydges, Ryan; Myre, Paul; Prpic, Jason; Turner, Linda; Yelle, Richard; Huiskamp, Maud

    2018-01-01

    Assessment of clinical competence is complex and inference based. Trustworthy and defensible assessment processes must have favourable evidence of validity, particularly where decisions are considered high stakes. We aimed to organize, collect and interpret validity evidence for a high stakes simulation based assessment strategy for certifying…

  6. Free Global Dsm Assessment on Large Scale Areas Exploiting the Potentialities of the Innovative Google Earth Engine Platform

    Science.gov (United States)

    Nascetti, A.; Di Rita, M.; Ravanelli, R.; Amicuzi, M.; Esposito, S.; Crespi, M.

    2017-05-01

    The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER) has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.

  7. FREE GLOBAL DSM ASSESSMENT ON LARGE SCALE AREAS EXPLOITING THE POTENTIALITIES OF THE INNOVATIVE GOOGLE EARTH ENGINE PLATFORM

    Directory of Open Access Journals (Sweden)

    A. Nascetti

    2017-05-01

    Full Text Available The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah and one Italian Region (Trentino Alto- Adige, Northern Italy exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.

  8. Assessment of climate change impacts on rainfall using large scale ...

    Indian Academy of Sciences (India)

    Many of the applied techniques in water resources management can be directly or indirectly influenced by ... is based on large scale climate signals data around the world. In order ... predictand relationships are often very complex. .... constraints to solve the optimization problem. ..... social, and environmental sustainability.

  9. Characterization of laser-induced plasmas as a complement to high-explosive large-scale detonations

    Directory of Open Access Journals (Sweden)

    Clare Kimblin

    2017-09-01

    Full Text Available Experimental investigations into the characteristics of laser-induced plasmas indicate that LIBS provides a relatively inexpensive and easily replicable laboratory technique to isolate and measure reactions germane to understanding aspects of high-explosive detonations under controlled conditions. Spectral signatures and derived physical parameters following laser ablation of aluminum, graphite and laser-sparked air are examined as they relate to those observed following detonation of high explosives and as they relate to shocked air. Laser-induced breakdown spectroscopy (LIBS reliably correlates reactions involving atomic Al and aluminum monoxide (AlO with respect to both emission spectra and temperatures, as compared to small- and large-scale high-explosive detonations. Atomic Al and AlO resulting from laser ablation and a cited small-scale study, decay within ∼10-5 s, roughly 100 times faster than the Al and AlO decay rates (∼10-3 s observed following the large-scale detonation of an Al-encased explosive. Temperatures and species produced in laser-sparked air are compared to those produced with laser ablated graphite in air. With graphite present, CN is dominant relative to N2+. In studies where the height of the ablating laser’s focus was altered relative to the surface of the graphite substrate, CN concentration was found to decrease with laser focus below the graphite surface, indicating that laser intensity is a critical factor in the production of CN, via reactive nitrogen.

  10. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Directory of Open Access Journals (Sweden)

    Steiakakis Chrysanthos

    2016-01-01

    Full Text Available The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions.

  11. Enabling High Performance Large Scale Dense Problems through KBLAS

    KAUST Repository

    Abdelfattah, Ahmad

    2014-05-04

    KBLAS (KAUST BLAS) is a small library that provides highly optimized BLAS routines on systems accelerated with GPUs. KBLAS is entirely written in CUDA C, and targets NVIDIA GPUs with compute capability 2.0 (Fermi) or higher. The current focus is on level-2 BLAS routines, namely the general matrix vector multiplication (GEMV) kernel, and the symmetric/hermitian matrix vector multiplication (SYMV/HEMV) kernel. KBLAS provides these two kernels in all four precisions (s, d, c, and z), with support to multi-GPU systems. Through advanced optimization techniques that target latency hiding and pushing memory bandwidth to the limit, KBLAS outperforms state-of-the-art kernels by 20-90% improvement. Competitors include CUBLAS-5.5, MAGMABLAS-1.4.0, and CULAR17. The SYMV/HEMV kernel from KBLAS has been adopted by NVIDIA, and should appear in CUBLAS-6.0. KBLAS has been used in large scale simulations of multi-object adaptive optics.

  12. Auditing for Score Inflation Using Self-Monitoring Assessments: Findings from Three Pilot Studies

    Science.gov (United States)

    Koretz, Daniel; Jennings, Jennifer L.; Ng, Hui Leng; Yu, Carol; Braslow, David; Langi, Meredith

    2016-01-01

    Test-based accountability often produces score inflation. Most studies have evaluated inflation by comparing trends on a high-stakes test and a lower stakes audit test. However, Koretz and Beguin (2010) noted weaknesses of audit tests and suggested self-monitoring assessments (SMAs), which incorporate audit items into high-stakes tests. This…

  13. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  14. A large scale field experiment in the Amazon basin (LAMBADA/BATERISTA)

    NARCIS (Netherlands)

    Dolman, A.J.; Kabat, P.; Gash, J.H.C.; Noilhan, J.; Jochum, A.M.; Nobre, C.

    1995-01-01

    A description is given of a large-scale field experiment planned in the Amazon basin, aimed at assessing the large-scale balances of energy, water and carbon dioxide. The embedding of this experiment in global change programmes is described, viz. the Biospheric Aspects of the Hydrological Cycle

  15. CO2 capture and sequestration. Technological and social stakes in France

    International Nuclear Information System (INIS)

    Minh, Ha-Duong; Naceur, Chaabane

    2010-01-01

    Industrial technology already tested in Norway, North America and Algeria, the CO 2 capture and sequestration (CCS) consists in collecting carbon dioxide and to inject it into deep geological traps. This solution, which contributes to the fight against climatic change, arouses a growing up interest in France as a consequence of the Grenelle Environnement meetings. At a time when big research and demonstration programs are launched everywhere in Europe, this book proposes for the first time a status of the knowledge gathered so far by the specialists of the IPG (World Physics Institute), of the BRGM (Bureau of Geologic and Mining Researches), of the IFP (French Petroleum Institute), and of the CNRS (National Center of Scientific Research). It takes stock of the stakes of this new technology in France. Beyond the technical discussions between experts, the book deals with the external communication stakes and the open public debates. The point of views of the different intervening parties (research organizations, environmental non-governmental organizations, European lobby (Zero Emission Platform), citizens, journalists and companies are compared. A large part of the book aims at shading light on the social acceptability question of this technology. In addition to a synthesis of the available literature, it presents and analyses two participation instruments: a dialogue workshop and a geographical information web site. Content: 1 - scientific stakes of CO 2 geologic sequestration; 2 - technical stakes; 3 - economical stakes; 4 - risks and public opinion; 5 - social acceptability and territorial planning, the wind energy experience; 6 - the point of view of Action-Climat-France network (RAC-F); 7 - citizens' recommendations; 8 - the comeback of coal on the international energy scene; 9 - some consensus from a 'dialogue workshop': the social acceptability of CCS; 10 - bibliographic synthesis about the social acceptability of CCS; 11 - METSTOR, the interactive maping at

  16. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  17. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  18. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  19. Assessing Landscape Scale Wildfire Exposure for Highly Valued Resources in a Mediterranean Area

    Science.gov (United States)

    Alcasena, Fermín J.; Salis, Michele; Ager, Alan A.; Arca, Bachisio; Molina, Domingo; Spano, Donatella

    2015-05-01

    We used a fire simulation modeling approach to assess landscape scale wildfire exposure for highly valued resources and assets (HVR) on a fire-prone area of 680 km2 located in central Sardinia, Italy. The study area was affected by several wildfires in the last half century: some large and intense fire events threatened wildland urban interfaces as well as other socioeconomic and cultural values. Historical wildfire and weather data were used to inform wildfire simulations, which were based on the minimum travel time algorithm as implemented in FlamMap. We simulated 90,000 fires that replicated recent large fire events in the area spreading under severe weather conditions to generate detailed maps of wildfire likelihood and intensity. Then, we linked fire modeling outputs to a geospatial risk assessment framework focusing on buffer areas around HVR. The results highlighted a large variation in burn probability and fire intensity in the vicinity of HVRs, and allowed us to identify the areas most exposed to wildfires and thus to a higher potential damage. Fire intensity in the HVR buffers was mainly related to fuel types, while wind direction, topographic features, and historically based ignition pattern were the key factors affecting fire likelihood. The methodology presented in this work can have numerous applications, in the study area and elsewhere, particularly to address and inform fire risk management, landscape planning and people safety on the vicinity of HVRs.

  20. Assessing landscape scale wildfire exposure for highly valued resources in a Mediterranean area.

    Science.gov (United States)

    Alcasena, Fermín J; Salis, Michele; Ager, Alan A; Arca, Bachisio; Molina, Domingo; Spano, Donatella

    2015-05-01

    We used a fire simulation modeling approach to assess landscape scale wildfire exposure for highly valued resources and assets (HVR) on a fire-prone area of 680 km(2) located in central Sardinia, Italy. The study area was affected by several wildfires in the last half century: some large and intense fire events threatened wildland urban interfaces as well as other socioeconomic and cultural values. Historical wildfire and weather data were used to inform wildfire simulations, which were based on the minimum travel time algorithm as implemented in FlamMap. We simulated 90,000 fires that replicated recent large fire events in the area spreading under severe weather conditions to generate detailed maps of wildfire likelihood and intensity. Then, we linked fire modeling outputs to a geospatial risk assessment framework focusing on buffer areas around HVR. The results highlighted a large variation in burn probability and fire intensity in the vicinity of HVRs, and allowed us to identify the areas most exposed to wildfires and thus to a higher potential damage. Fire intensity in the HVR buffers was mainly related to fuel types, while wind direction, topographic features, and historically based ignition pattern were the key factors affecting fire likelihood. The methodology presented in this work can have numerous applications, in the study area and elsewhere, particularly to address and inform fire risk management, landscape planning and people safety on the vicinity of HVRs.

  1. Meteorological impact assessment of possible large scale irrigation in Southwest Saudi Arabia

    NARCIS (Netherlands)

    Maat, ter H.W.; Hutjes, R.W.A.; Ohba, R.; Ueda, H.; Bisselink, B.; Bauer, T.

    2006-01-01

    On continental to regional scales feedbacks between landuse and landcover change and climate have been widely documented over the past 10¿15 years. In the present study we explore the possibility that also vegetation changes over much smaller areas may affect local precipitation regimes. Large scale

  2. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  3. Multilevel Latent Class Analysis for Large-Scale Educational Assessment Data: Exploring the Relation between the Curriculum and Students' Mathematical Strategies

    Science.gov (United States)

    Fagginger Auer, Marije F.; Hickendorff, Marian; Van Putten, Cornelis M.; Béguin, Anton A.; Heiser, Willem J.

    2016-01-01

    A first application of multilevel latent class analysis (MLCA) to educational large-scale assessment data is demonstrated. This statistical technique addresses several of the challenges that assessment data offers. Importantly, MLCA allows modeling of the often ignored teacher effects and of the joint influence of teacher and student variables.…

  4. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  5. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  6. RWE sells Nafta stake

    International Nuclear Information System (INIS)

    Janoska, J.

    2004-01-01

    At year-end 2000, state-owned Slovensky plynarensky priemysel (SPP) signed a Memorandum of Understanding that set the conditions for the German concern RWE to purchase a 40 % stake in Nafta Gbely. This partnership agreement was meant to grant RWE participation in the management of the gas storage operator, which is controlled by SPP, and allow RWE to increase the use of Nafta's capacities. But in the 3 years since then, these objectives were not met. RWE representatives were not appointed to the Nafta Board and not a single cubic meter of RWE gas was stored at Nafta. RWE denied that it was considering leaving Nafta. Control of Nafta and SPP gradually passed to RWE's major competitors. The attitude of RWE only changed last week, when it sold its stake in Nafta to Ruhrgas under favourable conditions. Although Ruhrgas already more or less controlled Nafta via SPP, it paid RWE 62.22 million Eur for its stake. This represents a price per share of about 12.44 Eur more than RWE paid over two years ago and about double the market price. One of the possible reasons why RWE decided to leave the company is, apart from uncertainty surrounding future participation in the company management, uncertainty regarding whether there is a profit to be made on future dividends. Another reason may be the joint operation of both rivals in a number of companies. And so the Nafta trade may be part of the establishment of areas of influence

  7. Large-scale derived flood frequency analysis based on continuous simulation

    Science.gov (United States)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  8. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  9. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  10. High-Speed Interrogation for Large-Scale Fiber Bragg Grating Sensing.

    Science.gov (United States)

    Hu, Chenyuan; Bai, Wei

    2018-02-24

    A high-speed interrogation scheme for large-scale fiber Bragg grating (FBG) sensing arrays is presented. This technique employs parallel computing and pipeline control to modulate incident light and demodulate the reflected sensing signal. One Electro-optic modulator (EOM) and one semiconductor optical amplifier (SOA) were used to generate a phase delay to filter reflected spectrum form multiple candidate FBGs with the same optical path difference (OPD). Experimental results showed that the fastest interrogation delay time for the proposed method was only about 27.2 us for a single FBG interrogation, and the system scanning period was only limited by the optical transmission delay in the sensing fiber owing to the multiple simultaneous central wavelength calculations. Furthermore, the proposed FPGA-based technique had a verified FBG wavelength demodulation stability of ±1 pm without average processing.

  11. Large Scale Evapotranspiration Estimates: An Important Component in Regional Water Balances to Assess Water Availability

    Science.gov (United States)

    Garatuza-Payan, J.; Yepez, E. A.; Watts, C.; Rodriguez, J. C.; Valdez-Torres, L. C.; Robles-Morua, A.

    2013-05-01

    Water security, can be defined as the reliable supply in quantity and quality of water to help sustain future populations and maintaining ecosystem health and productivity. Water security is rapidly declining in many parts of the world due to population growth, drought, climate change, salinity, pollution, land use change, over-allocation and over-utilization, among other issues. Governmental offices (such as the Comision Nacional del Agua in Mexico, CONAGUA) require and conduct studies to estimate reliable water balances at regional or continental scales in order to provide reasonable assessments of the amount of water that can be provided (from surface or ground water sources) to supply all the human needs while maintaining natural vegetation, on an operational basis and, more important, under disturbances, such as droughts. Large scale estimates of evapotranspiration (ET), a critical component of the water cycle, are needed for a better comprehension of the hydrological cycle at large scales, which, in most water balances is left as the residual. For operational purposes, such water balance estimates can not rely on ET measurements since they do not exist, should be simple and require the least ground information possible, information that is often scarce or does not exist at all. Given this limitation, the use of remotely sensed data to estimate ET could supplement the lack of ground information, particularly in remote regions In this study, a simple method, based on the Makkink equation is used to estimate ET for large areas at high spatial resolutions (1 km). The Makkink model used here is forced using three remotely sensed datasets. First, the model uses solar radiation estimates obtained from the Geostationary Operational Environmental Satellite (GOES); Second, the model uses an Enhanced Vegetation Index (EVI) obtained from the Moderate-resolution Imaging Spectroradiometer (MODIS) normalized to get an estimate for vegetation amount and land use which was

  12. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  13. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  14. A long-term, continuous simulation approach for large-scale flood risk assessments

    Science.gov (United States)

    Falter, Daniela; Schröter, Kai; Viet Dung, Nguyen; Vorogushyn, Sergiy; Hundecha, Yeshewatesfa; Kreibich, Heidi; Apel, Heiko; Merz, Bruno

    2014-05-01

    The Regional Flood Model (RFM) is a process based model cascade developed for flood risk assessments of large-scale basins. RFM consists of four model parts: the rainfall-runoff model SWIM, a 1D channel routing model, a 2D hinterland inundation model and the flood loss estimation model for residential buildings FLEMOps+r. The model cascade was recently undertaken a proof-of-concept study at the Elbe catchment (Germany) to demonstrate that flood risk assessments, based on a continuous simulation approach, including rainfall-runoff, hydrodynamic and damage estimation models, are feasible for large catchments. The results of this study indicated that uncertainties are significant, especially for hydrodynamic simulations. This was basically a consequence of low data quality and disregarding dike breaches. Therefore, RFM was applied with a refined hydraulic model setup for the Elbe tributary Mulde. The study area Mulde catchment comprises about 6,000 km2 and 380 river-km. The inclusion of more reliable information on overbank cross-sections and dikes considerably improved the results. For the application of RFM for flood risk assessments, long-term climate input data is needed to drive the model chain. This model input was provided by a multi-site, multi-variate weather generator that produces sets of synthetic meteorological data reproducing the current climate statistics. The data set comprises 100 realizations of 100 years of meteorological data. With the proposed continuous simulation approach of RFM, we simulated a virtual period of 10,000 years covering the entire flood risk chain including hydrological, 1D/2D hydrodynamic and flood damage estimation models. This provided a record of around 2.000 inundation events affecting the study area with spatially detailed information on inundation depths and damage to residential buildings on a resolution of 100 m. This serves as basis for a spatially consistent, flood risk assessment for the Mulde catchment presented in

  15. Sodium-immersed self-cooled electromagnetic pump design and development of a large-scale coil for high temperature

    International Nuclear Information System (INIS)

    Oto, Akihiro; Naohara, Nobuyuki; Ishida, Masayoshi; Katsuki, Kenji; Kumazawa, Ryouji

    1995-01-01

    A sodium-immersed, self-cooled electromagnetic (EM) pump was recently studied as a prospective innovative technology to simplify a fast breeder reactor plant system. The EM pump for a primary pump, a pump type, was designed, and the structural concept and the system performance were clarified. For the flow control method, a constant voltage/frequency method was preferable from the point of view of pump performance and efficiency. The insulation life was tested on a large-scale coil at high temperature as part of the development of a large-capacity EM pump. Mechanical and electrical damage were not observed, and the insulation performance was quite good. The insulation system could also be applied to large-scale coils

  16. The Limits and Possibilities of International Large-Scale Assessments. Education Policy Brief. Volume 9, Number 2, Spring 2011

    Science.gov (United States)

    Rutkowski, David J.; Prusinski, Ellen L.

    2011-01-01

    The staff of the Center for Evaluation & Education Policy (CEEP) at Indiana University is often asked about how international large-scale assessments influence U.S. educational policy. This policy brief is designed to provide answers to some of the most frequently asked questions encountered by CEEP researchers concerning the three most popular…

  17. Spatiotemporally enhancing time-series DMSP/OLS nighttime light imagery for assessing large-scale urban dynamics

    Science.gov (United States)

    Xie, Yanhua; Weng, Qihao

    2017-06-01

    Accurate, up-to-date, and consistent information of urban extents is vital for numerous applications central to urban planning, ecosystem management, and environmental assessment and monitoring. However, current large-scale urban extent products are not uniform with respect to definition, spatial resolution, temporal frequency, and thematic representation. This study aimed to enhance, spatiotemporally, time-series DMSP/OLS nighttime light (NTL) data for detecting large-scale urban changes. The enhanced NTL time series from 1992 to 2013 were firstly generated by implementing global inter-calibration, vegetation-based spatial adjustment, and urban archetype-based temporal modification. The dataset was then used for updating and backdating urban changes for the contiguous U.S.A. (CONUS) and China by using the Object-based Urban Thresholding method (i.e., NTL-OUT method, Xie and Weng, 2016b). The results showed that the updated urban extents were reasonably accurate, with city-scale RMSE (root mean square error) of 27 km2 and Kappa of 0.65 for CONUS, and 55 km2 and 0.59 for China, respectively. The backdated urban extents yielded similar accuracy, with RMSE of 23 km2 and Kappa of 0.63 in CONUS, while 60 km2 and 0.60 in China. The accuracy assessment further revealed that the spatial enhancement greatly improved the accuracy of urban updating and backdating by significantly reducing RMSE and slightly increasing Kappa values. The temporal enhancement also reduced RMSE, and improved the spatial consistency between estimated and reference urban extents. Although the utilization of enhanced NTL data successfully detected urban size change, relatively low locational accuracy of the detected urban changes was observed. It is suggested that the proposed methodology would be more effective for updating and backdating global urban maps if further fusion of NTL data with higher spatial resolution imagery was implemented.

  18. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  19. A large-scale peer teaching programme - acceptance and benefit.

    Science.gov (United States)

    Schuetz, Elisabeth; Obirei, Barbara; Salat, Daniela; Scholz, Julia; Hann, Dagmar; Dethleffsen, Kathrin

    2017-08-01

    performance in first assessments. 94% of the students participating in tutorials offered in the study year 2013/14 rated the tutorials as "excellent" or "good". An objective benefit has been shown by a significant increase in re-assessment scores with an effect size between the medium and large magnitudes for participants of tutorials compared to non-participants in the years 2012, 2013 and 2014. In addition, significantly higher pass rates of re-assessments could be observed. Acceptance, utilisation and benefit of the assessed peer teaching programme are high. Beyond the support of students, a contribution to the individualisation of studies and teaching is made. Further studies are necessary to investigate possible influences of large-scale peer teaching programmes, for example on the reduction of study length and drop-off rates, as well as additional effects on academic achievements. Copyright © 2017. Published by Elsevier GmbH.

  20. The relationship between small-scale and large-scale ionospheric electron density irregularities generated by powerful HF electromagnetic waves at high latitudes

    Directory of Open Access Journals (Sweden)

    E. D. Tereshchenko

    2006-11-01

    Full Text Available Satellite radio beacons were used in June 2001 to probe the ionosphere modified by a radio beam produced by the EISCAT high-power, high-frequency (HF transmitter located near Tromsø (Norway. Amplitude scintillations and variations of the phase of 150- and 400-MHz signals from Russian navigational satellites passing over the modified region were observed at three receiver sites. In several papers it has been stressed that in the polar ionosphere the thermal self-focusing on striations during ionospheric modification is the main mechanism resulting in the formation of large-scale (hundreds of meters to kilometers nonlinear structures aligned along the geomagnetic field (magnetic zenith effect. It has also been claimed that the maximum effects caused by small-scale (tens of meters irregularities detected in satellite signals are also observed in the direction parallel to the magnetic field. Contrary to those studies, the present paper shows that the maximum in amplitude scintillations does not correspond strictly to the magnetic zenith direction because high latitude drifts typically cause a considerable anisotropy of small-scale irregularities in a plane perpendicular to the geomagnetic field resulting in a deviation of the amplitude-scintillation peak relative to the minimum angle between the line-of-sight to the satellite and direction of the geomagnetic field lines. The variance of the logarithmic relative amplitude fluctuations is considered here, which is a useful quantity in such studies. The experimental values of the variance are compared with model calculations and good agreement has been found. It is also shown from the experimental data that in most of the satellite passes a variance maximum occurs at a minimum in the phase fluctuations indicating that the artificial excitation of large-scale irregularities is minimum when the excitation of small-scale irregularities is maximum.

  1. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  2. How International Large-Scale Skills Assessments Engage with National Actors: Mobilising Networks through Policy, Media and Public Knowledge

    Science.gov (United States)

    Hamilton, Mary

    2017-01-01

    This paper examines how international, large-scale skills assessments (ILSAs) engage with the broader societies they seek to serve and improve. It looks particularly at the discursive work that is done by different interest groups and the media through which the findings become part of public conversations and are translated into usable form in…

  3. Evaluation of creep-fatigue crack growth for large-scale FBR reactor vessel and NDE assessment

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Young Sang; Kim, Jong Bum; Kim, Seok Hun; Yoo, Bong

    2001-03-01

    Creep fatigue crack growth contributes to the failure of FRB reactor vessels in high temperature condition. In the design stage of reactor vessel, crack growth evaluation is very important to ensure the structural safety and setup the in-service inspection strategy. In this study, creep-fatigue crack growth evaluation has been performed for the semi-elliptical surface cracks subjected to thermal loading. The thermal stress analysis of a large-scale FBR reactor vessel has been carried out for the load conditions. The distributions of axial, radial, hoop, and Von Mises stresses were obtained for the loading conditions. At the maximum point of the axial and hoop stress, the longitudinal and circumferential surface cracks (i.e. PTS crack, NDE short crack and shallow long crack) were postulated. Using the maximum and minimum values of stresses, the creep-fatigue crack growth of the proposed cracks was simulated. The crack growth rate of circumferential cracks becomes greater than that of longitudinal cracks. The total crack growth of the largest PTS crack is very small after 427 cycles. The structural integrity of a large-scale reactor can be maintained for the plant life. The crack depth growth of the shallow long crack is faster than that of the NDE short crack. In the ISI of the large-scale FBR reactor vessel, the ultrasonic inspection is beneficial to detect the shallow circumferential cracks.

  4. LARGE-SCALE COMMERCIAL INVESTMENTS IN LAND: SEEKING ...

    African Journals Online (AJOL)

    extent of large-scale investment in land or to assess its impact on the people in recipient countries. .... favorable lease terms, apparently based on a belief that this is necessary to .... Harm to the rights of local occupiers of land can result from a dearth. 24. ..... applies to a self-identified group based on the group's traditions.

  5. Comparing the Effectiveness of Self-Paced and Collaborative Frame-of-Reference Training on Rater Accuracy in a Large-Scale Writing Assessment

    Science.gov (United States)

    Raczynski, Kevin R.; Cohen, Allan S.; Engelhard, George, Jr.; Lu, Zhenqiu

    2015-01-01

    There is a large body of research on the effectiveness of rater training methods in the industrial and organizational psychology literature. Less has been reported in the measurement literature on large-scale writing assessments. This study compared the effectiveness of two widely used rater training methods--self-paced and collaborative…

  6. Integrated numerical platforms for environmental dose assessments of large tritium inventory facilities

    International Nuclear Information System (INIS)

    Castro, P.; Ardao, J.; Velarde, M.; Sedano, L.; Xiberta, J.

    2013-01-01

    Related with a prospected new scenario of large inventory tritium facilities [KATRIN at TLK, CANDUs, ITER, EAST, other coming] the prescribed dosimetric limits by ICRP-60 for tritium committed-doses are under discussion requiring, in parallel, to surmount the highly conservative assessments by increasing the refinement of dosimetric-assessments in many aspects. Precise Lagrangian-computations of dosimetric cloud-evolution after standardized (normal/incidental/SBO) tritium cloud emissions are today numerically open to the perfect match of real-time meteorological-data, and patterns data at diverse scales for prompt/early and chronic tritium dose assessments. The trends towards integrated-numerical-platforms for environmental-dose assessments of large tritium inventory facilities under development.

  7. ``Large''- vs Small-scale friction control in turbulent channel flow

    Science.gov (United States)

    Canton, Jacopo; Örlü, Ramis; Chin, Cheng; Schlatter, Philipp

    2017-11-01

    We reconsider the ``large-scale'' control scheme proposed by Hussain and co-workers (Phys. Fluids 10, 1049-1051 1998 and Phys. Rev. Fluids, 2, 62601 2017), using new direct numerical simulations (DNS). The DNS are performed in a turbulent channel at friction Reynolds number Reτ of up to 550 in order to eliminate low-Reynolds-number effects. The purpose of the present contribution is to re-assess this control method in the light of more modern developments in the field, in particular also related to the discovery of (very) large-scale motions. The goals of the paper are as follows: First, we want to better characterise the physics of the control, and assess what external contribution (vortices, forcing, wall motion) are actually needed. Then, we investigate the optimal parameters and, finally, determine which aspects of this control technique actually scale in outer units and can therefore be of use in practical applications. In addition to discussing the mentioned drag-reduction effects, the present contribution will also address the potential effect of the naturally occurring large-scale motions on frictional drag, and give indications on the physical processes for potential drag reduction possible at all Reynolds numbers.

  8. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  9. Using GRACE Satellite Gravimetry for Assessing Large-Scale Hydrologic Extremes

    Directory of Open Access Journals (Sweden)

    Alexander Y. Sun

    2017-12-01

    Full Text Available Global assessment of the spatiotemporal variability in terrestrial total water storage anomalies (TWSA in response to hydrologic extremes is critical for water resources management. Using TWSA derived from the gravity recovery and climate experiment (GRACE satellites, this study systematically assessed the skill of the TWSA-climatology (TC approach and breakpoint (BP detection method for identifying large-scale hydrologic extremes. The TC approach calculates standardized anomalies by using the mean and standard deviation of the GRACE TWSA corresponding to each month. In the BP detection method, the empirical mode decomposition (EMD is first applied to identify the mean return period of TWSA extremes, and then a statistical procedure is used to identify the actual occurrence times of abrupt changes (i.e., BPs in TWSA. Both detection methods were demonstrated on basin-averaged TWSA time series for the world’s 35 largest river basins. A nonlinear event coincidence analysis measure was applied to cross-examine abrupt changes detected by these methods with those detected by the Standardized Precipitation Index (SPI. Results show that our EMD-assisted BP procedure is a promising tool for identifying hydrologic extremes using GRACE TWSA data. Abrupt changes detected by the BP method coincide well with those of the SPI anomalies and with documented hydrologic extreme events. Event timings obtained by the TC method were ambiguous for a number of river basins studied, probably because the GRACE data length is too short to derive long-term climatology at this time. The BP approach demonstrates a robust wet-dry anomaly detection capability, which will be important for applications with the upcoming GRACE Follow-On mission.

  10. Assessing Human Modifications to Floodplains using Large-Scale Hydrogeomorphic Floodplain Modeling

    Science.gov (United States)

    Morrison, R. R.; Scheel, K.; Nardi, F.; Annis, A.

    2017-12-01

    Human modifications to floodplains for water resource and flood management purposes have significantly transformed river-floodplain connectivity dynamics in many watersheds. Bridges, levees, reservoirs, shifts in land use, and other hydraulic engineering works have altered flow patterns and caused changes in the timing and extent of floodplain inundation processes. These hydrogeomorphic changes have likely resulted in negative impacts to aquatic habitat and ecological processes. The availability of large-scale topographic datasets at high resolution provide an opportunity for detecting anthropogenic impacts by means of geomorphic mapping. We have developed and are implementing a methodology for comparing a hydrogeomorphic floodplain mapping technique to hydraulically-modeled floodplain boundaries to estimate floodplain loss due to human activities. Our hydrogeomorphic mapping methodology assumes that river valley morphology intrinsically includes information on flood-driven erosion and depositional phenomena. We use a digital elevation model-based algorithm to identify the floodplain as the area of the fluvial corridor laying below water reference levels, which are estimated using a simplified hydrologic model. Results from our hydrogeomorphic method are compared to hydraulically-derived flood zone maps and spatial datasets of levee protected-areas to explore where water management features, such as levees, have changed floodplain dynamics and landscape features. Parameters associated with commonly used F-index functions are quantified and analyzed to better understand how floodplain areas have been reduced within a basin. Preliminary results indicate that the hydrogeomorphic floodplain model is useful for quickly delineating floodplains at large watershed scales, but further analyses are needed to understand the caveats for using the model in determining floodplain loss due to levees. We plan to continue this work by exploring the spatial dependencies of the F

  11. Personality Assessment Inventory scale characteristics and factor structure in the assessment of alcohol dependency.

    Science.gov (United States)

    Schinka, J A

    1995-02-01

    Individual scale characteristics and the inventory structure of the Personality Assessment Inventory (PAI; Morey, 1991) were examined by conducting internal consistency and factor analyses of item and scale score data from a large group (N = 301) of alcohol-dependent patients. Alpha coefficients, mean inter-item correlations, and corrected item-total scale correlations for the sample paralleled values reported by Morey for a large clinical sample. Minor differences in the scale factor structure of the inventory from Morey's clinical sample were found. Overall, the findings support the use of the PAI in the assessment of personality and psychopathology of alcohol-dependent patients.

  12. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  13. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  14. A new method for large-scale assessment of change in ecosystem functioning in relation to land degradation

    Science.gov (United States)

    Horion, Stephanie; Ivits, Eva; Verzandvoort, Simone; Fensholt, Rasmus

    2017-04-01

    Ongoing pressures on European land are manifold with extreme climate events and non-sustainable use of land resources being amongst the most important drivers altering the functioning of the ecosystems. The protection and conservation of European natural capital is one of the key objectives of the 7th Environmental Action Plan (EAP). The EAP stipulates that European land must be managed in a sustainable way by 2020 and the UN Sustainable development goals define a Land Degradation Neutral world as one of the targets. This implies that land degradation (LD) assessment of European ecosystems must be performed repeatedly allowing for the assessment of the current state of LD as well as changes compared to a baseline adopted by the UNCCD for the objective of land degradation neutrality. However, scientifically robust methods are still lacking for large-scale assessment of LD and repeated consistent mapping of the state of terrestrial ecosystems. Historical land degradation assessments based on various methods exist, but methods are generally non-replicable or difficult to apply at continental scale (Allan et al. 2007). The current lack of research methods applicable at large spatial scales is notably caused by the non-robust definition of LD, the scarcity of field data on LD, as well as the complex inter-play of the processes driving LD (Vogt et al., 2011). Moreover, the link between LD and changes in land use (how land use changes relates to change in vegetation productivity and ecosystem functioning) is not straightforward. In this study we used the segmented trend method developed by Horion et al. (2016) for large-scale systematic assessment of hotspots of change in ecosystem functioning in relation to LD. This method alleviates shortcomings of widely used linear trend model that does not account for abrupt change, nor adequately captures the actual changes in ecosystem functioning (de Jong et al. 2013; Horion et al. 2016). Here we present a new methodology for

  15. A climatological analysis of high-precipitation events in Dronning Maud Land, Antarctica, and associated large-scale atmospheric conditions

    NARCIS (Netherlands)

    Welker, Christoph; Martius, Olivia; Froidevaux, Paul; Reijmer, Carleen H.; Fischer, Hubertus

    2014-01-01

    The link between high precipitation in Dronning Maud Land (DML), Antarctica, and the large-scale atmospheric circulation is investigated using ERA-Interim data for 1979-2009. High-precipitation events are analyzed at Halvfarryggen situated in the coastal region of DML and at Kohnen Station located

  16. Economic games on the internet: the effect of $1 stakes.

    Science.gov (United States)

    Amir, Ofra; Rand, David G; Gal, Ya'akov Kobi

    2012-01-01

    Online labor markets such as Amazon Mechanical Turk (MTurk) offer an unprecedented opportunity to run economic game experiments quickly and inexpensively. Using Mturk, we recruited 756 subjects and examined their behavior in four canonical economic games, with two payoff conditions each: a stakes condition, in which subjects' earnings were based on the outcome of the game (maximum earnings of $1); and a no-stakes condition, in which subjects' earnings are unaffected by the outcome of the game. Our results demonstrate that economic game experiments run on MTurk are comparable to those run in laboratory settings, even when using very low stakes.

  17. The Climate Potentials and Side-Effects of Large-Scale terrestrial CO2 Removal - Insights from Quantitative Model Assessments

    Science.gov (United States)

    Boysen, L.; Heck, V.; Lucht, W.; Gerten, D.

    2015-12-01

    Terrestrial carbon dioxide removal (tCDR) through dedicated biomass plantations is considered as one climate engineering (CE) option if implemented at large-scale. While the risks and costs are supposed to be small, the effectiveness depends strongly on spatial and temporal scales of implementation. Based on simulations with a dynamic global vegetation model (LPJmL) we comprehensively assess the effectiveness, biogeochemical side-effects and tradeoffs from an earth system-analytic perspective. We analyzed systematic land-use scenarios in which all, 25%, or 10% of natural and/or agricultural areas are converted to tCDR plantations including the assumption that biomass plantations are established once the 2°C target is crossed in a business-as-usual climate change trajectory. The resulting tCDR potentials in year 2100 include the net accumulated annual biomass harvests and changes in all land carbon pools. We find that only the most spatially excessive, and thus undesirable, scenario would be capable to restore the 2° target by 2100 under continuing high emissions (with a cooling of 3.02°C). Large-scale biomass plantations covering areas between 1.1 - 4.2 Gha would produce a climate reduction potential of 0.8 - 1.4°C. tCDR plantations at smaller scales do not build up enough biomass over this considered period and the potentials to achieve global warming reductions are substantially lowered to no more than 0.5-0.6°C. Finally, we demonstrate that the (non-economic) costs for the Earth system include negative impacts on the water cycle and on ecosystems, which are already under pressure due to both land use change and climate change. Overall, tCDR may lead to a further transgression of land- and water-related planetary boundaries while not being able to set back the crossing of the planetary boundary for climate change. tCDR could still be considered in the near-future mitigation portfolio if implemented on small scales on wisely chosen areas.

  18. A method for the assessment of the visual impact caused by the large-scale deployment of renewable-energy facilities

    International Nuclear Information System (INIS)

    Rodrigues, Marcos; Montanes, Carlos; Fueyo, Norberto

    2010-01-01

    The production of energy from renewable sources requires a significantly larger use of the territory compared with conventional (fossil and nuclear) sources. For large penetrations of renewable technologies, such as wind power, the overall visual impact at the national level can be substantial, and may prompt public reaction. This study develops a methodology for the assessment of the visual impact that can be used to measure and report the level of impact caused by several renewable technologies (wind farms, solar photovoltaic plants or solar thermal ones), both at the local and regional (e.g. national) scales. Applications are shown to several large-scale, hypothetical scenarios of wind and solar-energy penetration in Spain, and also to the vicinity of an actual, single wind farm.

  19. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  20. A moderated mediated path analysis of factors influencing student performance on a standardized high-stakes science test

    Science.gov (United States)

    Pelkey, Ramona K.

    Gender, ethnicity, family economic status, reading score, mathematics score, and number of science semesters successfully completed were examined for their contributory role to a student's science score on a high-stakes, high school exit examination. Path analysis and analysis of variance procedures were used to quantify each variable's influence on science score. Gender, ethnicity, and family economic status were found to be moderators while reading proved to mediate within the model. The path model was created using a calibration sample and cross-validated using a hold-out validation sample. Bootstrapping was used to verify the goodness of fit of the model. A predictive equation explained 66% (R2 = .66) of the variance in observed TAKS science score.

  1. Matrix Sampling of Items in Large-Scale Assessments

    Directory of Open Access Journals (Sweden)

    Ruth A. Childs

    2003-07-01

    Full Text Available Matrix sampling of items -' that is, division of a set of items into different versions of a test form..-' is used by several large-scale testing programs. Like other test designs, matrixed designs have..both advantages and disadvantages. For example, testing time per student is less than if each..student received all the items, but the comparability of student scores may decrease. Also,..curriculum coverage is maintained, but reporting of scores becomes more complex. In this paper,..matrixed designs are compared with more traditional designs in nine categories of costs:..development costs, materials costs, administration costs, educational costs, scoring costs,..reliability costs, comparability costs, validity costs, and reporting costs. In choosing among test..designs, a testing program should examine the costs in light of its mandate(s, the content of the..tests, and the financial resources available, among other considerations.

  2. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  3. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  4. Large scale high strain-rate tests of concrete

    Directory of Open Access Journals (Sweden)

    Kiefer R.

    2012-08-01

    Full Text Available This work presents the stages of development of some innovative equipment, based on Hopkinson bar techniques, for performing large scale dynamic tests of concrete specimens. The activity is centered at the recently upgraded HOPLAB facility, which is basically a split Hopkinson bar with a total length of approximately 200 m and with bar diameters of 72 mm. Through pre-tensioning and suddenly releasing a steel cable, force pulses of up to 2 MN, 250 μs rise time and 40 ms duration can be generated and applied to the specimen tested. The dynamic compression loading has first been treated and several modifications in the basic configuration have been introduced. Twin incident and transmitter bars have been installed with strong steel plates at their ends where large specimens can be accommodated. A series of calibration and qualification tests has been conducted and the first real tests on concrete cylindrical specimens of 20cm diameter and up to 40cm length have commenced. Preliminary results from the analysis of the recorded signals indicate proper Hopkinson bar testing conditions and reliable functioning of the facility.

  5. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  6. A Model for Predicting Student Performance on High-Stakes Assessment

    Science.gov (United States)

    Dammann, Matthew Walter

    2010-01-01

    This research study examined the use of student achievement on reading and math state assessments to predict success on the science state assessment. Multiple regression analysis was utilized to test the prediction for all students in grades 5 and 8 in a mid-Atlantic state. The prediction model developed from the analysis explored the combined…

  7. Can mixed assessment methods make biology classes more equitable?

    Science.gov (United States)

    Cotner, Sehoya; Ballen, Cissy J

    2017-01-01

    Many factors have been proposed to explain the attrition of women in science, technology, engineering and math fields, among them the lower performance of women in introductory courses resulting from deficits in incoming preparation. We focus on the impact of mixed methods of assessment, which minimizes the impact of high-stakes exams and rewards other methods of assessment such as group participation, low-stakes quizzes and assignments, and in-class activities. We hypothesized that these mixed methods would benefit individuals who otherwise underperform on high-stakes tests. Here, we analyze gender-based performance trends in nine large (N > 1000 students) introductory biology courses in fall 2016. Females underperformed on exams compared to their male counterparts, a difference that does not exist with other methods of assessment that compose course grade. Further, we analyzed three case studies of courses that transitioned their grading schemes to either de-emphasize or emphasize exams as a proportion of total course grade. We demonstrate that the shift away from an exam emphasis consequently benefits female students, thereby closing gaps in overall performance. Further, the exam performance gap itself is reduced when the exams contribute less to overall course grade. We discuss testable predictions that follow from our hypothesis, and advocate for the use of mixed methods of assessments (possibly as part of an overall shift to active learning techniques). We conclude by challenging the student deficit model, and suggest a course deficit model as explanatory of these performance gaps, whereby the microclimate of the classroom can either raise or lower barriers to success for underrepresented groups in STEM.

  8. Algorithm and Application of Gcp-Independent Block Adjustment for Super Large-Scale Domestic High Resolution Optical Satellite Imagery

    Science.gov (United States)

    Sun, Y. S.; Zhang, L.; Xu, B.; Zhang, Y.

    2018-04-01

    The accurate positioning of optical satellite image without control is the precondition for remote sensing application and small/medium scale mapping in large abroad areas or with large-scale images. In this paper, aiming at the geometric features of optical satellite image, based on a widely used optimization method of constraint problem which is called Alternating Direction Method of Multipliers (ADMM) and RFM least-squares block adjustment, we propose a GCP independent block adjustment method for the large-scale domestic high resolution optical satellite image - GISIBA (GCP-Independent Satellite Imagery Block Adjustment), which is easy to parallelize and highly efficient. In this method, the virtual "average" control points are built to solve the rank defect problem and qualitative and quantitative analysis in block adjustment without control. The test results prove that the horizontal and vertical accuracy of multi-covered and multi-temporal satellite images are better than 10 m and 6 m. Meanwhile the mosaic problem of the adjacent areas in large area DOM production can be solved if the public geographic information data is introduced as horizontal and vertical constraints in the block adjustment process. Finally, through the experiments by using GF-1 and ZY-3 satellite images over several typical test areas, the reliability, accuracy and performance of our developed procedure will be presented and studied in this paper.

  9. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  10. Economic games on the internet: the effect of $1 stakes.

    Directory of Open Access Journals (Sweden)

    Ofra Amir

    Full Text Available Online labor markets such as Amazon Mechanical Turk (MTurk offer an unprecedented opportunity to run economic game experiments quickly and inexpensively. Using Mturk, we recruited 756 subjects and examined their behavior in four canonical economic games, with two payoff conditions each: a stakes condition, in which subjects' earnings were based on the outcome of the game (maximum earnings of $1; and a no-stakes condition, in which subjects' earnings are unaffected by the outcome of the game. Our results demonstrate that economic game experiments run on MTurk are comparable to those run in laboratory settings, even when using very low stakes.

  11. Staking solutions to tube vibration problems (developed by Technos et Compagnie - FRANCE)

    International Nuclear Information System (INIS)

    Hewitt, E.W.; Bizard, A.; Horn, M.J.

    1989-01-01

    Electric generating plant steam surface condensers have been prone to vibration induced tube failures. One common and effective method for stopping this vibration has been to insert stakes into the bundle to provide additional support. Stakes have been fabricated of a variety of rigid and semi-rigid materials of fixed dimensions. Installation difficulties and problems of incomplete tube support have been associated with this approach. New developments in the application of plastic technology has offered another approach. Stakes made of plastic tubes which are flattened, by evacuation, at the time of manufacture may now be easily inserted into the tube bundle. After insertion, the vacuum is released and the memory of the plastic causes the stakes to expand and assume their original form. The spring force of the plastic cradles the adjacent condenser tubes and stops the vibration. Developed for Electricite de France (EDF), the stakes are currently installed in 19 units of the French utility system, and two units in the United States

  12. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    Science.gov (United States)

    Jensen, Tue V.; Pinson, Pierre

    2017-11-01

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  13. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.

    Science.gov (United States)

    Jensen, Tue V; Pinson, Pierre

    2017-11-28

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  14. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  15. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    Science.gov (United States)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  16. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    Science.gov (United States)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the

  17. Obtaining high-resolution stage forecasts by coupling large-scale hydrologic models with sensor data

    Science.gov (United States)

    Fries, K. J.; Kerkez, B.

    2017-12-01

    We investigate how "big" quantities of distributed sensor data can be coupled with a large-scale hydrologic model, in particular the National Water Model (NWM), to obtain hyper-resolution forecasts. The recent launch of the NWM provides a great example of how growing computational capacity is enabling a new generation of massive hydrologic models. While the NWM spans an unprecedented spatial extent, there remain many questions about how to improve forecast at the street-level, the resolution at which many stakeholders make critical decisions. Further, the NWM runs on supercomputers, so water managers who may have access to their own high-resolution measurements may not readily be able to assimilate them into the model. To that end, we ask the question: how can the advances of the large-scale NWM be coupled with new local observations to enable hyper-resolution hydrologic forecasts? A methodology is proposed whereby the flow forecasts of the NWM are directly mapped to high-resolution stream levels using Dynamical System Identification. We apply the methodology across a sensor network of 182 gages in Iowa. Of these sites, approximately one third have shown to perform well in high-resolution flood forecasting when coupled with the outputs of the NWM. The quality of these forecasts is characterized using Principal Component Analysis and Random Forests to identify where the NWM may benefit from new sources of local observations. We also discuss how this approach can help municipalities identify where they should place low-cost sensors to most benefit from flood forecasts of the NWM.

  18. Wind power: Areva acquires a 51% stake in Multibrid

    International Nuclear Information System (INIS)

    2007-01-01

    AREVA announced the acquisition of a 51% stake in Multibrid, a designer and manufacturer of multi-megawatt off-shore wind turbines based in Germany. With this acquisition, AREVA has entered into a joint venture with Prokon Nord, a German off-shore wind turbine and biomass plant developer and current owner of Multibrid. This transaction values Multibrid at euro 150 million. AREVA plans to rapidly further develop Multibrid's activities by giving the company access to its industrial resources, financial base and international commercial network. In return, Multibrid will provide AREVA with its leading-edge technology which, developed for 5 MW turbines, can achieve a very high output while reducing operating costs thanks to a simplified maintenance system. With this stake in Multibrid, AREVA aims to increase its presence on the offshore wind market that meets land settlement requirements and that should grow significantly in the years to come (from 300 MW in Europe today to an expected 1400 MW by 2011). As an exclusive supplier of Prokon Nord, Multibrid will participate in projects such as Borkum West (30 MW), the first offshore project in Germany, Borkum West 2 (400 MW), and Cote d'Albatre (105 MW), the first offshore wind farm project in France. The stake in Multibrid strengthens AREVA's strategic positioning on the CO 2 -free energy market, thanks to complementary solutions ranging from nuclear technologies to renewables. A number of recent achievements illustrate this strategy: - bio-energy (crucial energy supply in numerous rural areas): delivery of turnkey biomass power plants; ongoing construction of 10 plants in India, Thailand and Brazil; future development plans in fast-growing regions, such as Latin America; - wind power: Multibrid adds to the Group's stake in REpower and to its partnership with Suzlon for which AREVA is the number one supplier of transmission and distribution solutions for wind power; - hydrogen and fuel cells: design and manufacture of

  19. Increasing condom use and declining STI prevalence in high-risk MSM and TGs: evaluation of a large-scale prevention program in Tamil Nadu, India.

    Science.gov (United States)

    Subramanian, Thilakavathi; Ramakrishnan, Lakshmi; Aridoss, Santhakumar; Goswami, Prabuddhagopal; Kanguswami, Boopathi; Shajan, Mathew; Adhikary, Rajat; Purushothaman, Girish Kumar Chethrapilly; Ramamoorthy, Senthil Kumar; Chinnaswamy, Eswaramurthy; Veeramani, Ilaya Bharathy; Paranjape, Ramesh Shivram

    2013-09-17

    This paper presents an evaluation of Avahan, a large scale HIV prevention program that was implemented using peer-mediated strategies, condom distribution and sexually transmitted infection (STI) clinical services among high-risk men who have sex with men (HR-MSM) and male to female transgender persons (TGs) in six high-prevalence state of Tamil Nadu, in southern India. Two rounds of large scale cross-sectional bio-behavioural surveys among HR-MSM and TGs and routine program monitoring data were used to assess changes in program coverage, condom use and prevalence of STIs (including HIV) and their association to program exposure. The Avahan program for HR-MSM and TGs in Tamil Nadu was significantly scaled up and contacts by peer educators reached 77 percent of the estimated denominator by the end of the program's fourth year. Exposure to the program increased between the two rounds of surveys for both HR-MSM (from 66 percent to 90 percent; AOR = 4.6; p Tamil Nadu achieved a high coverage, resulting in improved condom use by HR-MSM with their regular and commercial male partners. Declining STI prevalence and stable HIV prevalence reflect the positive effects of the prevention strategy. Outcomes from the program logic model indiacte the effectiveness of the program for HR-MSM and TGs in Tamil Nadu.

  20. Characterizing Temperature Variability and Associated Large Scale Meteorological Patterns Across South America

    Science.gov (United States)

    Detzer, J.; Loikith, P. C.; Mechoso, C. R.; Barkhordarian, A.; Lee, H.

    2017-12-01

    South America's climate varies considerably owing to its large geographic range and diverse topographical features. Spanning the tropics to the mid-latitudes and from high peaks to tropical rainforest, the continent experiences an array of climate and weather patterns. Due to this considerable spatial extent, assessing temperature variability at the continent scale is particularly challenging. It is well documented in the literature that temperatures have been increasing across portions of South America in recent decades, and while there have been many studies that have focused on precipitation variability and change, temperature has received less scientific attention. Therefore, a more thorough understanding of the drivers of temperature variability is critical for interpreting future change. First, k-means cluster analysis is used to identify four primary modes of temperature variability across the continent, stratified by season. Next, composites of large scale meteorological patterns (LSMPs) are calculated for months assigned to each cluster. Initial results suggest that LSMPs, defined using meteorological variables such as sea level pressure (SLP), geopotential height, and wind, are able to identify synoptic scale mechanisms important for driving temperature variability at the monthly scale. Some LSMPs indicate a relationship with known recurrent modes of climate variability. For example, composites of geopotential height suggest that the Southern Annular Mode is an important, but not necessarily dominant, component of temperature variability over southern South America. This work will be extended to assess the drivers of temperature extremes across South America.

  1. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  2. Emotion at Stake—The Role of Stake Size and Emotions in a Power-to-Take Game Experiment in China with a Comparison to Europe

    Directory of Open Access Journals (Sweden)

    Ronald Bosman

    2017-03-01

    Full Text Available This paper experimentally investigates how monetary incentives and emotions influence behavior in a two-player power-to-take game (PTTG. In this game, one player can claim any part of the other's endowment (take rate, and the second player can respond by destroying any part of his or her own endowment. The experiment is run in China. We further compare our findings with the behavior of two European subject pools. Our results give new insights regarding emotion regulation. Even though stake size does not appear to matter for take rates and destruction rates, it does matter for the reaction function of the responder regarding the take rate. When stakes are high, there is less destruction for low and intermediate take rates, and more destruction for high take rates, compared to relatively low stakes. Under low incentives, ‘hot’ anger-type emotions are important for destruction, while ‘cool’ contempt becomes prominent under high monetary incentives. These results suggest emotion regulation in the high-stake condition. Moreover, emotions are found to fully mediate the impact of the take rate on destruction when stakes are low, whereas they only partially do so if stakes are high. Comparing the low-stakes data for China with existing European data, we find similarities in behavior, emotions and emotion intensities, as well as the full mediation of the take rate by emotions. We find some differences related to the type of emotions that are important for destruction. Whereas anger and joy are important in both, in addition, irritation and fear play a role in China, while this holds for contempt in the EU.

  3. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  4. On Matrix Sampling and Imputation of Context Questionnaires with Implications for the Generation of Plausible Values in Large-Scale Assessments

    Science.gov (United States)

    Kaplan, David; Su, Dan

    2016-01-01

    This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…

  5. Five hundred years of gridded high-resolution precipitation reconstructions over Europe and the connection to large-scale circulation

    Energy Technology Data Exchange (ETDEWEB)

    Pauling, Andreas [University of Bern, Institute of Geography, Bern (Switzerland); Luterbacher, Juerg; Wanner, Heinz [University of Bern, Institute of Geography, Bern (Switzerland); National Center of Competence in Research (NCCR) in Climate, Bern (Switzerland); Casty, Carlo [University of Bern, Climate and Environmental Physics Institute, Bern (Switzerland)

    2006-03-15

    We present seasonal precipitation reconstructions for European land areas (30 W to 40 E/30-71 N; given on a 0.5 x 0.5 resolved grid) covering the period 1500-1900 together with gridded reanalysis from 1901 to 2000 (Mitchell and Jones 2005). Principal component regression techniques were applied to develop this dataset. A large variety of long instrumental precipitation series, precipitation indices based on documentary evidence and natural proxies (tree-ring chronologies, ice cores, corals and a speleothem) that are sensitive to precipitation signals were used as predictors. Transfer functions were derived over the 1901-1983 calibration period and applied to 1500-1900 in order to reconstruct the large-scale precipitation fields over Europe. The performance (quality estimation based on unresolved variance within the calibration period) of the reconstructions varies over centuries, seasons and space. Highest reconstructive skill was found for winter over central Europe and the Iberian Peninsula. Precipitation variability over the last half millennium reveals both large interannual and decadal fluctuations. Applying running correlations, we found major non-stationarities in the relation between large-scale circulation and regional precipitation. For several periods during the last 500 years, we identified key atmospheric modes for southern Spain/northern Morocco and central Europe as representations of two precipitation regimes. Using scaled composite analysis, we show that precipitation extremes over central Europe and southern Spain are linked to distinct pressure patterns. Due to its high spatial and temporal resolution, this dataset allows detailed studies of regional precipitation variability for all seasons, impact studies on different time and space scales, comparisons with high-resolution climate models as well as analysis of connections with regional temperature reconstructions. (orig.)

  6. Strategic Environmental Assessment and Environmental Auditing in Large-scale Public Infrastructure Construction: the case of Qinghai-Tibet Railway

    NARCIS (Netherlands)

    He, G.; Zhang, L.; Lu, Y.

    2009-01-01

    Large-scale public infrastructure projects have featured in China’s modernization course since the early 1980s. During the early stages of China’s rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however,

  7. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  8. Application of plant metabonomics in quality assessment for large-scale production of traditional Chinese medicine.

    Science.gov (United States)

    Ning, Zhangchi; Lu, Cheng; Zhang, Yuxin; Zhao, Siyu; Liu, Baoqin; Xu, Xuegong; Liu, Yuanyan

    2013-07-01

    The curative effects of traditional Chinese medicines are principally based on the synergic effect of their multi-targeting, multi-ingredient preparations, in contrast to modern pharmacology and drug development that often focus on a single chemical entity. Therefore, the method employing a few markers or pharmacologically active constituents to assess the quality and authenticity of the complex preparations has a number of severe challenges. Metabonomics can provide an effective platform for complex sample analysis. It is also reported to be applied to the quality analysis of the traditional Chinese medicine. Metabonomics enables comprehensive assessment of complex traditional Chinese medicines or herbal remedies and sample classification of diverse biological statuses, origins, or qualities in samples, by means of chemometrics. Identification, processing, and pharmaceutical preparation are the main procedures in the large-scale production of Chinese medicinal preparations. Through complete scans, plants metabonomics addresses some of the shortfalls of single analyses and presents a considerable potential to become a sharp tool for traditional Chinese medicine quality assessment. Georg Thieme Verlag KG Stuttgart · New York.

  9. Post-examination interpretation of objective test data: monitoring and improving the quality of high-stakes examinations--a commentary on two AMEE Guides.

    Science.gov (United States)

    Tavakol, Mohsen; Dennick, Reg

    2012-01-01

    As great emphasis is rightly placed upon the importance of assessment to judge the quality of our future healthcare professionals, it is appropriate not only to choose the most appropriate assessment method, but to continually monitor the quality of the tests themselves, in a hope that we may continually improve the process. This article stresses the importance of quality control mechanisms in the exam cycle and briefly outlines some of the key psychometric concepts including reliability measures, factor analysis, generalisability theory and item response theory. The importance of such analyses for the standard setting procedures is emphasised. This article also accompanies two new AMEE Guides in Medical Education (Tavakol M, Dennick R. Post-examination Analysis of Objective Tests: AMEE Guide No. 54 and Tavakol M, Dennick R. 2012. Post examination analysis of objective test data: Monitoring and improving the quality of high stakes examinations: AMEE Guide No. 66) which provide the reader with practical examples of analysis and interpretation, in order to help develop valid and reliable tests.

  10. Large scale debris-flow hazard assessment: a geotechnical approach and GIS modelling

    Directory of Open Access Journals (Sweden)

    G. Delmonaco

    2003-01-01

    Full Text Available A deterministic distributed model has been developed for large-scale debris-flow hazard analysis in the basin of River Vezza (Tuscany Region – Italy. This area (51.6 km 2 was affected by over 250 landslides. These were classified as debris/earth flow mainly involving the metamorphic geological formations outcropping in the area, triggered by the pluviometric event of 19 June 1996. In the last decades landslide hazard and risk analysis have been favoured by the development of GIS techniques permitting the generalisation, synthesis and modelling of stability conditions on a large scale investigation (>1:10 000. In this work, the main results derived by the application of a geotechnical model coupled with a hydrological model for the assessment of debris flows hazard analysis, are reported. This analysis has been developed starting by the following steps: landslide inventory map derived by aerial photo interpretation, direct field survey, generation of a database and digital maps, elaboration of a DTM and derived themes (i.e. slope angle map, definition of a superficial soil thickness map, geotechnical soil characterisation through implementation of a backanalysis on test slopes, laboratory test analysis, inference of the influence of precipitation, for distinct return times, on ponding time and pore pressure generation, implementation of a slope stability model (infinite slope model and generalisation of the safety factor for estimated rainfall events with different return times. Such an approach has allowed the identification of potential source areas of debris flow triggering. This is used to detected precipitation events with estimated return time of 10, 50, 75 and 100 years. The model shows a dramatic decrease of safety conditions for the simulation when is related to a 75 years return time rainfall event. It corresponds to an estimated cumulated daily intensity of 280–330 mm. This value can be considered the hydrological triggering

  11. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  12. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  13. The Cosmology Large Angular Scale Surveyor (CLASS)

    Science.gov (United States)

    Harrington, Kathleen; Marriange, Tobias; Aamir, Ali; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  14. A new system of labour management in African large-scale agriculture?

    DEFF Research Database (Denmark)

    Gibbon, Peter; Riisgaard, Lone

    2014-01-01

    This paper applies a convention theory (CT) approach to the analysis of labour management systems in African large-scale farming. The reconstruction of previous analyses of high-value crop production on large-scale farms in Africa in terms of CT suggests that, since 1980–95, labour management has...

  15. Methods for assessing the socioeconomic impacts of large-scale resource developments: implications for nuclear repository siting

    International Nuclear Information System (INIS)

    Murdock, S.H.; Leistritz, F.L.

    1983-03-01

    An overview of the major methods presently available for assessing the socioeconomic impacts of large-scale resource developments and includes discussion of the implications and applications of such methods for nuclear-waste-repository siting are provided. The report: (1) summarizes conceptual approaches underlying, and methodological alternatives for, the conduct of impact assessments in each substantive area, and then enumerates advantages and disadvantages of each alternative; (2) describes factors related to the impact-assessment process, impact events, and the characteristics of rural areas that affect the magnitude and distribution of impacts and the assessment of impacts in each area; (3) provides a detailed review of those methodologies actually used in impact assessment for each area, describes advantages and problems encountered in the use of each method, and identifies the frequency of use and the general level of acceptance of each technique; and (4) summarizes the implications of each area of projection for the repository-siting process, the applicability of the methods for each area to the special and standard features of repositories, and makes general recommendations concerning specific methods and procedures that should be incorporated in assessments for siting areas

  16. A large scale field experiment in the Amazon Basin (Lambada/Bateristca)

    Energy Technology Data Exchange (ETDEWEB)

    Dolman, A.J.; Kabat, P.; Gash, J.H.C.; Noilhan, J.; Jochum, A.M.; Nobre, C. [Winand Staring Centre, Wageningen (Netherlands)

    1994-12-31

    A description is given of a large scale field experiment planned in the Amazon Basin, aiming to assess the large scale balances of energy, water and CO{sub 2}. The background for this experiment, the embedding in global change programmes of IGBP/BAHC and WCRP/GEWEX is described. A proposal by four European groups aimed at designing the experiment with the help of mesoscale models is described and a possible European input to this experiment is suggested. 24 refs., 1 app.

  17. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  18. Multi-scale properties of large eddy simulations: correlations between resolved-scale velocity-field increments and subgrid-scale quantities

    Science.gov (United States)

    Linkmann, Moritz; Buzzicotti, Michele; Biferale, Luca

    2018-06-01

    We provide analytical and numerical results concerning multi-scale correlations between the resolved velocity field and the subgrid-scale (SGS) stress-tensor in large eddy simulations (LES). Following previous studies for Navier-Stokes equations, we derive the exact hierarchy of LES equations governing the spatio-temporal evolution of velocity structure functions of any order. The aim is to assess the influence of the subgrid model on the inertial range intermittency. We provide a series of predictions, within the multifractal theory, for the scaling of correlation involving the SGS stress and we compare them against numerical results from high-resolution Smagorinsky LES and from a-priori filtered data generated from direct numerical simulations (DNS). We find that LES data generally agree very well with filtered DNS results and with the multifractal prediction for all leading terms in the balance equations. Discrepancies are measured for some of the sub-leading terms involving cross-correlation between resolved velocity increments and the SGS tensor or the SGS energy transfer, suggesting that there must be room to improve the SGS modelisation to further extend the inertial range properties for any fixed LES resolution.

  19. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  20. An efficient implementation of 3D high-resolution imaging for large-scale seismic data with GPU/CPU heterogeneous parallel computing

    Science.gov (United States)

    Xu, Jincheng; Liu, Wei; Wang, Jin; Liu, Linong; Zhang, Jianfeng

    2018-02-01

    De-absorption pre-stack time migration (QPSTM) compensates for the absorption and dispersion of seismic waves by introducing an effective Q parameter, thereby making it an effective tool for 3D, high-resolution imaging of seismic data. Although the optimal aperture obtained via stationary-phase migration reduces the computational cost of 3D QPSTM and yields 3D stationary-phase QPSTM, the associated computational efficiency is still the main problem in the processing of 3D, high-resolution images for real large-scale seismic data. In the current paper, we proposed a division method for large-scale, 3D seismic data to optimize the performance of stationary-phase QPSTM on clusters of graphics processing units (GPU). Then, we designed an imaging point parallel strategy to achieve an optimal parallel computing performance. Afterward, we adopted an asynchronous double buffering scheme for multi-stream to perform the GPU/CPU parallel computing. Moreover, several key optimization strategies of computation and storage based on the compute unified device architecture (CUDA) were adopted to accelerate the 3D stationary-phase QPSTM algorithm. Compared with the initial GPU code, the implementation of the key optimization steps, including thread optimization, shared memory optimization, register optimization and special function units (SFU), greatly improved the efficiency. A numerical example employing real large-scale, 3D seismic data showed that our scheme is nearly 80 times faster than the CPU-QPSTM algorithm. Our GPU/CPU heterogeneous parallel computing framework significant reduces the computational cost and facilitates 3D high-resolution imaging for large-scale seismic data.

  1. Rotation invariant fast features for large-scale recognition

    Science.gov (United States)

    Takacs, Gabriel; Chandrasekhar, Vijay; Tsai, Sam; Chen, David; Grzeszczuk, Radek; Girod, Bernd

    2012-10-01

    We present an end-to-end feature description pipeline which uses a novel interest point detector and Rotation- Invariant Fast Feature (RIFF) descriptors. The proposed RIFF algorithm is 15× faster than SURF1 while producing large-scale retrieval results that are comparable to SIFT.2 Such high-speed features benefit a range of applications from Mobile Augmented Reality (MAR) to web-scale image retrieval and analysis.

  2. Integrating adaptive behaviour in large-scale flood risk assessments: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Aerts, Jeroen

    2015-04-01

    Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.

  3. The energy stakes - After Fukushima. 2. ed.

    International Nuclear Information System (INIS)

    Iacona, Estelle; Taine, Jean; Tamain, Bernard

    2012-01-01

    The energy question today is worldwide and depends on major geopolitical stakes (demography, development, water, health, environment, research, risks). The energy must be universally produced and distributed together with minimizing pollutions, nuclear risks and CO 2 emissions. This new edition of 'the energy stakes' is fully updated and approaches some of the main questions that any responsible citizen should ask. It comprises 3 parts dealing with: a comprehensive review of the energy question in most of countries in the world, the constraints and challenges to take up to manage energy in an optimum way, and a prospective study about the control of energy consumption and about the existing technical solutions. (J.S)

  4. Magma viscosity estimation based on analysis of erupted products. Potential assessment for large-scale pyroclastic eruptions

    International Nuclear Information System (INIS)

    Takeuchi, Shingo

    2010-01-01

    After the formulation of guidelines for volcanic hazards in site evaluation for nuclear installations (e.g. JEAG4625-2009), it is required to establish appropriate methods to assess potential of large-scale pyroclastic eruptions at long-dormant volcanoes, which is one of the most hazardous volcanic phenomena on the safety of the installations. In considering the volcanic dormancy, magma eruptability is an important concept. The magma eruptability is dominantly controlled by magma viscosity, which can be estimated from petrological analysis of erupted materials. Therefore, viscosity estimation of magmas erupted in past eruptions should provide important information to assess future activities at hazardous volcanoes. In order to show the importance of magma viscosity in the concept of magma eruptability, this report overviews dike propagation processes from a magma chamber and nature of magma viscosity. Magma viscosity at pre-eruptive conditions of magma chambers were compiled based on previous petrological studies on past eruptions in Japan. There are only 16 examples of eruptions at 9 volcanoes satisfying data requirement for magma viscosity estimation. Estimated magma viscosities range from 10 2 to 10 7 Pa·s for basaltic to rhyolitic magmas. Most of examples fall below dike propagation limit of magma viscosity (ca. 10 6 Pa·s) estimated based on a dike propagation model. Highly viscous magmas (ca. 10 7 Pa·s) than the dike propagation limit are considered to lose eruptability which is the ability to form dikes and initiate eruptions. However, in some cases, small precursory eruptions of less viscous magmas commonly occurred just before climactic eruptions of the highly viscous magmas, suggesting that the precursory dike propagation by the less viscous magmas induced the following eruptions of highly viscous magmas (ca. 10 7 Pa·s). (author)

  5. Nuclear-pumped lasers for large-scale applications

    International Nuclear Information System (INIS)

    Anderson, R.E.; Leonard, E.M.; Shea, R.F.; Berggren, R.R.

    1989-05-01

    Efficient initiation of large-volume chemical lasers may be achieved by neutron induced reactions which produce charged particles in the final state. When a burst mode nuclear reactor is used as the neutron source, both a sufficiently intense neutron flux and a sufficiently short initiation pulse may be possible. Proof-of-principle experiments are planned to demonstrate lasing in a direct nuclear-pumped large-volume system; to study the effects of various neutron absorbing materials on laser performance; to study the effects of long initiation pulse lengths; to demonstrate the performance of large-scale optics and the beam quality that may be obtained; and to assess the performance of alternative designs of burst systems that increase the neutron output and burst repetition rate. 21 refs., 8 figs., 5 tabs

  6. The assessment of the readiness of five countries to implement child maltreatment prevention programs on a large scale.

    Science.gov (United States)

    Mikton, Christopher; Power, Mick; Raleva, Marija; Makoae, Mokhantso; Al Eissa, Majid; Cheah, Irene; Cardia, Nancy; Choo, Claire; Almuneef, Maha

    2013-12-01

    This study aimed to systematically assess the readiness of five countries - Brazil, the Former Yugoslav Republic of Macedonia, Malaysia, Saudi Arabia, and South Africa - to implement evidence-based child maltreatment prevention programs on a large scale. To this end, it applied a recently developed method called Readiness Assessment for the Prevention of Child Maltreatment based on two parallel 100-item instruments. The first measures the knowledge, attitudes, and beliefs concerning child maltreatment prevention of key informants; the second, completed by child maltreatment prevention experts using all available data in the country, produces a more objective assessment readiness. The instruments cover all of the main aspects of readiness including, for instance, availability of scientific data on the problem, legislation and policies, will to address the problem, and material resources. Key informant scores ranged from 31.2 (Brazil) to 45.8/100 (the Former Yugoslav Republic of Macedonia) and expert scores, from 35.2 (Brazil) to 56/100 (Malaysia). Major gaps identified in almost all countries included a lack of professionals with the skills, knowledge, and expertise to implement evidence-based child maltreatment programs and of institutions to train them; inadequate funding, infrastructure, and equipment; extreme rarity of outcome evaluations of prevention programs; and lack of national prevalence surveys of child maltreatment. In sum, the five countries are in a low to moderate state of readiness to implement evidence-based child maltreatment prevention programs on a large scale. Such an assessment of readiness - the first of its kind - allows gaps to be identified and then addressed to increase the likelihood of program success. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Local-scale high-resolution atmospheric dispersion model using large-eddy simulation. LOHDIM-LES

    International Nuclear Information System (INIS)

    Nakayama, Hiromasa; Nagai, Haruyasu

    2016-03-01

    We developed LOcal-scale High-resolution atmospheric DIspersion Model using Large-Eddy Simulation (LOHDIM-LES). This dispersion model is designed based on LES which is effective to reproduce unsteady behaviors of turbulent flows and plume dispersion. The basic equations are the continuity equation, the Navier-Stokes equation, and the scalar conservation equation. Buildings and local terrain variability are resolved by high-resolution grids with a few meters and these turbulent effects are represented by immersed boundary method. In simulating atmospheric turbulence, boundary layer flows are generated by a recycling turbulent inflow technique in a driver region set up at the upstream of the main analysis region. This turbulent inflow data are imposed at the inlet of the main analysis region. By this approach, the LOHDIM-LES can provide detailed information on wind velocities and plume concentration in the investigated area. (author)

  8. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  9. The effect of $1, $5 and $10 stakes in an online dictator game.

    Science.gov (United States)

    Raihani, Nichola J; Mace, Ruth; Lamba, Shakti

    2013-01-01

    The decision rules underpinning human cooperative behaviour are often investigated under laboratory conditions using monetary incentives. A major concern with this approach is that stake size may bias subjects' decisions. This concern is particularly acute in online studies, where stakes are often far lower than those used in laboratory or field settings. We address this concern by conducting a Dictator Game using Amazon Mechanical Turk. In this two-player game, one player (the dictator) determines the division of an endowment between himself and the other player. We recruited subjects from India and the USA to play an online Dictator Game. Dictators received endowments of $1, $5 or $10. We collected two batches of data over two consecutive years. We found that players from India were less generous when playing with a $10 stake. By contrast, the effect of stake size among players from the USA was very small. This study indicates that the effects of stake size on decision making in economic games may vary across populations.

  10. Predators on private land: broad-scale socioeconomic interactions influence large predator management

    Directory of Open Access Journals (Sweden)

    Hayley S. Clements

    2016-06-01

    Full Text Available The proliferation of private land conservation areas (PLCAs is placing increasing pressure on conservation authorities to effectively regulate their ecological management. Many PLCAs depend on tourism for income, and charismatic large mammal species are considered important for attracting international visitors. Broad-scale socioeconomic factors therefore have the potential to drive fine-scale ecological management, creating a systemic scale mismatch that can reduce long-term sustainability in cases where economic and conservation objectives are not perfectly aligned. We assessed the socioeconomic drivers and outcomes of large predator management on 71 PLCAs in South Africa. Owners of PLCAs that are stocking free-roaming large predators identified revenue generation as influencing most or all of their management decisions, and rated profit generation as a more important objective than did the owners of PLCAs that did not stock large predators. Ecotourism revenue increased with increasing lion (Panthera leo density, which created a potential economic incentive for stocking lion at high densities. Despite this potential mismatch between economic and ecological objectives, lion densities were sustainable relative to available prey. Regional-scale policy guidelines for free-roaming lion management were ecologically sound. By contrast, policy guidelines underestimated the area required to sustain cheetah (Acinonyx jubatus, which occurred at unsustainable densities relative to available prey. Evidence of predator overstocking included predator diet supplementation and frequent reintroduction of game. We conclude that effective facilitation of conservation on private land requires consideration of the strong and not necessarily beneficial multiscale socioeconomic factors that influence private land management.

  11. How much is our fairness worth? The effect of raising stakes on offers by Proposers and minimum acceptable offers in Dictator and Ultimatum Games.

    Directory of Open Access Journals (Sweden)

    Julie Novakova

    Full Text Available BACKGROUND: The aim of this study was to determine whether people respond differently to low and high stakes in Dictator and Ultimatum Games. We assumed that if we raised the stakes high enough, we would observe more self-orientated behavior because fairness would become too costly, in spite of a possible risk of a higher punishment. METHODS: A questionnaire was completed by a sample of 524 university students of biology. A mixed linear model was used to test the relation between the amount at stake (CZK 20, 200, 2,000, 20,000 and 200,000, i.e., approximately $1-$10,000 and the shares, as well as the subjects' gender and the design of the study (single vs. multiple games for different amounts. RESULTS: We have discovered a significant relationship between the amount at stake and the minimum acceptable offer in the Ultimatum Game and the proposed shares in both Ultimatum and Dictator Games (p = 0.001, p<0.001, p = 0.0034. The difference between playing a single game or more games with several amounts at stake did not influence the relation between the stakes and the offered and minimum acceptable shares. Women proved significantly more generous than men in their offers in the Dictator Game (p = 0.007. CONCLUSION: Our results suggest that people's behavior in the Dictator and Ultimatum Games depends on the amount at stake. The players tended to lower their relative proposed shares, as well as their relative minimum acceptable offers. We propose that the Responders' sense of equity and fair play depends on the stakes because of the costs of maintaining fairness. However, our results also suggest that the price of fairness is very high and that it is very difficult, probably even impossible, to buy the transition of Homo sociologicus into Homo economicus.

  12. Assessing the sustainable construction of large construction companies in Malaysia

    Science.gov (United States)

    Adewale, Bamgbade Jibril; Mohammed, Kamaruddeen Ahmed; Nasrun, Mohd Nawi Mohd

    2016-08-01

    Considering the increasing concerns for the consideration of sustainability issues in construction project delivery within the construction industry, this paper assesses the extent of sustainable construction among Malaysian large contractors, in order to ascertain the level of the industry's impacts on both the environment and the society. Sustainable construction explains the construction industry's responsibility to efficiently utilise the finite resources while also reducing construction impacts on both humans and the environment throughout the phases of construction. This study used proportionate stratified random sampling to conduct a field study with a sample of 172 contractors out of the 708 administered questionnaires. Data were collected from large contractors in the eleven states of peninsular Malaysia. Using the five-level rating scale (which include: 1= Very Low; 2= Low; 3= Moderate; 4= High; 5= Very High) to describe the level of sustainable construction of Malaysian contractors based on previous studies, statistical analysis reveals that environmental, social and economic sustainability of Malaysian large contractors are high.

  13. Can mixed assessment methods make biology classes more equitable?

    Directory of Open Access Journals (Sweden)

    Sehoya Cotner

    Full Text Available Many factors have been proposed to explain the attrition of women in science, technology, engineering and math fields, among them the lower performance of women in introductory courses resulting from deficits in incoming preparation. We focus on the impact of mixed methods of assessment, which minimizes the impact of high-stakes exams and rewards other methods of assessment such as group participation, low-stakes quizzes and assignments, and in-class activities. We hypothesized that these mixed methods would benefit individuals who otherwise underperform on high-stakes tests. Here, we analyze gender-based performance trends in nine large (N > 1000 students introductory biology courses in fall 2016. Females underperformed on exams compared to their male counterparts, a difference that does not exist with other methods of assessment that compose course grade. Further, we analyzed three case studies of courses that transitioned their grading schemes to either de-emphasize or emphasize exams as a proportion of total course grade. We demonstrate that the shift away from an exam emphasis consequently benefits female students, thereby closing gaps in overall performance. Further, the exam performance gap itself is reduced when the exams contribute less to overall course grade. We discuss testable predictions that follow from our hypothesis, and advocate for the use of mixed methods of assessments (possibly as part of an overall shift to active learning techniques. We conclude by challenging the student deficit model, and suggest a course deficit model as explanatory of these performance gaps, whereby the microclimate of the classroom can either raise or lower barriers to success for underrepresented groups in STEM.

  14. Methods for large-scale international studies on ICT in education

    NARCIS (Netherlands)

    Pelgrum, W.J.; Plomp, T.; Voogt, Joke; Knezek, G.A.

    2008-01-01

    International comparative assessment is a research method applied for describing and analyzing educational processes and outcomes. They are used to ‘describe the status quo’ in educational systems from an international comparative perspective. This chapter reviews different large scale international

  15. High-Stakes Testing in the Warm Heart of Africa:The Challenges and Successes of the Malawi National Examinations Board

    Directory of Open Access Journals (Sweden)

    Elias Chakwera

    2004-06-01

    Full Text Available In the United States, tests are held to high standards of quality. In developing countries such as Malawi, psychometricians must deal with these same high standards as well as several additional pressures such as widespread cheating, test administration difficulties due to challenging landscapes and poor resources, difficulties in reliably scoring performance assessments, and extreme scrutiny from political parties and the popular press. The purposes of this paper are to (a familiarize the measurement community in the US about Malawi’s assessment programs, (b discuss some of the unique challenges inherent in such a program, (c compare testing conditions and test administration formats between Malawi and the US, and (d provide suggestions for improving large-scale testing in countries such as the US and Malawi. By learning how a small country instituted and supports its current testing programs, a broader perspective on resolving current measurement problems throughout the world will emerge.

  16. A Statistical Model for Hourly Large-Scale Wind and Photovoltaic Generation in New Locations

    DEFF Research Database (Denmark)

    Ekstrom, Jussi; Koivisto, Matti Juhani; Mellin, Ilkka

    2017-01-01

    The analysis of large-scale wind and photovoltaic (PV) energy generation is of vital importance in power systems where their penetration is high. This paper presents a modular methodology to assess the power generation and volatility of a system consisting of both PV plants (PVPs) and wind power...... of new PVPs and WPPs in system planning. The model is verified against hourly measured wind speed and solar irradiance data from Finland. A case study assessing the impact of the geographical distribution of the PVPs and WPPs on aggregate power generation and its variability is presented....

  17. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    Science.gov (United States)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  18. COP21: defense stakes

    International Nuclear Information System (INIS)

    Coldefy, Alain; Hulot, Nicolas; Aichi, Leila; Tertrais, Bruno; Paillard, Christophe-Alexandre; Piodi, Jerome; Regnier, Serge; Volpi, Jean-Luc; Descleves, Emmanuel; Garcin, Thierry; Granholm, Niklas; Wedin, Lars; Pouvreau, Ana; Henninger, Laurent

    2015-01-01

    The 21. Conference of the Parties (COP21) from the UN Framework Convention took place in Paris between November 30 and December 11, 2015. The challenge is to reach a universal agreement of fight against global warming and to control the carbon footprint of human activities. This topic is in the core of the Defense Ministry preoccupations. This special dossier takes stock of the question of defense issues linked with global warming. The dossier comprises 13 papers dealing with: 1 - COP21: defense stakes (Coldefy, A.); 2 - Warfare climate, a chance for peace (Hulot, N.); 3 - COP21 and defense (Aichi, L.); 4 - A war climate? (Tertrais, B.); 5 - Challenges the World has to face in the 21. century (Paillard, C.A.); 6 - Desertification: a time bomb in the heart of Sahel (Piodi, J.); 7 - The infrastructure department of defense in the fight against climate disturbance (Regnier, S.); 8 - Fight against global warming, a chance for the forces? (Volpi, J.L.); 9 - Sea and sustainable development (Descleves, E.); 10 - Rationales of Arctic's surrounding powers (Garcin, T.); 11 - Arctic: strategic stake (Granholm, N.; Wedin, L.); 12 - Strategic impact of Turkey's new energy choices (Pouvreau, A.); 13 - Climate and war: a brief historical outlook (Henninger, L.)

  19. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  20. Pharmacy students' test-taking motivation-effort on a low-stakes standardized test.

    Science.gov (United States)

    Waskiewicz, Rhonda A

    2011-04-11

    To measure third-year pharmacy students' level of motivation while completing the Pharmacy Curriculum Outcomes Assessment (PCOA) administered as a low-stakes test to better understand use of the PCOA as a measure of student content knowledge. Student motivation was manipulated through an incentive (ie, personal letter from the dean) and a process of statistical motivation filtering. Data were analyzed to determine any differences between the experimental and control groups in PCOA test performance, motivation to perform well, and test performance after filtering for low motivation-effort. Incentivizing students diminished the need for filtering PCOA scores for low effort. Where filtering was used, performance scores improved, providing a more realistic measure of aggregate student performance. To ensure that PCOA scores are an accurate reflection of student knowledge, incentivizing and/or filtering for low motivation-effort among pharmacy students should be considered fundamental best practice when the PCOA is administered as a low-stakes test.

  1. Large-scale hydrological simulations using the soil water assessment tool, protocol development, and application in the danube basin.

    Science.gov (United States)

    Pagliero, Liliana; Bouraoui, Fayçal; Willems, Patrick; Diels, Jan

    2014-01-01

    The Water Framework Directive of the European Union requires member states to achieve good ecological status of all water bodies. A harmonized pan-European assessment of water resources availability and quality, as affected by various management options, is necessary for a successful implementation of European environmental legislation. In this context, we developed a methodology to predict surface water flow at the pan-European scale using available datasets. Among the hydrological models available, the Soil Water Assessment Tool was selected because its characteristics make it suitable for large-scale applications with limited data requirements. This paper presents the results for the Danube pilot basin. The Danube Basin is one of the largest European watersheds, covering approximately 803,000 km and portions of 14 countries. The modeling data used included land use and management information, a detailed soil parameters map, and high-resolution climate data. The Danube Basin was divided into 4663 subwatersheds of an average size of 179 km. A modeling protocol is proposed to cope with the problems of hydrological regionalization from gauged to ungauged watersheds and overparameterization and identifiability, which are usually present during calibration. The protocol involves a cluster analysis for the determination of hydrological regions and multiobjective calibration using a combination of manual and automated calibration. The proposed protocol was successfully implemented, with the modeled discharges capturing well the overall hydrological behavior of the basin. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  2. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  3. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  4. Technical Design Report for large-scale neutrino detectors prototyping and phased performance assessment in view of a long-baseline oscillation experiment

    CERN Document Server

    De Bonis, I.; Duchesneau, D.; Pessard, H.; Bordoni, S.; Ieva, M.; Lux, T.; Sanchez, F.; Jipa, A.; Lazanu, I.; Calin, M.; Esanu, T.; Ristea, O.; Ristea, C.; Nita, L.; Efthymiopoulos, I.; Nessi, M.; Asfandiyarov, R.; Blondel, A.; Bravar, A.; Cadoux, F.; Haesler, A.; Karadzhov, Y.; Korzenev, A.; Martin, C.; Noah, E.; Ravonel, M.; Rayner, M.; Scantamburlo, E.; Bayes, R.; Soler, F.J.P.; Nuijten, G.A.; Loo, K.; Maalampi, J.; Slupecki, M.; Trzaska, W.H.; Campanelli, M.; Blebea-Apostu, A.M.; Chesneanu, D.; Gomoiu, M.C; Mitrica, B.; Margineanu, R.M.; Stanca, D.L.; Colino, N.; Gil-Botella, I.; Novella, P.; Palomares, C.; Santorelli, R.; Verdugo, A.; Karpikov, I.; Khotjantsev, A.; Kudenko, Y.; Mefodiev, A.; Mineev, O.; Ovsiannikova, T.; Yershov, N.; Enqvist, T.; Kuusiniemi, P.; De La Taille, C.; Dulucq, F.; Martin-Chassard, G.; Andrieu, B.; Dumarchez, J.; Giganti, C.; Levy, J.-M.; Popov, B.; Robert, A.; Agostino, L.; Buizza-Avanzini, M.; Dawson, J.; Franco, D.; Gorodetzky, P.; Kryn, D.; Patzak, T.; Tonazzo, A.; Vannucci, F.; Bésida, O.; Bolognesi, S.; Delbart, A.; Emery, S.; Galymov, V.; Mazzucato, E.; Vasseur, G.; Zito, M.; Bogomilov, M.; Tsenov, R.; Vankova-Kirilova, G.; Friend, M.; Hasegawa, T.; Nakadaira, T.; Sakashita, K.; Zambelli, L.; Autiero, D.; Caiulo, D.; Chaussard, L.; Déclais, Y.; Franco, D.; Marteau, J.; Pennacchio, E.; Bay, F.; Cantini, C.; Crivelli, P.; Epprecht, L.; Gendotti, A.; Di Luise, S.; Horikawa, S.; Murphy, S.; Nikolics, K.; Periale, L.; Regenfus, C.; Rubbia, A.; Sgalaberna, D.; Viant, T.; Wu, S.; Sergiampietri, F.; CERN. Geneva. SPS and PS Experiments Committee; SPSC

    2014-01-01

    In June 2012, an Expression of Interest for a long-baseline experiment (LBNO, CERN-SPSC-EOI-007) has been submitted to the CERN SPSC and is presently under review. LBNO considers three types of neutrino detector technologies: a double-phase liquid argon (LAr) TPC and a magnetised iron detector as far detectors. For the near detector, a high-pressure gas TPC embedded in a calorimeter and a magnet is the baseline design. A mandatory milestone in view of any future long baseline experiment is a concrete prototyping effort towards the envisioned large-scale detectors, and an accompanying campaign of measurements aimed at assessing the systematic errors that will be affecting their intended physics programme. Following an encouraging feedback from 108th SPSC on the technology choices, we have defined as priority the construction and operation of a $6\\times 6\\times 6$m$^3$ (active volume) double-phase liquid argon (DLAr) demonstrator, and a parallel development of the technologies necessary for large magnetised MIN...

  5. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  6. Understanding Business Interests in International Large-Scale Student Assessments: A Media Analysis of "The Economist," "Financial Times," and "Wall Street Journal"

    Science.gov (United States)

    Steiner-Khamsi, Gita; Appleton, Margaret; Vellani, Shezleen

    2018-01-01

    The media analysis is situated in the larger body of studies that explore the varied reasons why different policy actors advocate for international large-scale student assessments (ILSAs) and adds to the research on the fast advance of the global education industry. The analysis of "The Economist," "Financial Times," and…

  7. Signatures of non-universal large scales in conditional structure functions from various turbulent flows

    International Nuclear Information System (INIS)

    Blum, Daniel B; Voth, Greg A; Bewley, Gregory P; Bodenschatz, Eberhard; Gibert, Mathieu; Xu Haitao; Gylfason, Ármann; Mydlarski, Laurent; Yeung, P K

    2011-01-01

    We present a systematic comparison of conditional structure functions in nine turbulent flows. The flows studied include forced isotropic turbulence simulated on a periodic domain, passive grid wind tunnel turbulence in air and in pressurized SF 6 , active grid wind tunnel turbulence (in both synchronous and random driving modes), the flow between counter-rotating discs, oscillating grid turbulence and the flow in the Lagrangian exploration module (in both constant and random driving modes). We compare longitudinal Eulerian second-order structure functions conditioned on the instantaneous large-scale velocity in each flow to assess the ways in which the large scales affect the small scales in a variety of turbulent flows. Structure functions are shown to have larger values when the large-scale velocity significantly deviates from the mean in most flows, suggesting that dependence on the large scales is typical in many turbulent flows. The effects of the large-scale velocity on the structure functions can be quite strong, with the structure function varying by up to a factor of 2 when the large-scale velocity deviates from the mean by ±2 standard deviations. In several flows, the effects of the large-scale velocity are similar at all the length scales we measured, indicating that the large-scale effects are scale independent. In a few flows, the effects of the large-scale velocity are larger on the smallest length scales. (paper)

  8. Comparison of Multi-Scale Digital Elevation Models for Defining Waterways and Catchments Over Large Areas

    Science.gov (United States)

    Harris, B.; McDougall, K.; Barry, M.

    2012-07-01

    Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.

  9. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    Science.gov (United States)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  10. Chirping for large-scale maritime archaeological survey

    DEFF Research Database (Denmark)

    Grøn, Ole; Boldreel, Lars Ole

    2014-01-01

    Archaeological wrecks exposed on the sea floor are mapped using side-scan and multibeam techniques, whereas the detection of submerged archaeological sites, such as Stone Age settlements, and wrecks, partially or wholly embedded in sea-floor sediments, requires the application of high-resolution ...... the present state of this technology, it appears well suited to large-scale maritime archaeological mapping....

  11. Deterministic patterned growth of high-mobility large-crystal graphene: a path towards wafer scale integration

    Science.gov (United States)

    Miseikis, Vaidotas; Bianco, Federica; David, Jérémy; Gemmi, Mauro; Pellegrini, Vittorio; Romagnoli, Marco; Coletti, Camilla

    2017-06-01

    We demonstrate rapid deterministic (seeded) growth of large single-crystals of graphene by chemical vapour deposition (CVD) utilising pre-patterned copper substrates with chromium nucleation sites. Arrays of graphene single-crystals as large as several hundred microns are grown with a periodicity of up to 1 mm. The graphene is transferred to target substrates using aligned and contamination- free semi-dry transfer. The high quality of the synthesised graphene is confirmed by Raman spectroscopy and transport measurements, demonstrating room-temperature carrier mobility of 21 000 cm2 V-1 s-1 when transferred on top of hexagonal boron nitride. By tailoring the nucleation of large single-crystals according to the desired device geometry, it will be possible to produce complex device architectures based on single-crystal graphene, thus paving the way to the adoption of CVD graphene in wafer-scale fabrication.

  12. The Rights and Responsibility of Test Takers When Large-Scale Testing Is Used for Classroom Assessment

    Science.gov (United States)

    van Barneveld, Christina; Brinson, Karieann

    2017-01-01

    The purpose of this research was to identify conflicts in the rights and responsibility of Grade 9 test takers when some parts of a large-scale test are marked by teachers and used in the calculation of students' class marks. Data from teachers' questionnaires and students' questionnaires from a 2009-10 administration of a large-scale test of…

  13. The effect of $1, $5 and $10 stakes in an online dictator game.

    Directory of Open Access Journals (Sweden)

    Nichola J Raihani

    Full Text Available The decision rules underpinning human cooperative behaviour are often investigated under laboratory conditions using monetary incentives. A major concern with this approach is that stake size may bias subjects' decisions. This concern is particularly acute in online studies, where stakes are often far lower than those used in laboratory or field settings. We address this concern by conducting a Dictator Game using Amazon Mechanical Turk. In this two-player game, one player (the dictator determines the division of an endowment between himself and the other player. We recruited subjects from India and the USA to play an online Dictator Game. Dictators received endowments of $1, $5 or $10. We collected two batches of data over two consecutive years. We found that players from India were less generous when playing with a $10 stake. By contrast, the effect of stake size among players from the USA was very small. This study indicates that the effects of stake size on decision making in economic games may vary across populations.

  14. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  15. The impact of language and high-stakes testing policies on elementary school English language learners in Arizona.

    Directory of Open Access Journals (Sweden)

    Wayne E. Wright

    2006-05-01

    Full Text Available This article reports the results of a survey of third-grade teachers of English Language Learners (ELLs in Arizona regarding school language and accountability policies—Proposition 203, which restricts bilingual education and mandates sheltered English Immersion; the federal No Child Left Behind Act of 2001 (NCLB; and Arizona LEARNS, the state’s high-stakes testing and accountability program. The instrument, consisting of 126 survey questions plus open-ended interview question, was designed to obtain teacher’s views, to ascertain the impact of these polices, and to explore their effectiveness in improving the education of ELL students. The survey was administered via telephone to 40 teacher participants from different urban, rural and reservation schools across the state. Each participant represents the elementary school in their respective school district which has the largest population of ELL students. Analyses of both quantitative and qualitative data reveal that these policies have mostly resulted in confusion in schools throughout the state over what is and is not allowed, and what constitutes quality instruction for ELLs, that there is little evidence that such policies have led to improvements in the education of ELL students, and that these policies may be causing more harm than good. Specifically, teachers report they have been given little to no guidance over what constitutes sheltered English immersion, and provide evidence that most ELL students in their schools are receiving mainstream sink-or-swim instruction. In terms of accountability, while the overwhelming majority of teachers support the general principle, they believe that high-stakes tests are inappropriate for ELLs and participants provided evidence that the focus on testing is leading to instruction practices for ELLs which fail to meet their unique linguistic and academic needs. The article concludes with suggestions for needed changes to improve the quality of

  16. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  17. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  18. Experimental Investigation of a Large-Scale Low-Boom Inlet Concept

    Science.gov (United States)

    Hirt, Stefanie M.; Chima, Rodrick V.; Vyas, Manan A.; Wayman, Thomas R.; Conners, Timothy R.; Reger, Robert W.

    2011-01-01

    A large-scale low-boom inlet concept was tested in the NASA Glenn Research Center 8- x 6- foot Supersonic Wind Tunnel. The purpose of this test was to assess inlet performance, stability and operability at various Mach numbers and angles of attack. During this effort, two models were tested: a dual stream inlet designed to mimic potential aircraft flight hardware integrating a high-flow bypass stream; and a single stream inlet designed to study a configuration with a zero-degree external cowl angle and to permit surface visualization of the vortex generator flow on the internal centerbody surface. During the course of the test, the low-boom inlet concept was demonstrated to have high recovery, excellent buzz margin, and high operability. This paper will provide an overview of the setup, show a brief comparison of the dual stream and single stream inlet results, and examine the dual stream inlet characteristics.

  19. High-Efficiency, Multijunction Solar Cells for Large-Scale Solar Electricity Generation

    Science.gov (United States)

    Kurtz, Sarah

    2006-03-01

    A solar cell with an infinite number of materials (matched to the solar spectrum) has a theoretical efficiency limit of 68%. If sunlight is concentrated, this limit increases to about 87%. These theoretical limits are calculated using basic physics and are independent of the details of the materials. In practice, the challenge of achieving high efficiency depends on identifying materials that can effectively use the solar spectrum. Impressive progress has been made with the current efficiency record being 39%. Today's solar market is also showing impressive progress, but is still hindered by high prices. One strategy for reducing cost is to use lenses or mirrors to focus the light on small solar cells. In this case, the system cost is dominated by the cost of the relatively inexpensive optics. The value of the optics increases with the efficiency of the solar cell. Thus, a concentrator system made with 35%- 40%-efficient solar cells is expected to deliver 50% more power at a similar cost when compare with a system using 25%-efficient cells. Today's markets are showing an opportunity for large concentrator systems that didn't exist 5-10 years ago. Efficiencies may soon pass 40% and ultimately may reach 50%, providing a pathway to improved performance and decreased cost. Many companies are currently investigating this technology for large-scale electricity generation. The presentation will cover the basic physics and more practical considerations to achieving high efficiency as well as describing the current status of the concentrator industry. This work has been authored by an employee of the Midwest Research Institute under Contract No. DE- AC36-99GO10337 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, worldwide license to publish or reproduce the published form of this work, or allow

  20. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Science.gov (United States)

    Steiakakis, Chrysanthos; Agioutantis, Zacharias; Apostolou, Evangelia; Papavgeri, Georgia; Tripolitsiotis, Achilles

    2016-01-01

    The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions. This paper presents an integrated data management system which was developed over a number of years as well as the advantages through a specific application. The presented case study illustrates how the high production slopes of a mine that exceed depths of 100-120 m were successfully mined with an average displacement rate of 10- 20 mm/day, approaching an almost slow to moderate landslide velocity. Monitoring data of the past four years are included in the database and can be analyzed to produce valuable results. Time-series data correlations of movements, precipitation records, etc. are evaluated and presented in this case study. The results can be used to successfully manage mine operations and ensure the safety of the mine and the workforce.

  1. Evaluating high risks in large-scale projects using an extended VIKOR method under a fuzzy environment

    Directory of Open Access Journals (Sweden)

    S. Ebrahimnejad

    2012-04-01

    Full Text Available The complexity of large-scale projects has led to numerous risks in their life cycle. This paper presents a new risk evaluation approach in order to rank the high risks in large-scale projects and improve the performance of these projects. It is based on the fuzzy set theory that is an effective tool to handle uncertainty. It is also based on an extended VIKOR method that is one of the well-known multiple criteria decision-making (MCDM methods. The proposed decision-making approach integrates knowledge and experience acquired from professional experts, since they perform the risk identification and also the subjective judgments of the performance rating for high risks in terms of conflicting criteria, including probability, impact, quickness of reaction toward risk, event measure quantity and event capability criteria. The most notable difference of the proposed VIKOR method with its traditional version is just the use of fuzzy decision-matrix data to calculate the ranking index without the need to ask the experts. Finally, the proposed approach is illustrated with a real-case study in an Iranian power plant project, and the associated results are compared with two well-known decision-making methods under a fuzzy environment.

  2. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  3. Large-scale circulation departures related to wet episodes in northeast Brazil

    Science.gov (United States)

    Sikdar, D. N.; Elsner, J. B.

    1985-01-01

    Large scale circulation features are presented as related to wet spells over northeast Brazil (Nordeste) during the rainy season (March and April) of 1979. The rainy season season is devided into dry and wet periods, the FGGE and geostationary satellite data was averaged and mean and departure fields of basic variables and cloudiness were studied. Analysis of seasonal mean circulation features show: lowest sea level easterlies beneath upper level westerlies; weak meridional winds; high relative humidity over the Amazon basin and relatively dry conditions over the South Atlantic Ocean. A fluctuation was found in the large scale circulation features on time scales of a few weeks or so over Nordeste and the South Atlantic sector. Even the subtropical High SLP's have large departures during wet episodes, implying a short period oscillation in the Southern Hemisphere Hadley circulation.

  4. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  5. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  6. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  7. Application of soft x-ray laser interferometry to study large-scale-length, high-density plasmas

    International Nuclear Information System (INIS)

    Wan, A.S.; Barbee, T.W., Jr.; Cauble, R.

    1996-01-01

    We have employed a Mach-Zehnder interferometer, using a Ne-like Y x- ray laser at 155 Angstrom as the probe source, to study large-scale- length, high-density colliding plasmas and exploding foils. The measured density profile of counter-streaming high-density colliding plasmas falls in between the calculated profiles using collisionless and fluid approximations with the radiation hydrodynamic code LASNEX. We have also performed simultaneous measured the local gain and electron density of Y x-ray laser amplifier. Measured gains in the amplifier were found to be between 10 and 20 cm -1 , similar to predictions and indicating that refraction is the major cause of signal loss in long line focus lasers. Images showed that high gain was produced in spots with dimensions of ∼ 10 μm, which we believe is caused by intensity variations in the optical drive laser. Measured density variations were smooth on the 10-μm scale so that temperature variations were likely the cause of the localized gain regions. We are now using the interferometry technique as a mechanism to validate and benchmark our numerical codes used for the design and analysis of high-energy-density physics experiments. 11 refs., 6 figs

  8. Symptom assessment in early psychosis: The use of well-established rating scales in clinical high-risk and recent-onset populations

    OpenAIRE

    Fulford, Daniel; Pearson, Rahel; Stuart, Barbara K.; Fisher, Melissa; Mathalon, Daniel H.; Vinogradov, Sophia; Loewy, Rachel L.

    2014-01-01

    Symptom assessment in early psychosis research typically relies on scales validated in chronic schizophrenia samples. Our goal was to inform investigators who are selecting symptom scales for early psychosis research. We described measure characteristics, baseline scores, and scale inter-relationships in clinical high-risk (CHR) and recent-onset psychotic disorder (RO) samples using the Positive and Negative Syndrome Scale, Brief Psychiatric Rating Scale, Scale for the Assessment of Positive ...

  9. Preliminary design study of a large scale graphite oxidation loop

    International Nuclear Information System (INIS)

    Epel, L.G.; Majeski, S.J.; Schweitzer, D.G.; Sheehan, T.V.

    1979-08-01

    A preliminary design study of a large scale graphite oxidation loop was performed in order to assess feasibility and to estimate capital costs. The nominal design operates at 50 atmospheres helium and 1800 F with a graphite specimen 30 inches long and 10 inches in diameter. It was determined that a simple single walled design was not practical at this time because of a lack of commercially available thick walled high temperature alloys. Two alternative concepts, at reduced operating pressure, were investigated. Both were found to be readily fabricable to operate at 1800 F and capital cost estimates for these are included. A design concept, which is outside the scope of this study, was briefly considered

  10. Testing on a Large Scale Running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Höcker, A; Hughes-Jones, R E; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Leahu, L; Leahu, M; Lehmann-Miotto, G; Le Vine, M J; Liu, W; Maeno, T; Männer, R; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Müller, M; Garcia-Murillo, R; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Albuquerque-Portes, M; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Sole-Segura, E; Seixas, M; Sloper, J; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Ünel, G; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; von der Schmitt, H; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  11. Testing on a Large Scale running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Albuquerque-Portes, M; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garcia-Murillo, R; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Hughes-Jones, R E; Höcker, A; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Le Vine, M J; Leahu, L; Leahu, M; Lehmann-Miotto, G; Liu, W; Maeno, T; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Männer, R; Müller, M; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Seixas, M; Sloper, J; Sole-Segura, E; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; von der Schmitt, H; Ünel, G; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  12. Depth and breadth: Bridging the gap between scientific inquiry and high-stakes testing with diverse junior high school students

    Science.gov (United States)

    Kang, Jee Sun Emily

    This study explored how inquiry-based teaching and learning processes occurred in two teachers' diverse 8th grade Physical Science classrooms in a Program Improvement junior high school within the context of high-stakes standardized testing. Instructors for the courses examined included not only the two 8th grade science teachers, but also graduate fellows from a nearby university. Research was drawn from inquiry-based instruction in science education, the achievement gap, and the high stakes testing movement, as well as situated learning theory to understand how opportunities for inquiry were negotiated within the diverse classroom context. Transcripts of taped class sessions; student work samples; interviews of teachers and students; and scores from the California Standards Test in science were collected and analyzed. Findings indicated that the teachers provided structured inquiry in order to support their students in learning about forces and to prepare them for the standardized test. Teachers also supported students in generating evidence-based explanations, connecting inquiry-based investigations with content on forces, proficiently using science vocabulary, and connecting concepts about forces to their daily lives. Findings from classroom data revealed constraints to student learning: students' limited language proficiency, peer counter culture, and limited time. Supports were evidenced as well: graduate fellows' support during investigations, teachers' guided questioning, standardized test preparation, literacy support, and home-school connections. There was no statistical difference in achievement on the Forces Unit test or science standardized test between classes with graduate fellows and without fellows. There was also no statistical difference in student performance between the two teachers' classrooms, even though their teaching styles were very different. However, there was a strong correlation between students' achievement on the chapter test and

  13. Huntsman takes a stake in Chemplex

    International Nuclear Information System (INIS)

    Wood, A.

    1993-01-01

    Huntsman Chemical (Salt Lake City) has bought a 50% stake in Australian styrenics maker Chemplex (Melbourne) from Consolidated Press Holdings (Sydney). Huntsman stepped in after a previous acquisition plan by South Africa's Sentrachem (Johannesburg) broke down because of a failure to agree on price. Chemplex has two production locations near Melbourne: West Footscray, with capacity for 100,000 m.t./year of styrene, plus polystyrene, phenol, and acetone; and Dandenong, with production of acrylonitrile butadiene styrene and latex. The company was originally Monsanto Australia, before being acquired by Consolidated Press in 1988. The deal will give Huntsman its first major production position in the Asia/Pacific region, apart from a 50% stake in a 25,000-m.t./year polystyrene plant in Taiwan, with Grand Pacific Petrochemical (Taipei) as a partner. In 1991, Huntsman abandoned plans to invest in a 25,000-m.t./year polystyrene plant in Thailand with Mitsubishi Corp. and Toa (Bangkok). Huntsman Chemical has annual revenues of $1.3 billion

  14. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  15. Methodology for a GIS-based damage assessment for researchers following large scale disasters

    Science.gov (United States)

    Crawford, Patrick Shane

    research field. Along with visually mapping the data, geometric calculations can be conducted on the data to give the viewer more information about the damage. In Chapter 4, a tornado damage contour for Moore, Oklahoma following the May 20, 2013 tornado is shown. This damage contour was created in GIS based on the Enhanced Fujita (EF) damage scale, and gives the viewer an easily understood picture of the extent and distribution of the tornado. This thesis aims to describe a foundational groundwork for activities that are performed in the GIS-based damage assessment procedure and provide uses for the damage assessment as well as research being conducted on how to use the data collected from these assessments. This will allow researchers to conduct a highly adaptable, rapid GIS-based damage assessment of their own.

  16. Potential Impact of Large Scale Abstraction on the Quality of Shallow ...

    African Journals Online (AJOL)

    PRO

    Significant increase in crop production would not, however, be ... sounding) using Geonics EM34-3 and Abem SAS300C Terrameter to determine the aquifer (fresh water lens) ..... Final report on environmental impact assessment of large scale.

  17. Facile Large-scale synthesis of stable CuO nanoparticles

    Science.gov (United States)

    Nazari, P.; Abdollahi-Nejand, B.; Eskandari, M.; Kohnehpoushi, S.

    2018-04-01

    In this work, a novel approach in synthesizing the CuO nanoparticles was introduced. A sequential corrosion and detaching was proposed in the growth and dispersion of CuO nanoparticles in the optimum pH value of eight. The produced CuO nanoparticles showed six nm (±2 nm) in diameter and spherical feather with a high crystallinity and uniformity in size. In this method, a large-scale production of CuO nanoparticles (120 grams in an experimental batch) from Cu micro-particles was achieved which may met the market criteria for large-scale production of CuO nanoparticles.

  18. Hierarchical cultural values predict success and mortality in high-stakes teams

    Science.gov (United States)

    Anicich, Eric M.; Swaab, Roderick I.; Galinsky, Adam D.

    2015-01-01

    Functional accounts of hierarchy propose that hierarchy increases group coordination and reduces conflict. In contrast, dysfunctional accounts claim that hierarchy impairs performance by preventing low-ranking team members from voicing their potentially valuable perspectives and insights. The current research presents evidence for both the functional and dysfunctional accounts of hierarchy within the same dataset. Specifically, we offer empirical evidence that hierarchical cultural values affect the outcomes of teams in high-stakes environments through group processes. Experimental data from a sample of expert mountain climbers from 27 countries confirmed that climbers expect that a hierarchical culture leads to improved team coordination among climbing teams, but impaired psychological safety and information sharing compared with an egalitarian culture. An archival analysis of 30,625 Himalayan mountain climbers from 56 countries on 5,104 expeditions found that hierarchy both elevated and killed in the Himalayas: Expeditions from more hierarchical countries had more climbers reach the summit, but also more climbers die along the way. Importantly, we established the role of group processes by showing that these effects occurred only for group, but not solo, expeditions. These findings were robust to controlling for environmental factors, risk preferences, expedition-level characteristics, country-level characteristics, and other cultural values. Overall, this research demonstrates that endorsing cultural values related to hierarchy can simultaneously improve and undermine group performance. PMID:25605883

  19. Design Optimization and Fatigue Analysis of Laser Stake Welded Connections

    Science.gov (United States)

    2008-06-01

    is ultimately envisioned that laser welding will be as common in the shipyard as other processes such -- as MIG, TIG and SMAW. Laser stake- welding of...input from conventional welding techniques can be detrimental to the polymer matrix composite material. In comparison, the laser welding process allows...more discrete frequencies. In the laser welding process , the photons are targeted on the work piece surface which needs to be welded . Highly

  20. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  1. Optimal Selection of AC Cables for Large Scale Offshore Wind Farms

    DEFF Research Database (Denmark)

    Hou, Peng; Hu, Weihao; Chen, Zhe

    2014-01-01

    The investment of large scale offshore wind farms is high in which the electrical system has a significant contribution to the total cost. As one of the key components, the cost of the connection cables affects the initial investment a lot. The development of cable manufacturing provides a vast...... and systematical way for the optimal selection of cables in large scale offshore wind farms....

  2. Experimental facilities for large-scale and full-scale study of hydrogen accidents

    Energy Technology Data Exchange (ETDEWEB)

    Merilo, E.; Groethe, M.; Colton, J. [SRI International, Poulter Laboratory, Menlo Park, CA (United States); Chiba, S. [SRI Japan, Tokyo (Japan)

    2007-07-01

    This paper summarized some of the work that has been performed at SRI International over the past 5 years that address safety issues for the hydrogen-based economy. Researchers at SRI International have conducted experiments at the Corral Hollow Experiment Site (CHES) near Livermore California to obtain fundamental data on hydrogen explosions for risk assessment. In particular, large-scale hydrogen tests were conducted using homogeneous mixtures of hydrogen in volumes from 5.3 m{sup 3} to 300 m{sup 3} to represent scenarios involving fuel cell vehicles as well as transport and storage facilities. Experiments have focused on unconfined deflagrations of hydrogen and air, and detonations of hydrogen in a semi-open space to measure free-field blast effects; the use of blast walls as a mitigation technique; turbulent enhancement of hydrogen combustion due to obstacles within the mixture, and determination of when deflagration-to-detonation transition occurs; the effect of confined hydrogen releases and explosions that could originate from an interconnecting hydrogen pipeline; and, large and small accidental releases of hydrogen. The experiments were conducted to improve the prediction of hydrogen explosions and the capabilities for performing risk assessments, and to develop mitigation techniques. Measurements included hydrogen concentration; flame speed; blast overpressure; heat flux; and, high-speed, standard, and infrared video. The data collected in these experiments is used to correlate computer models and to facilitate the development of codes and standards. This work contributes to better safety technology by evaluating the effectiveness of different blast mitigation techniques. 13 refs., 13 figs.

  3. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  4. Potential climatic impacts and reliability of large-scale offshore wind farms

    International Nuclear Information System (INIS)

    Wang Chien; Prinn, Ronald G

    2011-01-01

    The vast availability of wind power has fueled substantial interest in this renewable energy source as a potential near-zero greenhouse gas emission technology for meeting future world energy needs while addressing the climate change issue. However, in order to provide even a fraction of the estimated future energy needs, a large-scale deployment of wind turbines (several million) is required. The consequent environmental impacts, and the inherent reliability of such a large-scale usage of intermittent wind power would have to be carefully assessed, in addition to the need to lower the high current unit wind power costs. Our previous study (Wang and Prinn 2010 Atmos. Chem. Phys. 10 2053) using a three-dimensional climate model suggested that a large deployment of wind turbines over land to meet about 10% of predicted world energy needs in 2100 could lead to a significant temperature increase in the lower atmosphere over the installed regions. A global-scale perturbation to the general circulation patterns as well as to the cloud and precipitation distribution was also predicted. In the later study reported here, we conducted a set of six additional model simulations using an improved climate model to further address the potential environmental and intermittency issues of large-scale deployment of offshore wind turbines for differing installation areas and spatial densities. In contrast to the previous land installation results, the offshore wind turbine installations are found to cause a surface cooling over the installed offshore regions. This cooling is due principally to the enhanced latent heat flux from the sea surface to lower atmosphere, driven by an increase in turbulent mixing caused by the wind turbines which was not entirely offset by the concurrent reduction of mean wind kinetic energy. We found that the perturbation of the large-scale deployment of offshore wind turbines to the global climate is relatively small compared to the case of land

  5. ability in Large Scale Land Acquisitions in Kenya

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    Kenya's national planning strategy, Vision 2030. Agri- culture, natural resource exploitation, and infrastruc- ... sitions due to high levels of poverty and unclear or in- secure land tenure rights in Kenya. Inadequate social ... lease to a private company over the expansive Yala. Swamp to undertake large-scale irrigation farming.

  6. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  7. COMPARISON OF MULTI-SCALE DIGITAL ELEVATION MODELS FOR DEFINING WATERWAYS AND CATCHMENTS OVER LARGE AREAS

    Directory of Open Access Journals (Sweden)

    B. Harris

    2012-07-01

    Full Text Available Digital Elevation Models (DEMs allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas are adequate for the creation of waterways and catchments at a regional scale.

  8. An eigenfunction method for reconstruction of large-scale and high-contrast objects.

    Science.gov (United States)

    Waag, Robert C; Lin, Feng; Varslot, Trond K; Astheimer, Jeffrey P

    2007-07-01

    A multiple-frequency inverse scattering method that uses eigenfunctions of a scattering operator is extended to image large-scale and high-contrast objects. The extension uses an estimate of the scattering object to form the difference between the scattering by the object and the scattering by the estimate of the object. The scattering potential defined by this difference is expanded in a basis of products of acoustic fields. These fields are defined by eigenfunctions of the scattering operator associated with the estimate. In the case of scattering objects for which the estimate is radial, symmetries in the expressions used to reconstruct the scattering potential greatly reduce the amount of computation. The range of parameters over which the reconstruction method works well is illustrated using calculated scattering by different objects. The method is applied to experimental data from a 48-mm diameter scattering object with tissue-like properties. The image reconstructed from measurements has, relative to a conventional B-scan formed using a low f-number at the same center frequency, significantly higher resolution and less speckle, implying that small, high-contrast structures can be demonstrated clearly using the extended method.

  9. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  10. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  11. Experimental simulation of microinteractions in large scale explosions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, X.; Luo, R.; Yuen, W.W.; Theofanous, T.G. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    This paper presents data and analysis of recent experiments conducted in the SIGMA-2000 facility to simulate microinteractions in large scale explosions. Specifically, the fragmentation behavior of a high temperature molten steel drop under high pressure (beyond critical) conditions are investigated. The current data demonstrate, for the first time, the effect of high pressure in suppressing the thermal effect of fragmentation under supercritical conditions. The results support the microinteractions idea, and the ESPROSE.m prediction of fragmentation rate. (author)

  12. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  13. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data.

    Science.gov (United States)

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H

    2012-11-06

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the "big data" challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce.

  14. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  15. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  16. Assessing large-scale weekly cycles in meteorological variables: a review

    Directory of Open Access Journals (Sweden)

    A. Sanchez-Lorenzo

    2012-07-01

    Full Text Available Several studies have claimed to have found significant weekly cycles of meteorological variables appearing over large domains, which can hardly be related to urban effects exclusively. Nevertheless, there is still an ongoing scientific debate whether these large-scale weekly cycles exist or not, and some other studies fail to reproduce them with statistical significance. In addition to the lack of the positive proof for the existence of these cycles, their possible physical explanations have been controversially discussed during the last years. In this work we review the main results about this topic published during the recent two decades, including a summary of the existence or non-existence of significant weekly weather cycles across different regions of the world, mainly over the US, Europe and Asia. In addition, some shortcomings of common statistical methods for analyzing weekly cycles are listed. Finally, a brief summary of supposed causes of the weekly cycles, focusing on the aerosol-cloud-radiation interactions and their impact on meteorological variables as a result of the weekly cycles of anthropogenic activities, and possible directions for future research, is presented.

  17. Effects of large-scale deforestation on precipitation in the monsoon regions: remote versus local effects.

    Science.gov (United States)

    Devaraju, N; Bala, Govindasamy; Modak, Angshuman

    2015-03-17

    In this paper, using idealized climate model simulations, we investigate the biogeophysical effects of large-scale deforestation on monsoon regions. We find that the remote forcing from large-scale deforestation in the northern middle and high latitudes shifts the Intertropical Convergence Zone southward. This results in a significant decrease in precipitation in the Northern Hemisphere monsoon regions (East Asia, North America, North Africa, and South Asia) and moderate precipitation increases in the Southern Hemisphere monsoon regions (South Africa, South America, and Australia). The magnitude of the monsoonal precipitation changes depends on the location of deforestation, with remote effects showing a larger influence than local effects. The South Asian Monsoon region is affected the most, with 18% decline in precipitation over India. Our results indicate that any comprehensive assessment of afforestation/reforestation as climate change mitigation strategies should carefully evaluate the remote effects on monsoonal precipitation alongside the large local impacts on temperatures.

  18. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  19. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir; Ltaief, Hatem; Mikhalev, Aleksandr; Charara, Ali; Keyes, David E.

    2018-01-01

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  20. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Nusser, Adi [Physics Department and the Asher Space Science Institute-Technion, Haifa 32000 (Israel); Branchini, Enzo [Department of Physics, Universita Roma Tre, Via della Vasca Navale 84, 00146 Rome (Italy); Davis, Marc, E-mail: adi@physics.technion.ac.il, E-mail: branchin@fis.uniroma3.it, E-mail: mdavis@berkeley.edu [Departments of Astronomy and Physics, University of California, Berkeley, CA 94720 (United States)

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  1. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  2. No large scale curvature perturbations during the waterfall phase transition of hybrid inflation

    International Nuclear Information System (INIS)

    Abolhasani, Ali Akbar; Firouzjahi, Hassan

    2011-01-01

    In this paper the possibility of generating large scale curvature perturbations induced from the entropic perturbations during the waterfall phase transition of the standard hybrid inflation model is studied. We show that whether or not appreciable amounts of large scale curvature perturbations are produced during the waterfall phase transition depends crucially on the competition between the classical and the quantum mechanical backreactions to terminate inflation. If one considers only the classical evolution of the system, we show that the highly blue-tilted entropy perturbations induce highly blue-tilted large scale curvature perturbations during the waterfall phase transition which dominate over the original adiabatic curvature perturbations. However, we show that the quantum backreactions of the waterfall field inhomogeneities produced during the phase transition dominate completely over the classical backreactions. The cumulative quantum backreactions of very small scale tachyonic modes terminate inflation very efficiently and shut off the curvature perturbation evolution during the waterfall phase transition. This indicates that the standard hybrid inflation model is safe under large scale curvature perturbations during the waterfall phase transition.

  3. Lagrangian space consistency relation for large scale structure

    International Nuclear Information System (INIS)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-01-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space

  4. Electron drift in a large scale solid xenon

    International Nuclear Information System (INIS)

    Yoo, J.; Jaskierny, W.F.

    2015-01-01

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Therefore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon

  5. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    International Nuclear Information System (INIS)

    Williams, Paul T.; Yin, Shengjun; Klasky, Hilda B.; Bass, Bennett Richard

    2011-01-01

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current status of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite

  6. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  7. Open TG-GATEs: a large-scale toxicogenomics database

    Science.gov (United States)

    Igarashi, Yoshinobu; Nakatsu, Noriyuki; Yamashita, Tomoya; Ono, Atsushi; Ohno, Yasuo; Urushidani, Tetsuro; Yamada, Hiroshi

    2015-01-01

    Toxicogenomics focuses on assessing the safety of compounds using gene expression profiles. Gene expression signatures from large toxicogenomics databases are expected to perform better than small databases in identifying biomarkers for the prediction and evaluation of drug safety based on a compound's toxicological mechanisms in animal target organs. Over the past 10 years, the Japanese Toxicogenomics Project consortium (TGP) has been developing a large-scale toxicogenomics database consisting of data from 170 compounds (mostly drugs) with the aim of improving and enhancing drug safety assessment. Most of the data generated by the project (e.g. gene expression, pathology, lot number) are freely available to the public via Open TG-GATEs (Toxicogenomics Project-Genomics Assisted Toxicity Evaluation System). Here, we provide a comprehensive overview of the database, including both gene expression data and metadata, with a description of experimental conditions and procedures used to generate the database. Open TG-GATEs is available from http://toxico.nibio.go.jp/english/index.html. PMID:25313160

  8. Synthesizing large-scale pyroclastic flows: Experimental design, scaling, and first results from PELE

    Science.gov (United States)

    Lube, G.; Breard, E. C. P.; Cronin, S. J.; Jones, J.

    2015-03-01

    Pyroclastic flow eruption large-scale experiment (PELE) is a large-scale facility for experimental studies of pyroclastic density currents (PDCs). It is used to generate high-energy currents involving 500-6500 m3 natural volcanic material and air that achieve velocities of 7-30 m s-1, flow thicknesses of 2-4.5 m, and runouts of >35 m. The experimental PDCs are synthesized by a controlled "eruption column collapse" of ash-lapilli suspensions onto an instrumented channel. The first set of experiments are documented here and used to elucidate the main flow regimes that influence PDC dynamic structure. Four phases are identified: (1) mixture acceleration during eruption column collapse, (2) column-slope impact, (3) PDC generation, and (4) ash cloud diffusion. The currents produced are fully turbulent flows and scale well to natural PDCs including small to large scales of turbulent transport. PELE is capable of generating short, pulsed, and sustained currents over periods of several tens of seconds, and dilute surge-like PDCs through to highly concentrated pyroclastic flow-like currents. The surge-like variants develop a basal <0.05 m thick regime of saltating/rolling particles and shifting sand waves, capped by a 2.5-4.5 m thick, turbulent suspension that grades upward to lower particle concentrations. Resulting deposits include stratified dunes, wavy and planar laminated beds, and thin ash cloud fall layers. Concentrated currents segregate into a dense basal underflow of <0.6 m thickness that remains aerated. This is capped by an upper ash cloud surge (1.5-3 m thick) with 100 to 10-4 vol % particles. Their deposits include stratified, massive, normally and reversely graded beds, lobate fronts, and laterally extensive veneer facies beyond channel margins.

  9. Balancing detail and scale in assessing transparency to improve the governance of agricultural commodity supply chains

    Science.gov (United States)

    Godar, Javier; Suavet, Clément; Gardner, Toby A.; Dawkins, Elena; Meyfroidt, Patrick

    2016-03-01

    To date, assessments of the sustainability of agricultural commodity supply chains have largely relied on some combination of macro-scale footprint accounts, detailed life-cycle analyses and fine-scale traceability systems. Yet these approaches are limited in their ability to support the sustainability governance of agricultural supply chains, whether because they are intended for coarser-grained analyses, do not identify individual actors, or are too costly to be implemented in a consistent manner for an entire region of production. Here we illustrate some of the advantages of a complementary middle-ground approach that balances detail and scale of supply chain transparency information by combining consistent country-wide data on commodity production at the sub-national (e.g. municipal) level with per shipment customs data to describe trade flows of a given commodity covering all companies and production regions within that country. This approach can support supply chain governance in two key ways. First, enhanced spatial resolution of the production regions that connect to individual supply chains allows for a more accurate consideration of geographic variability in measures of risk and performance that are associated with different production practices. Second, identification of key actors that operate within a specific supply chain, including producers, traders, shippers and consumers can help discriminate coalitions of actors that have shared stake in a particular region, and that together are capable of delivering more cost-effective and coordinated interventions. We illustrate the potential of this approach with examples from Brazil, Indonesia and Colombia. We discuss how transparency information can deepen understanding of the environmental and social impacts of commodity production systems, how benefits are distributed among actors, and some of the trade-offs involved in efforts to improve supply chain sustainability. We then discuss the challenges and

  10. Compact Representation of High-Dimensional Feature Vectors for Large-Scale Image Recognition and Retrieval.

    Science.gov (United States)

    Zhang, Yu; Wu, Jianxin; Cai, Jianfei

    2016-05-01

    In large-scale visual recognition and image retrieval tasks, feature vectors, such as Fisher vector (FV) or the vector of locally aggregated descriptors (VLAD), have achieved state-of-the-art results. However, the combination of the large numbers of examples and high-dimensional vectors necessitates dimensionality reduction, in order to reduce its storage and CPU costs to a reasonable range. In spite of the popularity of various feature compression methods, this paper shows that the feature (dimension) selection is a better choice for high-dimensional FV/VLAD than the feature (dimension) compression methods, e.g., product quantization. We show that strong correlation among the feature dimensions in the FV and the VLAD may not exist, which renders feature selection a natural choice. We also show that, many dimensions in FV/VLAD are noise. Throwing them away using feature selection is better than compressing them and useful dimensions altogether using feature compression methods. To choose features, we propose an efficient importance sorting algorithm considering both the supervised and unsupervised cases, for visual recognition and image retrieval, respectively. Combining with the 1-bit quantization, feature selection has achieved both higher accuracy and less computational cost than feature compression methods, such as product quantization, on the FV and the VLAD image representations.

  11. Large-scale coherent structures of suspended dust concentration in the neutral atmospheric surface layer: A large-eddy simulation study

    Science.gov (United States)

    Zhang, Yangyue; Hu, Ruifeng; Zheng, Xiaojing

    2018-04-01

    Dust particles can remain suspended in the atmospheric boundary layer, motions of which are primarily determined by turbulent diffusion and gravitational settling. Little is known about the spatial organizations of suspended dust concentration and how turbulent coherent motions contribute to the vertical transport of dust particles. Numerous studies in recent years have revealed that large- and very-large-scale motions in the logarithmic region of laboratory-scale turbulent boundary layers also exist in the high Reynolds number atmospheric boundary layer, but their influence on dust transport is still unclear. In this study, numerical simulations of dust transport in a neutral atmospheric boundary layer based on an Eulerian modeling approach and large-eddy simulation technique are performed to investigate the coherent structures of dust concentration. The instantaneous fields confirm the existence of very long meandering streaks of dust concentration, with alternating high- and low-concentration regions. A strong negative correlation between the streamwise velocity and concentration and a mild positive correlation between the vertical velocity and concentration are observed. The spatial length scales and inclination angles of concentration structures are determined, compared with their flow counterparts. The conditionally averaged fields vividly depict that high- and low-concentration events are accompanied by a pair of counter-rotating quasi-streamwise vortices, with a downwash inside the low-concentration region and an upwash inside the high-concentration region. Through the quadrant analysis, it is indicated that the vertical dust transport is closely related to the large-scale roll modes, and ejections in high-concentration regions are the major mechanisms for the upward motions of dust particles.

  12. Assessment Profile of Malaysia: High-Stakes External Examinations Dominate

    Science.gov (United States)

    Ong, Saw Lan

    2010-01-01

    Malaysia is a federation of 13 states located in South-east Asia. The country consists of two geographical regions; Peninsular Malaysia (also known as West Malaysia) and Malaysian Borneo (also known as East Malaysia) separated by the South China Sea. The educational administration in Malaysia is highly centralised with four hierarchical levels;…

  13. Large-scale nuclear energy from the thorium cycle

    International Nuclear Information System (INIS)

    Lewis, W.B.; Duret, M.F.; Craig, D.S.; Veeder, J.I.; Bain, A.S.

    1973-02-01

    The thorium fuel cycle in CANDU (Canada Deuterium Uranium) reactors challenges breeders and fusion as the simplest means of meeting the world's large-scale demands for energy for centuries. Thorium oxide fuel allows high power density with excellent neutron economy. The combination of thorium fuel with organic caloporteur promises easy maintenance and high availability of the whole plant. The total fuelling cost including charges on the inventory is estimated to be attractively low. (author) [fr

  14. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  15. Large-Scale Fabrication of Silicon Nanowires for Solar Energy Applications.

    Science.gov (United States)

    Zhang, Bingchang; Jie, Jiansheng; Zhang, Xiujuan; Ou, Xuemei; Zhang, Xiaohong

    2017-10-11

    The development of silicon (Si) materials during past decades has boosted up the prosperity of the modern semiconductor industry. In comparison with the bulk-Si materials, Si nanowires (SiNWs) possess superior structural, optical, and electrical properties and have attracted increasing attention in solar energy applications. To achieve the practical applications of SiNWs, both large-scale synthesis of SiNWs at low cost and rational design of energy conversion devices with high efficiency are the prerequisite. This review focuses on the recent progresses in large-scale production of SiNWs, as well as the construction of high-efficiency SiNW-based solar energy conversion devices, including photovoltaic devices and photo-electrochemical cells. Finally, the outlook and challenges in this emerging field are presented.

  16. Approach for growth of high-quality and large protein crystals

    Energy Technology Data Exchange (ETDEWEB)

    Matsumura, Hiroyoshi, E-mail: matsumura@chem.eng.osaka-u.ac.jp [Graduate School of Engineering, Osaka University, Suita, Osaka 565-0871 (Japan); JST (Japan); SOSHO Inc., Osaka 541-0053 (Japan); Sugiyama, Shigeru; Hirose, Mika; Kakinouchi, Keisuke; Maruyama, Mihoko; Murai, Ryota [Graduate School of Engineering, Osaka University, Suita, Osaka 565-0871 (Japan); JST (Japan); Adachi, Hiroaki; Takano, Kazufumi [Graduate School of Engineering, Osaka University, Suita, Osaka 565-0871 (Japan); JST (Japan); SOSHO Inc., Osaka 541-0053 (Japan); Murakami, Satoshi [JST (Japan); SOSHO Inc., Osaka 541-0053 (Japan); Graduate School of Bioscience and Biotechnology, Tokyo Institute of Technology, Nagatsuta, Midori-ku, Yokohama 226-8501 (Japan); Mori, Yusuke; Inoue, Tsuyoshi [Graduate School of Engineering, Osaka University, Suita, Osaka 565-0871 (Japan); JST (Japan); SOSHO Inc., Osaka 541-0053 (Japan)

    2011-01-01

    Three crystallization methods, including crystallization in the presence of a semi-solid agarose gel, top-seeded solution growth (TSSG) and a large-scale hanging-drop method, have previously been presented. In this study, crystallization has been further evaluated in the presence of a semi-solid agarose gel by crystallizing additional proteins. A novel crystallization method combining TSSG and the large-scale hanging-drop method has also been developed. Three crystallization methods for growing large high-quality protein crystals, i.e. crystallization in the presence of a semi-solid agarose gel, top-seeded solution growth (TSSG) and a large-scale hanging-drop method, have previously been presented. In this study the effectiveness of crystallization in the presence of a semi-solid agarose gel has been further evaluated by crystallizing additional proteins in the presence of 2.0% (w/v) agarose gel, resulting in complete gelification with high mechanical strength. In TSSG the seed crystals are hung by a seed holder protruding from the top of the growth vessel to prevent polycrystallization. In the large-scale hanging-drop method, a cut pipette tip was used to maintain large-scale droplets consisting of protein–precipitant solution. Here a novel crystallization method that combines TSSG and the large-scale hanging-drop method is reported. A large and single crystal of lysozyme was obtained by this method.

  17. Best Known Problem Solving Strategies in "High-Stakes" Assessments

    Science.gov (United States)

    Hong, Dae S.

    2011-01-01

    In its mathematics standards, National Council of Teachers of Mathematics (NCTM) states that problem solving is an integral part of all mathematics learning and exposure to problem solving strategies should be embedded across the curriculum. Furthermore, by high school, students should be able to use, decide and invent a wide range of strategies.…

  18. Low-Temperature Soft-Cover Deposition of Uniform Large-Scale Perovskite Films for High-Performance Solar Cells.

    Science.gov (United States)

    Ye, Fei; Tang, Wentao; Xie, Fengxian; Yin, Maoshu; He, Jinjin; Wang, Yanbo; Chen, Han; Qiang, Yinghuai; Yang, Xudong; Han, Liyuan

    2017-09-01

    Large-scale high-quality perovskite thin films are crucial to produce high-performance perovskite solar cells. However, for perovskite films fabricated by solvent-rich processes, film uniformity can be prevented by convection during thermal evaporation of the solvent. Here, a scalable low-temperature soft-cover deposition (LT-SCD) method is presented, where the thermal convection-induced defects in perovskite films are eliminated through a strategy of surface tension relaxation. Compact, homogeneous, and convection-induced-defects-free perovskite films are obtained on an area of 12 cm 2 , which enables a power conversion efficiency (PCE) of 15.5% on a solar cell with an area of 5 cm 2 . This is the highest efficiency at this large cell area. A PCE of 15.3% is also obtained on a flexible perovskite solar cell deposited on the polyethylene terephthalate substrate owing to the advantage of presented low-temperature processing. Hence, the present LT-SCD technology provides a new non-spin-coating route to the deposition of large-area uniform perovskite films for both rigid and flexible perovskite devices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Environment and host as large-scale controls of ectomycorrhizal fungi.

    Science.gov (United States)

    van der Linde, Sietse; Suz, Laura M; Orme, C David L; Cox, Filipa; Andreae, Henning; Asi, Endla; Atkinson, Bonnie; Benham, Sue; Carroll, Christopher; Cools, Nathalie; De Vos, Bruno; Dietrich, Hans-Peter; Eichhorn, Johannes; Gehrmann, Joachim; Grebenc, Tine; Gweon, Hyun S; Hansen, Karin; Jacob, Frank; Kristöfel, Ferdinand; Lech, Paweł; Manninger, Miklós; Martin, Jan; Meesenburg, Henning; Merilä, Päivi; Nicolas, Manuel; Pavlenda, Pavel; Rautio, Pasi; Schaub, Marcus; Schröck, Hans-Werner; Seidling, Walter; Šrámek, Vít; Thimonier, Anne; Thomsen, Iben Margrete; Titeux, Hugues; Vanguelova, Elena; Verstraeten, Arne; Vesterdal, Lars; Waldner, Peter; Wijk, Sture; Zhang, Yuxin; Žlindra, Daniel; Bidartondo, Martin I

    2018-06-06

    Explaining the large-scale diversity of soil organisms that drive biogeochemical processes-and their responses to environmental change-is critical. However, identifying consistent drivers of belowground diversity and abundance for some soil organisms at large spatial scales remains problematic. Here we investigate a major guild, the ectomycorrhizal fungi, across European forests at a spatial scale and resolution that is-to our knowledge-unprecedented, to explore key biotic and abiotic predictors of ectomycorrhizal diversity and to identify dominant responses and thresholds for change across complex environmental gradients. We show the effect of 38 host, environment, climate and geographical variables on ectomycorrhizal diversity, and define thresholds of community change for key variables. We quantify host specificity and reveal plasticity in functional traits involved in soil foraging across gradients. We conclude that environmental and host factors explain most of the variation in ectomycorrhizal diversity, that the environmental thresholds used as major ecosystem assessment tools need adjustment and that the importance of belowground specificity and plasticity has previously been underappreciated.

  20. The Perceived Value of Maths and Academic Self-Efficacy in the Appraisal of Fear Appeals Used Prior to a High-Stakes Test as Threatening or Challenging

    Science.gov (United States)

    Putwain, David William; Symes, Wendy

    2014-01-01

    Previous work has examined how messages communicated to students prior to high-stakes exams, that emphasise the importance of avoiding failure for subsequent life trajectory, may be appraised as threatening. In two studies, we extended this work to examine how students may also appraise such messages as challenging or disregard them as being of…

  1. Large-scale circulation departures related to wet episodes in north-east Brazil

    Science.gov (United States)

    Sikdar, Dhirendra N.; Elsner, James B.

    1987-01-01

    Large scale circulation features are presented as related to wet spells over northeast Brazil (Nordeste) during the rainy season (March and April) of 1979. The rainy season is divided into dry and wet periods; the FGGE and geostationary satellite data was averaged; and mean and departure fields of basic variables and cloudiness were studied. Analysis of seasonal mean circulation features show: lowest sea level easterlies beneath upper level westerlies; weak meridional winds; high relative humidity over the Amazon basin and relatively dry conditions over the South Atlantic Ocean. A fluctuation was found in the large scale circulation features on time scales of a few weeks or so over Nordeste and the South Atlantic sector. Even the subtropical High SLPs have large departures during wet episodes, implying a short period oscillation in the Southern Hemisphere Hadley circulation.

  2. Study on sandstorm PM10 exposure assessment in the large-scale region: a case study in Inner Mongolia.

    Science.gov (United States)

    Wang, Hongmei; Lv, Shihai; Diao, Zhaoyan; Wang, Baolu; Zhang, Han; Yu, Caihong

    2018-04-12

    The current exposure-effect curves describing sandstorm PM 10 exposure and the health effects are drawn roughly by the outdoor concentration (OC), which ignored the exposure levels of people's practical activity sites. The main objective of this work is to develop a novel approach to quantify human PM 10 exposure by their socio-categorized micro-environment activities-time weighed (SCMEATW) in strong sandstorm period, which can be used to assess the exposure profiles in the large-scale region. Types of people's SCMEATW were obtained by questionnaire investigation. Different types of representatives were trackly recorded during the big sandstorm. The average exposure levels were estimated by SCMEATW. Furthermore, the geographic information system (GIS) technique was taken not only to simulate the outdoor concentration spatially but also to create human exposure outlines in a visualized map simultaneously, which could help to understand the risk to different types of people. Additionally, exposure-response curves describing the acute outpatient rate odds by sandstorm were formed by SCMEATW, and the differences between SCMEATW and OC were compared. Results indicated that acute outpatient rate odds had relationships with PM 10 exposure from SCMEATW, with a level less than that of OC. Some types of people, such as herdsmen and those people walking outdoors during a strong sandstorm, have more risk than office men. Our findings provide more understanding of human practical activities on their exposure levels; they especially provide a tool to understand sandstorm PM 10 exposure in large scale spatially, which might help to perform the different categories population's risk assessment regionally.

  3. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  4. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  5. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  6. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  7. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  8. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  9. Using IT to Assess IT: Towards Greater Authenticity in Summative Performance Assessment

    Science.gov (United States)

    Newhouse, C. Paul

    2011-01-01

    An applied Information Technology (IT) course that is assessed using pen and paper may sound incongruous but it is symptomatic of the state of high-stakes assessment in jurisdictions such as Western Australia. Whereas technology has permeated most aspects of modern life, including schooling, and more has been demanded of education systems in terms…

  10. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  11. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  12. A high-speed transmission method for large-scale marine seismic prospecting systems

    International Nuclear Information System (INIS)

    KeZhu, Song; Ping, Cao; JunFeng, Yang; FuMing, Ruan

    2012-01-01

    A marine seismic prospecting system is a kind of data acquisition and transmission system with large-scale coverage and synchronous multi-node acquisition. In this kind of system, data transmission is a fundamental and difficult technique. In this paper, a high-speed data-transmission method is proposed, its implications and limitations are discussed, and conclusions are drawn. The method we propose has obvious advantages over traditional techniques with respect to long-distance operation, high speed, and real-time transmission. A marine seismic system with four streamers, each 6000 m long and capable of supporting up to 1920 channels, was designed and built based on this method. The effective transmission baud rate of this system was found to reach up to 240 Mbps, while the minimum sampling interval time was as short as 0.25 ms. This system was found to achieve a good synchronization: 83 ns. Laboratory and in situ experiments showed that this marine-prospecting system could work correctly and robustly, which verifies the feasibility and validity of the method proposed in this paper. In addition to the marine seismic applications, this method can also be used in land seismic applications and certain other transmission applications such as environmental or engineering monitoring systems. (paper)

  13. A high-speed transmission method for large-scale marine seismic prospecting systems

    Science.gov (United States)

    KeZhu, Song; Ping, Cao; JunFeng, Yang; FuMing, Ruan

    2012-12-01

    A marine seismic prospecting system is a kind of data acquisition and transmission system with large-scale coverage and synchronous multi-node acquisition. In this kind of system, data transmission is a fundamental and difficult technique. In this paper, a high-speed data-transmission method is proposed, its implications and limitations are discussed, and conclusions are drawn. The method we propose has obvious advantages over traditional techniques with respect to long-distance operation, high speed, and real-time transmission. A marine seismic system with four streamers, each 6000 m long and capable of supporting up to 1920 channels, was designed and built based on this method. The effective transmission baud rate of this system was found to reach up to 240 Mbps, while the minimum sampling interval time was as short as 0.25 ms. This system was found to achieve a good synchronization: 83 ns. Laboratory and in situ experiments showed that this marine-prospecting system could work correctly and robustly, which verifies the feasibility and validity of the method proposed in this paper. In addition to the marine seismic applications, this method can also be used in land seismic applications and certain other transmission applications such as environmental or engineering monitoring systems.

  14. Large Scale Computing and Storage Requirements for High Energy Physics

    International Nuclear Information System (INIS)

    Gerber, Richard A.; Wasserman, Harvey

    2010-01-01

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes

  15. TIMPs of parasitic helminths - a large-scale analysis of high-throughput sequence datasets.

    Science.gov (United States)

    Cantacessi, Cinzia; Hofmann, Andreas; Pickering, Darren; Navarro, Severine; Mitreva, Makedonka; Loukas, Alex

    2013-05-30

    Tissue inhibitors of metalloproteases (TIMPs) are a multifunctional family of proteins that orchestrate extracellular matrix turnover, tissue remodelling and other cellular processes. In parasitic helminths, such as hookworms, TIMPs have been proposed to play key roles in the host-parasite interplay, including invasion of and establishment in the vertebrate animal hosts. Currently, knowledge of helminth TIMPs is limited to a small number of studies on canine hookworms, whereas no information is available on the occurrence of TIMPs in other parasitic helminths causing neglected diseases. In the present study, we conducted a large-scale investigation of TIMP proteins of a range of neglected human parasites including the hookworm Necator americanus, the roundworm Ascaris suum, the liver flukes Clonorchis sinensis and Opisthorchis viverrini, as well as the schistosome blood flukes. This entailed mining available transcriptomic and/or genomic sequence datasets for the presence of homologues of known TIMPs, predicting secondary structures of defined protein sequences, systematic phylogenetic analyses and assessment of differential expression of genes encoding putative TIMPs in the developmental stages of A. suum, N. americanus and Schistosoma haematobium which infect the mammalian hosts. A total of 15 protein sequences with high homology to known eukaryotic TIMPs were predicted from the complement of sequence data available for parasitic helminths and subjected to in-depth bioinformatic analyses. Supported by the availability of gene manipulation technologies such as RNA interference and/or transgenesis, this work provides a basis for future functional explorations of helminth TIMPs and, in particular, of their role/s in fundamental biological pathways linked to long-term establishment in the vertebrate hosts, with a view towards the development of novel approaches for the control of neglected helminthiases.

  16. A non-destructive approach for assessing decay in preservative treated wood

    NARCIS (Netherlands)

    Machek, L.; Edlund, M.L.; Sierra-Alvarez, R.; Militz, H.

    2004-01-01

    This study investigated the suitability of the non-destructive vibration-impulse excitation technique to assess the attack of preservative-treated wood in contact with the ground. Small stakes (10×5×100 mm3) of treated and untreated Scots pine sapwood were exposed to decay in laboratory-scale

  17. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  18. Not Driven by High-Stakes Tests: Exploring Science Assessment and College Readiness of Students from an Urban Portfolio Community High School

    Science.gov (United States)

    Fleshman, Robin Earle

    This case study seeks to explore three research questions: (1) What science teaching and learning processes, perspectives, and cultures exist within the science classroom of an urban portfolio community high school? (2) In what ways does the portfolio-based approach prepare high school students of color for college level science coursework, laboratory work, and assessment? (3) Are portfolio community high school students of color college ready? Is there a relationship between students' science and mathematics performance and college readiness? The overarching objectives of the study are to learn, understand, and describe an urban portfolio community high school as it relates to science assessment and college readiness; to understand how the administration, teachers, and alumni perceive the use of portfolios in science learning and assessment; and to understand how alumni view their preparation and readiness for college and college science coursework, laboratory work, and assessments. The theoretical framework of this study encompasses four theories: critical theory, contextual assessment, self-regulated learning, and ethic of care. Because the urban high school studied partnered with a community-based organization (CBO), it identifies as a community school. Therefore, I provide context regarding the concept, culture, and services of community schools. Case study is the research design I used to explore in-depth this urban portfolio community high school, which involved mixed methods for data collection and analysis. In total, six alumni/current college students, five school members (administrators and teachers), and three CBO members (administrators, including myself) participated in the study. In addition to school artefacts and student portfolios collected, classroom and portfolio panel presentation observations and 13 semi-structured interviews were conducted to understand the portfolio-based approach as it pertains to science learning and assessment and college

  19. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  20. The Managing Emergencies in Paediatric Anaesthesia global rating scale is a reliable tool for simulation-based assessment in pediatric anesthesia crisis management.

    Science.gov (United States)

    Everett, Tobias C; Ng, Elaine; Power, Daniel; Marsh, Christopher; Tolchard, Stephen; Shadrina, Anna; Bould, Matthew D

    2013-12-01

    The use of simulation-based assessments for high-stakes physician examinations remains controversial. The Managing Emergencies in Paediatric Anaesthesia course uses simulation to teach evidence-based management of anesthesia crises to trainee anesthetists in the United Kingdom (UK) and Canada. In this study, we investigated the feasibility and reliability of custom-designed scenario-specific performance checklists and a global rating scale (GRS) assessing readiness for independent practice. After research ethics board approval, subjects were videoed managing simulated pediatric anesthesia crises in a single Canadian teaching hospital. Each subject was randomized to two of six different scenarios. All 60 scenarios were subsequently rated by four blinded raters (two in the UK, two in Canada) using the checklists and GRS. The actual and predicted reliability of the tools was calculated for different numbers of raters using the intraclass correlation coefficient (ICC) and the Spearman-Brown prophecy formula. Average measures ICCs ranged from 'substantial' to 'near perfect' (P ≤ 0.001). The reliability of the checklists and the GRS was similar. Single measures ICCs showed more variability than average measures ICC. At least two raters would be required to achieve acceptable reliability. We have established the reliability of a GRS to assess the management of simulated crisis scenarios in pediatric anesthesia, and this tool is feasible within the setting of a research study. The global rating scale allows raters to make a judgement regarding a participant's readiness for independent practice. These tools may be used in the future research examining simulation-based assessment. © 2013 John Wiley & Sons Ltd.

  1. Active power reserves evaluation in large scale PVPPs

    DEFF Research Database (Denmark)

    Crăciun, Bogdan-Ionut; Kerekes, Tamas; Sera, Dezso

    2013-01-01

    The present trend on investing in renewable ways of producing electricity in the detriment of conventional fossil fuel-based plants will lead to a certain point where these plants have to provide ancillary services and contribute to overall grid stability. Photovoltaic (PV) power has the fastest...... growth among all renewable energies and managed to reach high penetration levels creating instabilities which at the moment are corrected by the conventional generation. This paradigm will change in the future scenarios where most of the power is supplied by large scale renewable plants and parts...... of the ancillary services have to be shared by the renewable plants. The main focus of the proposed paper is to technically and economically analyze the possibility of having active power reserves in large scale PV power plants (PVPPs) without any auxiliary storage equipment. The provided reserves should...

  2. Relationships between high-stakes clinical skills exam scores and program director global competency ratings of first-year pediatric residents

    Science.gov (United States)

    Langenau, Erik E.; Pugliano, Gina; Roberts, William L.

    2011-01-01

    Background Responding to mandates from the Accreditation Council for Graduate Medical Education (ACGME) and American Osteopathic Association (AOA), residency programs have developed competency-based assessment tools. One such tool is the American College of Osteopathic Pediatricians (ACOP) program directors’ annual report. High-stakes clinical skills licensing examinations, such as the Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation (COMLEX-USA Level 2-PE), also assess competency in several clinical domains. Objective The purpose of this study is to investigate the relationships between program director competency ratings of first-year osteopathic residents in pediatrics and COMLEX-USA Level 2-PE scores from 2005 to 2009. Methods The sample included all 94 pediatric first-year residents who took COMLEX-USA Level 2-PE and whose training was reviewed by the ACOP for approval of training between 2005 and 2009. Program director competency ratings and COMLEX-USA Level 2-PE scores (domain and component) were merged and analyzed for relationships. Results Biomedical/biomechanical domain scores were positively correlated with overall program director competency ratings. Humanistic domain scores were not significantly correlated with overall program director competency ratings, but did show moderate correlation with ratings for interpersonal and communication skills. The six ACGME or seven AOA competencies assessed empirically by the ACOP program directors’ annual report could not be recovered by principal component analysis; instead, three factors were identified, accounting for 86% of the variance between competency ratings. Discussion A few significant correlations were noted between COMLEX-USA Level 2-PE scores and program director competency ratings. Exploring relationships between different clinical skills assessments is inherently difficult because of the heterogeneity of tools used and overlap of constructs within the AOA

  3. Relationships between high-stakes clinical skills exam scores and program director global competency ratings of first-year pediatric residents

    Directory of Open Access Journals (Sweden)

    Erik E. Langenau

    2011-09-01

    Full Text Available Responding to mandates from the Accreditation Council for Graduate Medical Education (ACGME and American Osteopathic Association (AOA, residency programs have developed competency-based assessment tools. One such tool is the American College of Osteopathic Pediatricians (ACOP program directors’ annual report. High-stakes clinical skills licensing examinations, such as the Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation (COMLEX-USA Level 2-PE, also assess competency in several clinical domains.The purpose of this study is to investigate the relationships between program director competency ratings of first-year osteopathic residents in pediatrics and COMLEX-USA Level 2-PE scores from 2005 to 2009.The sample included all 94 pediatric first-year residents who took COMLEX-USA Level 2-PE and whose training was reviewed by the ACOP for approval of training between 2005 and 2009. Program director competency ratings and COMLEX-USA Level 2-PE scores (domain and component were merged and analyzed for relationships.Biomedical/biomechanical domain scores were positively correlated with overall program director competency ratings. Humanistic domain scores were not significantly correlated with overall program director competency ratings, but did show moderate correlation with ratings for interpersonal and communication skills. The six ACGME or seven AOA competencies assessed empirically by the ACOP program directors’ annual report could not be recovered by principal component analysis; instead, three factors were identified, accounting for 86% of the variance between competency ratings.A few significant correlations were noted between COMLEX-USA Level 2-PE scores and program director competency ratings. Exploring relationships between different clinical skills assessments is inherently difficult because of the heterogeneity of tools used and overlap of constructs within the AOA and ACGME core competencies.

  4. Large scale mapping: an empirical comparison of pixel-based and ...

    African Journals Online (AJOL)

    In the past, large scale mapping was carried using precise ground survey methods. Later, paradigm shift in data collection using medium to low resolution and, recently, high resolution images brought to bear the problem of accurate data analysis and fitness-for-purpose challenges. Using high resolution satellite images ...

  5. Computing the universe: how large-scale simulations illuminate galaxies and dark energy

    Science.gov (United States)

    O'Shea, Brian

    2015-04-01

    High-performance and large-scale computing is absolutely to understanding astronomical objects such as stars, galaxies, and the cosmic web. This is because these are structures that operate on physical, temporal, and energy scales that cannot be reasonably approximated in the laboratory, and whose complexity and nonlinearity often defies analytic modeling. In this talk, I show how the growth of computing platforms over time has facilitated our understanding of astrophysical and cosmological phenomena, focusing primarily on galaxies and large-scale structure in the Universe.

  6. High Arctic Nitrous Oxide Emissions Found on Large Spatial Scales

    Science.gov (United States)

    Wilkerson, J. P.; Sayres, D. S.; Dobosy, R.; Anderson, J. G.

    2017-12-01

    As the planet warms, greenhouse gas emissions from thawing permafrost can potentially increase the net radiative forcing in our climate structure. However, knowledge about Arctic N2O emissions is particularly sparse. Increasing evidence suggests emissions from permafrost thaw may be a significant natural source of N2O. This evidence, though, is either based on lab experiments or in situ chamber studies, which have extremely limited spatial coverage. Consequently, it has not been confirmed to what extent these high emissions are representative of broader arctic regions. Using an airborne eddy covariance flux technique, we measured N2O fluxes over large regions of Alaska in August 2013. From these measurements, we directly show that large areas of this Arctic region have high N2O emissions.

  7. Symptom assessment in early psychosis: the use of well-established rating scales in clinical high-risk and recent-onset populations.

    Science.gov (United States)

    Fulford, Daniel; Pearson, Rahel; Stuart, Barbara K; Fisher, Melissa; Mathalon, Daniel H; Vinogradov, Sophia; Loewy, Rachel L

    2014-12-30

    Symptom assessment in early psychosis research typically relies on scales validated in chronic schizophrenia samples. Our goal was to inform investigators who are selecting symptom scales for early psychosis research. We described measure characteristics, baseline scores, and scale inter-relationships in clinical-high-risk (CHR) and recent-onset psychotic disorder (RO) samples using the Positive and Negative Syndrome Scale, Brief Psychiatric Rating Scale, Scale for the Assessment of Positive Symptoms, and Scale for the Assessment of Negative Symptoms; for the CHR group only, we included the Scale of Prodromal Symptoms. For investigators selecting symptom measures in intervention or longitudinal studies, we also examined the relationship of symptom scales with psychosocial functioning. In both samples, symptom subscales in the same domain, across measures, were moderately to highly intercorrelated. Within all measures, positive symptoms were not correlated with negative symptoms, but disorganized symptoms overlapped with both positive and negative symptoms. Functioning was significantly related to negative and disorganized, but not positive, symptoms in both samples on most measures. Findings suggest strong overlap in symptom severity ratings among the most common scales. In recent-onset samples, each has strengths and weaknesses. In CHR samples, they appear to add little information above and beyond the SOPS. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    DEFF Research Database (Denmark)

    Jensen, Tue Vissing; Pinson, Pierre

    2017-01-01

    , we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven...... to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecastingof renewable power generation....

  9. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    In foundries, violent explosions occur occasionally when molten metal comes into contact with water. If similar explosions can occur with other materials, hazardous situations may arise for example in LNG marine transportation accidents, or in liquid cooled reactor incidents when molten UO 2 contacts water or sodium coolant. Over the last 10 years a large body of experimental data has been obtained on the behaviour of small quantities of hot material in contact with a vaporisable coolant. Such experiments generally give low energy yields, despite producing fine fragmentation of the molten material. These events have been interpreted in terms of a wide range of phenomena such as violent boiling, liquid entrainment, bubble collapse, superheat, surface cracking and many others. Many of these studies have been aimed at understanding the small scale behaviour of the particular materials of interest. However, understanding the nature of the energetic events which were the original cause for concern may also be necessary to give confidence that violent events cannot occur for these materials in large scale situations. More recently, there has been a trend towards larger experiments and some of these have produced explosions of moderately high efficiency. Although occurrence of such large scale explosions can depend rather critically on initial conditions in a way which is not fully understood, there are signs that the interpretation of these events may be more straightforward than that of the single drop experiments. In the last two years several theoretical models for large scale explosions have appeared which attempt a self contained explanation of at least some stages of such high yield events: these have as their common feature a description of how a propagating breakdown of an initially quasi-stable distribution of materials is induced by the pressure and flow field caused by the energy release in adjacent regions. These models have led to the idea that for a full

  10. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  11. Large scale particle image velocimetry with helium filled soap bubbles

    Energy Technology Data Exchange (ETDEWEB)

    Bosbach, Johannes; Kuehn, Matthias; Wagner, Claus [German Aerospace Center (DLR), Institute of Aerodynamics and Flow Technology, Goettingen (Germany)

    2009-03-15

    The application of particle image velocimetry (PIV) to measurement of flows on large scales is a challenging necessity especially for the investigation of convective air flows. Combining helium filled soap bubbles as tracer particles with high power quality switched solid state lasers as light sources allows conducting PIV on scales of the order of several square meters. The technique was applied to mixed convection in a full scale double aisle aircraft cabin mock-up for validation of computational fluid dynamics simulations. (orig.)

  12. Large scale particle image velocimetry with helium filled soap bubbles

    Science.gov (United States)

    Bosbach, Johannes; Kühn, Matthias; Wagner, Claus

    2009-03-01

    The application of Particle Image Velocimetry (PIV) to measurement of flows on large scales is a challenging necessity especially for the investigation of convective air flows. Combining helium filled soap bubbles as tracer particles with high power quality switched solid state lasers as light sources allows conducting PIV on scales of the order of several square meters. The technique was applied to mixed convection in a full scale double aisle aircraft cabin mock-up for validation of Computational Fluid Dynamics simulations.

  13. Transient characteristics of current lead losses for the large scale high-temperature superconducting rotating machine

    International Nuclear Information System (INIS)

    Le, T. D.; Kim, J. H.; Park, S. I.; Kim, D. J.; Kim, H. M.; Lee, H. G.; Yoon, Y. S.; Jo, Y. S.; Yoon, K. Y.

    2014-01-01

    To minimize most heat loss of current lead for high-temperature superconducting (HTS) rotating machine, the choice of conductor properties and lead geometry - such as length, cross section, and cooling surface area - are one of the various significant factors must be selected. Therefore, an optimal lead for large scale of HTS rotating machine has presented before. Not let up with these trends, this paper continues to improve of diminishing heat loss for HTS part according to different model. It also determines the simplification conditions for an evaluation of the main flux flow loss and eddy current loss transient characteristics during charging and discharging period.

  14. Technological stakes of LHC, the large superconducting collider in project at CERN

    International Nuclear Information System (INIS)

    Lebrun, P.

    1991-01-01

    The LHC large superconducting particle collider project is presented, with particular emphasis on its major technological requirements and returns, mostly in the domains of high-field electromagnets, superfluid helium cryogenics, and integration of such advanced techniques in a large machine. The corresponding cooperation and technological transfer to European laboratories and industries are briefly discussed [fr

  15. IP over optical multicasting for large-scale video delivery

    Science.gov (United States)

    Jin, Yaohui; Hu, Weisheng; Sun, Weiqiang; Guo, Wei

    2007-11-01

    In the IPTV systems, multicasting will play a crucial role in the delivery of high-quality video services, which can significantly improve bandwidth efficiency. However, the scalability and the signal quality of current IPTV can barely compete with the existing broadcast digital TV systems since it is difficult to implement large-scale multicasting with end-to-end guaranteed quality of service (QoS) in packet-switched IP network. China 3TNet project aimed to build a high performance broadband trial network to support large-scale concurrent streaming media and interactive multimedia services. The innovative idea of 3TNet is that an automatic switched optical networks (ASON) with the capability of dynamic point-to-multipoint (P2MP) connections replaces the conventional IP multicasting network in the transport core, while the edge remains an IP multicasting network. In this paper, we will introduce the network architecture and discuss challenges in such IP over Optical multicasting for video delivery.

  16. Research on precision grinding technology of large scale and ultra thin optics

    Science.gov (United States)

    Zhou, Lian; Wei, Qiancai; Li, Jie; Chen, Xianhua; Zhang, Qinghua

    2018-03-01

    The flatness and parallelism error of large scale and ultra thin optics have an important influence on the subsequent polishing efficiency and accuracy. In order to realize the high precision grinding of those ductile elements, the low deformation vacuum chuck was designed first, which was used for clamping the optics with high supporting rigidity in the full aperture. Then the optics was planar grinded under vacuum adsorption. After machining, the vacuum system was turned off. The form error of optics was on-machine measured using displacement sensor after elastic restitution. The flatness would be convergenced with high accuracy by compensation machining, whose trajectories were integrated with the measurement result. For purpose of getting high parallelism, the optics was turned over and compensation grinded using the form error of vacuum chuck. Finally, the grinding experiment of large scale and ultra thin fused silica optics with aperture of 430mm×430mm×10mm was performed. The best P-V flatness of optics was below 3 μm, and parallelism was below 3 ″. This machining technique has applied in batch grinding of large scale and ultra thin optics.

  17. High-Energy Physics Strategies and Future Large-Scale Projects

    CERN Document Server

    Zimmermann, F

    2015-01-01

    We sketch the actual European and international strategies and possible future facilities. In the near term the High Energy Physics (HEP) community will fully exploit the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). Post-LHC options include a linear e+e- collider in Japan (ILC) or at CERN (CLIC), as well as circular lepton or hadron colliders in China (CepC/SppC) and Europe (FCC). We conclude with linear and circular acceleration approaches based on crystals, and some perspectives for the far future of accelerator-based particle physics.

  18. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  19. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    Science.gov (United States)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4Destiny will be used in its third year as a high resolution, wide-field imager to conduct a weak lensing survey covering >lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  20. Special educaction and rewiews in the municipality of large scale Sobral (CE

    Directory of Open Access Journals (Sweden)

    Ana Paula Lima Barbosa Cardoso

    2012-11-01

    Full Text Available This article aims to discuss and analyze the participation of students with disabilities in public schools of the city of Sobral-CE in the assessment scale developed in that context. It follows a case study, a qualitative approach, conducted within the Department of Education and two municipal schools; the highest and lowest IDEB results (2009. The data collection instruments: analysis of documents, interviews and observation, and content analysis. The theoretical framework discusses the large-scale evaluation in the brazilian context in conjunction with the literature on the evaluation of teaching for students with disabilities. We describe the landscape of education in general sobralense and also data on special education. The research results discussed two cases of large-scale evaluation that occurred in that municipality: municipal evaluation and Proof Brazil. Regarding the first, the subjects affirms the participation of students with disabilities through a mechanism that prevents these results affect other students, are called "children of the shore." In Proof Brazil, the subjects again reported the participation of these students in national testing. It's criticizing the appropriateness of that instrument to assess this particular student body, suggesting the need of developping more "relevant" ones. Finally, it appears that the large-scale evaluation calls into question the process of schooling experienced by pupils with disabilities in Sobral-CE, showing the challenges and difficulties of the actions of school inclusion proposals in that context.

  1. Large-Scale Traveling Weather Systems in Mars’ Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-10-01

    Between late fall and early spring, Mars’ middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  2. Large-Scale Traveling Weather Systems in Mars Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-01-01

    Between late fall and early spring, Mars' middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  3. Study of multi-functional precision optical measuring system for large scale equipment

    Science.gov (United States)

    Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi

    2017-10-01

    The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.

  4. The Development of a Formative and a Reflective Scale for the Assessment of On-Line Store Usability

    Directory of Open Access Journals (Sweden)

    Timo Christophersen

    2008-10-01

    Full Text Available In usability research, difference between formative and reflective measurement models for the assessment of latent variables has been ignored largely. As a consequence, many usability scales are misspecified. This might result in reduced scale validity because of the elimination of important usability facets within the procedure of scale development. The aim of the current study was to develop a questionnaire for the evaluation of On-line store usability (UFOS-V2 that includes both a formative and a reflective scale. 378 subjects participated in a laboratory experimental study. Each participant visited two out of 35 On-line stores. The usability and intention to buy was assessed for both stores. In addition, actual purchase behaviour was observed by combining the subjects' reward with the decision to buy. In a two-construct PLS structural equation model the formative usability scale was used as a predictor for the reflective usability measure. Results indicate that the formative usability scale UFOS-V2f forms a valid set of items for the user-based assessment of online store usability. The reflective usability scale shows high internal consistency. Positive relationships to intention and decision to buy confirm high scale validity.

  5. Magnetic Properties of Large-Scale Nanostructured Graphene Systems

    DEFF Research Database (Denmark)

    Gregersen, Søren Schou

    The on-going progress in two-dimensional (2D) materials and nanostructure fabrication motivates the study of altered and combined materials. Graphene—the most studied material of the 2D family—displays unique electronic and spintronic properties. Exceptionally high electron mobilities, that surpass...... those in conventional materials such as silicon, make graphene a very interesting material for high-speed electronics. Simultaneously, long spin-diffusion lengths and spin-life times makes graphene an eligible spin-transport channel. In this thesis, we explore fundamental features of nanostructured...... graphene systems using large-scale modeling techniques. Graphene perforations, or antidots, have received substantial interest in the prospect of opening large band gaps in the otherwise gapless graphene. Motivated by recent improvements of fabrication processes, such as forming graphene antidots and layer...

  6. Large-scale digitizer system (LSD) for charge and time digitization in high-energy physics experiments

    International Nuclear Information System (INIS)

    Althaus, R.F.; Kirsten, F.A.; Lee, K.L.; Olson, S.R.; Wagner, L.J.; Wolverton, J.M.

    1976-10-01

    A large-scale digitizer (LSD) system for acquiring charge and time-of-arrival particle data from high-energy-physics experiments has been developed at the Lawrence Berkeley Laboratory. The objective in this development was to significantly reduce the cost of instrumenting large-detector arrays which, for the 4π-geometry of colliding-beam experiments, are proposed with an order of magnitude increase in channel count over previous detectors. In order to achieve the desired economy (approximately $65 per channel), a system was designed in which a number of control signals for conversion, for digitization, and for readout are shared in common by all the channels in each 128-channel bin. The overall-system concept and the distribution of control signals that are critical to the 10-bit charge resolution and to the 12-bit time resolution are described. Also described is the bit-serial transfer scheme, chosen for its low component and cabling costs

  7. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  8. BigSUR: large-scale structured urban reconstruction

    KAUST Repository

    Kelly, Tom

    2017-11-22

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  9. BigSUR: large-scale structured urban reconstruction

    KAUST Repository

    Kelly, Tom; Femiani, John; Wonka, Peter; Mitra, Niloy J.

    2017-01-01

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  10. Solar wind fluctuations at large scale: A comparison between low and high solar activity conditions

    International Nuclear Information System (INIS)

    Bavassano, B.; Bruno, R.

    1991-01-01

    The influence of the Sun's activity cycle on the solar wind fluctuations at time scales from 1 hour to 3 days in the inner heliosphere (0.3 to 1 AU) is investigated. Hourly averages of plasma and magnetic field data by Helios spacecraft are used. Since fluctuations behave quite differently with changing scale, the analysis is performed separately for two different ranges in time scale. Between 1 and 6 hours Alfvenic fluctuations and pressure-balanced structures are extensively observed. At low solar activity and close to 0.3 AU, Alfvenic fluctuations are more frequent than pressure-balanced structures. This predominance, however, weakens for rising solar activity and radial distance, to the point that a role exchange, in terms of occurrence rate, is found at the maximum of the cycle close to 1 AU. On the other hand, in all cases Alfvenic fluctuations have a larger amplitude than pressure-balanced structures. On the whole, the Alfvenic contribution to the solar wind energy spectrum comes out to be dominant at all solar activity conditions. At scales from 0.5 to 3 days the most important feature is the growth, as the solar wind expansion develops, of strong positive correlations between magnetic and thermal pressures. These structures are progressively built up by the interaction between different wind flows. This effect is more pronounced at low than at high activity. Our findings support the conclusion that the solar cycle evolution of the large-scale velocity pattern is the factor governing the observed variations

  11. Large-scale machine learning and evaluation platform for real-time traffic surveillance

    Science.gov (United States)

    Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel

    2016-09-01

    In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.

  12. Large-Scale Assessment of Change in Student Achievement: Dutch Primary School Students' Results on Written Division in 1997 and 2004 as an Example

    Science.gov (United States)

    van den Heuvel-Panhuizen, Marja; Robitzsch, Alexander; Treffers, Adri; Koller, Olaf

    2009-01-01

    This article discusses large-scale assessment of change in student achievement and takes the study by Hickendorff, Heiser, Van Putten, and Verhelst (2009) as an example. This study compared the achievement of students in the Netherlands in 1997 and 2004 on written division problems. Based on this comparison, they claim that there is a performance…

  13. Large-scale synthesis of high-purity well-aligned carbon nanotubes using pyrolysis of iron(II) phthalocyanine and acetylene

    Science.gov (United States)

    Liu, B. C.; Lee, T. J.; Lee, S. H.; Park, C. Y.; Lee, C. J.

    2003-08-01

    Well-aligned carbon nanotubes (CNTs) with high purity have been produced by pyrolysis of iron(II) phthalocyanine and acetylene at 800 °C. The synthesized CNTs have a length of 75 μm and diameters ranging from 20 to 60 nm. The CNTs have a bamboo-like structure and exhibit good crystallinity of graphite sheets. The growth rate of the CNTs was rapidly increased with adding C 2H 2. Our results demonstrate that the proposed growth method is suitable to large-scale synthesis of high-purity well-aligned CNTs on various substrates.

  14. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  15. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  16. The influences of implementing state-mandated science assessment on teacher practice

    Science.gov (United States)

    Katzmann, Jason Matthew

    Four high school Biology teachers, two novice and two experienced, participated in a year and a half case study. By utilizing a naturalistic paradigm, the four individuals were studied in their natural environment, their classrooms. Data sources included: three semi-structured interviews, classroom observation field notes, and classroom artifacts. Through cross-case analysis and a constant comparative methodology, coding nodes where combined and refined resulting in the final themes for discussion. The following research question was investigated: what is the impact of high-stakes testing on high school Biology teacher's instructional planning, instructional practices and classroom assessments? Seven final themes were realized: Assessment, CSAP, Planning, Pressure, Standards, Teaching and Time. Each theme was developed and discussed utilizing each participant's voice. Trustworthiness of this study was established via five avenues: triangulation of data sources, credibility, transferability, dependability and confirmability. A model of the influences of high-stakes testing on teacher practice was developed to describe the seven themes (Figure 5). This model serves as an illustration of the complex nature of teacher practice and the influences upon it. The four participants in this study were influenced by high-stakes assessment. It influenced their instructional decisions, assessment practices, use of time, planning decisions and decreased the amount of inquiry that occurred in the classroom. Implications of this research and future research directions are described.

  17. Validated assessment scales for the lower face.

    Science.gov (United States)

    Narins, Rhoda S; Carruthers, Jean; Flynn, Timothy C; Geister, Thorin L; Görtelmeyer, Roman; Hardas, Bhushan; Himmrich, Silvia; Jones, Derek; Kerscher, Martina; de Maio, Maurício; Mohrmann, Cornelia; Pooth, Rainer; Rzany, Berthold; Sattler, Gerhard; Buchner, Larry; Benter, Ursula; Breitscheidel, Lusine; Carruthers, Alastair

    2012-02-01

    Aging in the lower face leads to lines, wrinkles, depression of the corners of the mouth, and changes in lip volume and lip shape, with increased sagging of the skin of the jawline. Refined, easy-to-use, validated, objective standards assessing the severity of these changes are required in clinical research and practice. To establish the reliability of eight lower face scales assessing nasolabial folds, marionette lines, upper and lower lip fullness, lip wrinkles (at rest and dynamic), the oral commissure and jawline, aesthetic areas, and the lower face unit. Four 5-point rating scales were developed to objectively assess upper and lower lip wrinkles, oral commissures, and the jawline. Twelve experts rated identical lower face photographs of 50 subjects in two separate rating cycles using eight 5-point scales. Inter- and intrarater reliability of responses was assessed. Interrater reliability was substantial or almost perfect for all lower face scales, aesthetic areas, and the lower face unit. Intrarater reliability was high for all scales, areas and the lower face unit. Our rating scales are reliable tools for valid and reproducible assessment of the aging process in lower face areas. © 2012 by the American Society for Dermatologic Surgery, Inc. Published by Wiley Periodicals, Inc.

  18. Application of Satellite Solar-Induced Chlorophyll Fluorescence to Understanding Large-Scale Variations in Vegetation Phenology and Function Over Northern High Latitude Forests

    Science.gov (United States)

    Jeong, Su-Jong; Schimel, David; Frankenberg, Christian; Drewry, Darren T.; Fisher, Joshua B.; Verma, Manish; Berry, Joseph A.; Lee, Jung-Eun; Joiner, Joanna

    2016-01-01

    This study evaluates the large-scale seasonal phenology and physiology of vegetation over northern high latitude forests (40 deg - 55 deg N) during spring and fall by using remote sensing of solar-induced chlorophyll fluorescence (SIF), normalized difference vegetation index (NDVI) and observation-based estimate of gross primary productivity (GPP) from 2009 to 2011. Based on GPP phenology estimation in GPP, the growing season determined by SIF time-series is shorter in length than the growing season length determined solely using NDVI. This is mainly due to the extended period of high NDVI values, as compared to SIF, by about 46 days (+/-11 days), indicating a large-scale seasonal decoupling of physiological activity and changes in greenness in the fall. In addition to phenological timing, mean seasonal NDVI and SIF have different responses to temperature changes throughout the growing season. We observed that both NDVI and SIF linearly increased with temperature increases throughout the spring. However, in the fall, although NDVI linearly responded to temperature increases, SIF and GPP did not linearly increase with temperature increases, implying a seasonal hysteresis of SIF and GPP in response to temperature changes across boreal ecosystems throughout their growing season. Seasonal hysteresis of vegetation at large-scales is consistent with the known phenomena that light limits boreal forest ecosystem productivity in the fall. Our results suggest that continuing measurements from satellite remote sensing of both SIF and NDVI can help to understand the differences between, and information carried by, seasonal variations vegetation structure and greenness and physiology at large-scales across the critical boreal regions.

  19. Large scale, highly dense nanoholes on metal surfaces by underwater laser assisted hydrogen etching near nanocrystalline boundary

    Energy Technology Data Exchange (ETDEWEB)

    Lin Dong; Zhang, Martin Yi; Ye Chang; Liu Zhikun; Liu, C. Richard [School of Industrial Engineering and Birck Nanotechnology Center, Purdue University, West Lafayette, IN 47906 (United States); Cheng, Gary J., E-mail: gjcheng@purdue.edu [School of Industrial Engineering and Birck Nanotechnology Center, Purdue University, West Lafayette, IN 47906 (United States)

    2012-03-01

    A new method to generate large scale and highly dense nanoholes is presented in this paper. By the pulsed laser irradiation under water, the hydrogen etching is introduced to form high density nanoholes on the surfaces of AISI 4140 steel and Ti. In order to achieve higher nanohole density, laser shock peening (LSP) followed by recrystallization is used for grain refinement. It is found that the nanohole density does not increase until recrystallization of the substructures after laser shock peening. The mechanism of nanohole generation is studied in detail. This method can be also applied to generate nanoholes on other materials with hydrogen etching effect.

  20. Large scale, highly dense nanoholes on metal surfaces by underwater laser assisted hydrogen etching near nanocrystalline boundary

    International Nuclear Information System (INIS)

    Lin Dong; Zhang, Martin Yi; Ye Chang; Liu Zhikun; Liu, C. Richard; Cheng, Gary J.

    2012-01-01

    A new method to generate large scale and highly dense nanoholes is presented in this paper. By the pulsed laser irradiation under water, the hydrogen etching is introduced to form high density nanoholes on the surfaces of AISI 4140 steel and Ti. In order to achieve higher nanohole density, laser shock peening (LSP) followed by recrystallization is used for grain refinement. It is found that the nanohole density does not increase until recrystallization of the substructures after laser shock peening. The mechanism of nanohole generation is studied in detail. This method can be also applied to generate nanoholes on other materials with hydrogen etching effect.

  1. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  2. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  3. Environmental implications of large-scale adoption of wind power: a scenario-based life cycle assessment

    International Nuclear Information System (INIS)

    Arvesen, Anders; Hertwich, Edgar G

    2011-01-01

    We investigate the potential environmental impacts of a large-scale adoption of wind power to meet up to 22% of the world’s growing electricity demand. The analysis builds on life cycle assessments of generic onshore and offshore wind farms, meant to represent average conditions for global deployment of wind power. We scale unit-based findings to estimate aggregated emissions of building, operating and decommissioning wind farms toward 2050, taking into account changes in the electricity mix in manufacturing. The energy scenarios investigated are the International Energy Agency’s BLUE scenarios. We estimate 1.7–2.6 Gt CO 2 -eq climate change, 2.1–3.2 Mt N-eq marine eutrophication, 9.2–14 Mt NMVOC photochemical oxidant formation, and 9.5–15 Mt SO 2 -eq terrestrial acidification impact category indicators due to global wind power in 2007–50. Assuming lifetimes 5 yr longer than reference, the total climate change indicator values are reduced by 8%. In the BLUE Map scenario, construction of new capacity contributes 64%, and repowering of existing capacity 38%, to total cumulative greenhouse gas emissions. The total emissions of wind electricity range between 4% and 14% of the direct emissions of the replaced fossil-fueled power plants. For all impact categories, the indirect emissions of displaced fossil power are larger than the total emissions caused by wind power.

  4. Multidimensional quantum entanglement with large-scale integrated optics

    DEFF Research Database (Denmark)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong

    2018-01-01

    -dimensional entanglement. A programmable bipartite entangled system is realized with dimension up to 15 × 15 on a large-scale silicon-photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality......The ability to control multidimensional quantum systems is key for the investigation of fundamental science and for the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control and analyze high...

  5. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  6. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  7. Cluster galaxy dynamics and the effects of large-scale environment

    Science.gov (United States)

    White, Martin; Cohn, J. D.; Smit, Renske

    2010-11-01

    Advances in observational capabilities have ushered in a new era of multi-wavelength, multi-physics probes of galaxy clusters and ambitious surveys are compiling large samples of cluster candidates selected in different ways. We use a high-resolution N-body simulation to study how the influence of large-scale structure in and around clusters causes correlated signals in different physical probes and discuss some implications this has for multi-physics probes of clusters (e.g. richness, lensing, Compton distortion and velocity dispersion). We pay particular attention to velocity dispersions, matching galaxies to subhaloes which are explicitly tracked in the simulation. We find that not only do haloes persist as subhaloes when they fall into a larger host, but groups of subhaloes retain their identity for long periods within larger host haloes. The highly anisotropic nature of infall into massive clusters, and their triaxiality, translates into an anisotropic velocity ellipsoid: line-of-sight galaxy velocity dispersions for any individual halo show large variance depending on viewing angle. The orientation of the velocity ellipsoid is correlated with the large-scale structure, and thus velocity outliers correlate with outliers caused by projection in other probes. We quantify this orientation uncertainty and give illustrative examples. Such a large variance suggests that velocity dispersion estimators will work better in an ensemble sense than for any individual cluster, which may inform strategies for obtaining redshifts of cluster members. We similarly find that the ability of substructure indicators to find kinematic substructures is highly viewing angle dependent. While groups of subhaloes which merge with a larger host halo can retain their identity for many Gyr, they are only sporadically picked up by substructure indicators. We discuss the effects of correlated scatter on scaling relations estimated through stacking, both analytically and in the simulations

  8. Contribution of large scale coherence to wind turbine power: A large eddy simulation study in periodic wind farms

    Science.gov (United States)

    Chatterjee, Tanmoy; Peet, Yulia T.

    2018-03-01

    Length scales of eddies involved in the power generation of infinite wind farms are studied by analyzing the spectra of the turbulent flux of mean kinetic energy (MKE) from large eddy simulations (LES). Large-scale structures with an order of magnitude bigger than the turbine rotor diameter (D ) are shown to have substantial contribution to wind power. Varying dynamics in the intermediate scales (D -10 D ) are also observed from a parametric study involving interturbine distances and hub height of the turbines. Further insight about the eddies responsible for the power generation have been provided from the scaling analysis of two-dimensional premultiplied spectra of MKE flux. The LES code is developed in a high Reynolds number near-wall modeling framework, using an open-source spectral element code Nek5000, and the wind turbines have been modelled using a state-of-the-art actuator line model. The LES of infinite wind farms have been validated against the statistical results from the previous literature. The study is expected to improve our understanding of the complex multiscale dynamics in the domain of large wind farms and identify the length scales that contribute to the power. This information can be useful for design of wind farm layout and turbine placement that take advantage of the large-scale structures contributing to wind turbine power.

  9. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  10. Large-scale, high-definition Ground Penetrating Radar prospection in archaeology

    Science.gov (United States)

    Trinks, I.; Kucera, M.; Hinterleitner, A.; Löcker, K.; Nau, E.; Neubauer, W.; Zitz, T.

    2012-04-01

    The future demands on professional archaeological prospection will be its ability to cover large areas in a time and cost efficient manner with very high spatial resolution and accuracy. The objective of the 2010 in Vienna established Ludwig Boltzmann Institute for Archaeological Prospection and Virtual Archaeology (LBI ArchPro) in collaboration with its eight European partner organisations is the advancement of state-of-the-art archaeological sciences. The application and specific further development of remote sensing, geophysical prospection and virtual reality applications, as well as of novel integrated interpretation approaches dedicated to non-invasive spatial archaeology combining near-surface prospection methods with advanced computer science is crucial for modern archaeology. Within the institute's research programme different areas for distinct case studies in Austria, Germany, Norway, Sweden and the UK have been selected as basis for the development and testing of new concepts for efficient and universally applicable tools for spatial, non-invasive archaeology. In terms of geophysical prospection the investigation of entire archaeological landscapes for the exploration and protection of Europe's buried cultural heritage requires new measurement devices, which are fast, accurate and precise. Therefore the further development of motorized, multichannel survey systems and advanced navigation solutions is required. The use of motorized measurement devices for archaeological prospection implicates several technological and methodological challenges. Latest multichannel Ground Penetrating Radar (GPR) arrays mounted in front off, or towed behind motorized survey vehicles permit large-scale GPR prospection surveys with unprecedented spatial resolution. In particular the motorized 16 channel 400 MHz MALÅ Imaging Radar Array (MIRA) used by the LBI ArchPro in combination with latest automatic data positioning and navigation solutions permits the reliable high

  11. Large Scale Computing and Storage Requirements for High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years

  12. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  13. Centralized manure digestion. Selection of locations and estimation of costs of large-scale manure storage application

    International Nuclear Information System (INIS)

    1995-03-01

    A study to assess the possibilities and the consequences of the use of existing Dutch large scale manure silos at centralised anaerobic digestion plants (CAD-plants) for manure and energy-rich organic wastes is carried out. Reconstruction of these large scale manure silos into digesters for a CAD-plant is not self-evident due to the high height/diameter ratio of these silos and the extra investments that have to be made for additional facilities for roofing, insulation, mixing and heating. From the results of an inventory and selection of large scale manure silos with a storage capacity above 1,500 m 3 it appeared that there are 21 locations in The Netherlands that can be qualified for realisation of a CAD plant with a processing capacity of 100 m 3 biomass (80% manure, 20% additives) per day. These locations are found in particular at the 'shortage-areas' for manure fertilisation in the Dutch provinces Groningen and Drenthe. Three of these 21 locations with large scale silos are considered to be the most suitable for realisation of a large scale CAD-plant. The selection is based on an optimal scale for a CAD-plant of 300 m 3 material (80% manure, 20% additives) to be processed per day and the most suitable consuming markets for the biogas produced at the CAD-plant. The three locations are at Middelharnis, Veendam, and Klazinaveen. Applying the conditions as used in this study and accounting for all costs for transport of manure, additives and end-product including the costs for the storage facilities, a break-even operation might be realised at a minimum income for the additives of approximately 50 Dutch guilders per m 3 (including TAV). This income price is considerably lower than the prevailing costs for tipping or processing of organic wastes in The Netherlands. This study revealed that a break-even exploitation of a large scale CAD-plant for the processing of manure with energy-rich additives is possible. (Abstract Truncated)

  14. In Praise of Assessment (Done Right)

    Science.gov (United States)

    Marshall, Kim

    2018-01-01

    High-stakes testing gets a lot of criticism, for good reason. But, when done right, assessment can be a valuable tool for educators and students. Kim Marshall describes how different types of assessments can improve learning by revealing learning problems in real time, improving student retention through the "retrieval effect," and…

  15. Large-scale self-assembled zirconium phosphate smectic layers via a simple spray-coating process

    Science.gov (United States)

    Wong, Minhao; Ishige, Ryohei; White, Kevin L.; Li, Peng; Kim, Daehak; Krishnamoorti, Ramanan; Gunther, Robert; Higuchi, Takeshi; Jinnai, Hiroshi; Takahara, Atsushi; Nishimura, Riichi; Sue, Hung-Jue

    2014-04-01

    The large-scale assembly of asymmetric colloidal particles is used in creating high-performance fibres. A similar concept is extended to the manufacturing of thin films of self-assembled two-dimensional crystal-type materials with enhanced and tunable properties. Here we present a spray-coating method to manufacture thin, flexible and transparent epoxy films containing zirconium phosphate nanoplatelets self-assembled into a lamellar arrangement aligned parallel to the substrate. The self-assembled mesophase of zirconium phosphate nanoplatelets is stabilized by epoxy pre-polymer and exhibits rheology favourable towards large-scale manufacturing. The thermally cured film forms a mechanically robust coating and shows excellent gas barrier properties at both low- and high humidity levels as a result of the highly aligned and overlapping arrangement of nanoplatelets. This work shows that the large-scale ordering of high aspect ratio nanoplatelets is easier to achieve than previously thought and may have implications in the technological applications for similar materials.

  16. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    Science.gov (United States)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  17. Dose monitoring in large-scale flowing aqueous media

    International Nuclear Information System (INIS)

    Kuruca, C.N.

    1995-01-01

    The Miami Electron Beam Research Facility (EBRF) has been in operation for six years. The EBRF houses a 1.5 MV, 75 KW DC scanned electron beam. Experiments have been conducted to evaluate the effectiveness of high-energy electron irradiation in the removal of toxic organic chemicals from contaminated water and the disinfection of various wastewater streams. The large-scale plant operates at approximately 450 L/min (120 gal/min). The radiation dose absorbed by the flowing aqueous streams is estimated by measuring the difference in water temperature before and after it passes in front of the beam. Temperature measurements are made using resistance temperature devices (RTDs) and recorded by computer along with other operating parameters. Estimated dose is obtained from the measured temperature differences using the specific heat of water. This presentation will discuss experience with this measurement system, its application to different water presentation devices, sources of error, and the advantages and disadvantages of its use in large-scale process applications

  18. Energy Analysis of Cascade Heating with High Back-Pressure Large-Scale Steam Turbine

    Directory of Open Access Journals (Sweden)

    Zhihua Ge

    2018-01-01

    Full Text Available To reduce the exergy loss that is caused by the high-grade extraction steam of traditional heating mode of combined heat and power (CHP generating unit, a high back-pressure cascade heating technology for two jointly constructed large-scale steam turbine power generating units is proposed. The Unit 1 makes full use of the exhaust steam heat from high back-pressure turbine, and the Unit 2 uses the original heating mode of extracting steam condensation, which significantly reduces the flow rate of high-grade extraction steam. The typical 2 × 350 MW supercritical CHP units in northern China were selected as object. The boundary conditions for heating were determined based on the actual climatic conditions and heating demands. A model to analyze the performance of the high back-pressure cascade heating supply units for off-design operating conditions was developed. The load distributions between high back-pressure exhaust steam direct supply and extraction steam heating supply were described under various conditions, based on which, the heating efficiency of the CHP units with the high back-pressure cascade heating system was analyzed. The design heating load and maximum heating supply load were determined as well. The results indicate that the average coal consumption rate during the heating season is 205.46 g/kWh for the design heating load after the retrofit, which is about 51.99 g/kWh lower than that of the traditional heating mode. The coal consumption rate of 199.07 g/kWh can be achieved for the maximum heating load. Significant energy saving and CO2 emission reduction are obtained.

  19. Application of parallel computing techniques to a large-scale reservoir simulation

    International Nuclear Information System (INIS)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris; Pruess, Karsten

    2001-01-01

    Even with the continual advances made in both computational algorithms and computer hardware used in reservoir modeling studies, large-scale simulation of fluid and heat flow in heterogeneous reservoirs remains a challenge. The problem commonly arises from intensive computational requirement for detailed modeling investigations of real-world reservoirs. This paper presents the application of a massive parallel-computing version of the TOUGH2 code developed for performing large-scale field simulations. As an application example, the parallelized TOUGH2 code is applied to develop a three-dimensional unsaturated-zone numerical model simulating flow of moisture, gas, and heat in the unsaturated zone of Yucca Mountain, Nevada, a potential repository for high-level radioactive waste. The modeling approach employs refined spatial discretization to represent the heterogeneous fractured tuffs of the system, using more than a million 3-D gridblocks. The problem of two-phase flow and heat transfer within the model domain leads to a total of 3,226,566 linear equations to be solved per Newton iteration. The simulation is conducted on a Cray T3E-900, a distributed-memory massively parallel computer. Simulation results indicate that the parallel computing technique, as implemented in the TOUGH2 code, is very efficient. The reliability and accuracy of the model results have been demonstrated by comparing them to those of small-scale (coarse-grid) models. These comparisons show that simulation results obtained with the refined grid provide more detailed predictions of the future flow conditions at the site, aiding in the assessment of proposed repository performance

  20. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  1. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  2. Mediterranean Thermohaline Response to Large-Scale Winter Atmospheric Forcing in a High-Resolution Ocean Model Simulation

    Science.gov (United States)

    Cusinato, Eleonora; Zanchettin, Davide; Sannino, Gianmaria; Rubino, Angelo

    2018-04-01

    Large-scale circulation anomalies over the North Atlantic and Euro-Mediterranean regions described by dominant climate modes, such as the North Atlantic Oscillation (NAO), the East Atlantic pattern (EA), the East Atlantic/Western Russian (EAWR) and the Mediterranean Oscillation Index (MOI), significantly affect interannual-to-decadal climatic and hydroclimatic variability in the Euro-Mediterranean region. However, whereas previous studies assessed the impact of such climate modes on air-sea heat and freshwater fluxes in the Mediterranean Sea, the propagation of these atmospheric forcing signals from the surface toward the interior and the abyss of the Mediterranean Sea remains unexplored. Here, we use a high-resolution ocean model simulation covering the 1979-2013 period to investigate spatial patterns and time scales of the Mediterranean thermohaline response to winter forcing from NAO, EA, EAWR and MOI. We find that these modes significantly imprint on the thermohaline properties in key areas of the Mediterranean Sea through a variety of mechanisms. Typically, density anomalies induced by all modes remain confined in the upper 600 m depth and remain significant for up to 18-24 months. One of the clearest propagation signals refers to the EA in the Adriatic and northern Ionian seas: There, negative EA anomalies are associated to an extensive positive density response, with anomalies that sink to the bottom of the South Adriatic Pit within a 2-year time. Other strong responses are the thermally driven responses to the EA in the Gulf of Lions and to the EAWR in the Aegean Sea. MOI and EAWR forcing of thermohaline properties in the Eastern Mediterranean sub-basins seems to be determined by reinforcement processes linked to the persistency of these modes in multiannual anomalous states. Our study also suggests that NAO, EA, EAWR and MOI could critically interfere with internal, deep and abyssal ocean dynamics and variability in the Mediterranean Sea.

  3. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  4. Status of large scale wind turbine technology development abroad?

    Institute of Scientific and Technical Information of China (English)

    Ye LI; Lei DUAN

    2016-01-01

    To facilitate the large scale (multi-megawatt) wind turbine development in China, the foreign e?orts and achievements in the area are reviewed and summarized. Not only the popular horizontal axis wind turbines on-land but also the o?shore wind turbines, vertical axis wind turbines, airborne wind turbines, and shroud wind turbines are discussed. The purpose of this review is to provide a comprehensive comment and assessment about the basic work principle, economic aspects, and environmental impacts of turbines.

  5. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  6. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    Energy Technology Data Exchange (ETDEWEB)

    Babu, Sudarsanam Suresh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Peter, William H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Dehoff, Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility

    2016-05-01

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact of the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii

  7. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  8. Large-area perovskite nanowire arrays fabricated by large-scale roll-to-roll micro-gravure printing and doctor blading

    Science.gov (United States)

    Hu, Qiao; Wu, Han; Sun, Jia; Yan, Donghang; Gao, Yongli; Yang, Junliang

    2016-02-01

    Organic-inorganic hybrid halide perovskite nanowires (PNWs) show great potential applications in electronic and optoelectronic devices such as solar cells, field-effect transistors and photodetectors. It is very meaningful to fabricate ordered, large-area PNW arrays and greatly accelerate their applications and commercialization in electronic and optoelectronic devices. Herein, highly oriented and ultra-long methylammonium lead iodide (CH3NH3PbI3) PNW array thin films were fabricated by large-scale roll-to-roll (R2R) micro-gravure printing and doctor blading in ambient environments (humility ~45%, temperature ~28 °C), which produced PNW lengths as long as 15 mm. Furthermore, photodetectors based on these PNWs were successfully fabricated on both silicon oxide (SiO2) and flexible polyethylene terephthalate (PET) substrates and showed moderate performance. This study provides low-cost, large-scale techniques to fabricate large-area PNW arrays with great potential applications in flexible electronic and optoelectronic devices.Organic-inorganic hybrid halide perovskite nanowires (PNWs) show great potential applications in electronic and optoelectronic devices such as solar cells, field-effect transistors and photodetectors. It is very meaningful to fabricate ordered, large-area PNW arrays and greatly accelerate their applications and commercialization in electronic and optoelectronic devices. Herein, highly oriented and ultra-long methylammonium lead iodide (CH3NH3PbI3) PNW array thin films were fabricated by large-scale roll-to-roll (R2R) micro-gravure printing and doctor blading in ambient environments (humility ~45%, temperature ~28 °C), which produced PNW lengths as long as 15 mm. Furthermore, photodetectors based on these PNWs were successfully fabricated on both silicon oxide (SiO2) and flexible polyethylene terephthalate (PET) substrates and showed moderate performance. This study provides low-cost, large-scale techniques to fabricate large-area PNW arrays

  9. Assessment, Student Learning and Classroom Practice: A Review

    Science.gov (United States)

    Amua-Sekyi, Ekua Tekyiwa

    2016-01-01

    Assessment in its various forms has always been a central part of educational practice. Evidence gleaned from the empirical literature suggests that assessment, especially high stakes external assessment has effect on how teachers teach and consequently, how students learn. Through focus group discussions, this paper draws upon the experiences of…

  10. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  11. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  12. Thermal interaction in crusted melt jets with large-scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Sugiyama, Ken-ichiro; Sotome, Fuminori; Ishikawa, Michio [Hokkaido Univ., Sapporo (Japan). Faculty of Engineering

    1998-01-01

    The objective of the present study is to experimentally observe thermal interaction which would be capable of triggering due to entrainment, or entrapment in crusted melt jets with `large-scale structure`. The present experiment was carried out by dropping molten zinc and molten tin of 100 grams, of which mass was sufficient to generate large-scale structures of melt jets. The experimental results show that the thermal interaction of entrapment type occurs in molten-zinc jets with rare probability, and the thermal interaction of entrainment type occurs in molten tin jets with high probability. The difference of thermal interaction between molten zinc and molten tin may attribute to differences of kinematic viscosity and melting point between them. (author)

  13. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  14. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  15. Hybrid Reynolds-Averaged/Large Eddy Simulation of a Cavity Flameholder; Assessment of Modeling Sensitivities

    Science.gov (United States)

    Baurle, R. A.

    2015-01-01

    Steady-state and scale-resolving simulations have been performed for flow in and around a model scramjet combustor flameholder. The cases simulated corresponded to those used to examine this flowfield experimentally using particle image velocimetry. A variety of turbulence models were used for the steady-state Reynolds-averaged simulations which included both linear and non-linear eddy viscosity models. The scale-resolving simulations used a hybrid Reynolds-averaged / large eddy simulation strategy that is designed to be a large eddy simulation everywhere except in the inner portion (log layer and below) of the boundary layer. Hence, this formulation can be regarded as a wall-modeled large eddy simulation. This effort was undertaken to formally assess the performance of the hybrid Reynolds-averaged / large eddy simulation modeling approach in a flowfield of interest to the scramjet research community. The numerical errors were quantified for both the steady-state and scale-resolving simulations prior to making any claims of predictive accuracy relative to the measurements. The steady-state Reynolds-averaged results showed a high degree of variability when comparing the predictions obtained from each turbulence model, with the non-linear eddy viscosity model (an explicit algebraic stress model) providing the most accurate prediction of the measured values. The hybrid Reynolds-averaged/large eddy simulation results were carefully scrutinized to ensure that even the coarsest grid had an acceptable level of resolution for large eddy simulation, and that the time-averaged statistics were acceptably accurate. The autocorrelation and its Fourier transform were the primary tools used for this assessment. The statistics extracted from the hybrid simulation strategy proved to be more accurate than the Reynolds-averaged results obtained using the linear eddy viscosity models. However, there was no predictive improvement noted over the results obtained from the explicit

  16. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    Science.gov (United States)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  17. Large-scale Estimates of Leaf Area Index from Active Remote Sensing Laser Altimetry

    Science.gov (United States)

    Hopkinson, C.; Mahoney, C.

    2016-12-01

    Leaf area index (LAI) is a key parameter that describes the spatial distribution of foliage within forest canopies which in turn control numerous relationships between the ground, canopy, and atmosphere. The retrieval of LAI has demonstrated success by in-situ (digital) hemispherical photography (DHP) and airborne laser scanning (ALS) data; however, field and ALS acquisitions are often spatially limited (100's km2) and costly. Large-scale (>1000's km2) retrievals have been demonstrated by optical sensors, however, accuracies remain uncertain due to the sensor's inability to penetrate the canopy. The spaceborne Geoscience Laser Altimeter System (GLAS) provides a possible solution in retrieving large-scale derivations whilst simultaneously penetrating the canopy. LAI retrieved by multiple DHP from 6 Australian sites, representing a cross-section of Australian ecosystems, were employed to model ALS LAI, which in turn were used to infer LAI from GLAS data at 5 other sites. An optimally filtered GLAS dataset was then employed in conjunction with a host of supplementary data to build a Random Forest (RF) model to infer predictions (and uncertainties) of LAI at a 250 m resolution across the forested regions of Australia. Predictions were validated against ALS-based LAI from 20 sites (R2=0.64, RMSE=1.1 m2m-2); MODIS-based LAI were also assessed against these sites (R2=0.30, RMSE=1.78 m2m-2) to demonstrate the strength of GLAS-based predictions. The large-scale nature of current predictions was also leveraged to demonstrate large-scale relationships of LAI with other environmental characteristics, such as: canopy height, elevation, and slope. The need for such wide-scale quantification of LAI is key in the assessment and modification of forest management strategies across Australia. Such work also assists Australia's Terrestrial Ecosystem Research Network, in fulfilling their government issued mandates.

  18. Environmental Impacts of Large Scale Biochar Application Through Spatial Modeling

    Science.gov (United States)

    Huber, I.; Archontoulis, S.

    2017-12-01

    In an effort to study the environmental (emissions, soil quality) and production (yield) impacts of biochar application at regional scales we coupled the APSIM-Biochar model with the pSIMS parallel platform. So far the majority of biochar research has been concentrated on lab to field studies to advance scientific knowledge. Regional scale assessments are highly needed to assist decision making. The overall objective of this simulation study was to identify areas in the USA that have the most gain environmentally from biochar's application, as well as areas which our model predicts a notable yield increase due to the addition of biochar. We present the modifications in both APSIM biochar and pSIMS components that were necessary to facilitate these large scale model runs across several regions in the United States at a resolution of 5 arcminutes. This study uses the AgMERRA global climate data set (1980-2010) and the Global Soil Dataset for Earth Systems modeling as a basis for creating its simulations, as well as local management operations for maize and soybean cropping systems and different biochar application rates. The regional scale simulation analysis is in progress. Preliminary results showed that the model predicts that high quality soils (particularly those common to Iowa cropping systems) do not receive much, if any, production benefit from biochar. However, soils with low soil organic matter ( 0.5%) do get a noteworthy yield increase of around 5-10% in the best cases. We also found N2O emissions to be spatial and temporal specific; increase in some areas and decrease in some other areas due to biochar application. In contrast, we found increases in soil organic carbon and plant available water in all soils (top 30 cm) due to biochar application. The magnitude of these increases (% change from the control) were larger in soil with low organic matter (below 1.5%) and smaller in soils with high organic matter (above 3%) and also dependent on biochar

  19. Applicability of vector processing to large-scale nuclear codes

    International Nuclear Information System (INIS)

    Ishiguro, Misako; Harada, Hiroo; Matsuura, Toshihiko; Okuda, Motoi; Ohta, Fumio; Umeya, Makoto.

    1982-03-01

    To meet the growing trend of computational requirements in JAERI, introduction of a high-speed computer with vector processing faculty (a vector processor) is desirable in the near future. To make effective use of a vector processor, appropriate optimization of nuclear codes to pipelined-vector architecture is vital, which will pose new problems concerning code development and maintenance. In this report, vector processing efficiency is assessed with respect to large-scale nuclear codes by examining the following items: 1) The present feature of computational load in JAERI is analyzed by compiling the computer utilization statistics. 2) Vector processing efficiency is estimated for the ten heavily-used nuclear codes by analyzing their dynamic behaviors run on a scalar machine. 3) Vector processing efficiency is measured for the other five nuclear codes by using the current vector processors, FACOM 230-75 APU and CRAY-1. 4) Effectiveness of applying a high-speed vector processor to nuclear codes is evaluated by taking account of the characteristics in JAERI jobs. Problems of vector processors are also discussed from the view points of code performance and ease of use. (author)

  20. Large-Scale Nanophotonic Solar Selective Absorbers for High-Efficiency Solar Thermal Energy Conversion.

    Science.gov (United States)

    Li, Pengfei; Liu, Baoan; Ni, Yizhou; Liew, Kaiyang Kevin; Sze, Jeff; Chen, Shuo; Shen, Sheng

    2015-08-19

    An omnidirectional nanophotonic solar selective absorber is fabricated on a large scale using a template-stripping method. The nanopyramid nickel structure achieves an average absorptance of 95% at a wavelength range below 1.3 μm and a low emittance less than 10% at wavelength >2.5 μm. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  2. A review of large-scale solar heating systems in Europe

    International Nuclear Information System (INIS)

    Fisch, M.N.; Guigas, M.; Dalenback, J.O.

    1998-01-01

    Large-scale solar applications benefit from the effect of scale. Compared to small solar domestic hot water (DHW) systems for single-family houses, the solar heat cost can be cut at least in third. The most interesting projects for replacing fossil fuels and the reduction of CO 2 -emissions are solar systems with seasonal storage in combination with gas or biomass boilers. In the framework of the EU-APAS project Large-scale Solar Heating Systems, thirteen existing plants in six European countries have been evaluated. lie yearly solar gains of the systems are between 300 and 550 kWh per m 2 collector area. The investment cost of solar plants with short-term storage varies from 300 up to 600 ECU per m 2 . Systems with seasonal storage show investment costs twice as high. Results of studies concerning the market potential for solar heating plants, taking new collector concepts and industrial production into account, are presented. Site specific studies and predesign of large-scale solar heating plants in six European countries for housing developments show a 50% cost reduction compared to existing projects. The cost-benefit-ratio for the planned systems with long-term storage is between 0.7 and 1.5 ECU per kWh per year. (author)

  3. Google Street View as an alternative method to car surveys in large-scale vegetation assessments.

    Science.gov (United States)

    Deus, Ernesto; Silva, Joaquim S; Catry, Filipe X; Rocha, Miguel; Moreira, Francisco

    2015-10-01

    Car surveys (CS) are a common method for assessing the distribution of alien invasive plants. Google Street View (GSV), a free-access web technology where users may experience a virtual travel along roads, has been suggested as a cost-effective alternative to car surveys. We tested if we could replicate the results from a countrywide survey conducted by car in Portugal using GSV as a remote sensing tool, aiming at assessing the distribution of Eucalyptus globulus Labill. wildlings on roadsides adjacent to eucalypt stands. Georeferenced points gathered along CS were used to create road transects visible as lines overlapping the road in GSV environment, allowing surveying the same sampling areas using both methods. This paper presents the results of the comparison between the two methods. Both methods produced similar models of plant abundance, selecting the same explanatory variables, in the same hierarchical order of importance and depicting a similar influence on plant abundance. Even though the GSV model had a lower performance and the GSV survey detected fewer plants, additional variables collected exclusively with GSV improved model performance and provided a new insight into additional factors influencing plant abundance. The survey using GSV required ca. 9 % of the funds and 62 % of the time needed to accomplish the CS. We conclude that GSV may be a cost-effective alternative to CS. We discuss some advantages and limitations of GSV as a survey method. We forecast that GSV may become a widespread tool in road ecology, particularly in large-scale vegetation assessments.

  4. Putting the Focus on Student Engagement: The Benefits of Performance-Based Assessment

    Science.gov (United States)

    Barlowe, Avram; Cook, Ann

    2016-01-01

    For more than two decades, the New York Performance Standards Consortium, a coalition of 38 public high schools, has steered clear of high-stakes testing, which superficially assess student learning. Instead, the consortium's approach relies on performance-based assessments--essays, research papers, science experiments, and high-level mathematical…

  5. Secure File Allocation and Caching in Large-scale Distributed Systems

    DEFF Research Database (Denmark)

    Di Mauro, Alessio; Mei, Alessandro; Jajodia, Sushil

    2012-01-01

    In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with hi......-balancing, and reducing delay of read operations. The system offers a trade-off-between performance and security that is dynamically tunable according to the current level of threat. We validate our mechanisms with extensive simulations in an Internet-like network.......In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with high...... security requirements in a system composed of a majority of low-security servers. We develop mechanisms to fragment files, to allocate them into multiple servers, and to cache them as close as possible to their readers while preserving the security requirement of the files, providing load...

  6. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  7. Large-Scale Spray Releases: Initial Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Schonewill, Philip P.; Gauglitz, Phillip A.; Bontha, Jagannadha R.; Daniel, Richard C.; Kurath, Dean E.; Adkins, Harold E.; Billing, Justin M.; Burns, Carolyn A.; Davis, James M.; Enderlin, Carl W.; Fischer, Christopher M.; Jenks, Jeromy WJ; Lukins, Craig D.; MacFarlan, Paul J.; Shutthanandan, Janani I.; Smith, Dennese M.

    2012-12-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and

  8. Just enough inflation. Power spectrum modifications at large scales

    International Nuclear Information System (INIS)

    Cicoli, Michele; Downes, Sean

    2014-07-01

    We show that models of 'just enough' inflation, where the slow-roll evolution lasted only 50-60 e-foldings, feature modifications of the CMB power spectrum at large angular scales. We perform a systematic and model-independent analysis of any possible non-slow-roll background evolution prior to the final stage of slow-roll inflation. We find a high degree of universality since most common backgrounds like fast-roll evolution, matter or radiation-dominance give rise to a power loss at large angular scales and a peak together with an oscillatory behaviour at scales around the value of the Hubble parameter at the beginning of slow-roll inflation. Depending on the value of the equation of state parameter, different pre-inflationary epochs lead instead to an enhancement of power at low-l, and so seem disfavoured by recent observational hints for a lack of CMB power at l< or similar 40. We also comment on the importance of initial conditions and the possibility to have multiple pre-inflationary stages.

  9. Volume measurement study for large scale input accountancy tank

    International Nuclear Information System (INIS)

    Uchikoshi, Seiji; Watanabe, Yuichi; Tsujino, Takeshi

    1999-01-01

    Large Scale Tank Calibration (LASTAC) facility, including an experimental tank which has the same volume and structure as the input accountancy tank of Rokkasho Reprocessing Plant (RRP) was constructed in Nuclear Material Control Center of Japan. Demonstration experiments have been carried out to evaluate a precision of solution volume measurement and to establish the procedure of highly accurate pressure measurement for a large scale tank with dip-tube bubbler probe system to be applied to the input accountancy tank of RRP. Solution volume in a tank is determined from substitution the solution level for the calibration function obtained in advance, which express a relation between the solution level and its volume in the tank. Therefore, precise solution volume measurement needs a precise calibration function that is determined carefully. The LASTAC calibration experiments using pure water showed good result in reproducibility. (J.P.N.)

  10. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  11. The Validity of Value-Added Estimates from Low-Stakes Testing Contexts: The Impact of Change in Test-Taking Motivation and Test Consequences

    Science.gov (United States)

    Finney, Sara J.; Sundre, Donna L.; Swain, Matthew S.; Williams, Laura M.

    2016-01-01

    Accountability mandates often prompt assessment of student learning gains (e.g., value-added estimates) via achievement tests. The validity of these estimates have been questioned when performance on tests is low stakes for students. To assess the effects of motivation on value-added estimates, we assigned students to one of three test consequence…

  12. Large-scale risk assessment of polycyclic aromatic hydrocarbons in shoreline sediments from Saudi Arabia: Environmental legacy after twelve years of the Gulf war oil spill

    Energy Technology Data Exchange (ETDEWEB)

    Bejarano, Adriana C., E-mail: ABejarano@researchplanning.co [Research Planning Inc., 1121 Park St., Columbia, SC 29201 (United States); Michel, Jacqueline [Research Planning Inc., 1121 Park St., Columbia, SC 29201 (United States)

    2010-05-15

    A large-scale assessment of polycyclic aromatic hydrocarbons (PAHs) from the 1991 Gulf War oil spill was performed for 2002-2003 sediment samples (n = 1679) collected from habitats along the shoreline of Saudi Arabia. Benthic sediment toxicity was characterized using the Equilibrium Partitioning Sediment Benchmark Toxic Unit approach for 43 PAHs (ESBTU{sub FCV,43}). Samples were assigned to risk categories according to ESBTU{sub FCV,43} values: no-risk (<=1), low (>1-<=2), low-medium (>2-<=3), medium (>3-<=5) and high-risk (>5). Sixty seven percent of samples had ESBTU{sub FCV,43} > 1 indicating potential adverse ecological effects. Sediments from the 0-30 cm layer from tidal flats, and the >30-<60 cm layer from heavily oiled halophytes and mangroves had high frequency of high-risk samples. No-risk samples were characterized by chrysene enrichment and depletion of lighter molecular weight PAHs, while high-risk samples showed little oil weathering and PAH patterns similar to 1993 samples. North of Safaniya sediments were not likely to pose adverse ecological effects contrary to sediments south of Tanaqib. Landscape and geomorphology has played a role on the distribution and persistence in sediments of oil from the Gulf War. - Risk Assessment of PAHs in shoreline sediments 12 years after the Gulf War oil spill.

  13. Large-scale risk assessment of polycyclic aromatic hydrocarbons in shoreline sediments from Saudi Arabia: Environmental legacy after twelve years of the Gulf war oil spill

    International Nuclear Information System (INIS)

    Bejarano, Adriana C.; Michel, Jacqueline

    2010-01-01

    A large-scale assessment of polycyclic aromatic hydrocarbons (PAHs) from the 1991 Gulf War oil spill was performed for 2002-2003 sediment samples (n = 1679) collected from habitats along the shoreline of Saudi Arabia. Benthic sediment toxicity was characterized using the Equilibrium Partitioning Sediment Benchmark Toxic Unit approach for 43 PAHs (ESBTU FCV,43 ). Samples were assigned to risk categories according to ESBTU FCV,43 values: no-risk (≤1), low (>1-≤2), low-medium (>2-≤3), medium (>3-≤5) and high-risk (>5). Sixty seven percent of samples had ESBTU FCV,43 > 1 indicating potential adverse ecological effects. Sediments from the 0-30 cm layer from tidal flats, and the >30-<60 cm layer from heavily oiled halophytes and mangroves had high frequency of high-risk samples. No-risk samples were characterized by chrysene enrichment and depletion of lighter molecular weight PAHs, while high-risk samples showed little oil weathering and PAH patterns similar to 1993 samples. North of Safaniya sediments were not likely to pose adverse ecological effects contrary to sediments south of Tanaqib. Landscape and geomorphology has played a role on the distribution and persistence in sediments of oil from the Gulf War. - Risk Assessment of PAHs in shoreline sediments 12 years after the Gulf War oil spill.

  14. A large-scale soil-structure interaction experiment: Part I design and construction

    International Nuclear Information System (INIS)

    Tang, H.T.; Tang, Y.K.; Wall, I.B.; Lin, E.

    1987-01-01

    In the simulated earthquake experiments (SIMQUAKE) sponsored by EPRI, the detonation of vertical arrays of explosives propagated wave motions through the ground to the model structures. Although such a simulation can provide information about dynamic soil-structure interaction (SSI) characteristics in a strong motion environment, it lacks seismic wave scattering characteristics for studying seismic input to the soil-structure system and the effect of different kinds of wave composition to the soil-structure response. To supplement the inadequacy of the simulated earthquake SSI experiment, the Electric Power Research Institute (EPRI) and the Taiwan Power Company (Taipower) jointly sponsored a large scale SSI experiment in the field. The objectives of the experiment are: (1) to obtain actual strong motion earthquakes induced database in a soft-soil environment which will substantiate predictive and design SSI models;and (2) to assess nuclear power plant reactor containment internal components dynamic response and margins relating to actual earthquake-induced excitation. These objectives are accomplished by recording and analyzing data from two instrumented, scaled down, (1/4- and 1/12-scale) reinforced concrete containments sited in a high seismic region in Taiwan where a strong-motion seismic array network is located

  15. Multivariate statistics high-dimensional and large-sample approximations

    CERN Document Server

    Fujikoshi, Yasunori; Shimizu, Ryoichi

    2010-01-01

    A comprehensive examination of high-dimensional analysis of multivariate methods and their real-world applications Multivariate Statistics: High-Dimensional and Large-Sample Approximations is the first book of its kind to explore how classical multivariate methods can be revised and used in place of conventional statistical tools. Written by prominent researchers in the field, the book focuses on high-dimensional and large-scale approximations and details the many basic multivariate methods used to achieve high levels of accuracy. The authors begin with a fundamental presentation of the basic

  16. REIONIZATION ON LARGE SCALES. I. A PARAMETRIC MODEL CONSTRUCTED FROM RADIATION-HYDRODYNAMIC SIMULATIONS

    International Nuclear Information System (INIS)

    Battaglia, N.; Trac, H.; Cen, R.; Loeb, A.

    2013-01-01

    We present a new method for modeling inhomogeneous cosmic reionization on large scales. Utilizing high-resolution radiation-hydrodynamic simulations with 2048 3 dark matter particles, 2048 3 gas cells, and 17 billion adaptive rays in a L = 100 Mpc h –1 box, we show that the density and reionization redshift fields are highly correlated on large scales (∼> 1 Mpc h –1 ). This correlation can be statistically represented by a scale-dependent linear bias. We construct a parametric function for the bias, which is then used to filter any large-scale density field to derive the corresponding spatially varying reionization redshift field. The parametric model has three free parameters that can be reduced to one free parameter when we fit the two bias parameters to simulation results. We can differentiate degenerate combinations of the bias parameters by combining results for the global ionization histories and correlation length between ionized regions. Unlike previous semi-analytic models, the evolution of the reionization redshift field in our model is directly compared cell by cell against simulations and performs well in all tests. Our model maps the high-resolution, intermediate-volume radiation-hydrodynamic simulations onto lower-resolution, larger-volume N-body simulations (∼> 2 Gpc h –1 ) in order to make mock observations and theoretical predictions

  17. Towards a large scale high energy cosmic neutrino undersea detector

    Energy Technology Data Exchange (ETDEWEB)

    Azoulay, R.; Berthier, R. [CEA Centre d`Etudes de Cadarache, 13 - Saint-Paul-lez-Durance (France). Direction des Sciences de la Matiere; Arpesella, C. [Centre National de la Recherche Scientifique (CNRS), 13 - Marseille (France). Centre de Physique Theorique] [and others

    1997-06-01

    ANTARES collaboration proposes to study high energy cosmic neutrinos by using a deep sea Cherenkov detector. The potential interest of such a study for astrophysicists and particle physicists is developed. The different origins of cosmic neutrinos are reviewed. In order to observe with relevant statistic the flux of neutrinos from extra-galactic sources, a km-scale detector is necessary. The feasibility of such a detector is studied. A variety of technical problems have been solved. Some of them are standard for particle physicists: choice of photo-multipliers, monitoring, trigger, electronics, data acquisition, detector optimization. Others are more specific of sea science engineering particularly: detector deployment in deep sea, data transmission through optical cables, bio-fouling, effect of sea current. The solutions are presented and the sea engineering part involving detector installation will be tested near French coasts. It is scheduled to build a reduced-scale demonstrator within the next 2 years. (A.C.) 50 refs.

  18. Towards a large scale high energy cosmic neutrino undersea detector

    International Nuclear Information System (INIS)

    Azoulay, R.; Berthier, R.; Arpesella, C.

    1997-06-01

    ANTARES collaboration proposes to study high energy cosmic neutrinos by using a deep sea Cherenkov detector. The potential interest of such a study for astrophysicists and particle physicists is developed. The different origins of cosmic neutrinos are reviewed. In order to observe with relevant statistic the flux of neutrinos from extra-galactic sources, a km-scale detector is necessary. The feasibility of such a detector is studied. A variety of technical problems have been solved. Some of them are standard for particle physicists: choice of photo-multipliers, monitoring, trigger, electronics, data acquisition, detector optimization. Others are more specific of sea science engineering particularly: detector deployment in deep sea, data transmission through optical cables, bio-fouling, effect of sea current. The solutions are presented and the sea engineering part involving detector installation will be tested near French coasts. It is scheduled to build a reduced-scale demonstrator within the next 2 years. (A.C.)

  19. Design of an omnidirectional single-point photodetector for large-scale spatial coordinate measurement

    Science.gov (United States)

    Xie, Hongbo; Mao, Chensheng; Ren, Yongjie; Zhu, Jigui; Wang, Chao; Yang, Lei

    2017-10-01

    In high precision and large-scale coordinate measurement, one commonly used approach to determine the coordinate of a target point is utilizing the spatial trigonometric relationships between multiple laser transmitter stations and the target point. A light receiving device at the target point is the key element in large-scale coordinate measurement systems. To ensure high-resolution and highly sensitive spatial coordinate measurement, a high-performance and miniaturized omnidirectional single-point photodetector (OSPD) is greatly desired. We report one design of OSPD using an aspheric lens, which achieves an enhanced reception angle of -5 deg to 45 deg in vertical and 360 deg in horizontal. As the heart of our OSPD, the aspheric lens is designed in a geometric model and optimized by LightTools Software, which enables the reflection of a wide-angle incident light beam into the single-point photodiode. The performance of home-made OSPD is characterized with working distances from 1 to 13 m and further analyzed utilizing developed a geometric model. The experimental and analytic results verify that our device is highly suitable for large-scale coordinate metrology. The developed device also holds great potential in various applications such as omnidirectional vision sensor, indoor global positioning system, and optical wireless communication systems.

  20. Large scale particle simulations in a virtual memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Million, R.; Wagner, J.S.; Tajima, T.

    1983-01-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceeds the computer core size. The required address space is automatically mapped onto slow disc memory the the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Assesses to slow memory significantly reduce the excecution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time. (orig.)