WorldWideScience

Sample records for large-scale writing assessment

  1. Differences Across Levels in the Language of Agency and Ability in Rating Scales for Large-Scale Second Language Writing Assessments

    OpenAIRE

    Anderson Salena Sampson

    2017-01-01

    While large-scale language and writing assessments benefit from a wealth of literature on the reliability and validity of specific tests and rating procedures, there is comparatively less literature that explores the specific language of second language writing rubrics. This paper provides an analysis of the language of performance descriptors for the public versions of the TOEFL and IELTS writing assessment rubrics, with a focus on linguistic agency encoded by agentive verbs and language of ...

  2. Differences Across Levels in the Language of Agency and Ability in Rating Scales for Large-Scale Second Language Writing Assessments

    Directory of Open Access Journals (Sweden)

    Anderson Salena Sampson

    2017-12-01

    Full Text Available While large-scale language and writing assessments benefit from a wealth of literature on the reliability and validity of specific tests and rating procedures, there is comparatively less literature that explores the specific language of second language writing rubrics. This paper provides an analysis of the language of performance descriptors for the public versions of the TOEFL and IELTS writing assessment rubrics, with a focus on linguistic agency encoded by agentive verbs and language of ability encoded by modal verbs can and cannot. While the IELTS rubrics feature more agentive verbs than the TOEFL rubrics, both pairs of rubrics feature uneven syntax across the band or score descriptors with either more agentive verbs for the highest scores, more nominalization for the lowest scores, or language of ability exclusively in the lowest scores. These patterns mirror similar patterns in the language of college-level classroom-based writing rubrics, but they differ from patterns seen in performance descriptors for some large-scale admissions tests. It is argued that the lack of syntactic congruity across performance descriptors in the IELTS and TOEFL rubrics may reflect a bias in how actual student performances at different levels are characterized.

  3. Comparing the Effectiveness of Self-Paced and Collaborative Frame-of-Reference Training on Rater Accuracy in a Large-Scale Writing Assessment

    Science.gov (United States)

    Raczynski, Kevin R.; Cohen, Allan S.; Engelhard, George, Jr.; Lu, Zhenqiu

    2015-01-01

    There is a large body of research on the effectiveness of rater training methods in the industrial and organizational psychology literature. Less has been reported in the measurement literature on large-scale writing assessments. This study compared the effectiveness of two widely used rater training methods--self-paced and collaborative…

  4. Large-Scale Direct-Writing of Aligned Nanofibers for Flexible Electronics.

    Science.gov (United States)

    Ye, Dong; Ding, Yajiang; Duan, Yongqing; Su, Jiangtao; Yin, Zhouping; Huang, Yong An

    2018-05-01

    Nanofibers/nanowires usually exhibit exceptionally low flexural rigidities and remarkable tolerance against mechanical bending, showing superior advantages in flexible electronics applications. Electrospinning is regarded as a powerful process for this 1D nanostructure; however, it can only be able to produce chaotic fibers that are incompatible with the well-patterned microstructures in flexible electronics. Electro-hydrodynamic (EHD) direct-writing technology enables large-scale deposition of highly aligned nanofibers in an additive, noncontact, real-time adjustment, and individual control manner on rigid or flexible, planar or curved substrates, making it rather attractive in the fabrication of flexible electronics. In this Review, the ground-breaking research progress in the field of EHD direct-writing technology is summarized, including a brief chronology of EHD direct-writing techniques, basic principles and alignment strategies, and applications in flexible electronics. Finally, future prospects are suggested to advance flexible electronics based on orderly arranged EHD direct-written fibers. This technology overcomes the limitations of the resolution of fabrication and viscosity of ink of conventional inkjet printing, and represents major advances in manufacturing of flexible electronics. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Materials for Assessing the Writing Skill

    Science.gov (United States)

    Nimehchisalem, Vahid

    2010-01-01

    This paper reviews the issues of concern in writing scale development in English as Second Language (ESL) settings with an intention to provide a useful guide for researchers or writing teachers who wish to develop or adapt valid, reliable and efficient writing scales considering their present assessment situations. With a brief discussion on the…

  6. Materials for Assessing the Writing Skill

    Directory of Open Access Journals (Sweden)

    Vahid Nimehchisalem

    2010-07-01

    Full Text Available This paper reviews the issues of concern in writing scale development in English as Second Language (ESL settings with an intention to provide a useful guide for researchers or writing teachers who wish to develop or adapt valid, reliable and efficient writing scales considering their present assessment situations. With a brief discussion on the rationale behind writing scales, the author considers the process of scale development by breaking it into three phases of design, operationalization and administration. The issues discussed in the first phase include analyzing the samples, deciding on the type of scale and ensuring the validity of its design. Phase two encompasses setting the scale criteria, operationalization of definitions, setting a numerical value, assigning an appropriate weight for each trait, accounting for validity and reliability. The final phase comprises recommendations on how a writing scale should be used.

  7. Secondary Students' Writing Achievement Goals: Assessing the Mediating Effects of Mastery and Performance Goals on Writing Self-Efficacy, Affect, and Writing Achievement

    Science.gov (United States)

    Yilmaz Soylu, Meryem; Zeleny, Mary G.; Zhao, Ruomeng; Bruning, Roger H.; Dempsey, Michael S.; Kauffman, Douglas F.

    2017-01-01

    The two studies reported here explored the factor structure of the newly constructed Writing Achievement Goal Scale (WAGS), and examined relationships among secondary students' writing achievement goals, writing self-efficacy, affect for writing, and writing achievement. In the first study, 697 middle school students completed the WAGS. A confirmatory factor analysis revealed a good fit for this data with a three-factor model that corresponds with mastery, performance approach, and performance avoidance goals. The results of Study 1 were an indication for the researchers to move forward with Study 2, which included 563 high school students. The secondary students completed the WAGS, as well as the Self-efficacy for Writing Scale, and the Liking Writing Scale. Students also self-reported grades for writing and for language arts courses. Approximately 6 weeks later, students completed a statewide writing assessment. We tested a theoretical model representing relationships among Study 2 variables using structural equation modeling including students' responses to the study scales and students' scores on the statewide assessment. Results from Study 2 revealed a good fit between a model depicting proposed relationships among the constructs and the data. Findings are discussed relative to achievement goal theory and writing. PMID:28878707

  8. Secondary Students' Writing Achievement Goals: Assessing the Mediating Effects of Mastery and Performance Goals on Writing Self-Efficacy, Affect, and Writing Achievement

    Directory of Open Access Journals (Sweden)

    Meryem Yilmaz Soylu

    2017-08-01

    Full Text Available The two studies reported here explored the factor structure of the newly constructed Writing Achievement Goal Scale (WAGS, and examined relationships among secondary students' writing achievement goals, writing self-efficacy, affect for writing, and writing achievement. In the first study, 697 middle school students completed the WAGS. A confirmatory factor analysis revealed a good fit for this data with a three-factor model that corresponds with mastery, performance approach, and performance avoidance goals. The results of Study 1 were an indication for the researchers to move forward with Study 2, which included 563 high school students. The secondary students completed the WAGS, as well as the Self-efficacy for Writing Scale, and the Liking Writing Scale. Students also self-reported grades for writing and for language arts courses. Approximately 6 weeks later, students completed a statewide writing assessment. We tested a theoretical model representing relationships among Study 2 variables using structural equation modeling including students' responses to the study scales and students' scores on the statewide assessment. Results from Study 2 revealed a good fit between a model depicting proposed relationships among the constructs and the data. Findings are discussed relative to achievement goal theory and writing.

  9. Toward Instructional Leadership: Principals' Perceptions of Large-Scale Assessment in Schools

    Science.gov (United States)

    Prytula, Michelle; Noonan, Brian; Hellsten, Laurie

    2013-01-01

    This paper describes a study of the perceptions that Saskatchewan school principals have regarding large-scale assessment reform and their perceptions of how assessment reform has affected their roles as principals. The findings revealed that large-scale assessments, especially provincial assessments, have affected the principal in Saskatchewan…

  10. How the Internet Will Help Large-Scale Assessment Reinvent Itself

    Directory of Open Access Journals (Sweden)

    Randy Elliot Bennett

    2001-02-01

    Full Text Available Large-scale assessment in the United States is undergoing enormous pressure to change. That pressure stems from many causes. Depending upon the type of test, the issues precipitating change include an outmoded cognitive-scientific basis for test design; a mismatch with curriculum; the differential performance of population groups; a lack of information to help individuals improve; and inefficiency. These issues provide a strong motivation to reconceptualize both the substance and the business of large-scale assessment. At the same time, advances in technology, measurement, and cognitive science are providing the means to make that reconceptualization a reality. The thesis of this paper is that the largest facilitating factor will be technological, in particular the Internet. In the same way that it is already helping to revolutionize commerce, education, and even social interaction, the Internet will help revolutionize the business and substance of large-scale assessment.

  11. Self-Assessment Methods in Writing Instruction: A Conceptual Framework, Successful Practices and Essential Strategies

    Science.gov (United States)

    Nielsen, Kristen

    2014-01-01

    Student writing achievement is essential to lifelong learner success, but supporting writing can be challenging for teachers. Several large-scale analyses of publications on writing have called for further study of instructional methods, as the current literature does not sufficiently address the need to support best teaching practices.…

  12. Discourse Approaches to Writing Assessment.

    Science.gov (United States)

    Connnor, Ulla; Mbaye, Aymerou

    2002-01-01

    Discusses assessment of English-as-a-Foreign/Second-Language (EFL/ESL) writing. Suggests there is a considerable gap between current practices in writing assessment and criteria suggested by advances in knowledge of discourse structure. Illustrates this by contrasting current practices in the scoring of two major EFL/ESL writing tests with…

  13. NASA: Assessments of Selected Large-Scale Projects

    Science.gov (United States)

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  14. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  15. Triangulating Teacher Perception, Classroom Observations, and Student Work to Evaluate Secondary Writing Programs

    Science.gov (United States)

    Henderson, Daphne Carr; Rupley, William H.; Nichols, Janet Alys; Nichols, William Dee; Rasinski, Timothy V.

    2018-01-01

    Current professional development efforts in writing at the secondary level have not resulted in student improvement on large-scale writing assessments. To maximize funding resources and instructional time, school leaders need a way to determine professional development content for writing teachers that aligns with specific student outcomes. The…

  16. Evolutionary leap in large-scale flood risk assessment needed

    OpenAIRE

    Vorogushyn, Sergiy; Bates, Paul D.; de Bruijn, Karin; Castellarin, Attilio; Kreibich, Heidi; Priest, Sally J.; Schröter, Kai; Bagli, Stefano; Blöschl, Günter; Domeneghetti, Alessio; Gouldby, Ben; Klijn, Frans; Lammersen, Rita; Neal, Jeffrey C.; Ridder, Nina

    2018-01-01

    Current approaches for assessing large-scale flood risks contravene the fundamental principles of the flood risk system functioning because they largely ignore basic interactions and feedbacks between atmosphere, catchments, river-floodplain systems and socio-economic processes. As a consequence, risk analyses are uncertain and might be biased. However, reliable risk estimates are required for prioritizing national investments in flood risk mitigation or for appraisal and management of insura...

  17. Accuracy assessment of planimetric large-scale map data for decision-making

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-06-01

    Full Text Available This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data.

  18. Large-Scale Assessment, Rationality, and Scientific Management: The Case of No Child Left Behind

    Science.gov (United States)

    Roach, Andrew T.; Frank, Jennifer

    2007-01-01

    This article examines the ways in which NCLB and the movement towards large-scale assessment systems are based on Weber's concept of formal rationality and tradition of scientific management. Building on these ideas, the authors use Ritzer's McDonaldization thesis to examine some of the core features of large-scale assessment and accountability…

  19. Estimating the Effectiveness of Special Education Using Large-Scale Assessment Data

    Science.gov (United States)

    Ewing, Katherine Anne

    2009-01-01

    The inclusion of students with disabilities in large scale assessment and accountability programs has provided new opportunities to examine the impact of special education services on student achievement. Hanushek, Kain, and Rivkin (1998, 2002) evaluated the effectiveness of special education programs by examining students' gains on a large-scale…

  20. Techniques for motivating students to write, for teaching writing and for systematizing writing assessment

    OpenAIRE

    Küçükal, Şerife

    1990-01-01

    Ankara : Faculty of Letters and the Institute of Economics and Social Science of Bilkent Univ., 1990. Thesis (Master's) -- Bilkent University, 1990. Includes bibliographical references. The purpose of this study is to investigate the suggestions that experts in the field of teaching composition have for motivating students to write, teaching writing and assessing writing and the ways that these suggestions could be used in Turkish EFL Hazirlik classes for elementary level students. ...

  1. Fuel pin integrity assessment under large scale transients

    International Nuclear Information System (INIS)

    Dutta, B.K.

    2006-01-01

    The integrity of fuel rods under normal, abnormal and accident conditions is an important consideration during fuel design of advanced nuclear reactors. The fuel matrix and the sheath form the first barrier to prevent the release of radioactive materials into the primary coolant. An understanding of the fuel and clad behaviour under different reactor conditions, particularly under the beyond-design-basis accident scenario leading to large scale transients, is always desirable to assess the inherent safety margins in fuel pin design and to plan for the mitigation the consequences of accidents, if any. The severe accident conditions are typically characterized by the energy deposition rates far exceeding the heat removal capability of the reactor coolant system. This may lead to the clad failure due to fission gas pressure at high temperature, large- scale pellet-clad interaction and clad melting. The fuel rod performance is affected by many interdependent complex phenomena involving extremely complex material behaviour. The versatile experimental database available in this area has led to the development of powerful analytical tools to characterize fuel under extreme scenarios

  2. Analysis of environmental impact assessment for large-scale X-ray medical equipments

    International Nuclear Information System (INIS)

    Fu Jin; Pei Chengkai

    2011-01-01

    Based on an Environmental Impact Assessment (EIA) project, this paper elaborates the basic analysis essentials of EIA for the sales project of large-scale X-ray medical equipment, and provides the analysis procedure of environmental impact and dose estimation method under normal and accident conditions. The key points of EIA for the sales project of large-scale X-ray medical equipment include the determination of pollution factor and management limit value according to the project's actual situation, the utilization of various methods of assessment and prediction such as analogy, actual measurement and calculation to analyze, monitor, calculate and predict the pollution during normal and accident condition. (authors)

  3. Improving Undergraduates’ Argumentative Group Essay Writing through Self-assessment

    Directory of Open Access Journals (Sweden)

    Yong Mei Fung

    2015-10-01

    Full Text Available When writing an argumentative essay, writers develop and evaluate arguments to embody, initiate, or simulate various kinds of interpersonal and textual interaction for reader consideration (Wu & Allison, 2003. This is quite challenging for English as a second language (ESL learners. To improve the quality of their writing, students need to review their draft throughout the writing process. This study aimed to investigate the effect of self-assessment in group writing and how group work improves students’ writing ability. An intact class comprising 22 first-year undergraduates participated in the study.  Data were collected from pre- and post-treatment writing tests, semi-structured interview and reflection entries. The results revealed that self-assessment has a significant effect on students’ writing performance. Group work also enhanced social and cognitive development of the students. This study provides insights into the use of self-assessment in writing class to develop learner autonomy and improve writing ability. Keywords: Argumentative essay, Self-assessment, Learner autonomy, Group writing, ESL learners

  4. A conceptual analysis of standard setting in large-scale assessments

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1994-01-01

    Elements of arbitrariness in the standard setting process are explored, and an alternative to the use of cut scores is presented. The first part of the paper analyzes the use of cut scores in large-scale assessments, discussing three different functions: (1) cut scores define the qualifications used

  5. Assessment of renewable energy resources potential for large scale and standalone applications in Ethiopia

    NARCIS (Netherlands)

    Tucho, Gudina Terefe; Weesie, Peter D.M.; Nonhebel, Sanderine

    2014-01-01

    This study aims to determine the contribution of renewable energy to large scale and standalone application in Ethiopia. The assessment starts by determining the present energy system and the available potentials. Subsequently, the contribution of the available potentials for large scale and

  6. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  7. Writing apprehension and academic procrastination among graduate students.

    Science.gov (United States)

    Onwuegbuzie, A J; Collins, K M

    2001-04-01

    Academic procrastination has been associated with both fear of failure and task aversiveness. Researchers have reported that most undergraduate and graduate students delay academic tasks. Among the latter, a large proportion report procrastination in writing term papers. Such procrastination may originate from and lead to anxiety about writing so the present purpose was to investigate the relationship between scores on Daly and Miller's 1975 Writing Apprehension Test and on the two dimensions, i.e., fear of failure and task aversiveness, of Solomon and Rothblum's 1984 Procrastination Assessment Scale-Students. Participants were 135 graduate students of varied disciplinary backgrounds. Correlations between writing apprehension and academic procrastination stemmed from fear of failure (29) and task aversiveness (.41). Implications are discussed.

  8. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    Science.gov (United States)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  9. Establishing Peer Mentor-Led Writing Groups in Large First-Year Courses

    Science.gov (United States)

    Marcoux, Sarah; Marken, Liv; Yu, Stan

    2012-01-01

    This paper describes the results of a pilot project designed to improve students' academic writing in a large (200-student) first-year Agriculture class at the University of Saskatchewan. In collaboration with the course's professor, the Writing Centre coordinator and a summer student designed curriculum for four two-hour Writing Group sessions…

  10. A Heuristic Tool for Teaching Business Writing: Self-Assessment, Knowledge Transfer, and Writing Exercises

    Science.gov (United States)

    Ortiz, Lorelei A.

    2013-01-01

    To teach effective business communication, instructors must target students’ current weaknesses in writing. One method for doing so is by assigning writing exercises. When used heuristically, writing exercises encourage students to practice self-assessment, self-evaluation, active learning, and knowledge transfer, all while reinforcing the basics…

  11. From reading to writing: Evaluating the Writer's Craft as a means of assessing school student writing

    Directory of Open Access Journals (Sweden)

    Pauline Sangster, Graeme Trousdale & Charles Anderson

    2012-06-01

    Full Text Available This article reports on part of a study investigating a new writing assessment, the Writer's Craft, which requires students to read a stimulus passage and then write a continuation adopting the style of the original. The article provides a detailed analysis of stimulus passages employed within this assessment scheme and students' written continuations of these passages. The findings reveal that this is a considerably more challenging assessment writing task than has previously been recognised; and that questions arise concerning the nature of the stimulus passages and the extent to which the assessment criteria captured what the students had achieved in their writing. The implications of these findings are discussed and recommendations are made.

  12. Explore the Usefulness of Person-Fit Analysis on Large-Scale Assessment

    Science.gov (United States)

    Cui, Ying; Mousavi, Amin

    2015-01-01

    The current study applied the person-fit statistic, l[subscript z], to data from a Canadian provincial achievement test to explore the usefulness of conducting person-fit analysis on large-scale assessments. Item parameter estimates were compared before and after the misfitting student responses, as identified by l[subscript z], were removed. The…

  13. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  14. THE ADAPTATION TO TURKISH OF THE WRITING ATTITUDE SCALE (WAS): THE STUDY OF VALIDITY AND RELIABILITY

    OpenAIRE

    GÖÇER, Ali

    2016-01-01

    The purpose of this study is to make Turkish adaptation the Writing Attitude Scale (WAS) that In order to measure writing anxiety developed by Marcia et al (1984). For this purpose was carried out the Validation of a Writing Attitude Scale and to examine its reliability and validity. Writing Attitude Scale (WAS) was first translated into Turkish and, equivalence analysis of forms English / Turkish language of the scale were carried out by the reading of three English teachers / lecturers. The...

  15. Organic Chemistry YouTube Writing Assignment for Large Lecture Classes

    Science.gov (United States)

    Franz, Annaliese K.

    2012-01-01

    This work describes efforts to incorporate and evaluate the use of a YouTube writing assignment in large lecture classes to personalize learning and improve conceptual understanding of chemistry through peer- and self-explanation strategies. Although writing assignments can be a method to incorporate peer- and self-explanation strategies, this…

  16. Designing Academic Writing Analytics for Civil Law Student Self-Assessment

    Science.gov (United States)

    Knight, Simon; Buckingham Shum, Simon; Ryan, Philippa; Sándor, Ágnes; Wang, Xiaolong

    2018-01-01

    Research into the teaching and assessment of student writing shows that many students find academic writing a challenge to learn, with legal writing no exception. Improving the availability and quality of timely formative feedback is an important aim. However, the time-consuming nature of assessing writing makes it impractical for instructors to…

  17. Laser direct writing of micro- and nano-scale medical devices

    Science.gov (United States)

    Gittard, Shaun D; Narayan, Roger J

    2010-01-01

    Laser-based direct writing of materials has undergone significant development in recent years. The ability to modify a variety of materials at small length scales and using short production times provides laser direct writing with unique capabilities for fabrication of medical devices. In many laser-based rapid prototyping methods, microscale and submicroscale structuring of materials is controlled by computer-generated models. Various laser-based direct write methods, including selective laser sintering/melting, laser machining, matrix-assisted pulsed-laser evaporation direct write, stereolithography and two-photon polymerization, are described. Their use in fabrication of microstructured and nanostructured medical devices is discussed. Laser direct writing may be used for processing a wide variety of advanced medical devices, including patient-specific prostheses, drug delivery devices, biosensors, stents and tissue-engineering scaffolds. PMID:20420557

  18. A synthesis of mathematics writing: Assessments, interventions, and surveys

    Directory of Open Access Journals (Sweden)

    Sarah R. Powell

    2017-02-01

    Full Text Available Mathematics standards in the United States describe communication as an essential part of mathematics. One outlet for communication is writing. To understand the mathematics writing of students, we conducted a synthesis to evaluate empirical research about mathematics writing. We identified 29 studies that included a mathematics-writing assessment, intervention, or survey for students in 1st through 12th grade. All studies were published between 1991 and 2015. The majority of assessments required students to write explanations to mathematical problems, and fewer than half scored student responses according to a rubric. Approximately half of the interventions involved the use of mathematics journals as an outlet for mathematics writing. Few intervention studies provided explicit direction on how to write in mathematics, and a small number of investigations provided statistical evidence of intervention efficacy. From the surveys, the majority of students expressed enjoyment when writing in mathematics settings but teachers reported using mathematics writing rarely. Across studies, findings indicate mathematics writing is used for a variety of purposes, but the quality of the studies is variable and more empirical research is needed.

  19. Secondary Analysis of Large-Scale Assessment Data: An Alternative to Variable-Centred Analysis

    Science.gov (United States)

    Chow, Kui Foon; Kennedy, Kerry John

    2014-01-01

    International large-scale assessments are now part of the educational landscape in many countries and often feed into major policy decisions. Yet, such assessments also provide data sets for secondary analysis that can address key issues of concern to educators and policymakers alike. Traditionally, such secondary analyses have been based on a…

  20. Measuring Students' Writing Ability on a Computer-Analytic Developmental Scale: An Exploratory Validity Study

    Science.gov (United States)

    Burdick, Hal; Swartz, Carl W.; Stenner, A. Jackson; Fitzgerald, Jill; Burdick, Don; Hanlon, Sean T.

    2013-01-01

    The purpose of the study was to explore the validity of a novel computer-analytic developmental scale, the Writing Ability Developmental Scale. On the whole, collective results supported the validity of the scale. It was sensitive to writing ability differences across grades and sensitive to within-grade variability as compared to human-rated…

  1. The Word Writing CAFE: Assessing Student Writing for Complexity, Accuracy, and Fluency

    Science.gov (United States)

    Leal, Dorothy J.

    2005-01-01

    The Word Writing CAFE is a new assessment tool designed for teachers to evaluate objectively students' word-writing ability for fluency, accuracy, and complexity. It is designed to be given to the whole class at one time. This article describes the development of the CAFE and provides directions for administering and scoring it. The author also…

  2. A Self-assessment Checklist for Undergraduate Students’ Argumentative Writing

    Directory of Open Access Journals (Sweden)

    Vahid Nimehchisalem

    2014-02-01

    Full Text Available With a growing emphasis on students’ ability to assess their own written works in teaching English as a Second Language (ESL writing courses, self-assessment checklists are today regarded as useful tools. These checklists can help learners diagnose their own weaknesses and improve their writing performance. This necessitates development of checklists that guide the learners in assessing their own writing. In this study, a self-assessment checklist was developed for undergraduate students in an ESL context to help them with their argumentative essays. This paper presents the related literature and theories, based on which the checklist was developed. The checklist is described and its potential theoretical and practical implications in ESL writing classes are discussed. Further research is necessary to refine the checklist through focus group studies with lecturers and students.

  3. Linking Large-Scale Reading Assessments: Comment

    Science.gov (United States)

    Hanushek, Eric A.

    2016-01-01

    E. A. Hanushek points out in this commentary that applied researchers in education have only recently begun to appreciate the value of international assessments, even though there are now 50 years of experience with these. Until recently, these assessments have been stand-alone surveys that have not been linked, and analysis has largely focused on…

  4. Employing Picture Description to Assess the Students' Descriptive Paragraph Writing

    Directory of Open Access Journals (Sweden)

    Ida Ayu Mega Cahyani

    2018-03-01

    Full Text Available Writing is considered as an important skill in learning process which is needed to be mastered by the students. However, in teaching learning process at schools or universities, the assessment of writing skill is not becoming the focus of learning process and the assessment is administered inappropriately. In this present study, the researcher undertook the study which dealt with assessing descriptive paragraph writing ability of the students through picture description by employing an ex post facto as the research design. The present study was intended to answer the research problem dealing with the extent of the students’ achievement of descriptive paragraph writing ability which is assessed through picture description. The samples under the study were 40 students determined by means of random sampling technique with lottery system. The data were collected through administering picture description as the research instrument. The obtained data were analyzed by using norm-reference measure of five standard values. The results of the data analysis showed that there were 67.50% samples of the study were successful in writing descriptive paragraph, while there were 32.50% samples were unsuccessful in writing descriptive paragraph which was assessed by administering picture description test

  5. Critical thinking evaluation in reflective writing: Development and testing of Carter Assessment of Critical Thinking in Midwifery (Reflection).

    Science.gov (United States)

    Carter, Amanda G; Creedy, Debra K; Sidebotham, Mary

    2017-11-01

    develop and test a tool designed for use by academics to evaluate pre-registration midwifery students' critical thinking skills in reflective writing. a descriptive cohort design was used. a random sample (n = 100) of archived student reflective writings based on a clinical event or experience during 2014 and 2015. a staged model for tool development was used to develop a fifteen item scale involving item generation; mapping of draft items to critical thinking concepts and expert review to test content validity; inter-rater reliability testing; pilot testing of the tool on 100 reflective writings; and psychometric testing. Item scores were analysed for mean, range and standard deviation. Internal reliability, content and construct validity were assessed. expert review of the tool revealed a high content validity index score of 0.98. Using two independent raters to establish inter-rater reliability, good absolute agreement of 72% was achieved with a Kappa coefficient K = 0.43 (pcritical thinking in reflective writing. Validation with large diverse samples is warranted. reflective practice is a key learning and teaching strategy in undergraduate Bachelor of Midwifery programmes and essential for safe, competent practice. There is the potential to enhance critical thinking development by assessingreflective writing with the CACTiM (reflection) tool to provide formative and summative feedback to students and inform teaching strategies. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  6. Assessing and Improving L2 Graduate Students' Popular Science and Academic Writing in an Academic Writing Course

    Science.gov (United States)

    Rakedzon, Tzipora; Baram-Tsabari, Ayelet

    2017-01-01

    This paper reports a study using a quasi-experimental design to examine whether an academic writing course in English can improve graduate students' academic and popular science writing skills. To address this issue, we designed pre- and post-assessment tasks, an intervention assessment task and a scoring rubric. The pre- and post-assessment tasks…

  7. Fluency or Accuracy - Two Different ‘Colours’ in Writing Assessment

    Directory of Open Access Journals (Sweden)

    Listyani Listyani

    2017-01-01

    Full Text Available Fluency and accuracy. These two things have victoriously won many teachers’ attention at tertiary level. In the case of writing, these two remain debatable, and have always attracted many people, both lecturers’ and students’ attention. These language production measures have distracted many lecturers’ concentration: should they be faithful to fluency of ideas, or grammatical and language accuracy in correcting students’ essays? This paper tries to present the classical yet never-ending dilemmatic conflicts within the area of writing assessment. This debate still remains interesting to follow. Data were gained from close observation on documents, that is, 21 students’ essays and interviews with 2 students of Academic Writing in Semester II, 2015-2016. Four writing lecturers were also interviewed for their intellectual and critical opinions on these dilemmatic problems in assessing writing. Discussion results of FGD (Forum Group Discussion involving all writing lecturers at the English Education Study Program at the Faculty of Language and Literature of Satya Wacana Christian University which were held in June, 2016, were also included as source of data. Hopefully, this paper gives a little more “colour” in the area of writing assessment, and gives a little enlightenment for other writing lecturers.   DOI: https://doi.org/10.24071/llt.2016.190201

  8. Improving report writing by peer assessment using Coursera

    DEFF Research Database (Denmark)

    Christiansen, Henrik Lehrmann

    2015-01-01

    report writing. In the case of report writing active learning could include peer evaluation which is what is investigated in this paper. This paper presents a case study from the Technical University of Denmark. A course on mobile communication was redesigned to include peer evaluation as a tool......Report writing is a general engineering competence and it should therefore be part of any university engineering education to learn how to write a good report. Active leaning methods are well-known to be effective in supporting student learning; hence it should preferably also be used for teaching...... for improving report writing skills. The peer evaluation process was automated using the elearning tool Coursera. What was investigated was the improvement in report writing as well as the consistency and quality of the peer assessed grades....

  9. A Self-Assessment Checklist for Undergraduate Students' Argumentative Writing

    Science.gov (United States)

    Nimehchisalem, Vahid; Chye, David Yoong Soon; Jaswant Singh, Sheena Kaur A/P; Zainuddin, Siti Zaidah; Norouzi, Sara; Khalid, Sheren

    2014-01-01

    With a growing emphasis on students' ability to assess their own written works in teaching English as a Second Language (ESL) writing courses, self-assessment checklists are today regarded as useful tools. These checklists can help learners diagnose their own weaknesses and improve their writing performance. This necessitates development of…

  10. Assessing Self-Regulated Strategies for School Writing: Cross-Cultural Validation of a Triadic Measure

    Science.gov (United States)

    Malpique, Anabela Abreu; Veiga Simão, Ana Margarida

    2015-01-01

    This study reports on the construction of a questionnaire to assess ninth-grade students' use of self-regulated strategies for school writing tasks. Exploratory and confirmatory factorial analyses were conducted to validate the factor structure of the instrument. The initial factor analytic stage (n = 296) revealed a 13-factor scale, accounting…

  11. Optimizing Prediction Using Bayesian Model Averaging: Examples Using Large-Scale Educational Assessments.

    Science.gov (United States)

    Kaplan, David; Lee, Chansoon

    2018-01-01

    This article provides a review of Bayesian model averaging as a means of optimizing the predictive performance of common statistical models applied to large-scale educational assessments. The Bayesian framework recognizes that in addition to parameter uncertainty, there is uncertainty in the choice of models themselves. A Bayesian approach to addressing the problem of model uncertainty is the method of Bayesian model averaging. Bayesian model averaging searches the space of possible models for a set of submodels that satisfy certain scientific principles and then averages the coefficients across these submodels weighted by each model's posterior model probability (PMP). Using the weighted coefficients for prediction has been shown to yield optimal predictive performance according to certain scoring rules. We demonstrate the utility of Bayesian model averaging for prediction in education research with three examples: Bayesian regression analysis, Bayesian logistic regression, and a recently developed approach for Bayesian structural equation modeling. In each case, the model-averaged estimates are shown to yield better prediction of the outcome of interest than any submodel based on predictive coverage and the log-score rule. Implications for the design of large-scale assessments when the goal is optimal prediction in a policy context are discussed.

  12. Investigating the Effect of Using Self-Assessment on Iranian EFL Learners' Writing

    Science.gov (United States)

    Heidarian, Nakisa

    2016-01-01

    This study investigated the effect of using self-assessment on Iranian EFL learners' writing. The purpose of this study was to demonstrate whether using of self-assessment as an assessment method was influential in developing learners' English writing performance generally writing processes specifically. The participants of this study consisted of…

  13. How to Measure and Explain Achievement Change in Large-Scale Assessments: A Rejoinder

    Science.gov (United States)

    Hickendorff, Marian; Heiser, Willem J.; van Putten, Cornelis M.; Verhelst, Norman D.

    2009-01-01

    In this rejoinder, we discuss substantive and methodological validity issues of large-scale assessments of trends in student achievement, commenting on the discussion paper by Van den Heuvel-Panhuizen, Robitzsch, Treffers, and Koller (2009). We focus on methodological challenges in deciding what to measure, how to measure it, and how to foster…

  14. How much is too much assessment? Insight into assessment-driven student learning gains in large-scale undergraduate microbiology courses.

    Science.gov (United States)

    Wang, Jack T H; Schembri, Mark A; Hall, Roy A

    2013-01-01

    Designing and implementing assessment tasks in large-scale undergraduate science courses is a labor-intensive process subject to increasing scrutiny from students and quality assurance authorities alike. Recent pedagogical research has provided conceptual frameworks for teaching introductory undergraduate microbiology, but has yet to define best-practice assessment guidelines. This study assessed the applicability of Biggs' theory of constructive alignment in designing consistent learning objectives, activities, and assessment items that aligned with the American Society for Microbiology's concept-based microbiology curriculum in MICR2000, an introductory microbiology course offered at the University of Queensland, Australia. By improving the internal consistency in assessment criteria and increasing the number of assessment items explicitly aligned to the course learning objectives, the teaching team was able to efficiently provide adequate feedback on numerous assessment tasks throughout the semester, which contributed to improved student performance and learning gains. When comparing the constructively aligned 2011 offering of MICR2000 with its 2010 counterpart, students obtained higher marks in both coursework assignments and examinations as the semester progressed. Students also valued the additional feedback provided, as student rankings for course feedback provision increased in 2011 and assessment and feedback was identified as a key strength of MICR2000. By designing MICR2000 using constructive alignment and iterative assessment tasks that followed a common set of learning outcomes, the teaching team was able to effectively deliver detailed and timely feedback in a large introductory microbiology course. This study serves as a case study for how constructive alignment can be integrated into modern teaching practices for large-scale courses.

  15. Lessons from a large-scale assessment: Results from conceptual inventories

    Directory of Open Access Journals (Sweden)

    Beth Thacker

    2014-07-01

    Full Text Available We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER (physics education research-informed materials into a department where most instruction has previously been traditional and a significant number of faculty are hesitant, ambivalent, or even resistant to the introduction of such reforms. Data were collected in all of the sections of both the large algebra- and calculus-based introductory courses for a number of years employing commonly used conceptual inventories. Results from a small PER-informed, inquiry-based, laboratory-based class are also reported. Results suggest that when PER-informed materials are introduced in the labs and recitations, independent of the lecture style, there is an increase in students’ conceptual inventory gains. There is also an increase in the results on conceptual inventories if PER-informed instruction is used in the lecture. The highest conceptual inventory gains were achieved by the combination of PER-informed lectures and laboratories in large class settings and by the hands-on, laboratory-based, inquiry-based course taught in a small class setting.

  16. Assessing the Cyborg Center: Assemblage-Based, Feminist Frameworks toward Socially Just Writing Center Assessments

    Science.gov (United States)

    Andersen, Erin M.

    2017-01-01

    This dissertation will broaden the purview of recent scholarship pertaining to socially just writing assessments by making connections among assemblage theory and materialism, studies of ecological and anti-racist assessments, and studies of writing center work, to ground theoretical conversations in everyday practices. Focusing on systemic…

  17. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  18. The Implementation of Continuous Assessment in Writing Classes ...

    African Journals Online (AJOL)

    The Implementation of Continuous Assessment in Writing Classes the of Jimma ... is the typical nature of summative tests was used to grade students' performance. ... Key words: Continuous assessment, Formative assessment, Summative ...

  19. Content and Alignment of State Writing Standards and Assessments as Predictors of Student Writing Achievement: An Analysis of 2007 National Assessment of Educational Progress Data

    Science.gov (United States)

    Troia, Gary A.; Olinghouse, Natalie G.; Zhang, Mingcai; Wilson, Joshua; Stewart, Kelly A.; Mo, Ya; Hawkins, Lisa

    2018-01-01

    We examined the degree to which content of states' writing standards and assessments (using measures of content range, frequency, balance, and cognitive complexity) and their alignment were related to student writing achievement on the 2007 National Assessment of Educational Progress (NAEP), while controlling for student, school, and state…

  20. The Effects of Portfolio Assessment on Writing of EFL Students

    Science.gov (United States)

    Nezakatgoo, Behzad

    2011-01-01

    The primary focus of this study was to determine the effect of portfolio assessment on final examination scores of EFL students' writing skill. To determine the impact of portfolio-based writing assessment 40 university students who enrolled in composition course were initially selected and divided randomly into two experimental and control…

  1. Large-scale assessment of flood risk and the effects of mitigation measures along the Elbe River

    NARCIS (Netherlands)

    de Kok, Jean-Luc; Grossmann, M.

    2010-01-01

    The downstream effects of flood risk mitigation measures and the necessity to develop flood risk management strategies that are effective on a basin scale call for a flood risk assessment methodology that can be applied at the scale of a large river. We present an example of a rapid flood risk

  2. Large-scale model-based assessment of deer-vehicle collision risk.

    Directory of Open Access Journals (Sweden)

    Torsten Hothorn

    Full Text Available Ungulates, in particular the Central European roe deer Capreolus capreolus and the North American white-tailed deer Odocoileus virginianus, are economically and ecologically important. The two species are risk factors for deer-vehicle collisions and as browsers of palatable trees have implications for forest regeneration. However, no large-scale management systems for ungulates have been implemented, mainly because of the high efforts and costs associated with attempts to estimate population sizes of free-living ungulates living in a complex landscape. Attempts to directly estimate population sizes of deer are problematic owing to poor data quality and lack of spatial representation on larger scales. We used data on >74,000 deer-vehicle collisions observed in 2006 and 2009 in Bavaria, Germany, to model the local risk of deer-vehicle collisions and to investigate the relationship between deer-vehicle collisions and both environmental conditions and browsing intensities. An innovative modelling approach for the number of deer-vehicle collisions, which allows nonlinear environment-deer relationships and assessment of spatial heterogeneity, was the basis for estimating the local risk of collisions for specific road types on the scale of Bavarian municipalities. Based on this risk model, we propose a new "deer-vehicle collision index" for deer management. We show that the risk of deer-vehicle collisions is positively correlated to browsing intensity and to harvest numbers. Overall, our results demonstrate that the number of deer-vehicle collisions can be predicted with high precision on the scale of municipalities. In the densely populated and intensively used landscapes of Central Europe and North America, a model-based risk assessment for deer-vehicle collisions provides a cost-efficient instrument for deer management on the landscape scale. The measures derived from our model provide valuable information for planning road protection and defining

  3. Writing assessment in higher education: Making the framework work

    NARCIS (Netherlands)

    Callies, M.; Zaytseva, E.; Present-Thomas, R.L.

    2013-01-01

    The importance of appropriate assessment methods for academic writing skills in higher education has received increasing attention in SLA research in recent years. Despite this, there is still relatively little understanding of how academic writing skills develop at the most advanced levels of

  4. Comprehensive large-scale assessment of intrinsic protein disorder.

    Science.gov (United States)

    Walsh, Ian; Giollo, Manuel; Di Domenico, Tomás; Ferrari, Carlo; Zimmermann, Olav; Tosatto, Silvio C E

    2015-01-15

    Intrinsically disordered regions are key for the function of numerous proteins. Due to the difficulties in experimental disorder characterization, many computational predictors have been developed with various disorder flavors. Their performance is generally measured on small sets mainly from experimentally solved structures, e.g. Protein Data Bank (PDB) chains. MobiDB has only recently started to collect disorder annotations from multiple experimental structures. MobiDB annotates disorder for UniProt sequences, allowing us to conduct the first large-scale assessment of fast disorder predictors on 25 833 different sequences with X-ray crystallographic structures. In addition to a comprehensive ranking of predictors, this analysis produced the following interesting observations. (i) The predictors cluster according to their disorder definition, with a consensus giving more confidence. (ii) Previous assessments appear over-reliant on data annotated at the PDB chain level and performance is lower on entire UniProt sequences. (iii) Long disordered regions are harder to predict. (iv) Depending on the structural and functional types of the proteins, differences in prediction performance of up to 10% are observed. The datasets are available from Web site at URL: http://mobidb.bio.unipd.it/lsd. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. A Quality Approach to Writing Assessment.

    Science.gov (United States)

    Andrade, Joanne; Ryley, Helen

    1992-01-01

    A Colorado elementary school began its Total Quality Management work about a year ago after several staff members participated in an IBM Leadership Training Program addressing applications of Deming's theories. The school's new writing assessment has increased collegiality and cross-grade collaboration. (MLH)

  6. On-line transient stability assessment of large-scale power systems by using ball vector machines

    International Nuclear Information System (INIS)

    Mohammadi, M.; Gharehpetian, G.B.

    2010-01-01

    In this paper ball vector machine (BVM) has been used for on-line transient stability assessment of large-scale power systems. To classify the system transient security status, a BVM has been trained for all contingencies. The proposed BVM based security assessment algorithm has very small training time and space in comparison with artificial neural networks (ANN), support vector machines (SVM) and other machine learning based algorithms. In addition, the proposed algorithm has less support vectors (SV) and therefore is faster than existing algorithms for on-line applications. One of the main points, to apply a machine learning method is feature selection. In this paper, a new Decision Tree (DT) based feature selection technique has been presented. The proposed BVM based algorithm has been applied to New England 39-bus power system. The simulation results show the effectiveness and the stability of the proposed method for on-line transient stability assessment procedure of large-scale power system. The proposed feature selection algorithm has been compared with different feature selection algorithms. The simulation results demonstrate the effectiveness of the proposed feature algorithm.

  7. Writing to and reading from a nano-scale crossbar memory based on memristors

    International Nuclear Information System (INIS)

    Vontobel, Pascal O; Robinett, Warren; Kuekes, Philip J; Stewart, Duncan R; Straznicky, Joseph; Stanley Williams, R

    2009-01-01

    We present a design study for a nano-scale crossbar memory system that uses memristors with symmetrical but highly nonlinear current-voltage characteristics as memory elements. The memory is non-volatile since the memristors retain their state when un-powered. In order to address the nano-wires that make up this nano-scale crossbar, we use two coded demultiplexers implemented using mixed-scale crossbars (in which CMOS-wires cross nano-wires and in which the crosspoint junctions have one-time configurable memristors). This memory system does not utilize the kind of devices (diodes or transistors) that are normally used to isolate the memory cell being written to and read from in conventional memories. Instead, special techniques are introduced to perform the writing and the reading operation reliably by taking advantage of the nonlinearity of the type of memristors used. After discussing both writing and reading strategies for our memory system in general, we focus on a 64 x 64 memory array and present simulation results that show the feasibility of these writing and reading procedures. Besides simulating the case where all device parameters assume exactly their nominal value, we also simulate the much more realistic case where the device parameters stray around their nominal value: we observe a degradation in margins, but writing and reading is still feasible. These simulation results are based on a device model for memristors derived from measurements of fabricated devices in nano-scale crossbars using Pt and Ti nano-wires and using oxygen-depleted TiO 2 as the switching material.

  8. Assessing Technical Writing in Institutional Contexts: Using Outcomes-Based Assessment for Programmatic Thinking.

    Science.gov (United States)

    Carter, Michael; Anson, Chris M.; Miller, Carolyn R.

    2003-01-01

    Notes that technical writing instruction often operates in isolation from other components of students' communication education. Argues for altering this isolation by moving writing instruction to a place of increased programmatic perspective, which may be attained through a means of assessment based on educational outcomes. Discusses two models…

  9. Assessing Children's Writing Products: The Role of Curriculum Based Measures

    Science.gov (United States)

    Dockrell, Julie E.; Connelly, Vincent; Walter, Kirsty; Critten, Sarah

    2015-01-01

    The assessment of children's writing raises technical and practical challenges. In this paper we examine the potential use of a curriculum based measure for writing (CBM-W) to assess the written texts of pupils in Key Stage 2 (M age 107 months, range 88 to 125). Two hundred and thirty six Year three, five and six pupils completed a standardized…

  10. Large-scale Ising-machines composed of magnetic neurons

    Science.gov (United States)

    Mizushima, Koichi; Goto, Hayato; Sato, Rie

    2017-10-01

    We propose Ising-machines composed of magnetic neurons, that is, magnetic bits in a recording track. In large-scale machines, the sizes of both neurons and synapses need to be reduced, and neat and smart connections among neurons are also required to achieve all-to-all connectivity among them. These requirements can be fulfilled by adopting magnetic recording technologies such as race-track memories and skyrmion tracks because the area of a magnetic bit is almost two orders of magnitude smaller than that of static random access memory, which has normally been used as a semiconductor neuron, and the smart connections among neurons are realized by using the read and write methods of these technologies.

  11. Balancing Tensions in Educational Policy Reforms: Large-Scale Implementation of Assessment for Learning in Norway

    Science.gov (United States)

    Hopfenbeck, Therese N.; Flórez Petour, María Teresa; Tolo, Astrid

    2015-01-01

    This study investigates how different stakeholders in Norway experienced a government-initiated, large-scale policy implementation programme on "Assessment for Learning" ("AfL"). Data were collected through 58 interviews with stakeholders in charge of the policy; Ministers of Education and members of the Directorate of…

  12. Writing-to-Learn

    Science.gov (United States)

    Balachandran, Shreedevi; Venkatesaperumal, Ramesh; Clara, Jothi; Shukri, Raghda K.

    2014-01-01

    Objectives: The objectives of this study were to assess the attitude of Omani nursing students towards writing-to-learn (WTL) and its relationship to demographic variables, self-efficacy and the writing process Methods: A cross-sectional design was used to evaluate attitudes towards WTL by Sultan Qaboos University nursing students. A convenience sample of 106 students was used and data collected between October 2009 and March 2010. A modified version of the WTL attitude scale developed by Dobie and Poirrier was used to collect the data. Descriptive and inferential statistics were used for analysis. Results: Senior and junior students had more positive attitudes to WTL than mid-level students who tended to have negative attitudes towards writing. Although 52.8% students had negative attitudes towards the writing process, the median was higher for attitudes to the writing process compared to the median for self-efficacy. There was a positive correlation between self-efficacy and writing process scores. Conclusion: Overall, students had negative attitudes towards WTL. Attitudes are learnt or formed through previous experiences. The incorporation of WTL strategies into teaching can transform students’ negative attitudes towards writing into positive ones. PMID:24516740

  13. Clinical Reasoning in the Assessment and Intervention Planning for Writing Disorder

    Science.gov (United States)

    Harrison, Gina L.; McManus, Kelly L.

    2017-01-01

    The incidence of writing disorder is as common as reading disorder, but it is frequently under-identified and rarely targeted for intervention. Increasing clinical understanding on various subtypes of writing disorder through assessment guided by data-driven decision making may alleviate this disparity for students with writing disorders. The…

  14. Writing, Evaluating and Assessing Data Response Items in Economics.

    Science.gov (United States)

    Trotman-Dickenson, D. I.

    1989-01-01

    Describes some of the problems in writing data response items in economics for use by A Level and General Certificate of Secondary Education (GCSE) students. Examines the experience of two series of workshops on writing items, evaluating them and assessing responses from schools. Offers suggestions for producing packages of data response items as…

  15. The use of test scores from large-scale assessment surveys: psychometric and statistical considerations

    Directory of Open Access Journals (Sweden)

    Henry Braun

    2017-11-01

    Full Text Available Abstract Background Economists are making increasing use of measures of student achievement obtained through large-scale survey assessments such as NAEP, TIMSS, and PISA. The construction of these measures, employing plausible value (PV methodology, is quite different from that of the more familiar test scores associated with assessments such as the SAT or ACT. These differences have important implications both for utilization and interpretation. Although much has been written about PVs, it appears that there are still misconceptions about whether and how to employ them in secondary analyses. Methods We address a range of technical issues, including those raised in a recent article that was written to inform economists using these databases. First, an extensive review of the relevant literature was conducted, with particular attention to key publications that describe the derivation and psychometric characteristics of such achievement measures. Second, a simulation study was carried out to compare the statistical properties of estimates based on the use of PVs with those based on other, commonly used methods. Results It is shown, through both theoretical analysis and simulation, that under fairly general conditions appropriate use of PV yields approximately unbiased estimates of model parameters in regression analyses of large scale survey data. The superiority of the PV methodology is particularly evident when measures of student achievement are employed as explanatory variables. Conclusions The PV methodology used to report student test performance in large scale surveys remains the state-of-the-art for secondary analyses of these databases.

  16. Evaluating undergraduate nursing students' self-efficacy and competence in writing: Effects of a writing intensive intervention.

    Science.gov (United States)

    Miller, Louise C; Russell, Cynthia L; Cheng, An-Lin; Skarbek, Anita J

    2015-05-01

    While professional nurses are expected to communicate clearly, these skills are often not explicitly taught in undergraduate nursing education. In this research study, writing self-efficacy and writing competency were evaluated in 52 nontraditional undergraduate baccalaureate completion students in two distance-mediated 16-week capstone courses. The intervention group (n = 44) experienced various genres and modalities of written assignments set in the context of evidence-based nursing practice; the comparison group (n = 8) received usual writing undergraduate curriculum instruction. Self-efficacy, measured by the Post Secondary Writerly Self-Efficacy Scale, indicated significant improvements for all self-efficacy items (all p's = 0.00). Writing competency, assessed in the intervention group using a primary trait scoring rubric (6 + 1 Trait Writing Model(®) of Instruction and Assessment), found significant differences in competency improvement on five of seven items. This pilot study demonstrated writing skills can improve in nontraditional undergraduate students with guided instruction. Further investigation with larger, culturally diverse samples is indicated to validate these results. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Contributory Role of Collaborative Assessment in Improving Critical Thinking and Writing

    Directory of Open Access Journals (Sweden)

    Mansoor Fahim

    2014-01-01

    Full Text Available Instilling critical thinking skills in language learners’ mind and enhancing second language writing are gaining momentum in the field of English language teaching. Though some approaches to meeting these objectives have been proposed, review of the literature revealed that the contributory role of collaborative assessment in fostering critical thinking and second language writing has been overlooked to date. Thus, this study was conducted to delve into this issue. To this aim, two intact intermediate EFL classes in Iran, each of which contains 18 learners, were included in the study; they were randomly assigned to Student-Student (S-S collaborative assessment and Teacher-Student (T-S collaborative assessment groups.  After receiving six sessions of treatment, both groups wrote an essay on an IELTS topic. Results of the study indicated that collaborative assessment, regardless of its type, has the potential to foster critical thinking and writing proficiency. Further, it came to light that S-S collaborative assessment group significantly outperformed the T-S collaborative assessment group in terms of gains in critical thinking and writing proficiency. In light of the findings, language teachers are suggested to involve learners in collaborative assessment processes; further, some suggestions are offered at the end to show some fruitful avenues for further research.

  18. Students’ Perception toward the Implementation of Peer-Assessment in Writing; Before and After Revision

    Directory of Open Access Journals (Sweden)

    Husni Mubarok

    2017-10-01

    Full Text Available This study was aimed at elaborating students’ perception toward the implementation of peer assessment in writing class either before or after revision. Writing becomes one of skills which should be mastered by students in order to get higher level of literacy. Writing is a productive skill which asks students to arrange words and organize them into good writing which could be understood by readers. The success of writing is determined by the writing process itself starting from planning, first draft writing, revising, and editing. One of the strategies used in teaching writing is by implementing peer assessment. Peer assessment strategy becomes one of important parts in the process of writing because there will be feedback or suggestion from peers in doing a review. The number of the subject of this research was students in second semester of the English Education Department of UNISNU Jepara. This research was conducted on even semester. The total number of the students, which became respondents, was 37 students of English Education Department. The research design used was qualitative research which measured students’ perceptions of the implementation of peer assessment in writing: before and after revision. The result showed that before revision, students had negative perception toward their own writing. After revision, they had positive perceptions toward peer assessment strategy. Those included usefulness and meaningfulness, nature of feedback, reality of feedback, precision, validity, fairness, and personal goal-setting. Besides that, the score after revision (7.9 was higher than the score before revision (6.62. It meant that the result showed the increasing of students’ score after revision.

  19. Questions Arising from the Assessment of EFL Narrative Writing

    Science.gov (United States)

    Yi, Yong

    2013-01-01

    This article questions how narrative writing is assessed, seeking to understand what we test, what we value, and why. It uses a single anomalous case that arose in the course of my recent PhD thesis to highlight the issues, asking if sufficient attention is being given to the value of emotional content in a piece of writing in comparison to its…

  20. Use of large-scale acoustic monitoring to assess anthropogenic pressures on Orthoptera communities.

    Science.gov (United States)

    Penone, Caterina; Le Viol, Isabelle; Pellissier, Vincent; Julien, Jean-François; Bas, Yves; Kerbiriou, Christian

    2013-10-01

    Biodiversity monitoring at large spatial and temporal scales is greatly needed in the context of global changes. Although insects are a species-rich group and are important for ecosystem functioning, they have been largely neglected in conservation studies and policies, mainly due to technical and methodological constraints. Sound detection, a nondestructive method, is easily applied within a citizen-science framework and could be an interesting solution for insect monitoring. However, it has not yet been tested at a large scale. We assessed the value of a citizen-science program in which Orthoptera species (Tettigoniidae) were monitored acoustically along roads. We used Bayesian model-averaging analyses to test whether we could detect widely known patterns of anthropogenic effects on insects, such as the negative effects of urbanization or intensive agriculture on Orthoptera populations and communities. We also examined site-abundance correlations between years and estimated the biases in species detection to evaluate and improve the protocol. Urbanization and intensive agricultural landscapes negatively affected Orthoptera species richness, diversity, and abundance. This finding is consistent with results of previous studies of Orthoptera, vertebrates, carabids, and butterflies. The average mass of communities decreased as urbanization increased. The dispersal ability of communities increased as the percentage of agricultural land and, to a lesser extent, urban area increased. Despite changes in abundances over time, we found significant correlations between yearly abundances. We identified biases linked to the protocol (e.g., car speed or temperature) that can be accounted for ease in analyses. We argue that acoustic monitoring of Orthoptera along roads offers several advantages for assessing Orthoptera biodiversity at large spatial and temporal extents, particularly in a citizen science framework. © 2013 Society for Conservation Biology.

  1. Large-scale Assessment Yields Evidence of Minimal Use of Reasoning Skills in Traditionally Taught Classes

    Science.gov (United States)

    Thacker, Beth

    2017-01-01

    Large-scale assessment data from Texas Tech University yielded evidence that most students taught traditionally in large lecture classes with online homework and predominantly multiple choice question exams, when asked to answer free-response (FR) questions, did not support their answers with logical arguments grounded in physics concepts. In addition to a lack of conceptual understanding, incorrect and partially correct answers lacked evidence of the ability to apply even lower level reasoning skills in order to solve a problem. Correct answers, however, did show evidence of at least lower level thinking skills as coded using a rubric based on Bloom's taxonomy. With the introduction of evidence-based instruction into the labs and recitations of the large courses and in a small, completely laboratory-based, hands-on course, the percentage of correct answers with correct explanations increased. The FR format, unlike other assessment formats, allowed assessment of both conceptual understanding and the application of thinking skills, clearly pointing out weaknesses not revealed by other assessment instruments, and providing data on skills beyond conceptual understanding for course and program assessment. Supported by National Institutes of Health (NIH) Challenge grant #1RC1GM090897-01.

  2. New Possibilities for High-Resolution, Large-Scale Ecosystem Assessment of the World's Semi-Arid Regions

    Science.gov (United States)

    Burney, J. A.; Goldblatt, R.

    2016-12-01

    Understanding drivers of land use change - and in particular, levels of ecosystem degradation - in semi-arid regions is of critical importance because these agroecosystems (1) are home to the world's poorest populations, almost all of whom depend on agriculture for their livelihoods, (2) play a critical role in the global carbon and climate cycles, and (3) have in many cases seen dramatic changes in temperature and precipitation, relative to global averages, over the past several decades. However, assessing ecosystem health (or, conversely, degradation) presents a difficult measurement problem. Established methods are very labor intensive and rest on detailed questionnaires and field assessments. High-resolution satellite imagery has a unique role semi-arid ecosystem assessment in that it can be used for rapid (or repeated) and very simple measurements of tree and shrub density, an excellent overall indicator for dryland ecosystem health. Because trees and large shrubs are more sparse in semi-arid regions, sub-meter resolution imagery in conjunction with automated image analysis can be used to assess density differences at high spatial resolution without expensive and time-consuming ground-truthing. This could be used down to the farm level, for example, to better assess the larger-scale ecosystem impacts of different management practices, to assess compliance with REDD+ carbon offset protocols, or to evaluate implementation of conservation goals. Here we present results comparing spatial and spectral remote sensing methods for semi-arid ecosystem assessment across new data sources, using the Brazilian Sertão as an example, and the implications for large-scale use in semi-arid ecosystem science.

  3. Implementing an Online Writing Assessment Strategy for Gerontology

    Science.gov (United States)

    Brown, Pamela S.; Hanks, Roma S.

    2008-01-01

    Assessment of student learning is a growing concern for programs in gerontology. This report focuses on the conception, design, funding, and implementation of an innovative online workshop to assess and improve writing skills of students enrolled in distance-learning gerontology classes. The approach is multidisciplinary and involves a…

  4. Diagnostic and Value-Added Assessment of Business Writing

    Science.gov (United States)

    Fraser, Linda; Harich, Katrin; Norby, Joni; Brzovic, Kathy; Rizkallah, Teeanna; Loewy, Dana

    2005-01-01

    To assess students' business writing abilities upon entry into the business program and exit from the capstone course, a multitiered assessment package was developed that measures students' achievement of specific learning outcomes and provides "value-added" scores. The online segment of the test measures five competencies across three process…

  5. Assessing large-scale wildlife responses to human infrastructure development.

    Science.gov (United States)

    Torres, Aurora; Jaeger, Jochen A G; Alonso, Juan Carlos

    2016-07-26

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future.

  6. Failure Impact Assessment for Large-Scale Landslides Located Near Human Settlement: Case Study in Southern Taiwan

    Directory of Open Access Journals (Sweden)

    Ming-Chien Chung

    2018-05-01

    Full Text Available In 2009, Typhoon Morakot caused over 680 deaths and more than 20,000 landslides in Taiwan. From 2010 to 2015, the Central Geological Survey of the Ministry of Economic Affairs identified 1047 potential large-scale landslides in Taiwan, of which 103 may have affected human settlements. This paper presents an analytical procedure that can be applied to assess the possible impact of a landslide collapse on nearby settlements. In this paper, existing technologies, including interpretation of remote sensing images, hydrogeological investigation, and numerical analysis, are integrated to evaluate potential failure scenarios and the landslide scale of a specific case: the Xinzhuang landslide. GeoStudio and RAMMS analysis modes and hazard classification produced the following results: (1 evaluation of the failure mechanisms and the influence zones of large-scale landslides; (2 assessment of the migration and accumulation of the landslide mass after failure; and (3 a landslide hazard and evacuation map. The results of the case study show that this analytical procedure can quantitatively estimate potential threats to human settlements. Furthermore, it can be applied to other villages and used as a reference in disaster prevention and evacuation planning.

  7. Writing to the Common Core: Teachers' Responses to Changes in Standards and Assessments for Writing in Elementary Schools

    Science.gov (United States)

    Wilcox, Kristen Campbell; Jeffery, Jill V.; Gardner-Bixler, Andrea

    2016-01-01

    This multiple case study investigated how the Common Core State Standards (CCSS) for writing and teacher evaluation system based in part on CCSS assessments might be influencing writing instruction in elementary schools. The sample included nine schools: Six achieved above-predicted performance on English Language Arts (ELA) as well as prior ELA…

  8. Extending the Principles of Intensive Writing to Large Macroeconomics Classes

    Science.gov (United States)

    Docherty, Peter; Tse, Harry; Forman, Ross; McKenzie, Jo

    2010-01-01

    The authors report on the design and implementation of a pilot program to extend the principles of intensive writing outlined by W. Lee Hansen (1998), Murray S. Simpson and Shireen E. Carroll (1999) and David Carless (2006) to large macroeconomics classes. The key aspect of this program was its collaborative nature, with staff from two specialist…

  9. Fast‐writing E‐beam for defining large arrays of nano‐holes

    DEFF Research Database (Denmark)

    Højlund-Nielsen, Emil; Clausen, Jeppe Sandvik; Christiansen, Alexander Bruun

    2013-01-01

    Efficient nanoscale patterning of large areas is required for sub-wavelength optics. For example, 200 nm periodic structures are often too small to be made with standard UV- and DUV-equipment. Still, the final product must be made at an economic cost. Here we use a fast-writing strategy described...... in [1], where electron beam lithography (EBL) with a focused Gaussian beam is used to define shapes directly. The serial technique is optimized for speed and pattern fidelity to a maximum writing speed of around 30 min/cm2 for 200 nm periods in 2D lattices. The overall costs in terms of machine time...

  10. The Association between Audit Business Scale Advantage and Audit Quality of Asset Write-downs

    Directory of Open Access Journals (Sweden)

    Ziye Zhao

    2008-06-01

    We contribute to the literature with the following findings. First, auditors’ business scale is positively related to return relevance of write-downs. Second, auditors with ABSA not only enhance the relevance between impairments and economic variables but also weaken the relation between impairments and managerial variables; however, the results appear in only a few of the firm-specific variables. Third, results are mixed when we test the ABSA effect on price-relevance and persistence dimensions. Fourth, the ABSA effect is stronger when the complexity of asset write-downs requires some inside information to comprehend the nature of action. Adding to the main finding, we also found the ABSA effect became weaker when we proxy ABSA with raw data of companies’ business scale instead of the top five auditors in business scale. Taken together, our results show that the ABSA effect does exist in auditing of assets write-downs, although with weak evidence. Our results also indicated rational auditor choice based on quality of service in China's audit market. We identified some unique factors from stakeholders’ cooperative structuring actions in China audit market as potential explanations to the market rationality.

  11. Large area nano-patterning /writing on gold substrate using dip - pen nanolithography (DPN)

    Science.gov (United States)

    Saini, Sudhir Kumar; Vishwakarma, Amit; Agarwal, Pankaj B.; Pesala, Bala; Agarwal, Ajay

    2014-10-01

    Dip Pen Nanolithography (DPN) is utilized to pattern large area (50μmX50μm) gold substrate for application in fabricating Nano-gratings. For Nano-writing 16-MHA ink coated AFM tip was prepared using double dipping procedure. Gold substrate is fabricated on thermally grown SiO2 substrate by depositing ˜5 nm titanium layer followed by ˜30nm gold using DC pulse sputtering. The gratings were designed using period of 800nm and 25% duty cycle. Acquired AFM images indicate that as the AFM tip proceeds for nano-writing, line width decreases from 190nm to 100nm. This occurs probably due to depreciation of 16-MHA molecules in AFM tip as writing proceeds.

  12. Contextualize Technical Writing Assessment to Better Prepare Students for Workplace Writing: Student-Centered Assessment Instruments

    Science.gov (United States)

    Yu, Han

    2008-01-01

    To teach students how to write for the workplace and other professional contexts, technical writing teachers often assign writing tasks that reflect real-life communication contexts, a teaching approach that is grounded in the field's contextualized understanding of genre. This article argues to fully embrace contextualized literacy and better…

  13. Assessment of disturbance at three spatial scales in two large tropical reservoirs

    Directory of Open Access Journals (Sweden)

    Letícia de Morais

    2016-12-01

    Full Text Available Large reservoirs are an increasingly common feature across tropical landscapes because of their importance for water supply, flood control and hydropower, but their ecological conditions are infrequently evaluated. Our objective was to assess the range of disturbances for two large tropical reservoirs and their influences on benthic macroinvertebrates. We tested three hypotheses: i a wide variation in the level of environmental disturbance can be observed among sites in the reservoirs; ii the two reservoirs would exhibit a different degree of disturbance level; and iii the magnitude of disturbance would influence the structure and composition of benthic assemblages. For each reservoir, we assessed land use (macroscale, physical habitat structure (mesoscale, and water quality (microscale. We sampled 40 sites in the littoral zones of both Três Marias and São Simão Reservoirs (Minas Gerais, Brazil. At the macroscale, we measured cover percentages of land use categories in buffer areas at each site, where each buffer was a circular arc of 250 m. At the mesoscale, we assessed the presence of human disturbances in the riparian and drawdown zones at the local (site scale. At the microscale, we assessed water quality at each macroinvertebrate sampling station using the Micro Disturbance Index (MDI. To evaluate anthropogenic disturbance of each site, we calculated an integrated disturbance index (IDI from a buffer disturbance index (BDI and a local disturbance index (LDI. For each site, we calculated richness and abundance of benthic macroinvertebrates, Chironomidae genera richness, abundance and percent Chironomidae individuals, abundance and percent EPT individuals, richness and percent EPT taxa, abundance and percent resistant individuals, and abundance and percent non-native individuals. We also evaluated the influence of disturbance on benthic macroinvertebrate assemblages at the entire-reservoir scale. The BDI, LDI and IDI had significantly

  14. Self-Regulation through Portfolio Assessment in Writing Classrooms

    Science.gov (United States)

    Mak, Pauline; Wong, Kevin M.

    2018-01-01

    Portfolio assessment (PA) is promulgated as a useful tool to promote learning through assessment. While the benefits of PA are well documented, there is a lack of empirical research on how students' self-regulation can be effectively fostered in writing classrooms, and how the use of PA can develop students' self-regulated capacities. This…

  15. The Contribution of International Large-Scale Assessments to Educational Research: Combining Individual and Institutional Data Sources

    Science.gov (United States)

    Strietholt, Rolf; Scherer, Ronny

    2018-01-01

    The present paper aims to discuss how data from international large-scale assessments (ILSAs) can be utilized and combined, even with other existing data sources, in order to monitor educational outcomes and study the effectiveness of educational systems. We consider different purposes of linking data, namely, extending outcomes measures,…

  16. Coverage of the migrant population in large-scale assessment surveys. Experiences from PIAAC in Germany

    Directory of Open Access Journals (Sweden)

    Débora B. Maehler

    2017-03-01

    Full Text Available Abstract Background European countries, and especially Germany, are currently very much affected by human migration flows, with the result that the task of integration has become a challenge. Only very little empirical evidence on topics such as labor market participation and processes of social integration of migrant subpopulations is available to date from large-scale population surveys. The present paper provides an overview of the representation of the migrant population in the German Programme for the International Assessment of Adult Competencies (PIAAC sample and evaluates reasons for the under-coverage of this population. Methods We examine outcome rates and reasons for nonresponse among the migrant population based on sampling frame data, and we also examine para data from the interviewers’ contact protocols to evaluate time patterns for the successful contacting of migrants. Results and Conclusions This is the first time that results of this kind have been presented for a large-scale assessment in educational research. These results are also discussed in the context of future PIAAC cycles. Overall, they confirm the expectations in the literature that factors such as language problems result in lower contact and response rates among migrants.

  17. Towards large scale stochastic rainfall models for flood risk assessment in trans-national basins

    Science.gov (United States)

    Serinaldi, F.; Kilsby, C. G.

    2012-04-01

    While extensive research has been devoted to rainfall-runoff modelling for risk assessment in small and medium size watersheds, less attention has been paid, so far, to large scale trans-national basins, where flood events have severe societal and economic impacts with magnitudes quantified in billions of Euros. As an example, in the April 2006 flood events along the Danube basin at least 10 people lost their lives and up to 30 000 people were displaced, with overall damages estimated at more than half a billion Euros. In this context, refined analytical methods are fundamental to improve the risk assessment and, then, the design of structural and non structural measures of protection, such as hydraulic works and insurance/reinsurance policies. Since flood events are mainly driven by exceptional rainfall events, suitable characterization and modelling of space-time properties of rainfall fields is a key issue to perform a reliable flood risk analysis based on alternative precipitation scenarios to be fed in a new generation of large scale rainfall-runoff models. Ultimately, this approach should be extended to a global flood risk model. However, as the need of rainfall models able to account for and simulate spatio-temporal properties of rainfall fields over large areas is rather new, the development of new rainfall simulation frameworks is a challenging task involving that faces with the problem of overcoming the drawbacks of the existing modelling schemes (devised for smaller spatial scales), but keeping the desirable properties. In this study, we critically summarize the most widely used approaches for rainfall simulation. Focusing on stochastic approaches, we stress the importance of introducing suitable climate forcings in these simulation schemes in order to account for the physical coherence of rainfall fields over wide areas. Based on preliminary considerations, we suggest a modelling framework relying on the Generalized Additive Models for Location, Scale

  18. A review of sensing technologies for small and large-scale touch panels

    Science.gov (United States)

    Akhtar, Humza; Kemao, Qian; Kakarala, Ramakrishna

    2017-06-01

    A touch panel is an input device for human computer interaction. It consists of a network of sensors, a sampling circuit and a micro controller for detecting and locating a touch input. Touch input can come from either finger or stylus depending upon the type of touch technology. These touch panels provide an intuitive and collaborative workspace so that people can perform various tasks with the use of their fingers instead of traditional input devices like keyboard and mouse. Touch sensing technology is not new. At the time of this writing, various technologies are available in the market and this paper reviews the most common ones. We review traditional designs and sensing algorithms for touch technology. We also observe that due to its various strengths, capacitive touch will dominate the large-scale touch panel industry in years to come. In the end, we discuss the motivation for doing academic research on large-scale panels.

  19. Writing and Pseudo-Writing from Internet-Based Sources: Implications for Learning and Assessment

    Science.gov (United States)

    Skaar, Håvard

    2015-01-01

    In recent years, plagiarism has been on the increase across the Western world. This article identifies Internet access as a contributory cause of this trend and addresses the implications of readily available Internet sources for the teaching and assessment of writing in schools. The basis for the article is a previous study showing a wide…

  20. International Large-Scale Assessment Studies and Educational Policy-Making in Chile: Contexts and Dimensions of Influence

    Science.gov (United States)

    Cox, Cristián; Meckes, Lorena

    2016-01-01

    Since the 1990s, Chile has participated in all major international large-scale assessment studies (ILSAs) of the IEA and OECD, as well as the regional ones conducted by UNESCO in Latin America, after it had been involved in the very first international Science Study in 1970-1971. This article examines the various ways in which these studies have…

  1. Developing and assessing EFL students’ writing skills via a class-blog

    Directory of Open Access Journals (Sweden)

    Eleni Daskalogiannaki

    2012-02-01

    Full Text Available This paper presents the implementation and the positive findings of a study that merges blog use and portfolio development for teaching and assessing writing. More specifically, it investigates whether a class blog can be integrated into the Greek EFL teaching context as an effective means to engage learners in process writing and as a form of e-portfolio, where they can keep track of their writing development. It also examines blog use for enhancing students’ motivation, interaction, participation and learning. The study followed a project-based approach and was conducted in a state Junior High School in Greece. Data was collected over a 4-month period via a questionnaire as well as from analyzing students’ writing samples and teacher’s observations of whole-class behavior during blogging. The findings reveal that the blog encouraged students to approach writing as a cognitive process of constant modification, motivated them to write more and better in various writing genres, and helped them become competent, autonomous and critical writers.

  2. Standard Errors for National Trends in International Large-Scale Assessments in the Case of Cross-National Differential Item Functioning

    Science.gov (United States)

    Sachse, Karoline A.; Haag, Nicole

    2017-01-01

    Standard errors computed according to the operational practices of international large-scale assessment studies such as the Programme for International Student Assessment's (PISA) or the Trends in International Mathematics and Science Study (TIMSS) may be biased when cross-national differential item functioning (DIF) and item parameter drift are…

  3. Using Procedure Based on Item Response Theory to Evaluate Classification Consistency Indices in the Practice of Large-Scale Assessment

    Directory of Open Access Journals (Sweden)

    Shanshan Zhang

    2017-09-01

    Full Text Available In spite of the growing interest in the methods of evaluating the classification consistency (CC indices, only few researches are available in the field of applying these methods in the practice of large-scale educational assessment. In addition, only few studies considered the influence of practical factors, for example, the examinee ability distribution, the cut score location and the score scale, on the performance of CC indices. Using the newly developed Lee's procedure based on the item response theory (IRT, the main purpose of this study is to investigate the performance of CC indices when practical factors are taken into consideration. A simulation study and an empirical study were conducted under comprehensive conditions. Results suggested that with negatively skewed distribution, the CC indices were larger than with other distributions. Interactions occurred among ability distribution, cut score location, and score scale. Consequently, Lee's IRT procedure is reliable to be used in the field of large-scale educational assessment, and when reporting the indices, it should be treated with caution as testing conditions may vary a lot.

  4. Large-scale assessment of benthic communities across multiple marine protected areas using an autonomous underwater vehicle.

    Science.gov (United States)

    Ferrari, Renata; Marzinelli, Ezequiel M; Ayroza, Camila Rezende; Jordan, Alan; Figueira, Will F; Byrne, Maria; Malcolm, Hamish A; Williams, Stefan B; Steinberg, Peter D

    2018-01-01

    Marine protected areas (MPAs) are designed to reduce threats to biodiversity and ecosystem functioning from anthropogenic activities. Assessment of MPAs effectiveness requires synchronous sampling of protected and non-protected areas at multiple spatial and temporal scales. We used an autonomous underwater vehicle to map benthic communities in replicate 'no-take' and 'general-use' (fishing allowed) zones within three MPAs along 7o of latitude. We recorded 92 taxa and 38 morpho-groups across three large MPAs. We found that important habitat-forming biota (e.g. massive sponges) were more prevalent and abundant in no-take zones, while short ephemeral algae were more abundant in general-use zones, suggesting potential short-term effects of zoning (5-10 years). Yet, short-term effects of zoning were not detected at the community level (community structure or composition), while community structure varied significantly among MPAs. We conclude that by allowing rapid, simultaneous assessments at multiple spatial scales, autonomous underwater vehicles are useful to document changes in marine communities and identify adequate scales to manage them. This study advanced knowledge of marine benthic communities and their conservation in three ways. First, we quantified benthic biodiversity and abundance, generating the first baseline of these benthic communities against which the effectiveness of three large MPAs can be assessed. Second, we identified the taxonomic resolution necessary to assess both short and long-term effects of MPAs, concluding that coarse taxonomic resolution is sufficient given that analyses of community structure at different taxonomic levels were generally consistent. Yet, observed differences were taxa-specific and may have not been evident using our broader taxonomic classifications, a classification of mid to high taxonomic resolution may be necessary to determine zoning effects on key taxa. Third, we provide an example of statistical analyses and

  5. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  6. A Study on Critical Thinking Assessment System of College English Writing

    Science.gov (United States)

    Dong, Tian; Yue, Lu

    2015-01-01

    This research attempts to discuss the validity of introducing the evaluation of students' critical thinking skills (CTS) into the assessment system of college English writing through an empirical study. In this paper, 30 College English Test Band 4 (CET-4) writing samples were collected and analyzed. Students' CTS and the final scores of collected…

  7. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  8. Assessing writing ability in primary education: on the evaluation of text quality and text complexity

    NARCIS (Netherlands)

    Feenstra, Hiske

    2014-01-01

    Writing is a complex ability, and measuring writing ability is a notoriously complex task. The assessment of writing ability is complicated by the multi-faceted nature of this productive language ability on one hand, and the difficulty of evaluating writing performances on the other hand. In this

  9. Assessment of clean development mechanism potential of large-scale energy efficiency measures in heavy industries

    International Nuclear Information System (INIS)

    Hayashi, Daisuke; Krey, Matthias

    2007-01-01

    This paper assesses clean development mechanism (CDM) potential of large-scale energy efficiency measures in selected heavy industries (iron and steel, cement, aluminium, pulp and paper, and ammonia) taking India and Brazil as examples of CDM project host countries. We have chosen two criteria for identification of the CDM potential of each energy efficiency measure: (i) emission reductions volume (in CO 2 e) that can be expected from the measure and (ii) likelihood of the measure passing the additionality test of the CDM Executive Board (EB) when submitted as a proposed CDM project activity. The paper shows that the CDM potential of large-scale energy efficiency measures strongly depends on the project-specific and country-specific context. In particular, technologies for the iron and steel industry (coke dry quenching (CDQ), top pressure recovery turbine (TRT), and basic oxygen furnace (BOF) gas recovery), the aluminium industry (point feeder prebake (PFPB) smelter), and the pulp and paper industry (continuous digester technology) offer promising CDM potential

  10. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  11. Assessment of climate change impacts on rainfall using large scale ...

    Indian Academy of Sciences (India)

    Many of the applied techniques in water resources management can be directly or indirectly influenced by ... is based on large scale climate signals data around the world. In order ... predictand relationships are often very complex. .... constraints to solve the optimization problem. ..... social, and environmental sustainability.

  12. Examining Dimensions of Self-Efficacy for Writing

    Science.gov (United States)

    Bruning, Roger; Dempsey, Michael; Kauffman, Douglas F.; McKim, Courtney; Zumbrunn, Sharon

    2013-01-01

    A multifactor perspective on writing self-efficacy was examined in 2 studies. Three factors were proposed--self-efficacy for writing ideation, writing conventions, and writing self-regulation--and a scale constructed to reflect these factors. In Study 1, middle school students (N = 697) completed the Self-Efficacy for Writing Scale (SEWS), along…

  13. Teaching General Education Students How to Write Scientific Arguments Using Real Earth Data

    Science.gov (United States)

    Kelly, G. J.; Prothero, W. A.

    2003-12-01

    Writing activities can improve student understanding of scientific content and processes. We have studied student writing to identify the challenges that students face in composing scientific arguments and to clarify features that constitute quality in scientific writing. We have applied argumentation analysis for the assessment of students' use of evidence in a general education oceanography course. Argumentation analysis refers to the systematic examination of ways that conclusions are supported with evidence. The student writers were supported by an interactive CD-ROM, "Our Dynamic Planet," which provided students with "point and click" access to real earth data and allowed them to solve many problems associated with plate tectonics. Plate boundary types (using quakes, volcanoes, elevation profiles, and heat flow) and plate motion can be determined (seafloor age, island ages/hot spots) with this technology. First, we discuss the structure of scientific argument and how this structure can be made accessible to undergraduate students. Second, we present examples of argumentation analysis applied to student writing. These examples demonstrate how use of large scale geological data sets can be used to support student writing. Third, we present results from a series of studies to show ways that students adhere to the genre conventions of geological writing through use of theoretical claims, multiple lines of evidence, and cohesive terms. These results, combined with our evidence-based orientation to instruction, formed the basis for modifications in the course instruction. These instructional modifications include providing detailed examples of data based observations and interpretations, heuristics for assessing other students' arguments, and quick write exercises with similar but simplified writing tasks. More information about the CD-ROM may be found at http://oceanography.geol.ucsb.edu/.

  14. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  15. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  16. A large scale field experiment in the Amazon basin (LAMBADA/BATERISTA)

    NARCIS (Netherlands)

    Dolman, A.J.; Kabat, P.; Gash, J.H.C.; Noilhan, J.; Jochum, A.M.; Nobre, C.

    1995-01-01

    A description is given of a large-scale field experiment planned in the Amazon basin, aimed at assessing the large-scale balances of energy, water and carbon dioxide. The embedding of this experiment in global change programmes is described, viz. the Biospheric Aspects of the Hydrological Cycle

  17. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  18. Framework for Students’ Online Collaborative Writing

    DEFF Research Database (Denmark)

    Sørensen, Birgitte Holm; Levinsen, Karin Tweddell; Holm, Madeleine Rygner

    2016-01-01

    The paper focuses on collaborative writing in Google Docs and presents a framework for how students can develop methods for collaborations that include human and non-human actors. The paper is based on the large-scale research and development project Students’ Digital Production and Students...... shows that teachers do not introduce or refer the students to online collaborative strategies, roles or communications. The students’ online collaborative writing is entirely within the students’ domain. On this basis, the paper focuses on how teachers’ awareness and articulation of the students’ online...... collaborative writing within a framework can qualify students´ methods to collaborate online with the intention to improve their learning results. In relation to this, the paper explores how digital technologies may act as co-participants in collaboration, production and reflection. Moreover, the framework...

  19. Meteorological impact assessment of possible large scale irrigation in Southwest Saudi Arabia

    NARCIS (Netherlands)

    Maat, ter H.W.; Hutjes, R.W.A.; Ohba, R.; Ueda, H.; Bisselink, B.; Bauer, T.

    2006-01-01

    On continental to regional scales feedbacks between landuse and landcover change and climate have been widely documented over the past 10¿15 years. In the present study we explore the possibility that also vegetation changes over much smaller areas may affect local precipitation regimes. Large scale

  20. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  1. Multilevel Latent Class Analysis for Large-Scale Educational Assessment Data: Exploring the Relation between the Curriculum and Students' Mathematical Strategies

    Science.gov (United States)

    Fagginger Auer, Marije F.; Hickendorff, Marian; Van Putten, Cornelis M.; Béguin, Anton A.; Heiser, Willem J.

    2016-01-01

    A first application of multilevel latent class analysis (MLCA) to educational large-scale assessment data is demonstrated. This statistical technique addresses several of the challenges that assessment data offers. Importantly, MLCA allows modeling of the often ignored teacher effects and of the joint influence of teacher and student variables.…

  2. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  3. Writing Assessment's "Debilitating Inheritance": Behaviorism's Dismissal of Experience

    Science.gov (United States)

    Wilson, Maja Joiwind

    2013-01-01

    In this project, I examine the legacy of behaviorism's dismissal of experience on contemporary writing assessment theory and practice within the field of composition studies. I use an archival study of John B. Watson's letters to Robert Mearns Yerkes to establish behaviorism's systematic denial of experience and its related constructs: mind,…

  4. The Effect Of Problem Based Learning And Self-Assessment On Students’ Writing Competency And Self-Regulated Learningm

    Directory of Open Access Journals (Sweden)

    Suyoga Dharma I Putu

    2018-01-01

    Full Text Available This experimental study aimed at investigating the effect of Problem Based Learning (PBL and self-assessment (SA on students’ writing competency and self-regulated learning in Tabanan Regency. This research applied 2x2 factorial design. 96 students were selected as sample through random sampling. Data were collected by test (writing competency and questionnaire (self-regulation. Students’ writings were scored by analytical scoring rubric. The obtained data were analyzed statistically by MANOVA at 5% significance level. This research discovers: 1 there is a significant effect of PBL which occurs simultaneously and separately on students’ writing competency and self-regulated learning, 2 there is a significant effect of SA which ocurs simultaneously and separately on students’ writing competency and self-regulated learning, 3 there is a significant interaction between teaching model and assessment type on students’ writing competency and self-regulated learning which occurs simultaneously, 4 there is no significant interaction between teaching model and assessment type on students’ writing competency, and 5 there is a significant interaction between teaching model and assessment type on students’ self-regulated learning. This research results implies that PBL and SA should be applied in instruction process as a way to improve the quality of students’ writing competency and self-regulated learning.

  5. Large-scale derived flood frequency analysis based on continuous simulation

    Science.gov (United States)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  6. The story of a narrative: Teaching and assessing English writing in a township school

    Directory of Open Access Journals (Sweden)

    Caroline Akinyeye

    2016-05-01

    Full Text Available The new language curriculum in South Africa recommends that extended writing be taught through a combination of text-based (or genre and process approaches. This article reports on a study of the teaching and assessment of narrative writing in English as a first additional language (FAL at a time of curriculum change. The setting is a Cape Flats township school. In focusing on a story written by a Grade 9 learner and assessed by her teacher, the study sought evidence of the use of text-based and process approaches. The theoretical frame is informed by genre theory, which draws on Systemic Functional Linguistics and social constructivist approaches to language learning. A qualitative research paradigm was used. Data obtained for this case study included the learner’s writing, interviews with the teacher, and classroom observation. The study finds very little evidence of a scaffolded approach to the teaching and assessment of writing, and explores the constraints on the realisation of the curriculum cycle in English FAL. These relate to the teacher’s understanding of writing as well as to material conditions in township schools.

  7. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  8. Influence of Writing Ability and Computation Skill on Mathematics Writing

    Science.gov (United States)

    Powell, Sarah R.; Hebert, Michael A.

    2016-01-01

    Mathematics standards expect students to communicate about mathematics using oral and written methods, and some high-stakes assessments ask students to answer mathematics questions by writing. Assumptions about mathematics communication via writing include (a) students possess writing skill, (b) students can transfer this writing skill to…

  9. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  10. Career writing : a creative, expressive and reflective approach to qualitative assessment and guidance

    NARCIS (Netherlands)

    dr. Frans Meijers; Reinekke Lengelle

    2015-01-01

    Career Writing is a narrative approach to qualitative career assessment whereby client (or student) groups use creative, reflective, and expressive forms of writing to foster an internal dialogue about career. It is intended to help individuals construct a career identity by uncovering life themes,

  11. A Three-Year Reflective Writing Program as Part of Introductory Pharmacy Practice Experiences

    Science.gov (United States)

    Vaughn, Jessica; Kerr, Kevin; Zielenski, Christopher; Toppel, Brianna; Johnson, Lauren; McCauley, Patrina; Turner, Christopher J.

    2013-01-01

    Objectives. To implement and evaluate a 3-year reflective writing program incorporated into introductory pharmacy practice experiences (IPPEs) in the first- through third-year of a doctor of pharmacy (PharmD) program. Design. Reflective writing was integrated into 6 IPPE courses to develop students’ lifelong learning skills. In their writing, students were required to self-assess their performance in patient care activities, identify and describe how they would incorporate learning opportunities, and then evaluate their progress. Practitioners, faculty members, and fourth-year PharmD students served as writing preceptors. Assessment. The success of the writing program was assessed by reviewing class performance and surveying writing preceptor’s opinions regarding the student’s achievement of program objectives. Class pass rates averaged greater than 99% over the 8 years of the program and the large majority of the writing preceptors reported that student learning objectives were met. A support pool of 99 writing preceptors was created. Conclusions. A 3-year reflective writing program improved pharmacy students’ reflection and reflective writing skills. PMID:23788811

  12. Learning Science through Writing: Associations with Prior Conceptions of Writing and Perceptions of a Writing Program

    Science.gov (United States)

    Ellis, Robert A.; Taylor, Charlotte E.; Drury, Helen

    2007-01-01

    Students in a large undergraduate biology course were expected to write a scientific report as a key part of their course design. This study investigates the quality of learning arising from the writing experience and how it relates to the quality of students' preconceptions of learning through writing and their perceptions of their writing…

  13. A long-term, continuous simulation approach for large-scale flood risk assessments

    Science.gov (United States)

    Falter, Daniela; Schröter, Kai; Viet Dung, Nguyen; Vorogushyn, Sergiy; Hundecha, Yeshewatesfa; Kreibich, Heidi; Apel, Heiko; Merz, Bruno

    2014-05-01

    The Regional Flood Model (RFM) is a process based model cascade developed for flood risk assessments of large-scale basins. RFM consists of four model parts: the rainfall-runoff model SWIM, a 1D channel routing model, a 2D hinterland inundation model and the flood loss estimation model for residential buildings FLEMOps+r. The model cascade was recently undertaken a proof-of-concept study at the Elbe catchment (Germany) to demonstrate that flood risk assessments, based on a continuous simulation approach, including rainfall-runoff, hydrodynamic and damage estimation models, are feasible for large catchments. The results of this study indicated that uncertainties are significant, especially for hydrodynamic simulations. This was basically a consequence of low data quality and disregarding dike breaches. Therefore, RFM was applied with a refined hydraulic model setup for the Elbe tributary Mulde. The study area Mulde catchment comprises about 6,000 km2 and 380 river-km. The inclusion of more reliable information on overbank cross-sections and dikes considerably improved the results. For the application of RFM for flood risk assessments, long-term climate input data is needed to drive the model chain. This model input was provided by a multi-site, multi-variate weather generator that produces sets of synthetic meteorological data reproducing the current climate statistics. The data set comprises 100 realizations of 100 years of meteorological data. With the proposed continuous simulation approach of RFM, we simulated a virtual period of 10,000 years covering the entire flood risk chain including hydrological, 1D/2D hydrodynamic and flood damage estimation models. This provided a record of around 2.000 inundation events affecting the study area with spatially detailed information on inundation depths and damage to residential buildings on a resolution of 100 m. This serves as basis for a spatially consistent, flood risk assessment for the Mulde catchment presented in

  14. Mathematical writing

    CERN Document Server

    Vivaldi, Franco

    2014-01-01

    This book teaches the art of writing mathematics, an essential -and difficult- skill for any mathematics student.   The book begins with an informal introduction on basic writing principles and a review of the essential dictionary for mathematics. Writing techniques are developed gradually, from the small to the large: words, phrases, sentences, paragraphs, to end with short compositions. These may represent the introduction of a concept, the abstract of a presentation or the proof of a theorem. Along the way the student will learn how to establish a coherent notation, mix words and symbols effectively, write neat formulae, and structure a definition.   Some elements of logic and all common methods of proofs are featured, including various versions of induction and existence proofs. The book concludes with advice on specific aspects of thesis writing (choosing of a title, composing an abstract, compiling a bibliography) illustrated by large number of real-life examples. Many exercises are included; over 150...

  15. The Limits and Possibilities of International Large-Scale Assessments. Education Policy Brief. Volume 9, Number 2, Spring 2011

    Science.gov (United States)

    Rutkowski, David J.; Prusinski, Ellen L.

    2011-01-01

    The staff of the Center for Evaluation & Education Policy (CEEP) at Indiana University is often asked about how international large-scale assessments influence U.S. educational policy. This policy brief is designed to provide answers to some of the most frequently asked questions encountered by CEEP researchers concerning the three most popular…

  16. Improving Process Writing with the Use Authentic Assessment

    Science.gov (United States)

    bin Abdul Aziz, Muhammad Noor; Yusoff, Nurahimah Mohd

    2016-01-01

    The paper discusses on how process writing is improved with the use of authentic assessment in an English Language classroom. Eleven primary school children from Year 4 in a rural school in Sabah are the participants of the study. Data were collected by observing them during the English Language lessons and at the end of the series of…

  17. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  18. Spatiotemporally enhancing time-series DMSP/OLS nighttime light imagery for assessing large-scale urban dynamics

    Science.gov (United States)

    Xie, Yanhua; Weng, Qihao

    2017-06-01

    Accurate, up-to-date, and consistent information of urban extents is vital for numerous applications central to urban planning, ecosystem management, and environmental assessment and monitoring. However, current large-scale urban extent products are not uniform with respect to definition, spatial resolution, temporal frequency, and thematic representation. This study aimed to enhance, spatiotemporally, time-series DMSP/OLS nighttime light (NTL) data for detecting large-scale urban changes. The enhanced NTL time series from 1992 to 2013 were firstly generated by implementing global inter-calibration, vegetation-based spatial adjustment, and urban archetype-based temporal modification. The dataset was then used for updating and backdating urban changes for the contiguous U.S.A. (CONUS) and China by using the Object-based Urban Thresholding method (i.e., NTL-OUT method, Xie and Weng, 2016b). The results showed that the updated urban extents were reasonably accurate, with city-scale RMSE (root mean square error) of 27 km2 and Kappa of 0.65 for CONUS, and 55 km2 and 0.59 for China, respectively. The backdated urban extents yielded similar accuracy, with RMSE of 23 km2 and Kappa of 0.63 in CONUS, while 60 km2 and 0.60 in China. The accuracy assessment further revealed that the spatial enhancement greatly improved the accuracy of urban updating and backdating by significantly reducing RMSE and slightly increasing Kappa values. The temporal enhancement also reduced RMSE, and improved the spatial consistency between estimated and reference urban extents. Although the utilization of enhanced NTL data successfully detected urban size change, relatively low locational accuracy of the detected urban changes was observed. It is suggested that the proposed methodology would be more effective for updating and backdating global urban maps if further fusion of NTL data with higher spatial resolution imagery was implemented.

  19. A Scale for the Assessment of Attitudes and Knowledge Regarding Sexuality in the Aged.

    Science.gov (United States)

    White, Charles B.

    This paper presents the Aging Sexuality Knowledge and Attitudes Scale (ASKAS), an instrument designed to assess the particular aspects of sexual knowledge and attitudes as they relate to the aged. Development of ASKAS items from a survey of existant physiological research on sexuality in older adults and a review of social-psychological writing on…

  20. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  1. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  2. The New Interface for Writing

    Science.gov (United States)

    Hadi-Tabassum, Samina

    2014-01-01

    Schools are scrambling to prepare their students for the writing assessments in correlation with the Common Core tests. In some states, writing has not been assessed for more than a decade. Yet, with the use of computerized grading of the students' writing, many teachers are wondering how to best prepare students for the writing assessments,…

  3. How International Large-Scale Skills Assessments Engage with National Actors: Mobilising Networks through Policy, Media and Public Knowledge

    Science.gov (United States)

    Hamilton, Mary

    2017-01-01

    This paper examines how international, large-scale skills assessments (ILSAs) engage with the broader societies they seek to serve and improve. It looks particularly at the discursive work that is done by different interest groups and the media through which the findings become part of public conversations and are translated into usable form in…

  4. LARGE-SCALE COMMERCIAL INVESTMENTS IN LAND: SEEKING ...

    African Journals Online (AJOL)

    extent of large-scale investment in land or to assess its impact on the people in recipient countries. .... favorable lease terms, apparently based on a belief that this is necessary to .... Harm to the rights of local occupiers of land can result from a dearth. 24. ..... applies to a self-identified group based on the group's traditions.

  5. Using the Rasch measurement model to design a report writing assessment instrument.

    Science.gov (United States)

    Carlson, Wayne R

    2013-01-01

    This paper describes how the Rasch measurement model was used to develop an assessment instrument designed to measure student ability to write law enforcement incident and investigative reports. The ability to write reports is a requirement of all law enforcement recruits in the state of Michigan and is a part of the state's mandatory basic training curriculum, which is promulgated by the Michigan Commission on Law Enforcement Standards (MCOLES). Recently, MCOLES conducted research to modernize its training and testing in the area of report writing. A structured validation process was used, which included: a) an examination of the job tasks of a patrol officer, b) input from content experts, c) a review of the professional research, and d) the creation of an instrument to measure student competency. The Rasch model addressed several measurement principles that were central to construct validity, which were particularly useful for assessing student performances. Based on the results of the report writing validation project, the state established a legitimate connectivity between the report writing standard and the essential job functions of a patrol officer in Michigan. The project also produced an authentic instrument for measuring minimum levels of report writing competency, which generated results that are valid for inferences of student ability. Ultimately, the state of Michigan must ensure the safety of its citizens by licensing only those patrol officers who possess a minimum level of core competency. Maintaining the validity and reliability of both the training and testing processes can ensure that the system for producing such candidates functions as intended.

  6. The 2008-2009 Pennsylvania System of School Assessment Handbook for Assessment Coordinators: Writing, Reading and Mathematics, Science

    Science.gov (United States)

    Pennsylvania Department of Education, 2010

    2010-01-01

    This handbook describes the responsibilities of district and school assessment coordinators in the administration of the Pennsylvania System of School Assessment (PSSA). This updated guidebook contains the following sections: (1) General Assessment Guidelines for All Assessments; (2) Writing Specific Guidelines; (3) Reading and Mathematics…

  7. Faculty Feelings as Writers: Relationship with Writing Genres, Perceived Competences, and Values Associated to Writing

    Science.gov (United States)

    del Pilar Gallego Castaño, Liliana; Castelló Badia, Montserrat; Badia Garganté, Antoni

    2016-01-01

    This study attempts to relate faculty feelings towards writing with writing genres, perceived competences and values associated to writing. 67 foreign languages faculty in Colombia and Spain voluntarily filled in a four-section on-line questionnaire entitled "The Writing Feelings Questionnaire." All the sections were Likert Scale type.…

  8. ``Large''- vs Small-scale friction control in turbulent channel flow

    Science.gov (United States)

    Canton, Jacopo; Örlü, Ramis; Chin, Cheng; Schlatter, Philipp

    2017-11-01

    We reconsider the ``large-scale'' control scheme proposed by Hussain and co-workers (Phys. Fluids 10, 1049-1051 1998 and Phys. Rev. Fluids, 2, 62601 2017), using new direct numerical simulations (DNS). The DNS are performed in a turbulent channel at friction Reynolds number Reτ of up to 550 in order to eliminate low-Reynolds-number effects. The purpose of the present contribution is to re-assess this control method in the light of more modern developments in the field, in particular also related to the discovery of (very) large-scale motions. The goals of the paper are as follows: First, we want to better characterise the physics of the control, and assess what external contribution (vortices, forcing, wall motion) are actually needed. Then, we investigate the optimal parameters and, finally, determine which aspects of this control technique actually scale in outer units and can therefore be of use in practical applications. In addition to discussing the mentioned drag-reduction effects, the present contribution will also address the potential effect of the naturally occurring large-scale motions on frictional drag, and give indications on the physical processes for potential drag reduction possible at all Reynolds numbers.

  9. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  10. Dynamic Assessment of Writing: The Impact of Implicit/Explicit Mediations on L2 Learners' Internalization of Writing Skills and Strategies

    Science.gov (United States)

    Alavi, Sayyed Mohammad; Taghizadeh, Mahboubeh

    2014-01-01

    Dynamic assessment is a procedure in which development is simultaneously assessed and improved with regard to the individual's or group's Zone of Proximal Development (ZPD; Lantolf & Poehner, 2004). This study aimed to follow dynamic assessment and investigate the impact of three types of implicit and explicit feedback on the essay writing of…

  11. Large-scale assessment of olfactory preferences and learning in Drosophila melanogaster: behavioral and genetic components

    Directory of Open Access Journals (Sweden)

    Elisabetta Versace

    2015-09-01

    Full Text Available In the Evolve and Resequence method (E&R, experimental evolution and genomics are combined to investigate evolutionary dynamics and the genotype-phenotype link. As other genomic approaches, this methods requires many replicates with large population sizes, which imposes severe restrictions on the analysis of behavioral phenotypes. Aiming to use E&R for investigating the evolution of behavior in Drosophila, we have developed a simple and effective method to assess spontaneous olfactory preferences and learning in large samples of fruit flies using a T-maze. We tested this procedure on (a a large wild-caught population and (b 11 isofemale lines of Drosophila melanogaster. Compared to previous methods, this procedure reduces the environmental noise and allows for the analysis of large population samples. Consistent with previous results, we show that flies have a preference for orange vs. apple odor. With our procedure wild-derived flies exhibit olfactory learning in the absence of previous laboratory selection. Furthermore, we find genetic differences in the olfactory learning with relatively high heritability. We propose this large-scale method as an effective tool for E&R and genome-wide association studies on olfactory preferences and learning.

  12. Using GRACE Satellite Gravimetry for Assessing Large-Scale Hydrologic Extremes

    Directory of Open Access Journals (Sweden)

    Alexander Y. Sun

    2017-12-01

    Full Text Available Global assessment of the spatiotemporal variability in terrestrial total water storage anomalies (TWSA in response to hydrologic extremes is critical for water resources management. Using TWSA derived from the gravity recovery and climate experiment (GRACE satellites, this study systematically assessed the skill of the TWSA-climatology (TC approach and breakpoint (BP detection method for identifying large-scale hydrologic extremes. The TC approach calculates standardized anomalies by using the mean and standard deviation of the GRACE TWSA corresponding to each month. In the BP detection method, the empirical mode decomposition (EMD is first applied to identify the mean return period of TWSA extremes, and then a statistical procedure is used to identify the actual occurrence times of abrupt changes (i.e., BPs in TWSA. Both detection methods were demonstrated on basin-averaged TWSA time series for the world’s 35 largest river basins. A nonlinear event coincidence analysis measure was applied to cross-examine abrupt changes detected by these methods with those detected by the Standardized Precipitation Index (SPI. Results show that our EMD-assisted BP procedure is a promising tool for identifying hydrologic extremes using GRACE TWSA data. Abrupt changes detected by the BP method coincide well with those of the SPI anomalies and with documented hydrologic extreme events. Event timings obtained by the TC method were ambiguous for a number of river basins studied, probably because the GRACE data length is too short to derive long-term climatology at this time. The BP approach demonstrates a robust wet-dry anomaly detection capability, which will be important for applications with the upcoming GRACE Follow-On mission.

  13. Personality Assessment Inventory scale characteristics and factor structure in the assessment of alcohol dependency.

    Science.gov (United States)

    Schinka, J A

    1995-02-01

    Individual scale characteristics and the inventory structure of the Personality Assessment Inventory (PAI; Morey, 1991) were examined by conducting internal consistency and factor analyses of item and scale score data from a large group (N = 301) of alcohol-dependent patients. Alpha coefficients, mean inter-item correlations, and corrected item-total scale correlations for the sample paralleled values reported by Morey for a large clinical sample. Minor differences in the scale factor structure of the inventory from Morey's clinical sample were found. Overall, the findings support the use of the PAI in the assessment of personality and psychopathology of alcohol-dependent patients.

  14. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    Science.gov (United States)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  15. Large scale integration of flexible non-volatile, re-addressable memories using P(VDF-TrFE) and amorphous oxide transistors

    International Nuclear Information System (INIS)

    Gelinck, Gerwin H; Cobb, Brian; Van Breemen, Albert J J M; Myny, Kris

    2015-01-01

    Ferroelectric polymers and amorphous metal oxide semiconductors have emerged as important materials for re-programmable non-volatile memories and high-performance, flexible thin-film transistors, respectively. However, realizing sophisticated transistor memory arrays has proven to be a challenge, and demonstrating reliable writing to and reading from such a large scale memory has thus far not been demonstrated. Here, we report an integration of ferroelectric, P(VDF-TrFE), transistor memory arrays with thin-film circuitry that can address each individual memory element in that array. n-type indium gallium zinc oxide is used as the active channel material in both the memory and logic thin-film transistors. The maximum process temperature is 200 °C, allowing plastic films to be used as substrate material. The technology was scaled up to 150 mm wafer size, and offers good reproducibility, high device yield and low device variation. This forms the basis for successful demonstration of memory arrays, read and write circuitry, and the integration of these. (paper)

  16. Environmental impact assessment and environmental audit in large-scale public infrastructure construction: the case of the Qinghai-Tibet Railway.

    Science.gov (United States)

    He, Guizhen; Zhang, Lei; Lu, Yonglong

    2009-09-01

    Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.

  17. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  18. A new method for large-scale assessment of change in ecosystem functioning in relation to land degradation

    Science.gov (United States)

    Horion, Stephanie; Ivits, Eva; Verzandvoort, Simone; Fensholt, Rasmus

    2017-04-01

    Ongoing pressures on European land are manifold with extreme climate events and non-sustainable use of land resources being amongst the most important drivers altering the functioning of the ecosystems. The protection and conservation of European natural capital is one of the key objectives of the 7th Environmental Action Plan (EAP). The EAP stipulates that European land must be managed in a sustainable way by 2020 and the UN Sustainable development goals define a Land Degradation Neutral world as one of the targets. This implies that land degradation (LD) assessment of European ecosystems must be performed repeatedly allowing for the assessment of the current state of LD as well as changes compared to a baseline adopted by the UNCCD for the objective of land degradation neutrality. However, scientifically robust methods are still lacking for large-scale assessment of LD and repeated consistent mapping of the state of terrestrial ecosystems. Historical land degradation assessments based on various methods exist, but methods are generally non-replicable or difficult to apply at continental scale (Allan et al. 2007). The current lack of research methods applicable at large spatial scales is notably caused by the non-robust definition of LD, the scarcity of field data on LD, as well as the complex inter-play of the processes driving LD (Vogt et al., 2011). Moreover, the link between LD and changes in land use (how land use changes relates to change in vegetation productivity and ecosystem functioning) is not straightforward. In this study we used the segmented trend method developed by Horion et al. (2016) for large-scale systematic assessment of hotspots of change in ecosystem functioning in relation to LD. This method alleviates shortcomings of widely used linear trend model that does not account for abrupt change, nor adequately captures the actual changes in ecosystem functioning (de Jong et al. 2013; Horion et al. 2016). Here we present a new methodology for

  19. Development of the Write Process for Pipeline-Ready Heavy Oil

    Energy Technology Data Exchange (ETDEWEB)

    Lee Brecher; Charles Mones; Frank Guffey

    2009-03-07

    Work completed under this program advances the goal of demonstrating Western Research Institute's (WRI's) WRITE{trademark} process for upgrading heavy oil at field scale. MEG Energy Corporation (MEG) located in Calgary, Alberta, Canada supported efforts at WRI to develop the WRITE{trademark} process as an oil sands, field-upgrading technology through this Task 51 Jointly Sponsored Research project. The project consisted of 6 tasks: (1) optimization of the distillate recovery unit (DRU), (2) demonstration and design of a continuous coker, (3) conceptual design and cost estimate for a commercial facility, (4) design of a WRITE{trademark} pilot plant, (5) hydrotreating studies, and (6) establish a petroleum analysis laboratory. WRITE{trademark} is a heavy oil and bitumen upgrading process that produces residuum-free, pipeline ready oil from heavy material with undiluted density and viscosity that exceed prevailing pipeline specifications. WRITE{trademark} uses two processing stages to achieve low and high temperature conversion of heavy oil or bitumen. The first stage DRU operates at mild thermal cracking conditions, yielding a light overhead product and a heavy residuum or bottoms material. These bottoms flow to the second stage continuous coker that operates at severe pyrolysis conditions, yielding light pyrolyzate and coke. The combined pyrolyzate and mildly cracked overhead streams form WRITE{trademark}'s synthetic crude oil (SCO) production. The main objectives of this project were to (1) complete testing and analysis at bench scale with the DRU and continuous coker reactors and provide results to MEG for process evaluation and scale-up determinations and (2) complete a technical and economic assessment of WRITE{trademark} technology to determine its viability. The DRU test program was completed and a processing envelope developed. These results were used for process assessment and for scaleup. Tests in the continuous coker were intended to

  20. Malaysian Tertiary Level ESL Students’ Perceptions toward Teacher Feedback, Peer Feedback and Self-assessment in their Writing

    Directory of Open Access Journals (Sweden)

    Kayatri Vasu

    2016-09-01

    Full Text Available In Malaysia, teacher feedback is highly preferred by students, who often believe that teachers know best. Teacher feedback shows them their teacher’s idea of an ideal writing. However, excessive dependence on teachers adds to their workload. Therefore, teachers are increasingly promoting two other alternative methods that are gradually gaining importance. These methods are peer feedback and self-assessment. This study investigates ESL students’ perceptions toward teacher feedback, peer feedback and self-assessment in students’ writing process. Questionnaires, adapted from the instruments in the literature, were administered to 107 randomly selected students in a private local university in Malaysia. Students found feedback given to the content and organization of their writing more useful than feedback provided for their vocabulary and grammar. It was also found that students perceived feedback from teacher, peers and self-assessment all as highly useful. Additionally the results indicated while there was no significant difference (p > .05 between the students’ perceptions toward teacher feedback and self-assessment, they were both perceived as significantly more useful (p < .001 than peer feedback. The students also perceived explicit feedback as significantly more useful (p < .001 than implicit feedback. The results of this study have implications for English language learning-teaching practitioners and researchers. They shed light on the options preferred by students in revising their writing in ESL writing classrooms. Future research on the effects of teacher feedback, peer feedback and self-assessment on students’ writing performance will provide better insight on the preferred methods in ESL writing classrooms in similar settings.

  1. Study on two-dimensional POISSON design of large-scale FFAG magnet

    International Nuclear Information System (INIS)

    Ouyang Huafu

    2006-01-01

    In order to decrease the edge effect of the field, the designed magnetic field distribution in a large-scale FFAG magnet is realized by both the trim coil and the shape of the magnet pole-face. Through two-dimensional POISSON simulations, the distribution about the current and the position of the trim coil and the shape of the magnet pole are determined. In order to facilitate the POISSON design, two codes are writteen to automatically adjust the current and the position of the trim coil and the shape of magnet pole-face appeared in the POISSON input file. With the two codes, the efficiency of POISSON simulations is improved and the mistakes which might occur in writing and adjusting the POISSON input file manually could be avoided. (authors)

  2. Vocabulary and Writing in a First and Second Language

    DEFF Research Database (Denmark)

    Albrechtsen, Dorte; Haastrup, Kirsten; Henriksen, Birgit

    Book description: Vocabulary and Writing in a First and Second Language is based on a large-scale empirical study. The innovative feature of the research was that the same students were asked to do the same tasks in both languages while reporting their thinking as they went along. Furthermore , t......-depth approach useful in understanding the processes of both first and second language performance......Book description: Vocabulary and Writing in a First and Second Language is based on a large-scale empirical study. The innovative feature of the research was that the same students were asked to do the same tasks in both languages while reporting their thinking as they went along. Furthermore...... the relationship between the skills and describe the level of development for individual learners within the three areas. In all cases, statistical and qualitative analyses are offered, the latter being based on the learners' own 'think-aloud' reports. Both researchers and teachers of language will find this in...

  3. A method for the assessment of the visual impact caused by the large-scale deployment of renewable-energy facilities

    International Nuclear Information System (INIS)

    Rodrigues, Marcos; Montanes, Carlos; Fueyo, Norberto

    2010-01-01

    The production of energy from renewable sources requires a significantly larger use of the territory compared with conventional (fossil and nuclear) sources. For large penetrations of renewable technologies, such as wind power, the overall visual impact at the national level can be substantial, and may prompt public reaction. This study develops a methodology for the assessment of the visual impact that can be used to measure and report the level of impact caused by several renewable technologies (wind farms, solar photovoltaic plants or solar thermal ones), both at the local and regional (e.g. national) scales. Applications are shown to several large-scale, hypothetical scenarios of wind and solar-energy penetration in Spain, and also to the vicinity of an actual, single wind farm.

  4. Approaches to large scale unsaturated flow in heterogeneous, stratified, and fractured geologic media

    International Nuclear Information System (INIS)

    Ababou, R.

    1991-08-01

    This report develops a broad review and assessment of quantitative modeling approaches and data requirements for large-scale subsurface flow in radioactive waste geologic repository. The data review includes discussions of controlled field experiments, existing contamination sites, and site-specific hydrogeologic conditions at Yucca Mountain. Local-scale constitutive models for the unsaturated hydrodynamic properties of geologic media are analyzed, with particular emphasis on the effect of structural characteristics of the medium. The report further reviews and analyzes large-scale hydrogeologic spatial variability from aquifer data, unsaturated soil data, and fracture network data gathered from the literature. Finally, various modeling strategies toward large-scale flow simulations are assessed, including direct high-resolution simulation, and coarse-scale simulation based on auxiliary hydrodynamic models such as single equivalent continuum and dual-porosity continuum. The roles of anisotropy, fracturing, and broad-band spatial variability are emphasized. 252 refs

  5. Processfolio: Uniting Academic Literacies and Critical Emancipatory Action Research for Practitioner-Led Inquiry into EAP Writing Assessment

    Science.gov (United States)

    Pearson, Jayne

    2017-01-01

    This article reports on the design and implementation of an alternative form of writing assessment in a UK English for Academic Purposes (EAP) pre-sessional course. The assessment, termed processfolio, was a response to research inquiry into how writing assessment in a local context negated student agency and inculcated disempowering models of…

  6. Matrix Sampling of Items in Large-Scale Assessments

    Directory of Open Access Journals (Sweden)

    Ruth A. Childs

    2003-07-01

    Full Text Available Matrix sampling of items -' that is, division of a set of items into different versions of a test form..-' is used by several large-scale testing programs. Like other test designs, matrixed designs have..both advantages and disadvantages. For example, testing time per student is less than if each..student received all the items, but the comparability of student scores may decrease. Also,..curriculum coverage is maintained, but reporting of scores becomes more complex. In this paper,..matrixed designs are compared with more traditional designs in nine categories of costs:..development costs, materials costs, administration costs, educational costs, scoring costs,..reliability costs, comparability costs, validity costs, and reporting costs. In choosing among test..designs, a testing program should examine the costs in light of its mandate(s, the content of the..tests, and the financial resources available, among other considerations.

  7. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  8. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  9. Dual Rubrics and the Process of Writing: Assessment and Best Practices in a Developmental English Course

    Science.gov (United States)

    Pireh, Diane Flanegan

    2014-01-01

    This article presents strategies for using two types of essay-writing rubrics in a developmental English class of students transitioning into college-level writing. One checklist rubric is student-facing, designed to serve as a guide for students throughout the writing process and as a self-assessment tool. The other checklist rubric is…

  10. Sub-surface laser nanostructuring in stratified metal/dielectric media: a versatile platform towards flexible, durable and large-scale plasmonic writing

    International Nuclear Information System (INIS)

    Siozios, A; Bellas, D V; Lidorikis, E; Patsalas, P; Kalfagiannis, N; Cranton, W M; Koutsogeorgis, D C; Bazioti, C; Dimitrakopulos, G P; Vourlias, G

    2015-01-01

    Laser nanostructuring of pure ultrathin metal layers or ceramic/metal composite thin films has emerged as a promising route for the fabrication of plasmonic patterns with applications in information storage, cryptography, and security tagging. However, the environmental sensitivity of pure Ag layers and the complexity of ceramic/metal composite film growth hinder the implementation of this technology to large-scale production, as well as its combination with flexible substrates. In the present work we investigate an alternative pathway, namely, starting from non-plasmonic multilayer metal/dielectric layers, whose growth is compatible with large scale production such as in-line sputtering and roll-to-roll deposition, which are then transformed into plasmonic templates by single-shot UV-laser annealing (LA). This entirely cold, large-scale process leads to a subsurface nanoconstruction involving plasmonic Ag nanoparticles (NPs) embedded in a hard and inert dielectric matrix on top of both rigid and flexible substrates. The subsurface encapsulation of Ag NPs provides durability and long-term stability, while the cold character of LA suits the use of sensitive flexible substrates. The morphology of the final composite film depends primarily on the nanocrystalline character of the dielectric host and its thermal conductivity. We demonstrate the emergence of a localized surface plasmon resonance, and its tunability depending on the applied fluence and environmental pressure. The results are well explained by theoretical photothermal modeling. Overall, our findings qualify the proposed process as an excellent candidate for versatile, large-scale optical encoding applications. (paper)

  11. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  12. The Effects of Differing Response Criteria on the Assessment of Writing Competence.

    Science.gov (United States)

    Winters, Lynn

    The purpose of this study was to investigate the relative validities of four essay scoring systems, reflecting alternative conceptualizations of the writing process, for identifying "competent" writers. Each rater was trained in two of the four scoring systems: General Impression Scoring (GI), Diederich Expository Scale (DES), CSE…

  13. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  14. Effects of an expressive writing intervention on cancer-related distress in Danish breast cancer survivors

    DEFF Research Database (Denmark)

    Jensen-Johansen, Mikael Birkelund; Christensen, Søren; Valdimarsdottir, Heiddis

    2013-01-01

    Objective: To examine the effects of an expressive writing intervention (EWI) on cancer-related distress, depressive symptoms, and mood in women treated for early stage breast cancer. Methods: A nationwide sample of 507 Danish women who had recently completed treatment for primary breast cancer...... were randomly assigned to three 20-min home-based writing exercises, one week apart, focusing on either emotional disclosure (EWI group) or a non-emotional topic (control group). Cancer-related distress [Impact of Event Scale (IES)], depressive symptoms (Beck Depression Inventory—Short Form......), and negative (37-item Profile of Moods State) and positive mood (Passive Positive Mood Scale) were assessed at baseline and at 3 and 9 months post-intervention. Choice of writing topic (cancer versus other), alexithymia (20-item Toronto Alexithymia Scale), and social constraints (Social Constraints Scale) were...

  15. Guidance for Large-scale Implementation of Alternate Wetting and Drying: A Biophysical Suitability Assessment

    Science.gov (United States)

    Sander, B. O.; Wassmann, R.; Nelson, A.; Palao, L.; Wollenberg, E.; Ishitani, M.

    2014-12-01

    The alternate wetting and drying (AWD) technology for rice production does not only save 15-30% of irrigation water, it also reduces methane emissions by up to 70%. AWD is defined by periodic drying and re-flooding of a rice field. Due to its high mitigation potential and its simplicity to execute this practice AWD has gained a lot of attention in recent years. The Climate and Clean Air Coalition (CCAC) has put AWD high on its agenda and funds a project to guide implementation of this technology in Vietnam, Bangladesh and Colombia. One crucial activity is a biophysical suitability assessment for AWD in the three countries. For this, we analyzed rainfall and soil data as well as potential evapotranspiration to assess if the water balance allows practicing AWD or if precipitation is too high for rice fields to fall dry. In my talk I will outline key factors for a successful large-scale implementation of AWD with a focus on the biophysical suitability assessment. The seasonal suitability maps that we generated highlight priority areas for AWD implementation and guide policy makers to informed decisions about meaningful investments in infrastructure and extension work.

  16. What Are They Thinking? Automated Analysis of Student Writing about Acid-Base Chemistry in Introductory Biology

    Science.gov (United States)

    Haudek, Kevin C.; Prevost, Luanna B.; Moscarella, Rosa A.; Merrill, John; Urban-Lurain, Mark

    2012-01-01

    Students' writing can provide better insight into their thinking than can multiple-choice questions. However, resource constraints often prevent faculty from using writing assessments in large undergraduate science courses. We investigated the use of computer software to analyze student writing and to uncover student ideas about chemistry in an…

  17. A Critical Review of the IELTS Writing Test

    Science.gov (United States)

    Uysal, Hacer Hande

    2010-01-01

    Administered at local centres in 120 countries throughout the world, IELTS (International English Language Testing System) is one of the most widely used large-scale ESL tests that also offers a direct writing test component. Because of its popularity and its use for making critical decisions about test takers, it is crucial to draw attention to…

  18. The Convergence of High Performance Computing and Large Scale Data Analytics

    Science.gov (United States)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  19. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  20. On Matrix Sampling and Imputation of Context Questionnaires with Implications for the Generation of Plausible Values in Large-Scale Assessments

    Science.gov (United States)

    Kaplan, David; Su, Dan

    2016-01-01

    This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…

  1. The Effect of Dialogue Journal Writing on EFL Students' Writing Skill

    Directory of Open Access Journals (Sweden)

    Ali Gholami Mehrdad

    2008-02-01

    Full Text Available Despite the role writing plays in learning a foreign language, many students do not show much interest in taking an active part in writing classes (Myint, 1997. Thus different activities have been proposed to motivate students to write one of which is dialogue journal writing, and the present work tries to investigate the possible effect(s of such activity on writing ability of a group of English students at Islamic Azad University- Hamedan branch. To do this, 50 students obtaining 1 and 2 on the TWE scale on the structure section of a TOEFL test were selected and randomly assigned to experimental and control groups. After some introductory sessions, the students were asked to write paragraphs on a weekly schedule and hand them in to be corrected. In the experimental group the students were, furthermore, asked to keep journals and hand them in. After 4 months, the students in both groups took part in a writing exam in which they had to write two paragraphs on the topics given. The comparison of the means at p

  2. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  3. Application of plant metabonomics in quality assessment for large-scale production of traditional Chinese medicine.

    Science.gov (United States)

    Ning, Zhangchi; Lu, Cheng; Zhang, Yuxin; Zhao, Siyu; Liu, Baoqin; Xu, Xuegong; Liu, Yuanyan

    2013-07-01

    The curative effects of traditional Chinese medicines are principally based on the synergic effect of their multi-targeting, multi-ingredient preparations, in contrast to modern pharmacology and drug development that often focus on a single chemical entity. Therefore, the method employing a few markers or pharmacologically active constituents to assess the quality and authenticity of the complex preparations has a number of severe challenges. Metabonomics can provide an effective platform for complex sample analysis. It is also reported to be applied to the quality analysis of the traditional Chinese medicine. Metabonomics enables comprehensive assessment of complex traditional Chinese medicines or herbal remedies and sample classification of diverse biological statuses, origins, or qualities in samples, by means of chemometrics. Identification, processing, and pharmaceutical preparation are the main procedures in the large-scale production of Chinese medicinal preparations. Through complete scans, plants metabonomics addresses some of the shortfalls of single analyses and presents a considerable potential to become a sharp tool for traditional Chinese medicine quality assessment. Georg Thieme Verlag KG Stuttgart · New York.

  4. Large scale debris-flow hazard assessment: a geotechnical approach and GIS modelling

    Directory of Open Access Journals (Sweden)

    G. Delmonaco

    2003-01-01

    Full Text Available A deterministic distributed model has been developed for large-scale debris-flow hazard analysis in the basin of River Vezza (Tuscany Region – Italy. This area (51.6 km 2 was affected by over 250 landslides. These were classified as debris/earth flow mainly involving the metamorphic geological formations outcropping in the area, triggered by the pluviometric event of 19 June 1996. In the last decades landslide hazard and risk analysis have been favoured by the development of GIS techniques permitting the generalisation, synthesis and modelling of stability conditions on a large scale investigation (>1:10 000. In this work, the main results derived by the application of a geotechnical model coupled with a hydrological model for the assessment of debris flows hazard analysis, are reported. This analysis has been developed starting by the following steps: landslide inventory map derived by aerial photo interpretation, direct field survey, generation of a database and digital maps, elaboration of a DTM and derived themes (i.e. slope angle map, definition of a superficial soil thickness map, geotechnical soil characterisation through implementation of a backanalysis on test slopes, laboratory test analysis, inference of the influence of precipitation, for distinct return times, on ponding time and pore pressure generation, implementation of a slope stability model (infinite slope model and generalisation of the safety factor for estimated rainfall events with different return times. Such an approach has allowed the identification of potential source areas of debris flow triggering. This is used to detected precipitation events with estimated return time of 10, 50, 75 and 100 years. The model shows a dramatic decrease of safety conditions for the simulation when is related to a 75 years return time rainfall event. It corresponds to an estimated cumulated daily intensity of 280–330 mm. This value can be considered the hydrological triggering

  5. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  6. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  7. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  8. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  9. Methods for assessing the socioeconomic impacts of large-scale resource developments: implications for nuclear repository siting

    International Nuclear Information System (INIS)

    Murdock, S.H.; Leistritz, F.L.

    1983-03-01

    An overview of the major methods presently available for assessing the socioeconomic impacts of large-scale resource developments and includes discussion of the implications and applications of such methods for nuclear-waste-repository siting are provided. The report: (1) summarizes conceptual approaches underlying, and methodological alternatives for, the conduct of impact assessments in each substantive area, and then enumerates advantages and disadvantages of each alternative; (2) describes factors related to the impact-assessment process, impact events, and the characteristics of rural areas that affect the magnitude and distribution of impacts and the assessment of impacts in each area; (3) provides a detailed review of those methodologies actually used in impact assessment for each area, describes advantages and problems encountered in the use of each method, and identifies the frequency of use and the general level of acceptance of each technique; and (4) summarizes the implications of each area of projection for the repository-siting process, the applicability of the methods for each area to the special and standard features of repositories, and makes general recommendations concerning specific methods and procedures that should be incorporated in assessments for siting areas

  10. A large scale field experiment in the Amazon Basin (Lambada/Bateristca)

    Energy Technology Data Exchange (ETDEWEB)

    Dolman, A.J.; Kabat, P.; Gash, J.H.C.; Noilhan, J.; Jochum, A.M.; Nobre, C. [Winand Staring Centre, Wageningen (Netherlands)

    1994-12-31

    A description is given of a large scale field experiment planned in the Amazon Basin, aiming to assess the large scale balances of energy, water and CO{sub 2}. The background for this experiment, the embedding in global change programmes of IGBP/BAHC and WCRP/GEWEX is described. A proposal by four European groups aimed at designing the experiment with the help of mesoscale models is described and a possible European input to this experiment is suggested. 24 refs., 1 app.

  11. Investigating the Practices of Assessment Methods in Amharic Language Writing Skill Context: The Case of Selected Higher Education in Ethiopia

    Science.gov (United States)

    Tesfay, Hailay

    2017-01-01

    This study aims to investigate Ethiopian higher education Amharic language writing skills instructors' practices of Assessment Methods in writing skill context. It was also intended to look for their viewpoints about the practicality of implementing Assessment Methods in Amharic writing courses. In order to achieve the goals of this study,…

  12. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  13. Trait Based Assessment on Teaching Writing Skill for EFL Learners

    Science.gov (United States)

    Asrobi, Maman; Prasetyaningrum, Ari

    2017-01-01

    This study was conducted in order to investigate the effectiveness of trait based assessment on teaching writing skill for EFL learners. Designed as pre-experimental study with one group pretest and posttest design, it examined 20 students of the second semester of English Department of "Hamzanwadi University" in the academic year…

  14. Measuring the Impact of Rater Negotiation in Writing Performance Assessment

    Science.gov (United States)

    Trace, Jonathan; Janssen, Gerriet; Meier, Valerie

    2017-01-01

    Previous research in second language writing has shown that when scoring performance assessments even trained raters can exhibit significant differences in severity. When raters disagree, using discussion to try to reach a consensus is one popular form of score resolution, particularly in contexts with limited resources, as it does not require…

  15. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  16. A probabilistic assessment of large scale wind power development for long-term energy resource planning

    Science.gov (United States)

    Kennedy, Scott Warren

    A steady decline in the cost of wind turbines and increased experience in their successful operation have brought this technology to the forefront of viable alternatives for large-scale power generation. Methodologies for understanding the costs and benefits of large-scale wind power development, however, are currently limited. In this thesis, a new and widely applicable technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic modeling techniques to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. A method for including the spatial smoothing effect of geographically dispersed wind farms is also introduced. The model has been used to analyze potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle (NGCC) and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on natural gas and coal prices is also discussed. In power systems with a high penetration of wind generated electricity, the intermittent availability of wind power may influence hourly spot prices. A price responsive electricity demand model is introduced that shows a small increase in wind power value when consumers react to hourly spot prices. The effectiveness of this mechanism depends heavily on estimates of the own- and cross-price elasticities of aggregate electricity demand. This work makes a valuable

  17. Integrating adaptive behaviour in large-scale flood risk assessments: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Aerts, Jeroen

    2015-04-01

    Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.

  18. INTERACTIONIST DYNAMIC ASSESSMENT IN ACADEMIC PERSUASIVE WRITING: A CASE OF TWO EFL LEARNERS

    Directory of Open Access Journals (Sweden)

    Zahra kheradmand Saadi

    2017-09-01

    Full Text Available This study investigated the effects of interactionist dynamic assessment on improving academic persuasive writing of two Iranian EFL learners majoring in English Language and Literature. Qualitative analysis of the interactions between the mediator and learners and the drafts written by the learners indicated that using different types of mediation were effective in developing learners’ persuasive writing. In addition to the factors such as individual, time, and language feature which were shown to be integral in determining mediation, assessment of the two cases showed that factors such as mediator’s role, learners’ responsiveness to mediation, and agency were important in specifying mediation.

  19. Observing writing processes of struggling adult writers with collaborative writing

    Directory of Open Access Journals (Sweden)

    Afra Sturm

    2016-10-01

    Full Text Available This study investigated how struggling adult writers solve a writing task and what they know about writing and themselves as writers. The writing process of the adult writers was examined by combining three elements: the observation of collaborative writing tasks, analyses of their written texts, and structured individual interviews that included both retrospective and prospective parts. This methodical approach provides productive tools to assess writing processes and writing knowledge of struggling adult writers. The triangulation of data from the different sources is visualized in a case study. Findings from the case study suggest both similarities and differences between struggling adult and younger writers. Concerning the writing process of both groups, planning and revision play a limited role. However, alongside these similar limitations in their writing process, struggling adult writers distinguish themselves from their young counterparts through their relatively extensive knowledge about themselves as writers.

  20. Nuclear-pumped lasers for large-scale applications

    International Nuclear Information System (INIS)

    Anderson, R.E.; Leonard, E.M.; Shea, R.F.; Berggren, R.R.

    1989-05-01

    Efficient initiation of large-volume chemical lasers may be achieved by neutron induced reactions which produce charged particles in the final state. When a burst mode nuclear reactor is used as the neutron source, both a sufficiently intense neutron flux and a sufficiently short initiation pulse may be possible. Proof-of-principle experiments are planned to demonstrate lasing in a direct nuclear-pumped large-volume system; to study the effects of various neutron absorbing materials on laser performance; to study the effects of long initiation pulse lengths; to demonstrate the performance of large-scale optics and the beam quality that may be obtained; and to assess the performance of alternative designs of burst systems that increase the neutron output and burst repetition rate. 21 refs., 8 figs., 5 tabs

  1. Assessment and Intervention in Overcoming Writing Difficulties: An Illustration From the Self-Regulated Strategy Development Model.

    Science.gov (United States)

    Graham, Steve; Harris, Karen R

    1999-07-01

    The progress of a 12-year-old boy with learning disabilities and severe writing difficulties is followed from initial assessment through instruction in strategies for planning, revising, and managing the composing process. A validated instructional model, Self-Regulated Strategy Development (SRSD), was used to teach these processes. With SRSD, writing strategies are explicitly taught in combination with procedures for regulating the use of these strategies, the writing process, and any undesirable behaviors that may impede performance. Recommendations are offered to speech-language pathologists for applying the SRSD model to children experiencing writing difficulties.

  2. The assessment of the readiness of five countries to implement child maltreatment prevention programs on a large scale.

    Science.gov (United States)

    Mikton, Christopher; Power, Mick; Raleva, Marija; Makoae, Mokhantso; Al Eissa, Majid; Cheah, Irene; Cardia, Nancy; Choo, Claire; Almuneef, Maha

    2013-12-01

    This study aimed to systematically assess the readiness of five countries - Brazil, the Former Yugoslav Republic of Macedonia, Malaysia, Saudi Arabia, and South Africa - to implement evidence-based child maltreatment prevention programs on a large scale. To this end, it applied a recently developed method called Readiness Assessment for the Prevention of Child Maltreatment based on two parallel 100-item instruments. The first measures the knowledge, attitudes, and beliefs concerning child maltreatment prevention of key informants; the second, completed by child maltreatment prevention experts using all available data in the country, produces a more objective assessment readiness. The instruments cover all of the main aspects of readiness including, for instance, availability of scientific data on the problem, legislation and policies, will to address the problem, and material resources. Key informant scores ranged from 31.2 (Brazil) to 45.8/100 (the Former Yugoslav Republic of Macedonia) and expert scores, from 35.2 (Brazil) to 56/100 (Malaysia). Major gaps identified in almost all countries included a lack of professionals with the skills, knowledge, and expertise to implement evidence-based child maltreatment programs and of institutions to train them; inadequate funding, infrastructure, and equipment; extreme rarity of outcome evaluations of prevention programs; and lack of national prevalence surveys of child maltreatment. In sum, the five countries are in a low to moderate state of readiness to implement evidence-based child maltreatment prevention programs on a large scale. Such an assessment of readiness - the first of its kind - allows gaps to be identified and then addressed to increase the likelihood of program success. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Writing orthotic device for the management of writer's cramp

    Directory of Open Access Journals (Sweden)

    Narayanasarma V. Singam

    2013-01-01

    Full Text Available Background: Oral therapies and chemodenervation procedures are often unrewarding in the treatment of focal, task-specific hand disorders such as writer's cramp or primary writing tremor. Methods: A portable writing orthotic device was evaluated on fifteen consecutively recruited writer's cramp and primary writing tremor subjects. We measured overall impairment at baseline and after two weeks of at-home use with the Writer’s Cramp Rating Scale (range = 0-8, higher is worse and writing quality and comfort with a visual analog scale (range = 0-10. Results: Compared to regular pen, the writing orthotic device improved the Writer's Cramp Rating Scale scores at first-test (p=0.001 and re-test (p=0.005 as well as writing quality and device comfort in writer's cramp subjects. Benefits were sustained at two weeks. Primary writing tremor subjects demonstrated no improvements.Conclusions: Writing orthotic devices exploiting a muscle-substitution strategy may yield immediate benefits in patients with writer's cramp.

  4. The Write Stuff: Teaching the Introductory Public Relations Writing Course.

    Science.gov (United States)

    King, Cynthia M.

    2001-01-01

    Outlines an introductory public relations writing course. Presents course topics and objectives, and assignments designed to meet them. Provides a sample grading rubric and evaluates major public relations writing textbooks. Discusses learning and assessment strategies. (SR)

  5. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  6. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  7. Methods for large-scale international studies on ICT in education

    NARCIS (Netherlands)

    Pelgrum, W.J.; Plomp, T.; Voogt, Joke; Knezek, G.A.

    2008-01-01

    International comparative assessment is a research method applied for describing and analyzing educational processes and outcomes. They are used to ‘describe the status quo’ in educational systems from an international comparative perspective. This chapter reviews different large scale international

  8. Technical Writing Redesign and Assessment: A Pilot Study

    Science.gov (United States)

    Winter, Gaye Bush

    2010-01-01

    The purpose of this study was to compare scores on writing assignments from traditional, fully online courses in technical writing to pilot, hybrid courses at a southern university. A total of 232 students' assignments were compared in this study. All writing assignments were scored by six trained instructors of English using the same five point…

  9. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  10. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    Science.gov (United States)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  11. The relationship between automatic assessment of oral proficiency and other indicators of first year students' linguistic abilities

    CSIR Research Space (South Africa)

    De Wet, Febe

    2012-11-01

    Full Text Available Academic literacy proficiency is key to the success of a student at university. Currently, the large-scale assessment of language proficiency, particularly at higher education levels, is dominated by reading and writing tests because listening...

  12. THE EFFECT OF SCAFFOLDING AND PORTFOLIO ASSESSMENT ON JORDANIAN EFL LEARNERS’ WRITING

    Directory of Open Access Journals (Sweden)

    Ruba Fahmi Bataineh

    2016-07-01

    Full Text Available This study examines the potential effect of scaffolding-based instruction and portfolio-based assessment on Jordanian EFL tenth grade students’ overall writing performance and their performance on the sub-skills of focus, development, organization, conventions and word choice. The study uses a quasi-experimental experimental/control group, pre-/posttest design. In the experimental group, 15 female tenth grade students from the North-Eastern Badia Directorate of Education (Jordan were taught to generate ideas, structure, draft, and edit their written pieces using agency scaffolding, the scaffolding principles of contextual support, continuity, intersubjectivity, flow, contingency and handover, and a slightly adapted version of Hamp-Lyons and Condon’s (2000 Portfolio Model of collection, selection and reflection. A control group of 28 students were instructed conventionally per the guidelines of the teacher’s book. Using descriptive statistics and ANCOVA to analyze the students’ scores on the pre- and the posttests, the results showed that the group taught through scaffolding-based instruction and portfolio-based assessment outperformed the control group (at a≤ 0.05 in their overall writing performance and in their performance on the five writing sub-skills.

  13. Writing Editorials.

    Science.gov (United States)

    Pappas, Marjorie L.

    2003-01-01

    Presents a thematic unit for middle schools on editorial writing, or persuasive writing, based on the Pathways Model for information skills lessons. Includes assessing other editorials; student research process journals; information literacy and process skills; and two lesson plans that involve library media specialists as well as teachers. (LRW)

  14. Assessing Writing: A Review of the Main Trends

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Salmani Nodoushan

    2014-10-01

    Full Text Available As a language skill, writing has had, still has and will continue to have an important role in shaping the scientific structure of human life in that it is the medium through which scientific content is stored, retained, and transmitted. It has therefore been a major concern for writing teachers and researchers to find a reliable method for evaluating and ensuring quality writing. This paper addresses the different approaches to scoring writing and classifies them into a priori scoring systems (including holistic and analytic scoring, and a posteriori trait-based scoring systems (including primary-trait and multiple-trait scoring.

  15. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  16. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  17. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  18. Understanding Business Interests in International Large-Scale Student Assessments: A Media Analysis of "The Economist," "Financial Times," and "Wall Street Journal"

    Science.gov (United States)

    Steiner-Khamsi, Gita; Appleton, Margaret; Vellani, Shezleen

    2018-01-01

    The media analysis is situated in the larger body of studies that explore the varied reasons why different policy actors advocate for international large-scale student assessments (ILSAs) and adds to the research on the fast advance of the global education industry. The analysis of "The Economist," "Financial Times," and…

  19. Signatures of non-universal large scales in conditional structure functions from various turbulent flows

    International Nuclear Information System (INIS)

    Blum, Daniel B; Voth, Greg A; Bewley, Gregory P; Bodenschatz, Eberhard; Gibert, Mathieu; Xu Haitao; Gylfason, Ármann; Mydlarski, Laurent; Yeung, P K

    2011-01-01

    We present a systematic comparison of conditional structure functions in nine turbulent flows. The flows studied include forced isotropic turbulence simulated on a periodic domain, passive grid wind tunnel turbulence in air and in pressurized SF 6 , active grid wind tunnel turbulence (in both synchronous and random driving modes), the flow between counter-rotating discs, oscillating grid turbulence and the flow in the Lagrangian exploration module (in both constant and random driving modes). We compare longitudinal Eulerian second-order structure functions conditioned on the instantaneous large-scale velocity in each flow to assess the ways in which the large scales affect the small scales in a variety of turbulent flows. Structure functions are shown to have larger values when the large-scale velocity significantly deviates from the mean in most flows, suggesting that dependence on the large scales is typical in many turbulent flows. The effects of the large-scale velocity on the structure functions can be quite strong, with the structure function varying by up to a factor of 2 when the large-scale velocity deviates from the mean by ±2 standard deviations. In several flows, the effects of the large-scale velocity are similar at all the length scales we measured, indicating that the large-scale effects are scale independent. In a few flows, the effects of the large-scale velocity are larger on the smallest length scales. (paper)

  20. Using Portfolio to Assess Rural Young Learners' Writing Skills in English Language Classroom

    Science.gov (United States)

    Aziz, Muhammad Noor Abdul; Yusoff, Nurahimah Mohd.

    2015-01-01

    This study aimed at discussing the benefits of portfolio assessment in assessing students' writing skills. The study explores the use of authentic assessment in the classroom. Eleven primary school children from Year 4 in a rural school in Sabah participated in this study. Data were collected by observing them during the English Language lessons…

  1. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    Science.gov (United States)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  2. The Rights and Responsibility of Test Takers When Large-Scale Testing Is Used for Classroom Assessment

    Science.gov (United States)

    van Barneveld, Christina; Brinson, Karieann

    2017-01-01

    The purpose of this research was to identify conflicts in the rights and responsibility of Grade 9 test takers when some parts of a large-scale test are marked by teachers and used in the calculation of students' class marks. Data from teachers' questionnaires and students' questionnaires from a 2009-10 administration of a large-scale test of…

  3. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  4. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  5. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  6. Large Scale Evapotranspiration Estimates: An Important Component in Regional Water Balances to Assess Water Availability

    Science.gov (United States)

    Garatuza-Payan, J.; Yepez, E. A.; Watts, C.; Rodriguez, J. C.; Valdez-Torres, L. C.; Robles-Morua, A.

    2013-05-01

    Water security, can be defined as the reliable supply in quantity and quality of water to help sustain future populations and maintaining ecosystem health and productivity. Water security is rapidly declining in many parts of the world due to population growth, drought, climate change, salinity, pollution, land use change, over-allocation and over-utilization, among other issues. Governmental offices (such as the Comision Nacional del Agua in Mexico, CONAGUA) require and conduct studies to estimate reliable water balances at regional or continental scales in order to provide reasonable assessments of the amount of water that can be provided (from surface or ground water sources) to supply all the human needs while maintaining natural vegetation, on an operational basis and, more important, under disturbances, such as droughts. Large scale estimates of evapotranspiration (ET), a critical component of the water cycle, are needed for a better comprehension of the hydrological cycle at large scales, which, in most water balances is left as the residual. For operational purposes, such water balance estimates can not rely on ET measurements since they do not exist, should be simple and require the least ground information possible, information that is often scarce or does not exist at all. Given this limitation, the use of remotely sensed data to estimate ET could supplement the lack of ground information, particularly in remote regions In this study, a simple method, based on the Makkink equation is used to estimate ET for large areas at high spatial resolutions (1 km). The Makkink model used here is forced using three remotely sensed datasets. First, the model uses solar radiation estimates obtained from the Geostationary Operational Environmental Satellite (GOES); Second, the model uses an Enhanced Vegetation Index (EVI) obtained from the Moderate-resolution Imaging Spectroradiometer (MODIS) normalized to get an estimate for vegetation amount and land use which was

  7. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  8. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  9. Scientific writing: a randomized controlled trial comparing standard and on-line instruction

    Directory of Open Access Journals (Sweden)

    Phadtare Amruta

    2009-05-01

    Full Text Available Abstract Background Writing plays a central role in the communication of scientific ideas and is therefore a key aspect in researcher education, ultimately determining the success and long-term sustainability of their careers. Despite the growing popularity of e-learning, we are not aware of any existing study comparing on-line vs. traditional classroom-based methods for teaching scientific writing. Methods Forty eight participants from a medical, nursing and physiotherapy background from US and Brazil were randomly assigned to two groups (n = 24 per group: An on-line writing workshop group (on-line group, in which participants used virtual communication, google docs and standard writing templates, and a standard writing guidance training (standard group where participants received standard instruction without the aid of virtual communication and writing templates. Two outcomes, manuscript quality was assessed using the scores obtained in Six subgroup analysis scale as the primary outcome measure, and satisfaction scores with Likert scale were evaluated. To control for observer variability, inter-observer reliability was assessed using Fleiss's kappa. A post-hoc analysis comparing rates of communication between mentors and participants was performed. Nonparametric tests were used to assess intervention efficacy. Results Excellent inter-observer reliability among three reviewers was found, with an Intraclass Correlation Coefficient (ICC agreement = 0.931882 and ICC consistency = 0.932485. On-line group had better overall manuscript quality (p = 0.0017, SSQSavg score 75.3 ± 14.21, ranging from 37 to 94 compared to the standard group (47.27 ± 14.64, ranging from 20 to 72. Participant satisfaction was higher in the on-line group (4.3 ± 0.73 compared to the standard group (3.09 ± 1.11 (p = 0.001. The standard group also had fewer communication events compared to the on-line group (0.91 ± 0.81 vs. 2.05 ± 1.23; p = 0.0219. Conclusion Our protocol

  10. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  11. Potential Impact of Large Scale Abstraction on the Quality of Shallow ...

    African Journals Online (AJOL)

    PRO

    Significant increase in crop production would not, however, be ... sounding) using Geonics EM34-3 and Abem SAS300C Terrameter to determine the aquifer (fresh water lens) ..... Final report on environmental impact assessment of large scale.

  12. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  13. Can Computers Make the Grade in Writing Exams?

    Science.gov (United States)

    Hadi-Tabassum, Samina

    2014-01-01

    Schools are scrambling to prepare students for the writing assessments aligned to the Common Core State Standards. In some states, writing has not been assessed for over a decade. Yet, with the use of computerized grading of the student's writing, many teachers are wondering how to best prepare students for the writing assessments that will…

  14. Writing and Science Literacy

    Science.gov (United States)

    Weiss-Magasic, Coleen

    2012-01-01

    Writing activities are a sure way to assess and enhance students' science literacy. Sometimes the author's students use technical writing to communicate their lab experiences, just as practicing scientists do. Other times, they use creative writing to make connections to the topics they're learning. This article describes both types of writing…

  15. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  16. Guidelines for writing an argumentative essay

    OpenAIRE

    Aleksandra Egurnova

    2014-01-01

    The guidelines below are intended for teachers, professors, students, and the public at large who are interested in the issues of English writing culture. They provide a detailed plan for completing the writing task–writing an argumentative essay.

  17. On the use of Cloud Computing and Machine Learning for Large-Scale SAR Science Data Processing and Quality Assessment Analysi

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Geodetic imaging is revolutionizing geophysics, but the scope of discovery has been limited by labor-intensive technological implementation of the analyses. The Advanced Rapid Imaging and Analysis (ARIA) project has proven capability to automate SAR data processing and analysis. Existing and upcoming SAR missions such as Sentinel-1A/B and NISAR are also expected to generate massive amounts of SAR data. This has brought to the forefront the need for analytical tools for SAR quality assessment (QA) on the large volumes of SAR data-a critical step before higher-level time series and velocity products can be reliably generated. Initially leveraging an advanced hybrid-cloud computing science data system for performing large-scale processing, machine learning approaches were augmented for automated analysis of various quality metrics. Machine learning-based user-training of features, cross-validation, prediction models were integrated into our cloud-based science data processing flow to enable large-scale and high-throughput QA analytics for enabling improvements to the production quality of geodetic data products.

  18. A large-scale peer teaching programme - acceptance and benefit.

    Science.gov (United States)

    Schuetz, Elisabeth; Obirei, Barbara; Salat, Daniela; Scholz, Julia; Hann, Dagmar; Dethleffsen, Kathrin

    2017-08-01

    The involvement of students in the embodiment of university teaching through peer-assisted learning formats is commonly applied. Publications on this topic exclusively focus on strictly defined situations within the curriculum and selected target groups. This study, in contrast, presents and evaluates a large-scale structured and quality-assured peer teaching programme, which offers diverse and targeted courses throughout the preclinical part of the medical curriculum. The large-scale peer teaching programme consists of subject specific and interdisciplinary tutorials that address all scientific, physiological and anatomic subjects of the preclinical curriculum as well as tutorials with contents exceeding the formal curriculum. In the study year 2013/14 a total of 1,420 lessons were offered as part of the programme. Paper-based evaluations were conducted over the full range of courses. Acceptance and benefit of this peer teaching programme were evaluated in a retrospective study covering the period 2012 to 2014. Usage of tutorials by students who commenced their studies in 2012/13 (n=959) was analysed from 2012 till 2014. Based on the results of 13 first assessments in the preclinical subjects anatomy, biochemistry and physiology, the students were assigned to one of five groups. These groups were compared according to participation in the tutorials. To investigate the benefit of tutorials of the peer teaching programme, the results of biochemistry re-assessments of participants and non-participants of tutorials in the years 2012 till 2014 (n=188, 172 and 204, respectively) were compared using Kolmogorov-Smirnov- and Chi-square tests as well as the effect size Cohen's d. Almost 70 % of the students attended the voluntary additional programme during their preclinical studies. The students participating in the tutorials had achieved different levels of proficiency in first assessments. The acceptance of different kinds of tutorials appears to correlate with their

  19. Kindergarten Predictors of Third Grade Writing

    Science.gov (United States)

    Kim, Young-Suk; Al Otaiba, Stephanie; Wanzek, Jeanne

    2015-01-01

    The primary goal of the present study was to examine the relations of kindergarten transcription, oral language, word reading, and attention skills to writing skills in third grade. Children (N = 157) were assessed on their letter writing automaticity, spelling, oral language, word reading, and attention in kindergarten. Then, they were assessed on writing in third grade using three writing tasks – one narrative and two expository prompts. Children’s written compositions were evaluated in terms of writing quality (the extent to which ideas were developed and presented in an organized manner). Structural equation modeling showed that kindergarten oral language and lexical literacy skills (i.e., word reading and spelling) were independently predicted third grade narrative writing quality, and kindergarten literacy skill uniquely predicted third grade expository writing quality. In contrast, attention and letter writing automaticity were not directly related to writing quality in either narrative or expository genre. These results are discussed in light of theoretical and practical implications. PMID:25642118

  20. Modeling the impact of large-scale energy conversion systems on global climate

    International Nuclear Information System (INIS)

    Williams, J.

    There are three energy options which could satisfy a projected energy requirement of about 30 TW and these are the solar, nuclear and (to a lesser extent) coal options. Climate models can be used to assess the impact of large scale deployment of these options. The impact of waste heat has been assessed using energy balance models and general circulation models (GCMs). Results suggest that the impacts are significant when the heat imput is very high and studies of more realistic scenarios are required. Energy balance models, radiative-convective models and a GCM have been used to study the impact of doubling the atmospheric CO 2 concentration. State-of-the-art models estimate a surface temperature increase of 1.5-3.0 0 C with large amplification near the poles, but much uncertainty remains. Very few model studies have been made of the impact of particles on global climate, more information on the characteristics of particle input are required. The impact of large-scale deployment of solar energy conversion systems has received little attention but model studies suggest that large scale changes in surface characteristics associated with such systems (surface heat balance, roughness and hydrological characteristics and ocean surface temperature) could have significant global climatic effects. (Auth.)

  1. Exploring Associations among Writing Self-Perceptions, Writing Abilities, and Native Language of English-Spanish Two-Way Immersion Students

    Science.gov (United States)

    Neugebauer, Sabina R.; Howard, Elizabeth R.

    2015-01-01

    The current study, with 409 fourth graders in two-way immersion programs, explored the writing self-perceptions of native English and native Spanish speakers and the relationship between self-perceptions and writing performance. An adapted version of the Writer Self-Perception Scale (WSPS) was administered along with a writing task. Native English…

  2. Gender and Ethnic Group Differences on the GMAT Analytical Writing Assessment.

    Science.gov (United States)

    Bridgeman, Brent; McHale, Frederick

    Gender and ethnic group differences on the Analytical Writing Assessment that is part of the Graduate Management Admissions Test were evaluated. Data from the first operational administration for 36,583 examinees in October 1994 were used. Standardized differences from the White male reference group were computed separately for men and women in…

  3. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  4. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  5. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  6. Large-area nanoimprinting on various substrates by reconfigurable maskless laser direct writing

    KAUST Repository

    Lee, Daeho

    2012-08-10

    Laser-assisted, one-step direct nanoimprinting of metal and semiconductor nanoparticles (NPs) was investigated to fabricate submicron structures including mesh, line, nanopillar and nanowire arrays. Master molds were fabricated with high-speed (200mms 1) laser direct writing (LDW) of negative or positive photoresists on Si wafers. The fabrication was completely free of lift-off or reactive ion etching processes. Polydimethylsiloxane (PDMS) stamps fabricated from master molds replicated nanoscale structures (down to 200nm) with no or negligible residual layers on various substrates. The low temperature and pressure used for nanoimprinting enabled direct nanofabrication on flexible substrates. With the aid of high-speed LDW, wafer scale 4inch direct nanoimprinting was demonstrated. © 2012 IOP Publishing Ltd.

  7. Reflective writing: the student nurse's perspective on reflective writing and poetry writing.

    Science.gov (United States)

    Coleman, Dawn; Willis, Diane S

    2015-07-01

    Reflective writing is a mandatory part of nurse education but how students develop their skills and use reflection as part of their experiential learning remains relatively unknown. Understanding reflective writing in all forms from the perspective of a student nurse is therefore important. To explore the use of reflective writing and the use of poetry in pre-registered nursing students. A qualitative design was employed to explore reflective writing in pre-registered nursing students. A small university in Scotland. BSc (Hons) Adult and Mental Health Pre-registration Student Nurses. Two focus groups were conducted with 10 student nurses during March 2012. Data was analysed thematically using the framework of McCarthy (1999). Students found the process of reflective writing daunting but valued it over time. Current educational methods, such as assessing reflective accounts, often lead to the 'narrative' being watered down and the student feeling judged. Despite this, reflection made students feel responsible for their own learning and research on the topic. Some students felt the use of models of reflection constricting, whilst poetry freed up their expression allowing them to demonstrate the compassion for their patient under their care. Poetry writing gives students the opportunity for freedom of expression, personal satisfaction and a closer connection with their patients, which the more formal approach to reflective writing did not offer. There is a need for students to have a safe and supportive forum in which to express and have their experiences acknowledged without the fear of being judged. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  8. Modeling Student Motivation and Students’ Ability Estimates From a Large-Scale Assessment of Mathematics

    Directory of Open Access Journals (Sweden)

    Carlos Zerpa

    2011-09-01

    Full Text Available When large-scale assessments (LSA do not hold personal stakes for students, students may not put forth their best effort. Low-effort examinee behaviors (e.g., guessing, omitting items result in an underestimate of examinee abilities, which is a concern when using results of LSA to inform educational policy and planning. The purpose of this study was to explore the relationship between examinee motivation as defined by expectancy-value theory, student effort, and examinee mathematics abilities. A principal components analysis was used to examine the data from Grade 9 students (n = 43,562 who responded to a self-report questionnaire on their attitudes and practices related to mathematics. The results suggested a two-component model where the components were interpreted as task-values in mathematics and student effort. Next, a hierarchical linear model was implemented to examine the relationship between examinee component scores and their estimated ability on a LSA. The results of this study provide evidence that motivation, as defined by the expectancy-value theory and student effort, partially explains student ability estimates and may have implications in the information that get transferred to testing organizations, school boards, and teachers while assessing students’ Grade 9 mathematics learning.

  9. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  10. Writing-to-Learn: Attitudes of nursing students at Sultan Qaboos University.

    Science.gov (United States)

    Balachandran, Shreedevi; Venkatesaperumal, Ramesh; Clara, Jothi; Shukri, Raghda K

    2014-02-01

    The objectives of this study were to assess the attitude of Omani nursing students towards writing-to-learn (WTL) and its relationship to demographic variables, self-efficacy and the writing process. A cross-sectional design was used to evaluate attitudes towards WTL by Sultan Qaboos University nursing students. A convenience sample of 106 students was used and data collected between October 2009 and March 2010. A modified version of the WTL attitude scale developed by Dobie and Poirrier was used to collect the data. Descriptive and inferential statistics were used for analysis. Senior and junior students had more positive attitudes to WTL than mid-level students who tended to have negative attitudes towards writing. Although 52.8% students had negative attitudes towards the writing process, the median was higher for attitudes to the writing process compared to the median for self-efficacy. There was a positive correlation between self-efficacy and writing process scores. Overall, students had negative attitudes towards WTL. Attitudes are learnt or formed through previous experiences. The incorporation of WTL strategies into teaching can transform students' negative attitudes towards writing into positive ones.

  11. Assessing large-scale weekly cycles in meteorological variables: a review

    Directory of Open Access Journals (Sweden)

    A. Sanchez-Lorenzo

    2012-07-01

    Full Text Available Several studies have claimed to have found significant weekly cycles of meteorological variables appearing over large domains, which can hardly be related to urban effects exclusively. Nevertheless, there is still an ongoing scientific debate whether these large-scale weekly cycles exist or not, and some other studies fail to reproduce them with statistical significance. In addition to the lack of the positive proof for the existence of these cycles, their possible physical explanations have been controversially discussed during the last years. In this work we review the main results about this topic published during the recent two decades, including a summary of the existence or non-existence of significant weekly weather cycles across different regions of the world, mainly over the US, Europe and Asia. In addition, some shortcomings of common statistical methods for analyzing weekly cycles are listed. Finally, a brief summary of supposed causes of the weekly cycles, focusing on the aerosol-cloud-radiation interactions and their impact on meteorological variables as a result of the weekly cycles of anthropogenic activities, and possible directions for future research, is presented.

  12. Free Global Dsm Assessment on Large Scale Areas Exploiting the Potentialities of the Innovative Google Earth Engine Platform

    Science.gov (United States)

    Nascetti, A.; Di Rita, M.; Ravanelli, R.; Amicuzi, M.; Esposito, S.; Crespi, M.

    2017-05-01

    The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER) has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.

  13. FREE GLOBAL DSM ASSESSMENT ON LARGE SCALE AREAS EXPLOITING THE POTENTIALITIES OF THE INNOVATIVE GOOGLE EARTH ENGINE PLATFORM

    Directory of Open Access Journals (Sweden)

    A. Nascetti

    2017-05-01

    Full Text Available The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah and one Italian Region (Trentino Alto- Adige, Northern Italy exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.

  14. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  15. Studying performation: the arrangement of speech, calculation and writing acts within dispositifs : Carbon accounting for strategizing in a large corporation

    OpenAIRE

    Le Breton , Morgane; Aggeri , Franck

    2016-01-01

    International audience; This paper aims at proposing an analytical framework for performation process that is performation through speech, calculation and writing acts connected within a “dispositif”. This analytical framework is put into practice in the case study of a French large corporation which has built a low-carbon strategy based on carbon accounting tools. We have found that low-carbon strategy is performed through carbon accounting tools since speech, calculation and writing acts ar...

  16. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    International Nuclear Information System (INIS)

    Williams, Paul T.; Yin, Shengjun; Klasky, Hilda B.; Bass, Bennett Richard

    2011-01-01

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current status of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite

  17. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  18. Open TG-GATEs: a large-scale toxicogenomics database

    Science.gov (United States)

    Igarashi, Yoshinobu; Nakatsu, Noriyuki; Yamashita, Tomoya; Ono, Atsushi; Ohno, Yasuo; Urushidani, Tetsuro; Yamada, Hiroshi

    2015-01-01

    Toxicogenomics focuses on assessing the safety of compounds using gene expression profiles. Gene expression signatures from large toxicogenomics databases are expected to perform better than small databases in identifying biomarkers for the prediction and evaluation of drug safety based on a compound's toxicological mechanisms in animal target organs. Over the past 10 years, the Japanese Toxicogenomics Project consortium (TGP) has been developing a large-scale toxicogenomics database consisting of data from 170 compounds (mostly drugs) with the aim of improving and enhancing drug safety assessment. Most of the data generated by the project (e.g. gene expression, pathology, lot number) are freely available to the public via Open TG-GATEs (Toxicogenomics Project-Genomics Assisted Toxicity Evaluation System). Here, we provide a comprehensive overview of the database, including both gene expression data and metadata, with a description of experimental conditions and procedures used to generate the database. Open TG-GATEs is available from http://toxico.nibio.go.jp/english/index.html. PMID:25313160

  19. Natural language processing in an intelligent writing strategy tutoring system.

    Science.gov (United States)

    McNamara, Danielle S; Crossley, Scott A; Roscoe, Rod

    2013-06-01

    The Writing Pal is an intelligent tutoring system that provides writing strategy training. A large part of its artificial intelligence resides in the natural language processing algorithms to assess essay quality and guide feedback to students. Because writing is often highly nuanced and subjective, the development of these algorithms must consider a broad array of linguistic, rhetorical, and contextual features. This study assesses the potential for computational indices to predict human ratings of essay quality. Past studies have demonstrated that linguistic indices related to lexical diversity, word frequency, and syntactic complexity are significant predictors of human judgments of essay quality but that indices of cohesion are not. The present study extends prior work by including a larger data sample and an expanded set of indices to assess new lexical, syntactic, cohesion, rhetorical, and reading ease indices. Three models were assessed. The model reported by McNamara, Crossley, and McCarthy (Written Communication 27:57-86, 2010) including three indices of lexical diversity, word frequency, and syntactic complexity accounted for only 6% of the variance in the larger data set. A regression model including the full set of indices examined in prior studies of writing predicted 38% of the variance in human scores of essay quality with 91% adjacent accuracy (i.e., within 1 point). A regression model that also included new indices related to rhetoric and cohesion predicted 44% of the variance with 94% adjacent accuracy. The new indices increased accuracy but, more importantly, afford the means to provide more meaningful feedback in the context of a writing tutoring system.

  20. Is it differences in language skills and working memory that account for girls being better at writing than boys?

    Directory of Open Access Journals (Sweden)

    Lorna Bourke

    2012-03-01

    Full Text Available Girls are more likely to outperform boys in the development of writing skills. This study considered gender differences in language and working memory skills as a possible explanation for the differential rates of progress. Sixty-seven children (31 males and 36 females (M age 57.30 months participated. Qualitative differences in writing progress were examined using a writing assessment scale from the Early Years Foundation Stage Profile (EYFSP. Quantitative measures of writing: number of words, diversity of words, number of phrases/sentences and grammatical complexity of the phrases/sentences were also analysed. The children were also assessed on tasks measuring their language production and comprehension skills and the visuo-spatial, phonological, and central executive components of working memory. The results indicated that the boys were more likely to perform significantly less well than the girls on all measures of writing except the grammatical complexity of sentences. Initially, no significant differences were found on any of the measures of language ability. Further, no significant differences were found between the genders on the capacity and efficiency of their working memory functioning. However, hierarchical regressions revealed that the individual differences in gender and language ability, more specifically spoken language comprehension, predicted performance on the EYFSP writing scale. This finding accords well with the literature that suggests that language skills can mediate the variance in boys’ and girls’ writing ability.

  1. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  2. Environment and host as large-scale controls of ectomycorrhizal fungi.

    Science.gov (United States)

    van der Linde, Sietse; Suz, Laura M; Orme, C David L; Cox, Filipa; Andreae, Henning; Asi, Endla; Atkinson, Bonnie; Benham, Sue; Carroll, Christopher; Cools, Nathalie; De Vos, Bruno; Dietrich, Hans-Peter; Eichhorn, Johannes; Gehrmann, Joachim; Grebenc, Tine; Gweon, Hyun S; Hansen, Karin; Jacob, Frank; Kristöfel, Ferdinand; Lech, Paweł; Manninger, Miklós; Martin, Jan; Meesenburg, Henning; Merilä, Päivi; Nicolas, Manuel; Pavlenda, Pavel; Rautio, Pasi; Schaub, Marcus; Schröck, Hans-Werner; Seidling, Walter; Šrámek, Vít; Thimonier, Anne; Thomsen, Iben Margrete; Titeux, Hugues; Vanguelova, Elena; Verstraeten, Arne; Vesterdal, Lars; Waldner, Peter; Wijk, Sture; Zhang, Yuxin; Žlindra, Daniel; Bidartondo, Martin I

    2018-06-06

    Explaining the large-scale diversity of soil organisms that drive biogeochemical processes-and their responses to environmental change-is critical. However, identifying consistent drivers of belowground diversity and abundance for some soil organisms at large spatial scales remains problematic. Here we investigate a major guild, the ectomycorrhizal fungi, across European forests at a spatial scale and resolution that is-to our knowledge-unprecedented, to explore key biotic and abiotic predictors of ectomycorrhizal diversity and to identify dominant responses and thresholds for change across complex environmental gradients. We show the effect of 38 host, environment, climate and geographical variables on ectomycorrhizal diversity, and define thresholds of community change for key variables. We quantify host specificity and reveal plasticity in functional traits involved in soil foraging across gradients. We conclude that environmental and host factors explain most of the variation in ectomycorrhizal diversity, that the environmental thresholds used as major ecosystem assessment tools need adjustment and that the importance of belowground specificity and plasticity has previously been underappreciated.

  3. Reflective Writing for Medical Students on the Surgical Clerkship: Oxymoron or Antidote?

    Science.gov (United States)

    Liu, Geoffrey Z; Jawitz, Oliver K; Zheng, Daniel; Gusberg, Richard J; Kim, Anthony W

    2016-01-01

    Reflective writing has emerged as a solution to declining empathy during clinical training. However, the role for reflective writing has not been studied in a surgical setting. The aim of this proof-of-concept study was to assess receptivity to a reflective-writing intervention among third-year medical students on their surgical clerkship. The reflective-writing intervention was a 1-hour, peer-facilitated writing workshop. This study employed a pre-post-intervention design. Subjects were surveyed on their experience 4 weeks before participation in the intervention and immediately afterwards. Surveys assessed student receptivity to reflective writing as well as self-perceived empathy, writing habits, and communication behaviors using a Likert-response scale. Quantitative responses were analyzed using paired t tests and linear regression. Qualitative responses were analyzed using an iterative consensus model. Yale-New Haven hospital, a tertiary care academic center. All medical students of Yale School of Medicine, rotating on their surgical clerkship during a 9-month period (74 in total) were eligible. In all, 25 students completed this study. The proportion of students desiring more opportunities for reflective writing increased from 32%-64%. The proportion of students receptive to a mandatory writing workshop increased from 16%-40%. These differences were both significant (p = 0.003 and p = 0.001). In all, 88% of students also reported new insight as a result of the workshop. In total, 39% of students reported a more positive impression of the surgical profession after participation. Overall, the workshop was well-received by students and improved student attitudes toward reflective writing and the surgical profession. Larger studies are required to validate the effect of this workshop on objective empathy measures. This study demonstrates how reflective writing can be incorporated into a presurgical curriculum. Copyright © 2015 Association of Program Directors in

  4. Study on sandstorm PM10 exposure assessment in the large-scale region: a case study in Inner Mongolia.

    Science.gov (United States)

    Wang, Hongmei; Lv, Shihai; Diao, Zhaoyan; Wang, Baolu; Zhang, Han; Yu, Caihong

    2018-04-12

    The current exposure-effect curves describing sandstorm PM 10 exposure and the health effects are drawn roughly by the outdoor concentration (OC), which ignored the exposure levels of people's practical activity sites. The main objective of this work is to develop a novel approach to quantify human PM 10 exposure by their socio-categorized micro-environment activities-time weighed (SCMEATW) in strong sandstorm period, which can be used to assess the exposure profiles in the large-scale region. Types of people's SCMEATW were obtained by questionnaire investigation. Different types of representatives were trackly recorded during the big sandstorm. The average exposure levels were estimated by SCMEATW. Furthermore, the geographic information system (GIS) technique was taken not only to simulate the outdoor concentration spatially but also to create human exposure outlines in a visualized map simultaneously, which could help to understand the risk to different types of people. Additionally, exposure-response curves describing the acute outpatient rate odds by sandstorm were formed by SCMEATW, and the differences between SCMEATW and OC were compared. Results indicated that acute outpatient rate odds had relationships with PM 10 exposure from SCMEATW, with a level less than that of OC. Some types of people, such as herdsmen and those people walking outdoors during a strong sandstorm, have more risk than office men. Our findings provide more understanding of human practical activities on their exposure levels; they especially provide a tool to understand sandstorm PM 10 exposure in large scale spatially, which might help to perform the different categories population's risk assessment regionally.

  5. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  6. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  7. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  8. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  9. Composition Medium Comparability in a Direct Writing Assessment of Non-Native English Speakers

    Directory of Open Access Journals (Sweden)

    Edward W. Wolfe

    2004-01-01

    Full Text Available The Test of English as a Foreign Language (TOEFL contains a direct writing assessment, and examinees are given the option of composing their responses at a computer terminal using a keyboard or composing their responses in handwriting. This study sought to determine whether performance on a direct writing assessment is comparable for examinees when given the choice to compose essays in handwriting versus word processing. We examined this relationship controlling for English language proficiency and several demographic characteristics of examinees using linear models. We found a weak two-way interaction between composition medium and English language proficiency with examinees with weaker English language scores performing better on handwritten essays while examinees with better English language scores performing comparably on the two testing media. We also observed predictable differences associated with geographic region, native language, gender, and age.

  10. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  11. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  12. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  13. A large-scale application of the Kalman alignment algorithm to the CMS tracker

    International Nuclear Information System (INIS)

    Widl, E; Fruehwirth, R

    2008-01-01

    The Kalman alignment algorithm has been specifically developed to cope with the demands that arise from the specifications of the CMS Tracker. The algorithmic concept is based on the Kalman filter formalism and is designed to avoid the inversion of large matrices. Most notably, the algorithm strikes a balance between conventional global and local track-based alignment algorithms, by restricting the computation of alignment parameters not only to alignable objects hit by the same track, but also to all other alignable objects that are significantly correlated. Nevertheless, this feature also comes with various trade-offs: Mechanisms are needed that affect which alignable objects are significantly correlated and keep track of these correlations. Due to the large amount of alignable objects involved at each update (at least compared to local alignment algorithms), the time spent for retrieving and writing alignment parameters as well as the required user memory becomes a significant factor. The large-scale test presented here applies the Kalman alignment algorithm to the (misaligned) CMS Tracker barrel, and demonstrates the feasibility of the algorithm in a realistic scenario. It is shown that both the computation time and the amount of required user memory are within reasonable bounds, given the available computing resources, and that the obtained results are satisfactory

  14. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  15. Development of medical writing in India: Past, present and future

    Science.gov (United States)

    Sharma, Suhasini

    2017-01-01

    Pharmaceutical medical writing has grown significantly in India in the last couple of decades. It includes preparing regulatory, safety, and publication documents as well as educational and communication material related to health and health-care products. Medical writing requires medical understanding, knowledge of drug development and the regulatory and safety domains, understanding of research methodologies, and awareness of relevant regulations and guidelines. It also requires the ability to analyze, interpret, and present biomedical scientific data in the required format and good writing skills. Medical writing is the fourth most commonly outsourced clinical development activity, and its global demand has steadily increased due to rising cost pressures on the pharmaceutical industry. India has the unique advantages of a large workforce of science graduates and medical professionals trained in English and lower costs, which make it a suitable destination for outsourcing medical writing services. However, the current share of India in global medical writing business is very small. This industry in India faces some real challenges, such as the lack of depth and breadth in domain expertise, inadequate technical writing skills, high attrition rates, and paucity of standardized training programs as well as quality assessment tools. Focusing our time, attention, and resources to address these challenges will help the Indian medical writing industry gain its rightful share in the global medical writing business. PMID:28194338

  16. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  17. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  18. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  19. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  20. Special educaction and rewiews in the municipality of large scale Sobral (CE

    Directory of Open Access Journals (Sweden)

    Ana Paula Lima Barbosa Cardoso

    2012-11-01

    Full Text Available This article aims to discuss and analyze the participation of students with disabilities in public schools of the city of Sobral-CE in the assessment scale developed in that context. It follows a case study, a qualitative approach, conducted within the Department of Education and two municipal schools; the highest and lowest IDEB results (2009. The data collection instruments: analysis of documents, interviews and observation, and content analysis. The theoretical framework discusses the large-scale evaluation in the brazilian context in conjunction with the literature on the evaluation of teaching for students with disabilities. We describe the landscape of education in general sobralense and also data on special education. The research results discussed two cases of large-scale evaluation that occurred in that municipality: municipal evaluation and Proof Brazil. Regarding the first, the subjects affirms the participation of students with disabilities through a mechanism that prevents these results affect other students, are called "children of the shore." In Proof Brazil, the subjects again reported the participation of these students in national testing. It's criticizing the appropriateness of that instrument to assess this particular student body, suggesting the need of developping more "relevant" ones. Finally, it appears that the large-scale evaluation calls into question the process of schooling experienced by pupils with disabilities in Sobral-CE, showing the challenges and difficulties of the actions of school inclusion proposals in that context.

  1. The Psychology of Writing Development--And Its Implications for Assessment

    Science.gov (United States)

    Camp, Heather

    2012-01-01

    This article reviews key developmental theories that have been adopted by writing development researchers over the last fifty years. It describes how researchers have translated these theories into definitions of writing development capable of influencing curricular design and interpretations of student writing and explores the implications for…

  2. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  3. Large-Scale Assessment of Change in Student Achievement: Dutch Primary School Students' Results on Written Division in 1997 and 2004 as an Example

    Science.gov (United States)

    van den Heuvel-Panhuizen, Marja; Robitzsch, Alexander; Treffers, Adri; Koller, Olaf

    2009-01-01

    This article discusses large-scale assessment of change in student achievement and takes the study by Hickendorff, Heiser, Van Putten, and Verhelst (2009) as an example. This study compared the achievement of students in the Netherlands in 1997 and 2004 on written division problems. Based on this comparison, they claim that there is a performance…

  4. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  5. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  6. Stereotype Threat, Inquiring about Test Takers' Race and Gender, and Performance on Low-Stakes Tests in a Large-Scale Assessment. Research Report. ETS RR-15-02

    Science.gov (United States)

    Stricker, Lawrence J.; Rock, Donald A.; Bridgeman, Brent

    2015-01-01

    This study explores stereotype threat on low-stakes tests used in a large-scale assessment, math and reading tests in the Education Longitudinal Study of 2002 (ELS). Issues identified in laboratory research (though not observed in studies of high-stakes tests) were assessed: whether inquiring about their race and gender is related to the…

  7. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  8. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  9. Environmental implications of large-scale adoption of wind power: a scenario-based life cycle assessment

    International Nuclear Information System (INIS)

    Arvesen, Anders; Hertwich, Edgar G

    2011-01-01

    We investigate the potential environmental impacts of a large-scale adoption of wind power to meet up to 22% of the world’s growing electricity demand. The analysis builds on life cycle assessments of generic onshore and offshore wind farms, meant to represent average conditions for global deployment of wind power. We scale unit-based findings to estimate aggregated emissions of building, operating and decommissioning wind farms toward 2050, taking into account changes in the electricity mix in manufacturing. The energy scenarios investigated are the International Energy Agency’s BLUE scenarios. We estimate 1.7–2.6 Gt CO 2 -eq climate change, 2.1–3.2 Mt N-eq marine eutrophication, 9.2–14 Mt NMVOC photochemical oxidant formation, and 9.5–15 Mt SO 2 -eq terrestrial acidification impact category indicators due to global wind power in 2007–50. Assuming lifetimes 5 yr longer than reference, the total climate change indicator values are reduced by 8%. In the BLUE Map scenario, construction of new capacity contributes 64%, and repowering of existing capacity 38%, to total cumulative greenhouse gas emissions. The total emissions of wind electricity range between 4% and 14% of the direct emissions of the replaced fossil-fueled power plants. For all impact categories, the indirect emissions of displaced fossil power are larger than the total emissions caused by wind power.

  10. Assessing Human Modifications to Floodplains using Large-Scale Hydrogeomorphic Floodplain Modeling

    Science.gov (United States)

    Morrison, R. R.; Scheel, K.; Nardi, F.; Annis, A.

    2017-12-01

    Human modifications to floodplains for water resource and flood management purposes have significantly transformed river-floodplain connectivity dynamics in many watersheds. Bridges, levees, reservoirs, shifts in land use, and other hydraulic engineering works have altered flow patterns and caused changes in the timing and extent of floodplain inundation processes. These hydrogeomorphic changes have likely resulted in negative impacts to aquatic habitat and ecological processes. The availability of large-scale topographic datasets at high resolution provide an opportunity for detecting anthropogenic impacts by means of geomorphic mapping. We have developed and are implementing a methodology for comparing a hydrogeomorphic floodplain mapping technique to hydraulically-modeled floodplain boundaries to estimate floodplain loss due to human activities. Our hydrogeomorphic mapping methodology assumes that river valley morphology intrinsically includes information on flood-driven erosion and depositional phenomena. We use a digital elevation model-based algorithm to identify the floodplain as the area of the fluvial corridor laying below water reference levels, which are estimated using a simplified hydrologic model. Results from our hydrogeomorphic method are compared to hydraulically-derived flood zone maps and spatial datasets of levee protected-areas to explore where water management features, such as levees, have changed floodplain dynamics and landscape features. Parameters associated with commonly used F-index functions are quantified and analyzed to better understand how floodplain areas have been reduced within a basin. Preliminary results indicate that the hydrogeomorphic floodplain model is useful for quickly delineating floodplains at large watershed scales, but further analyses are needed to understand the caveats for using the model in determining floodplain loss due to levees. We plan to continue this work by exploring the spatial dependencies of the F

  11. The Climate Potentials and Side-Effects of Large-Scale terrestrial CO2 Removal - Insights from Quantitative Model Assessments

    Science.gov (United States)

    Boysen, L.; Heck, V.; Lucht, W.; Gerten, D.

    2015-12-01

    Terrestrial carbon dioxide removal (tCDR) through dedicated biomass plantations is considered as one climate engineering (CE) option if implemented at large-scale. While the risks and costs are supposed to be small, the effectiveness depends strongly on spatial and temporal scales of implementation. Based on simulations with a dynamic global vegetation model (LPJmL) we comprehensively assess the effectiveness, biogeochemical side-effects and tradeoffs from an earth system-analytic perspective. We analyzed systematic land-use scenarios in which all, 25%, or 10% of natural and/or agricultural areas are converted to tCDR plantations including the assumption that biomass plantations are established once the 2°C target is crossed in a business-as-usual climate change trajectory. The resulting tCDR potentials in year 2100 include the net accumulated annual biomass harvests and changes in all land carbon pools. We find that only the most spatially excessive, and thus undesirable, scenario would be capable to restore the 2° target by 2100 under continuing high emissions (with a cooling of 3.02°C). Large-scale biomass plantations covering areas between 1.1 - 4.2 Gha would produce a climate reduction potential of 0.8 - 1.4°C. tCDR plantations at smaller scales do not build up enough biomass over this considered period and the potentials to achieve global warming reductions are substantially lowered to no more than 0.5-0.6°C. Finally, we demonstrate that the (non-economic) costs for the Earth system include negative impacts on the water cycle and on ecosystems, which are already under pressure due to both land use change and climate change. Overall, tCDR may lead to a further transgression of land- and water-related planetary boundaries while not being able to set back the crossing of the planetary boundary for climate change. tCDR could still be considered in the near-future mitigation portfolio if implemented on small scales on wisely chosen areas.

  12. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  13. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  14. Evaluation of creep-fatigue crack growth for large-scale FBR reactor vessel and NDE assessment

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Young Sang; Kim, Jong Bum; Kim, Seok Hun; Yoo, Bong

    2001-03-01

    Creep fatigue crack growth contributes to the failure of FRB reactor vessels in high temperature condition. In the design stage of reactor vessel, crack growth evaluation is very important to ensure the structural safety and setup the in-service inspection strategy. In this study, creep-fatigue crack growth evaluation has been performed for the semi-elliptical surface cracks subjected to thermal loading. The thermal stress analysis of a large-scale FBR reactor vessel has been carried out for the load conditions. The distributions of axial, radial, hoop, and Von Mises stresses were obtained for the loading conditions. At the maximum point of the axial and hoop stress, the longitudinal and circumferential surface cracks (i.e. PTS crack, NDE short crack and shallow long crack) were postulated. Using the maximum and minimum values of stresses, the creep-fatigue crack growth of the proposed cracks was simulated. The crack growth rate of circumferential cracks becomes greater than that of longitudinal cracks. The total crack growth of the largest PTS crack is very small after 427 cycles. The structural integrity of a large-scale reactor can be maintained for the plant life. The crack depth growth of the shallow long crack is faster than that of the NDE short crack. In the ISI of the large-scale FBR reactor vessel, the ultrasonic inspection is beneficial to detect the shallow circumferential cracks.

  15. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  16. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  17. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Directory of Open Access Journals (Sweden)

    Xiangyun Xiao

    Full Text Available The reconstruction of gene regulatory networks (GRNs from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM, experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  18. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Science.gov (United States)

    Xiao, Xiangyun; Zhang, Wei; Zou, Xiufen

    2015-01-01

    The reconstruction of gene regulatory networks (GRNs) from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE)-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM), experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  19. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  20. Ultra-large scale AFM of lipid droplet arrays: investigating the ink transfer volume in dip pen nanolithography

    International Nuclear Information System (INIS)

    Förste, Alexander; Pfirrmann, Marco; Sachs, Johannes; Gröger, Roland; Walheim, Stefan; Brinkmann, Falko; Hirtz, Michael; Fuchs, Harald; Schimmel, Thomas

    2015-01-01

    There are only few quantitative studies commenting on the writing process in dip-pen nanolithography with lipids. Lipids are important carrier ink molecules for the delivery of bio-functional patters in bio-nanotechnology. In order to better understand and control the writing process, more information on the transfer of lipid material from the tip to the substrate is needed. The dependence of the transferred ink volume on the dwell time of the tip on the substrate was investigated by topography measurements with an atomic force microscope (AFM) that is characterized by an ultra-large scan range of 800 × 800 μm 2 . For this purpose arrays of dots of the phospholipid1,2-dioleoyl-sn-glycero-3-phosphocholine were written onto planar glass substrates and the resulting pattern was imaged by large scan area AFM. Two writing regimes were identified, characterized of either a steady decline or a constant ink volume transfer per dot feature. For the steady state ink transfer, a linear relationship between the dwell time and the dot volume was determined, which is characterized by a flow rate of about 16 femtoliters per second. A dependence of the ink transport from the length of pauses before and in between writing the structures was observed and should be taken into account during pattern design when aiming at best writing homogeneity. The ultra-large scan range of the utilized AFM allowed for a simultaneous study of the entire preparation area of almost 1 mm 2 , yielding good statistic results. (paper)

  1. Ultra-large scale AFM of lipid droplet arrays: investigating the ink transfer volume in dip pen nanolithography

    Science.gov (United States)

    Förste, Alexander; Pfirrmann, Marco; Sachs, Johannes; Gröger, Roland; Walheim, Stefan; Brinkmann, Falko; Hirtz, Michael; Fuchs, Harald; Schimmel, Thomas

    2015-05-01

    There are only few quantitative studies commenting on the writing process in dip-pen nanolithography with lipids. Lipids are important carrier ink molecules for the delivery of bio-functional patters in bio-nanotechnology. In order to better understand and control the writing process, more information on the transfer of lipid material from the tip to the substrate is needed. The dependence of the transferred ink volume on the dwell time of the tip on the substrate was investigated by topography measurements with an atomic force microscope (AFM) that is characterized by an ultra-large scan range of 800 × 800 μm2. For this purpose arrays of dots of the phospholipid1,2-dioleoyl-sn-glycero-3-phosphocholine were written onto planar glass substrates and the resulting pattern was imaged by large scan area AFM. Two writing regimes were identified, characterized of either a steady decline or a constant ink volume transfer per dot feature. For the steady state ink transfer, a linear relationship between the dwell time and the dot volume was determined, which is characterized by a flow rate of about 16 femtoliters per second. A dependence of the ink transport from the length of pauses before and in between writing the structures was observed and should be taken into account during pattern design when aiming at best writing homogeneity. The ultra-large scan range of the utilized AFM allowed for a simultaneous study of the entire preparation area of almost 1 mm2, yielding good statistic results.

  2. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  3. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  4. Laser direct writing using submicron-diameter fibers.

    Science.gov (United States)

    Tian, Feng; Yang, Guoguang; Bai, Jian; Xu, Jianfeng; Hou, Changlun; Liang, Yiyong; Wang, Kaiwei

    2009-10-26

    In this paper, a novel direct writing technique using submicron-diameter fibers is presented. The submicron-diameter fiber probe serves as a tightly confined point source and it adopts micro touch mode in the process of writing. The energy distribution of direct writing model is analyzed by Three-Dimension Finite-Difference Time-Domain method. Experiments demonstrate that submicron-diameter fiber direct writing has some advantages: simple process, 350-nm-resolution (lower than 442-nm-wavelength), large writing area, and controllable width of lines. In addition, by altering writing direction of lines, complex submicron patterns can be fabricated.

  5. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  6. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Directory of Open Access Journals (Sweden)

    Steiakakis Chrysanthos

    2016-01-01

    Full Text Available The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions.

  7. Status of large scale wind turbine technology development abroad?

    Institute of Scientific and Technical Information of China (English)

    Ye LI; Lei DUAN

    2016-01-01

    To facilitate the large scale (multi-megawatt) wind turbine development in China, the foreign e?orts and achievements in the area are reviewed and summarized. Not only the popular horizontal axis wind turbines on-land but also the o?shore wind turbines, vertical axis wind turbines, airborne wind turbines, and shroud wind turbines are discussed. The purpose of this review is to provide a comprehensive comment and assessment about the basic work principle, economic aspects, and environmental impacts of turbines.

  8. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  9. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  10. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  11. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  12. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  13. Large-scale Estimates of Leaf Area Index from Active Remote Sensing Laser Altimetry

    Science.gov (United States)

    Hopkinson, C.; Mahoney, C.

    2016-12-01

    Leaf area index (LAI) is a key parameter that describes the spatial distribution of foliage within forest canopies which in turn control numerous relationships between the ground, canopy, and atmosphere. The retrieval of LAI has demonstrated success by in-situ (digital) hemispherical photography (DHP) and airborne laser scanning (ALS) data; however, field and ALS acquisitions are often spatially limited (100's km2) and costly. Large-scale (>1000's km2) retrievals have been demonstrated by optical sensors, however, accuracies remain uncertain due to the sensor's inability to penetrate the canopy. The spaceborne Geoscience Laser Altimeter System (GLAS) provides a possible solution in retrieving large-scale derivations whilst simultaneously penetrating the canopy. LAI retrieved by multiple DHP from 6 Australian sites, representing a cross-section of Australian ecosystems, were employed to model ALS LAI, which in turn were used to infer LAI from GLAS data at 5 other sites. An optimally filtered GLAS dataset was then employed in conjunction with a host of supplementary data to build a Random Forest (RF) model to infer predictions (and uncertainties) of LAI at a 250 m resolution across the forested regions of Australia. Predictions were validated against ALS-based LAI from 20 sites (R2=0.64, RMSE=1.1 m2m-2); MODIS-based LAI were also assessed against these sites (R2=0.30, RMSE=1.78 m2m-2) to demonstrate the strength of GLAS-based predictions. The large-scale nature of current predictions was also leveraged to demonstrate large-scale relationships of LAI with other environmental characteristics, such as: canopy height, elevation, and slope. The need for such wide-scale quantification of LAI is key in the assessment and modification of forest management strategies across Australia. Such work also assists Australia's Terrestrial Ecosystem Research Network, in fulfilling their government issued mandates.

  14. A Bayesian computational model for online character recognition and disability assessment during cursive eye writing

    Directory of Open Access Journals (Sweden)

    Julien eDiard

    2013-11-01

    Full Text Available This research involves a novel apparatus, in which the user is presented with an illusion inducing visual stimulus. The user perceives illusory movement that can be followed by the eye, so that smooth pursuit eye movements can be sustained in arbitrary directions. Thus, free-flow trajectories of any shape can be traced. In other words, coupled with an eye-tracking device, this apparatus enables "eye writing", which appears to be an original object of study. We adapt a previous model of reading and writing to this context. We describe a probabilistic model called the Bayesian Action-Perception for Eye On-Line model (BAP-EOL. It encodes probabilistic knowledge about isolated letter trajectories, their size, high-frequency components of the produced trajectory, and pupil diameter. We show how Bayesian inference, in this single model, can be used to solve several tasks, like letter recognition and novelty detection (i.e., recognizing when a presented character is not part of the learned database. We are interested in the potential use of the eye writing apparatus by motor impaired patients: the final task we solve by Bayesian inference is disability assessment (i.e., measuring and tracking the evolution of motor characteristics of produced trajectories. Preliminary experimental results are presented, which illustrate the method, showing the feasibility of character recognition in the context of eye writing. We then show experimentally how a model of the unknown character can be used to detect trajectories that are likely to be new symbols, and how disability assessment can be performed by opportunistically observing characteristics of fine motor control, as letter are being traced. Experimental analyses also help identify specificities of eye writing, as compared to handwriting, and the resulting technical challenges.

  15. An Analysis on Effects of Story Mapping in Writing Short Stories in EFL Classes, Iraqi Case

    Directory of Open Access Journals (Sweden)

    Emine Bala

    2017-06-01

    Full Text Available In this study, it is investigated that how much story map graphic organizers contribute to foster writing short stories. Eighteen EFL students from foundation year were randomly chosen and provided eight writing courses. First, the writing teacher provided a topic to the students for each course, and asked them to write three short stories about given topics. In the following two lessons, the instructor introduced graphic organizers and taught the elements of short story to the students. Later, they were given another three topics for the following three courses to create short stories using story map graphic organizers created by writing teacher. Then, the researcher selected two of their first and second pieces randomly and developed a scale to assess the students’ first and second products. The results were classified by including story elements.in two tables as percentage.

  16. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  17. Improving patient outcomes through registered dietitian order writing.

    Science.gov (United States)

    Roberts, Susan R

    2013-10-01

    Traditionally, registered dietitians (RD) have not had order writing privileges in most patient-care facilities and rely on physicians to implement their recommendations. Research has demonstrated that this model results in a high percentage of RD recommendations not being ordered. Timely nutrition interventions are important due to the prevalence of malnutrition in the hospital setting and when RD recommendations are implemented, important outcomes are improved. In addition, several studies have demonstrated that when RDs have order writing privileges, which allows more assurance that an intervention will occur and timely interventions, improved outcomes, such as improved nutrition status, better management of electrolytes and glycemic control, reaching goal calories sooner, reduction in inappropriate parenteral nutrition use, cost savings, and less error with electronic order entry. The process for implementation and outcomes of an RD order writing program at 1 large, urban, tertiary medical center is described. The program has been successful, but the implementation process required multiple years and ongoing monitoring through data collection to ensure success. RDs interested in order writing privileges must consider federal and state regulations, their individual scope of practice (relevant training and competency assessment), and how to obtain approval from the appropriate hospital governing committees. RDs who obtain order writing privileges must understand "with privilege comes responsibility" and should plan to conduct outcomes research to promote the value and acceptance of RD order writing by regulatory agencies at all levels and hospital leaders, for instance physicians and administrators.

  18. Google Street View as an alternative method to car surveys in large-scale vegetation assessments.

    Science.gov (United States)

    Deus, Ernesto; Silva, Joaquim S; Catry, Filipe X; Rocha, Miguel; Moreira, Francisco

    2015-10-01

    Car surveys (CS) are a common method for assessing the distribution of alien invasive plants. Google Street View (GSV), a free-access web technology where users may experience a virtual travel along roads, has been suggested as a cost-effective alternative to car surveys. We tested if we could replicate the results from a countrywide survey conducted by car in Portugal using GSV as a remote sensing tool, aiming at assessing the distribution of Eucalyptus globulus Labill. wildlings on roadsides adjacent to eucalypt stands. Georeferenced points gathered along CS were used to create road transects visible as lines overlapping the road in GSV environment, allowing surveying the same sampling areas using both methods. This paper presents the results of the comparison between the two methods. Both methods produced similar models of plant abundance, selecting the same explanatory variables, in the same hierarchical order of importance and depicting a similar influence on plant abundance. Even though the GSV model had a lower performance and the GSV survey detected fewer plants, additional variables collected exclusively with GSV improved model performance and provided a new insight into additional factors influencing plant abundance. The survey using GSV required ca. 9 % of the funds and 62 % of the time needed to accomplish the CS. We conclude that GSV may be a cost-effective alternative to CS. We discuss some advantages and limitations of GSV as a survey method. We forecast that GSV may become a widespread tool in road ecology, particularly in large-scale vegetation assessments.

  19. The Effect of Portfolio Assessment on Learning Idioms in Writing

    Directory of Open Access Journals (Sweden)

    Abdorreza Tahriri

    2014-04-01

    Full Text Available The present study sought to investigate the effect of portfolio assessment on idiom competence of Iranian EFL learners. For the purpose of this study, 30 students from upper-intermediate level of English proficiency took part in this study. They were chosen through convenience sampling from a language institute in Rasht, Iran. They were randomly divided into experimental and control groups. A TOEFL test and a test of idioms were given to the students to ensure their homogeneity in terms of language proficiency and knowledge of idioms, respectively. The experimental group was intended to create a portfolio and put their writing samples, in which idioms were used, in the portfolio. They were involved in the process of self-and-peer assessment. The teacher also provided them with feedback and comments. However, the control group received a kind of traditional instruction. In other words, the control group used the idioms in their writing without receiving any comments and delivered it to their teacher to be scored. The treatment lasted for 10 sessions and a post-test was administered in the end. Independent samples t-tests were used to analyze the data gathered from the pretests and the posttest. The findings indicated that there was a statistically significant difference between the two groups in terms of idioms and portfolio was found to be able to improve students’ knowledge of idioms. The results of this study have some implications for teaching and learning idioms.

  20. An Ecological Approach to Understanding Assessment for Learning in Support of Student Writing Achievement

    Directory of Open Access Journals (Sweden)

    Bronwen Cowie

    2018-02-01

    Full Text Available In this paper, we report on a project conducted in a New Zealand primary school that aimed to enhance the writing achievement of primary school boys who were achieving just below the national standard for their age or level through the use of peer feedback and information and communication technologies (ICTs. The project involved a teacher collaborative inquiry approach where all seven teachers in the school and the school principal participated to achieve the project aim. We adopt an ecological approach as a lens to offer a holistic and comprehensive view of how peer assessment and use of ICTs can be facilitated to improve writing achievement. Data were collected through teacher interviews and written reflections of practice and student learning, teacher analysis of student work, team meeting notes, classroom observations, and student focus group interviews. Findings from the thematic analysis of textual data illustrate the potential of adopting an ecological approach to consider how teacher classroom practices are shaped by the school, community, and wider policy context. At the classroom level, our ecological analysis highlighted a productive synergy between commonplace writing pedagogy strategies and assessment for learning (AfL practices as part of teacher orchestration of an ensemble of interdependent routines, tools, and activities. Diversity, redundancy, and local adaptations of resources to provide multiple pathways and opportunities—social and material and digital—emerged as important in fostering peer assessment and ICT use in support of writing achievement. Importantly, these practices were made explicit and taken up across the school and in the parent community because of whole staff involvement in the project. The wider policy context allowed for and supported teachers developing more effective pedagogy to impact student learning outcomes. We propose that an ecological orientation offers the field a productive insight into the

  1. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  2. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  3. Large Scale Laser Two-Photon Polymerization Structuring for Fabrication of Artificial Polymeric Scaffolds for Regenerative Medicine

    International Nuclear Information System (INIS)

    Malinauskas, M.; Purlys, V.; Zukauskas, A.; Rutkauskas, M.; Danilevicius, P.; Paipulas, D.; Bickauskaite, G.; Gadonas, R.; Piskarskas, A.; Bukelskis, L.; Baltriukiene, D.; Bukelskiene, V.; Sirmenis, R.; Gaidukeviciute, A.; Sirvydis, V.

    2010-01-01

    We present a femtosecond Laser Two-Photon Polymerization (LTPP) system of large scale three-dimensional structuring for applications in tissue engineering. The direct laser writing system enables fabrication of artificial polymeric scaffolds over a large area (up to cm in lateral size) with sub-micrometer resolution which could find practical applications in biomedicine and surgery. Yb:KGW femtosecond laser oscillator (Pharos, Light Conversion. Co. Ltd.) is used as an irradiation source (75 fs, 515 nm (frequency doubled), 80 MHz). The sample is mounted on wide range linear motor driven stages having 10 nm sample positioning resolution (XY--ALS130-100, Z--ALS130-50, Aerotech, Inc.). These stages guarantee an overall travelling range of 100 mm into X and Y directions and 50 mm in Z direction and support the linear scanning speed up to 300 mm/s. By moving the sample three-dimensionally the position of laser focus in the photopolymer is changed and one is able to write complex 3D (three-dimensional) structures. An illumination system and CMOS camera enables online process monitoring. Control of all equipment is automated via custom made computer software ''3D-Poli'' specially designed for LTPP applications. Structures can be imported from computer aided design STereoLihography (stl) files or programmed directly. It can be used for rapid LTPP structuring in various photopolymers (SZ2080, AKRE19, PEG-DA-258) which are known to be suitable for bio-applications. Microstructured scaffolds can be produced on different substrates like glass, plastic and metal. In this paper, we present microfabricated polymeric scaffolds over a large area and growing of adult rabbit myogenic stem cells on them. Obtained results show the polymeric scaffolds to be applicable for cell growth practice. It exhibit potential to use it for artificial pericardium in the experimental model in the future.

  4. Large Scale Laser Two-Photon Polymerization Structuring for Fabrication of Artificial Polymeric Scaffolds for Regenerative Medicine

    Science.gov (United States)

    Malinauskas, M.; Purlys, V.; Žukauskas, A.; Rutkauskas, M.; Danilevičius, P.; Paipulas, D.; Bičkauskaitė, G.; Bukelskis, L.; Baltriukienė, D.; Širmenis, R.; Gaidukevičiutė, A.; Bukelskienė, V.; Gadonas, R.; Sirvydis, V.; Piskarskas, A.

    2010-11-01

    We present a femtosecond Laser Two-Photon Polymerization (LTPP) system of large scale three-dimensional structuring for applications in tissue engineering. The direct laser writing system enables fabrication of artificial polymeric scaffolds over a large area (up to cm in lateral size) with sub-micrometer resolution which could find practical applications in biomedicine and surgery. Yb:KGW femtosecond laser oscillator (Pharos, Light Conversion. Co. Ltd.) is used as an irradiation source (75 fs, 515 nm (frequency doubled), 80 MHz). The sample is mounted on wide range linear motor driven stages having 10 nm sample positioning resolution (XY—ALS130-100, Z—ALS130-50, Aerotech, Inc.). These stages guarantee an overall travelling range of 100 mm into X and Y directions and 50 mm in Z direction and support the linear scanning speed up to 300 mm/s. By moving the sample three-dimensionally the position of laser focus in the photopolymer is changed and one is able to write complex 3D (three-dimensional) structures. An illumination system and CMOS camera enables online process monitoring. Control of all equipment is automated via custom made computer software "3D-Poli" specially designed for LTPP applications. Structures can be imported from computer aided design STereoLihography (stl) files or programmed directly. It can be used for rapid LTPP structuring in various photopolymers (SZ2080, AKRE19, PEG-DA-258) which are known to be suitable for bio-applications. Microstructured scaffolds can be produced on different substrates like glass, plastic and metal. In this paper, we present microfabricated polymeric scaffolds over a large area and growing of adult rabbit myogenic stem cells on them. Obtained results show the polymeric scaffolds to be applicable for cell growth practice. It exhibit potential to use it for artificial pericardium in the experimental model in the future.

  5. Large scale particle simulations in a virtual memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Million, R.; Wagner, J.S.; Tajima, T.

    1983-01-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceeds the computer core size. The required address space is automatically mapped onto slow disc memory the the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Assesses to slow memory significantly reduce the excecution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time. (orig.)

  6. Improving Undergraduates' Argumentative Group Essay Writing through Self-Assessment

    Science.gov (United States)

    Fung, Yong Mei; Mei, Hooi Chee

    2015-01-01

    When writing an argumentative essay, writers develop and evaluate arguments to embody, initiate, or simulate various kinds of interpersonal and textual interaction for reader consideration (Wu & Allison, 2003). This is quite challenging for English as a second language (ESL) learners. To improve the quality of their writing, students need to…

  7. Rubrics: Heuristics for Developing Writing Strategies

    Science.gov (United States)

    De La Paz, Susan

    2009-01-01

    Rubrics are an integral part of many writing programs, and they represent elements of good writing in essays, stories, poems, as well as other genres and forms of text. Although it is possible to use rubrics to teach students about the processes underlying effective writing, a more common practice is to use rubrics as a means of assessment, after…

  8. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    Science.gov (United States)

    Dong, Xianlei; Bollen, Johan

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  9. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    Directory of Open Access Journals (Sweden)

    Xianlei Dong

    Full Text Available Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  10. The Implementation of Self-Assessment in Writing Class: A Case Study at STBA LIA Jakarta

    Science.gov (United States)

    Purwanti, Theresia Tuti

    2015-01-01

    Self-assessment has become a means of realizing the goals of learner-centered education. It is conducted to help students grow to be independent learners. With regard to this point, this case study is aimed at investigating the implementation of the self-assessment as a learning tool in writing class. Its purpose is to examine students' reactions…

  11. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  12. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  13. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  14. Large-scale dynamic compaction of natural salt

    International Nuclear Information System (INIS)

    Hansen, F.D.; Ahrens, E.H.

    1996-01-01

    A large-scale dynamic compaction demonstration of natural salt was successfully completed. About 40 m 3 of salt were compacted in three, 2-m lifts by dropping a 9,000-kg weight from a height of 15 m in a systematic pattern to achieve desired compaction energy. To enhance compaction, 1 wt% water was added to the relatively dry mine-run salt. The average compacted mass fractional density was 0.90 of natural intact salt, and in situ nitrogen permeabilities averaged 9X10 -14 m 2 . This established viability of dynamic compacting for placing salt shaft seal components. The demonstration also provided compacted salt parameters needed for shaft seal system design and performance assessments of the Waste Isolation Pilot Plant

  15. Assessing the Overall Value of an Online Writing Community

    Science.gov (United States)

    Mohapatra, Sanjay; Mohanty, Sukriti

    2017-01-01

    The advent of internet has brought in changes to many existing stable business models. With the technological shift, the concept of community writing has undergone several changes. Using a sample of 181 participants, it was found that online community, of late, has been greatly impacted by technology. Community writing involves amalgamation of…

  16. Technology-enhanced writing therapy for people with aphasia: results of a quasi-randomized waitlist controlled study.

    Science.gov (United States)

    Marshall, Jane; Caute, Anna; Chadd, Katie; Cruice, Madeline; Monnelly, Katie; Wilson, Stephanie; Woolf, Celia

    2018-05-10

    Acquired writing impairment, or dysgraphia, is common in aphasia. It affects both handwriting and typing, and may recover less well than other aphasic symptoms. Dysgraphia is an increasing priority for intervention, particularly for those wishing to participate in online written communication. Effective dysgraphia treatment studies have been reported, but many did not target, or did not achieve, improvements in functional writing. Functional outcomes might be promoted by therapies that exploit digital technologies, such as voice recognition and word prediction software. This study evaluated the benefits of technology-enhanced writing therapy for people with acquired dysgraphia. It aimed to explore the impact of therapy on a functional writing activity, and to examine whether treatment remediated or compensated for the writing impairment. The primary question was: Does therapy improve performance on a functional assessment of writing; and, if so, do gains occur only when writing is assisted by technology? Secondary measures examined whether therapy improved unassisted written naming, functional communication, mood and quality of life. The study employed a quasi-randomized waitlist controlled design. A total of 21 people with dysgraphia received 12 h of writing therapy either immediately or after a 6-week delay. The primary outcome measure was a functional assessment of writing, which was administered in handwriting and on a computer with assistive technology enabled. Secondary measures were: The Boston Naming Test (written version), Communication Activities of Daily Living-2, Visual Analogue Mood Scales (Sad question), and the Assessment of Living with Aphasia. Analyses of variance (ANOVA) were used to examine change on the outcome measures over two time points, between which the immediate group had received therapy but the delayed group had not. Pre-therapy, post-therapy and follow-up scores on the measures were also examined for all participants. Time × group

  17. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  18. A Heuristic Approach to Author Name Disambiguation in Bibliometrics Databases for Large-scale Research Assessments

    NARCIS (Netherlands)

    D'Angelo, C.A.; Giuffrida, C.; Abramo, G.

    2011-01-01

    National exercises for the evaluation of research activity by universities are becoming regular practice in ever more countries. These exercises have mainly been conducted through the application of peer-review methods. Bibliometrics has not been able to offer a valid large-scale alternative because

  19. Peer-editing Practice in the Writing Classroom: Benefits and Drawbacks

    Directory of Open Access Journals (Sweden)

    Ann Rosnida Md. Deni

    2011-01-01

    Full Text Available Small scale studies have shown that peer-editing is beneficial to students as it increases their awareness of the complex process of writing, it improves their knowledge of and skills in writing and helps them become more autonomous in learning. Teachers too may benefit from peer-editing as this practice discloses invaluable information on students’ writing weaknesses and strengths: and teachers’ teaching effectiveness. This is a small scale study conducted on fifteen first-year degree students majoring in Tourism to view the usefulness of peer-editing practice in enhancing their writing skills. Retrospective notes were taken to record students’ receptiveness and reaction towards peer editing practice: students writing samples and peer- editing questionnaires were analyzed to view students’ revisions and comments; and an open— ended questionnaire was distributed to identify students perceptions of peer—editing practice in the writing classroom. Analysis of data gathered revealed that peer-editing practice benefitted both the teacher and most of her students as it exposed important information that could improve her teaching of writing and her students’ writing practices. Data analysis also. however, discloses that peer-editing practice may have adverse effects on students’ motivation and improvement in writing if they are not deployed properly.

  20. Properties of large-scale methane/hydrogen jet fires

    Energy Technology Data Exchange (ETDEWEB)

    Studer, E. [CEA Saclay, DEN, LTMF Heat Transfer and Fluid Mech Lab, 91 - Gif-sur-Yvette (France); Jamois, D.; Leroy, G.; Hebrard, J. [INERIS, F-60150 Verneuil En Halatte (France); Jallais, S. [Air Liquide, F-78350 Jouy En Josas (France); Blanchetiere, V. [GDF SUEZ, 93 - La Plaine St Denis (France)

    2009-12-15

    A future economy based on reduction of carbon-based fuels for power generation and transportation may consider hydrogen as possible energy carrier Extensive and widespread use of hydrogen might require a pipeline network. The alternatives might be the use of the existing natural gas network or to design a dedicated network. Whatever the solution, mixing hydrogen with natural gas will modify the consequences of accidents, substantially The French National Research Agency (ANR) funded project called HYDROMEL focuses on these critical questions Within this project large-scale jet fires have been studied experimentally and numerically The main characteristics of these flames including visible length, radiation fluxes and blowout have been assessed. (authors)

  1. Effects of an expressive writing intervention on cancer-related distress in Danish breast cancer survivors - results from a nationwide randomized clinical trial.

    Science.gov (United States)

    Jensen-Johansen, M B; Christensen, S; Valdimarsdottir, H; Zakowski, S; Jensen, A B; Bovbjerg, D H; Zachariae, R

    2013-07-01

    To examine the effects of an expressive writing intervention (EWI) on cancer-related distress, depressive symptoms, and mood in women treated for early stage breast cancer. A nationwide sample of 507 Danish women who had recently completed treatment for primary breast cancer were randomly assigned to three 20-min home-based writing exercises, one week apart, focusing on either emotional disclosure (EWI group) or a non-emotional topic (control group). Cancer-related distress [Impact of Event Scale (IES)], depressive symptoms (Beck Depression Inventory-Short Form), and negative (37-item Profile of Moods State) and positive mood (Passive Positive Mood Scale) were assessed at baseline and at 3 and 9 months post-intervention. Choice of writing topic (cancer versus other), alexithymia (20-item Toronto Alexithymia Scale), and social constraints (Social Constraints Scale) were included as possible moderators. Significant (ppsychological symptoms were seen in both groups (pwriting topic moderated effects on IES, with women writing about other themes showing greater reductions in cancer-related avoidance than women writing about their cancer. Fewer depressive symptoms and higher levels of positive mood were seen 3 months post-intervention in women writing about their cancer when compared with the control group. Difficulties describing feelings and externally oriented thinking (20-item Toronto Alexithymia Scale) moderated effects on positive mood and IES-total, while no moderating effects were found of social constraints. In concordance with the majority of previous results with cancer patients, no main effects of EWI were found for cancer-related distress, depressive symptoms, and mood. Moderator analyses suggested that choice of writing topic and ability to process emotional experiences should be studied further. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  3. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  4. The Politics of Writing, Writing Politics: Virginia Woolf’s A [Virtual] Room of One’s Own

    Directory of Open Access Journals (Sweden)

    Tegan Zimmerman

    2012-12-01

    Full Text Available This article revisits A Room of One’s Own, Virginia Woolf’s foundational 1929 text on women’s writing. I examine from a feminist materialist perspective the relevance of Woolf’s notion of a “room” in our globalized and technological twenty-first century. I first review Woolf’s position on the material conditions necessary for women writers in her own time and then the applicability of her thinking for contemporary women writers on a global scale. I emphasize that the politics of writing, and in particular writing by women, that Woolf puts forth gives feminists the necessary tools to reevaluate and rethink women’s writing both online and offline. I therefore argue that Woolf’s traditional work on materiality can be updated and developed to further inform what is now, in the twenty-first century, an urgent need for women writers, a feminist philosophy of sexual difference in relation to technology, and an e-feminism of online spaces and women’s online writing.

  5. Project Administration Techniques for Successful Classroom Collaborative Writing.

    Science.gov (United States)

    Kryder, LeeAnne Giannone

    1991-01-01

    Focuses on the collaborative writing done for a large report or proposal over a period of several weeks or months in a business writing course. Discusses short-term writing projects and nonwriting tasks for project administration, meeting management, student/instructor conference, project planning and time estimates, and oral presentations. (PRA)

  6. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  7. The role of research-article writing motivation and self-regulatory strategies in explaining research-article abstract writing ability.

    Science.gov (United States)

    Lin, Ming-Chia; Cheng, Yuh-Show; Lin, Sieh-Hwa; Hsieh, Pei-Jung

    2015-04-01

    The purpose of the study was to investigate the effects of research-article writing motivation and use of self-regulatory writing strategies in explaining second language (L2) research-article abstract writing ability, alongside the L2 literacy effect. Four measures were administered: a L2 literacy test, a research abstract performance assessment, and inventories of writing motivation and strategy. Participants were L2 graduate students in Taiwan (N=185; M age=25.8 yr., SD=4.5, range=22-53). Results of structural equation modeling showed a direct effect of motivation on research-article writing ability, but no direct effect of strategy or indirect effect of motivation via strategy on research-article writing ability, with L2 literacy controlled. The findings suggest research-article writing instruction should address writing motivation, besides L2 literacy.

  8. Promoting linguistic complexity, greater message length and ease of engagement in email writing in people with aphasia: initial evidence from a study utilizing assistive writing software.

    Science.gov (United States)

    Thiel, Lindsey; Sage, Karen; Conroy, Paul

    2017-01-01

    Improving email writing in people with aphasia could enhance their ability to communicate, promote interaction and reduce isolation. Spelling therapies have been effective in improving single-word writing. However, there has been limited evidence on how to achieve changes to everyday writing tasks such as email writing in people with aphasia. One potential area that has been largely unexplored in the literature is the potential use of assistive writing technologies, despite some initial evidence that assistive writing software use can lead to qualitative and quantitative improvements to spontaneous writing. This within-participants case series design study aimed to investigate the effects of using assistive writing software to improve email writing in participants with dysgraphia related to aphasia. Eight participants worked through a hierarchy of writing tasks of increasing complexity within broad topic areas that incorporate the spheres of writing need of the participants: writing for domestic needs, writing for social needs and writing for business/administrative needs. Through completing these tasks, participants had the opportunity to use the various functions of the software, such as predictive writing, word banks and text to speech. Therapy also included training and practice in basic computer and email skills to encourage increased independence. Outcome measures included email skills, keyboard skills, email writing and written picture description tasks, and a perception of disability assessment. Four of the eight participants showed statistically significant improvements to spelling accuracy within emails when using the software. At a group level there was a significant increase in word length with the software; while four participants showed noteworthy changes to the range of word classes used. Enhanced independence in email use and improvements in participants' perceptions of their writing skills were also noted. This study provided some initial evidence

  9. Characterizing Temperature Variability and Associated Large Scale Meteorological Patterns Across South America

    Science.gov (United States)

    Detzer, J.; Loikith, P. C.; Mechoso, C. R.; Barkhordarian, A.; Lee, H.

    2017-12-01

    South America's climate varies considerably owing to its large geographic range and diverse topographical features. Spanning the tropics to the mid-latitudes and from high peaks to tropical rainforest, the continent experiences an array of climate and weather patterns. Due to this considerable spatial extent, assessing temperature variability at the continent scale is particularly challenging. It is well documented in the literature that temperatures have been increasing across portions of South America in recent decades, and while there have been many studies that have focused on precipitation variability and change, temperature has received less scientific attention. Therefore, a more thorough understanding of the drivers of temperature variability is critical for interpreting future change. First, k-means cluster analysis is used to identify four primary modes of temperature variability across the continent, stratified by season. Next, composites of large scale meteorological patterns (LSMPs) are calculated for months assigned to each cluster. Initial results suggest that LSMPs, defined using meteorological variables such as sea level pressure (SLP), geopotential height, and wind, are able to identify synoptic scale mechanisms important for driving temperature variability at the monthly scale. Some LSMPs indicate a relationship with known recurrent modes of climate variability. For example, composites of geopotential height suggest that the Southern Annular Mode is an important, but not necessarily dominant, component of temperature variability over southern South America. This work will be extended to assess the drivers of temperature extremes across South America.

  10. Predictors of writing competence in 4- to 7-year-old children.

    Science.gov (United States)

    Dunsmuir, Sandra; Blatchford, Peter

    2004-09-01

    This longitudinal study sought to improve understanding of the factors at home and school that influence children's attainment and progress in writing between the ages of 4 and 7 years. (i) To investigate the relationship between home variables and writing development in preschool children; (ii) to determine associations between child characteristics and writing development (iii) to conduct an analysis of the areas of continuity and discontinuity between variables at home and at school, and influences on subsequent writing development. Sixty children attending four urban primary schools participated in this study. Semi-structured interviews, questionnaires, observation schedules and standardized assessments were used. Writing samples were collected each term. Associations between measures and continuity over time were assessed using multiple regression analysis. Preschool variables that were found to be significantly associated with writing proficiency at school entry included mother's educational level, family size, parental assessment of writing and a measure of home writing. Child characteristics, skills and competencies were measured at school entry and those found to be significantly associated with writing at 7 years included season of birth, vocabulary score, pre-reading skills, handwriting and proficiency in writing name. The only preschool variable that maintained its significant relationship to writing at 7 years was home writing. Teacher assessments of pupil attitudes to writing were consistently found to be significantly associated with writing competence. This comprehensive study explored the complex interaction of cognitive, affective and contextual processes involved in learning to write, and identified specific features of successful writers. Results are discussed in relation to educational policy and practice issues.

  11. The use of soil moisture - remote sensing products for large-scale groundwater modeling and assessment

    NARCIS (Netherlands)

    Sutanudjaja, E.H.

    2012-01-01

    In this thesis, the possibilities of using spaceborne remote sensing for large-scale groundwater modeling are explored. We focus on a soil moisture product called European Remote Sensing Soil Water Index (ERS SWI, Wagner et al., 1999) - representing the upper profile soil moisture. As a test-bed, we

  12. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  13. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  14. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  15. AUSERA: Large-Scale Automated Security Risk Assessment of Global Mobile Banking Apps

    OpenAIRE

    Chen, Sen; Meng, Guozhu; Su, Ting; Fan, Lingling; Xue, Yinxing; Liu, Yang; Xu, Lihua; Xue, Minhui; Li, Bo; Hao, Shuang

    2018-01-01

    Contemporary financial technology (FinTech) that enables cashless mobile payment has been widely adopted by financial institutions, such as banks, due to its convenience and efficiency. However, FinTech has also made massive and dynamic transactions susceptible to security risks. Given large financial losses caused by such vulnerabilities, regulatory technology (RegTech) has been developed, but more comprehensive security risk assessment is specifically desired to develop robust, scalable, an...

  16. Scaling up: Assessing social impacts at the macro-scale

    International Nuclear Information System (INIS)

    Schirmer, Jacki

    2011-01-01

    Social impacts occur at various scales, from the micro-scale of the individual to the macro-scale of the community. Identifying the macro-scale social changes that results from an impacting event is a common goal of social impact assessment (SIA), but is challenging as multiple factors simultaneously influence social trends at any given time, and there are usually only a small number of cases available for examination. While some methods have been proposed for establishing the contribution of an impacting event to macro-scale social change, they remain relatively untested. This paper critically reviews methods recommended to assess macro-scale social impacts, and proposes and demonstrates a new approach. The 'scaling up' method involves developing a chain of logic linking change at the individual/site scale to the community scale. It enables a more problematised assessment of the likely contribution of an impacting event to macro-scale social change than previous approaches. The use of this approach in a recent study of change in dairy farming in south east Australia is described.

  17. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  18. The SEA-change Model in Information Literacy: Assessing Information Literacy Development with Reflective Writing

    Directory of Open Access Journals (Sweden)

    Barbara Anne Sen

    2014-07-01

    Full Text Available Reflective writing is a key professional skill, and the University of Sheffield Information School seeks to develop this skill in our students through the use of reflective assessments. Reflection has been used as a means of supporting Information Literacy development in the Higher Education context and recent pedagogical IL frameworks highlight the important role of reflection. This paper presents an analysis of Undergraduate students’ reflective writing on one module. The writing is mapped against two models of reflection to understand the nature and depth of the students’ reflection and through this understand their Information literacy development, with the overall aim of improving the teaching and learning experience for the future. Key findings are that students did reflect deeply and identified a number of ways in which they felt their IL had developed (e.g. developing a knowledge of specialist sources, ways they could have improved their information literacy practices (e.g. through storing information in a more organised fashion, and ways that we could improve our teaching (e.g. by providing appropriate scaffolding for the activities.

  19. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  20. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  1. Peer-Formativity: A Framework for Academic Writing

    Science.gov (United States)

    Murray, Rowena; Thow, Morag

    2014-01-01

    The system currently deployed to assess research outputs in higher education can influence what, how and for whom academics write; for some it may determine whether or not they write at all. This article offers a framework for negotiating this performative context--the writing meeting. This framework uses the established theoretical underpinning…

  2. Analysis of the applicability of fracture mechanics on the basis of large scale specimen testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Polachova, H.; Sulc, J.; Anikovskij, V.; Dragunov, Y.; Rivkin, E.; Filatov, V.

    1988-01-01

    The verification is dealt with of fracture mechanics calculations for WWER reactor pressure vessels by large scale model testing performed on the large testing machine ZZ 8000 (maximum load of 80 MN) in the Skoda Concern. The results of testing a large set of large scale test specimens with surface crack-type defects are presented. The nominal thickness of the specimens was 150 mm with defect depths between 15 and 100 mm, the testing temperature varying between -30 and +80 degC (i.e., in the temperature interval of T ko ±50 degC). Specimens with a scale of 1:8 and 1:12 were also tested, as well as standard (CT and TPB) specimens. Comparisons of results of testing and calculations suggest some conservatism of calculations (especially for small defects) based on Linear Elastic Fracture Mechanics, according to the Nuclear Reactor Pressure Vessel Codes which use the fracture mechanics values from J IC testing. On the basis of large scale tests the ''Defect Analysis Diagram'' was constructed and recommended for brittle fracture assessment of reactor pressure vessels. (author). 7 figs., 2 tabs., 3 refs

  3. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  4. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  5. Hydrogen combustion modelling in large-scale geometries

    International Nuclear Information System (INIS)

    Studer, E.; Beccantini, A.; Kudriakov, S.; Velikorodny, A.

    2014-01-01

    Hydrogen risk mitigation issues based on catalytic recombiners cannot exclude flammable clouds to be formed during the course of a severe accident in a Nuclear Power Plant. Consequences of combustion processes have to be assessed based on existing knowledge and state of the art in CFD combustion modelling. The Fukushima accidents have also revealed the need for taking into account the hydrogen explosion phenomena in risk management. Thus combustion modelling in a large-scale geometry is one of the remaining severe accident safety issues. At present day there doesn't exist a combustion model which can accurately describe a combustion process inside a geometrical configuration typical of the Nuclear Power Plant (NPP) environment. Therefore the major attention in model development has to be paid on the adoption of existing approaches or creation of the new ones capable of reliably predicting the possibility of the flame acceleration in the geometries of that type. A set of experiments performed previously in RUT facility and Heiss Dampf Reactor (HDR) facility is used as a validation database for development of three-dimensional gas dynamic model for the simulation of hydrogen-air-steam combustion in large-scale geometries. The combustion regimes include slow deflagration, fast deflagration, and detonation. Modelling is based on Reactive Discrete Equation Method (RDEM) where flame is represented as an interface separating reactants and combustion products. The transport of the progress variable is governed by different flame surface wrinkling factors. The results of numerical simulation are presented together with the comparisons, critical discussions and conclusions. (authors)

  6. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  7. Digital selective growth of a ZnO nanowire array by large scale laser decomposition of zinc acetate.

    Science.gov (United States)

    Hong, Sukjoon; Yeo, Junyeob; Manorotkul, Wanit; Kang, Hyun Wook; Lee, Jinhwan; Han, Seungyong; Rho, Yoonsoo; Suh, Young Duk; Sung, Hyung Jin; Ko, Seung Hwan

    2013-05-07

    We develop a digital direct writing method for ZnO NW micro-patterned growth on a large scale by selective laser decomposition of zinc acetate. For ZnO NW growth, by replacing the bulk heating with the scanning focused laser as a fully digital local heat source, zinc acetate crystallites can be selectively activated as a ZnO seed pattern to grow ZnO nanowires locally on a larger area. Together with the selective laser sintering process of metal nanoparticles, more than 10,000 UV sensors have been demonstrated on a 4 cm × 4 cm glass substrate to develop all-solution processible, all-laser mask-less digital fabrication of electronic devices including active layer and metal electrodes without any conventional vacuum deposition, photolithographic process, premade mask, high temperature and vacuum environment.

  8. The European Union Solidarity Fund: An Important Tool in the Recovery After Large-Scale Natural Disasters

    Directory of Open Access Journals (Sweden)

    Maria IONCICĂ

    2016-03-01

    Full Text Available This paper analyses the situation of the European Union Solidarity Fund, as an important tool in the recovery after large-scale natural disasters. In the last millennium, the European Union countries have faced climate change, which lead to events with disastrous consequences. There are several ex-post financial ways to respond to the challenges posed by large-scale natural disasters, among which EU Solidarity Fund, government funds, budget reallocation, donor assistance, domestic and/or external credit. The EU Solidarity Fund was created in 2002 after the massive floods from the Central Europe as the expression of the solidarity of EU countries. Romania has received financial assistance from the EU Solidarity Fund after the occurrence of major natural disasters, regional and neighbouring country disasters. The assessment of large-scale natural disasters in EU is very important and in order to analyse if there is a concentration of large-scale natural disasters in EU we used the Gini coefficient. In the paper, the method of the statistical analysis and the correlation between several indicators were used to study the financial impacts of large-scale natural disasters in Europe, and especially in Romania.

  9. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  10. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  11. Integrated numerical platforms for environmental dose assessments of large tritium inventory facilities

    International Nuclear Information System (INIS)

    Castro, P.; Ardao, J.; Velarde, M.; Sedano, L.; Xiberta, J.

    2013-01-01

    Related with a prospected new scenario of large inventory tritium facilities [KATRIN at TLK, CANDUs, ITER, EAST, other coming] the prescribed dosimetric limits by ICRP-60 for tritium committed-doses are under discussion requiring, in parallel, to surmount the highly conservative assessments by increasing the refinement of dosimetric-assessments in many aspects. Precise Lagrangian-computations of dosimetric cloud-evolution after standardized (normal/incidental/SBO) tritium cloud emissions are today numerically open to the perfect match of real-time meteorological-data, and patterns data at diverse scales for prompt/early and chronic tritium dose assessments. The trends towards integrated-numerical-platforms for environmental-dose assessments of large tritium inventory facilities under development.

  12. Writing to Learn and Learning to Write across the Disciplines: Peer-to-Peer Writing in Introductory-Level MOOCs

    Directory of Open Access Journals (Sweden)

    Denise K. Comer

    2014-11-01

    Full Text Available This study aimed to evaluate how peer-to-peer interactions through writing impact student learning in introductory-level massive open online courses (MOOCs across disciplines. This article presents the results of a qualitative coding analysis of peer-to-peer interactions in two introductory level MOOCs: English Composition I: Achieving Expertise and Introduction to Chemistry. Results indicate that peer-to-peer interactions in writing through the forums and through peer assessment enhance learner understanding, link to course learning objectives, and generally contribute positively to the learning environment. Moreover, because forum interactions and peer review occur in written form, our research contributes to open distance learning (ODL scholarship by highlighting the importance of writing to learn as a significant pedagogical practice that should be encouraged more in MOOCs across disciplines.

  13. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  14. Model development to acceptability-assessment of large scale power plants for electricity generation

    International Nuclear Information System (INIS)

    Schubert, Katharina

    2013-01-01

    An approach to specific assessment of large power plants is presented. This approach is intended to provide the decision which kind of nuclear, fossil and renewable installation operation minimizes unacceptable consequences for the environment, economy, and society. The tool ACCEPPT, which is currently under development for this purpose, allows a comprehensible and quantitative assessment of the reasonableness of unintended side-effects of different power plant types. The flexible design of the tool elements frame conditions and system technology supports a dynamic acceptability assessment under consideration of the particular context and plant configuration. Thus, current conditions can be used for evaluation as well as development scenarios. Finally the comprehensible acceptability results are intended to contribute overcoming of acceptance problems in the society. (orig.)

  15. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  16. Potential climatic impacts and reliability of large-scale offshore wind farms

    International Nuclear Information System (INIS)

    Wang Chien; Prinn, Ronald G

    2011-01-01

    The vast availability of wind power has fueled substantial interest in this renewable energy source as a potential near-zero greenhouse gas emission technology for meeting future world energy needs while addressing the climate change issue. However, in order to provide even a fraction of the estimated future energy needs, a large-scale deployment of wind turbines (several million) is required. The consequent environmental impacts, and the inherent reliability of such a large-scale usage of intermittent wind power would have to be carefully assessed, in addition to the need to lower the high current unit wind power costs. Our previous study (Wang and Prinn 2010 Atmos. Chem. Phys. 10 2053) using a three-dimensional climate model suggested that a large deployment of wind turbines over land to meet about 10% of predicted world energy needs in 2100 could lead to a significant temperature increase in the lower atmosphere over the installed regions. A global-scale perturbation to the general circulation patterns as well as to the cloud and precipitation distribution was also predicted. In the later study reported here, we conducted a set of six additional model simulations using an improved climate model to further address the potential environmental and intermittency issues of large-scale deployment of offshore wind turbines for differing installation areas and spatial densities. In contrast to the previous land installation results, the offshore wind turbine installations are found to cause a surface cooling over the installed offshore regions. This cooling is due principally to the enhanced latent heat flux from the sea surface to lower atmosphere, driven by an increase in turbulent mixing caused by the wind turbines which was not entirely offset by the concurrent reduction of mean wind kinetic energy. We found that the perturbation of the large-scale deployment of offshore wind turbines to the global climate is relatively small compared to the case of land

  17. Anxiety as It Pertains to EFL Writing Ability and Performance

    Science.gov (United States)

    Nodoushan, Mohammad Ali Salmani

    2015-01-01

    This paper reports the results of a study conducted to find (a) the impact of anxiety on EFL learners' writing performance, and (b) the relationship between anxiety and foreign language writing ability. 137 (N = 137) EFL learners took the Foreign Language Classroom Anxiety Scale (FLCAS), the Oxford Placement Test (OPT), and a writing task on a…

  18. Large-scale control site selection for population monitoring: an example assessing Sage-grouse trends

    Science.gov (United States)

    Fedy, Bradley C.; O'Donnell, Michael; Bowen, Zachary H.

    2015-01-01

    Human impacts on wildlife populations are widespread and prolific and understanding wildlife responses to human impacts is a fundamental component of wildlife management. The first step to understanding wildlife responses is the documentation of changes in wildlife population parameters, such as population size. Meaningful assessment of population changes in potentially impacted sites requires the establishment of monitoring at similar, nonimpacted, control sites. However, it is often difficult to identify appropriate control sites in wildlife populations. We demonstrated use of Geographic Information System (GIS) data across large spatial scales to select biologically relevant control sites for population monitoring. Greater sage-grouse (Centrocercus urophasianus; hearafter, sage-grouse) are negatively affected by energy development, and monitoring of sage-grouse population within energy development areas is necessary to detect population-level responses. Weused population data (1995–2012) from an energy development area in Wyoming, USA, the Atlantic Rim Project Area (ARPA), and GIS data to identify control sites that were not impacted by energy development for population monitoring. Control sites were surrounded by similar habitat and were within similar climate areas to the ARPA. We developed nonlinear trend models for both the ARPA and control sites and compared long-term trends from the 2 areas. We found little difference between the ARPA and control sites trends over time. This research demonstrated an approach for control site selection across large landscapes and can be used as a template for similar impact-monitoring studies. It is important to note that identification of changes in population parameters between control and treatment sites is only the first step in understanding the mechanisms that underlie those changes. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  19. Predators on private land: broad-scale socioeconomic interactions influence large predator management

    Directory of Open Access Journals (Sweden)

    Hayley S. Clements

    2016-06-01

    Full Text Available The proliferation of private land conservation areas (PLCAs is placing increasing pressure on conservation authorities to effectively regulate their ecological management. Many PLCAs depend on tourism for income, and charismatic large mammal species are considered important for attracting international visitors. Broad-scale socioeconomic factors therefore have the potential to drive fine-scale ecological management, creating a systemic scale mismatch that can reduce long-term sustainability in cases where economic and conservation objectives are not perfectly aligned. We assessed the socioeconomic drivers and outcomes of large predator management on 71 PLCAs in South Africa. Owners of PLCAs that are stocking free-roaming large predators identified revenue generation as influencing most or all of their management decisions, and rated profit generation as a more important objective than did the owners of PLCAs that did not stock large predators. Ecotourism revenue increased with increasing lion (Panthera leo density, which created a potential economic incentive for stocking lion at high densities. Despite this potential mismatch between economic and ecological objectives, lion densities were sustainable relative to available prey. Regional-scale policy guidelines for free-roaming lion management were ecologically sound. By contrast, policy guidelines underestimated the area required to sustain cheetah (Acinonyx jubatus, which occurred at unsustainable densities relative to available prey. Evidence of predator overstocking included predator diet supplementation and frequent reintroduction of game. We conclude that effective facilitation of conservation on private land requires consideration of the strong and not necessarily beneficial multiscale socioeconomic factors that influence private land management.

  20. A Numeric Scorecard Assessing the Mental Health Preparedness for Large-Scale Crises at College and University Campuses: A Delphi Study

    Science.gov (United States)

    Burgin, Rick A.

    2012-01-01

    Large-scale crises continue to surprise, overwhelm, and shatter college and university campuses. While the devastation to physical plants and persons is often evident and is addressed with crisis management plans, the number of emotional casualties left in the wake of these large-scale crises may not be apparent and are often not addressed with…

  1. Coupled Large Scale Hydro-mechanical Modelling for cap-rock Failure Risk Assessment of CO2 Storage in Deep Saline Aquifers

    International Nuclear Information System (INIS)

    Rohmer, J.; Seyedi, D.M.

    2010-01-01

    This work presents a numerical strategy of large scale hydro-mechanical simulations to assess the risk of damage in cap-rock formations during a CO 2 injection process. The proposed methodology is based on the development of a sequential coupling between a multiphase fluid flow (TOUGH2) and a hydro-mechanical calculation code (Code-Aster) that enables us to perform coupled hydro-mechanical simulation at a regional scale. The likelihood of different cap-rock damage mechanisms can then be evaluated based on the results of the coupled simulations. A scenario based approach is proposed to take into account the effect of the uncertainty of model parameters on damage likelihood. The developed methodology is applied for the cap-rock failure analysis of deep aquifer of the Dogger formation in the context of the Paris basin multilayered geological system as a demonstration example. The simulation is carried out at a regional scale (100 km) considering an industrial mass injection rate of CO 2 of 10 Mt/y. The assessment of the stress state after 10 years of injection is conducted through the developed sequential coupling. Two failure mechanisms have been taken into account, namely the tensile fracturing and the shear slip reactivation of pre-existing fractures. To deal with the large uncertainties due to sparse data on the layer formations, a scenario based strategy is undertaken. It consists in defining a first reference modelling scenario considering the mean values of the hydro-mechanical properties for each layer. A sensitivity analysis is then carried out and shows the importance of both the initial stress state and the reservoir hydraulic properties on the cap-rock failure tendency. On this basis, a second scenario denoted 'critical' is defined so that the most influential model parameters are taken in their worst configuration. None of these failure criteria is activated for the considered conditions. At a phenomenological level, this study points out three key

  2. An integrated assessment of a large-scale biodiesel production in Italy: Killing several birds with one stone?

    International Nuclear Information System (INIS)

    Russi, Daniela

    2008-01-01

    Biofuels are often presented as a contribution towards the solution of the problems related to our strong dependency on fossil fuels, i.e. greenhouse effect, energy dependency, urban pollution, besides being a way to support rural development. In this paper, an integrated assessment approach is employed to discuss the social desirability of a large-scale biodiesel production in Italy, taking into account social, environmental and economic factors. The conclusion is that the advantages in terms of reduction of greenhouse gas emissions, energy dependency and urban pollution would be very modest. The small benefits would not be enough to offset the huge costs in terms of land requirement: if the target of the European Directive 2003/30/EC were reached (5.75% of the energy used for transport by 2010) the equivalent of about one-third of the Italian agricultural land would be needed. The consequences would be a considerable increase in food imports and large environmental impacts in the agricultural phase. Also, since biodiesel must be de-taxed in order to make it competitive with oil-derived diesel, the Italian energy revenues would be reduced. In the end, rural development remains the only sound reason to promote biodiesel, but even for this objective other strategies look more advisable, like supporting organic agriculture. (author)

  3. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  4. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  5. Psychiatric/ psychological forensic report writing.

    Science.gov (United States)

    Young, Gerald

    Approaches to forensic report writing in psychiatry, psychology, and related mental health disciplines have moved from an organization, content, and stylistic framework to considering ethical and other codes, evidentiary standards, and practice considerations. The first part of the article surveys different approaches to forensic report writing, including that of forensic mental health assessment and psychiatric ethics. The second part deals especially with psychological ethical approaches. The American Psychological Association's Ethical Principles and Code of Conduct (2002) provide one set of principles on which to base forensic report writing. The U.S. Federal Rules of Evidence (2014) and related state rules provide another basis. The American Psychological Association's Specialty Guidelines for Forensic Psychology (2013) provide a third source. Some work has expanded the principles in ethics codes; and, in the third part of this article, these additions are applied to forensic report writing. Other work that could help with the question of forensic report writing concerns the 4 Ds in psychological injury assessments (e.g., conduct oneself with Dignity, avoid the adversary Divide, get the needed reliable Data, Determine interpretations and conclusions judiciously). One overarching ethical principle that is especially applicable in forensic report writing is to be comprehensive, scientific, and impartial. As applied to forensic report writing, the overall principle that applies is that the work process and product should reflect integrity in its ethics, law, and science. Four principles that derive from this meta-principle concern: Competency and Communication; Procedure and Protection; Dignity and Distance; and Data Collection and Determination. The standards or rules associated with each of these principles are reviewed. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  6. TRAVEL WRITING: AN APPLICATION OF WRITING WORKSHOP TO ENHANCE STUDENTS’S CREATIVE WRITING

    Directory of Open Access Journals (Sweden)

    Prayudias Margawati

    2014-10-01

    Full Text Available Writing is often assumed as uneasy skill to either learn or teach. For students, they find it difficult to develop ideas in writing. On the other hand, teachers, many of them, only ready with the materials but confuse with the appropriate ways to teach. This paper intends to describe and discuss a method of teaching writing namely writing workshop to improve students’ writing skill through travel writing. Writing workshop proposed by Calkins that consists of mini lesson, work time, peer conferring and/or response groups, share sessions, and publication celebration is applied in writing class for methodological purposes. In mini lesson, teacher offers something to the class that is meant to introduce a writing strategy done at the beginning of the workshop. During work time point, students start their new piece of writing. Teacher moves among students conferring with them while checking their works. Peer conferences or response groups provide a forum for students to talk about works in progress. When students work in group, one of them could arrange his/ her group needs during the work time. A share session may be varied, one possible way is each group shares their process of writing to other students. At the end of writing class, student writers come together to publish and/ or celebrate their final work. The publication could be in the form of portfolio, students’ diary, blog, or others. Travel writing genre is chosen as it could develop students’ creativity in describing/ narrating their own stories during, let say holiday or things they used to see on the way home weekly or monthly. Furthermore, travel writing as the product of creative writing teaches the readers of values, characteristics, and way of life. Last but not least, a professional writing teacher should set the writing workshop components in variety ways to achieve effective running-class.

  7. Investigating IELTS Academic Writing Task 2 : Relationships between cognitive writing processes, text quality, and working memory

    NARCIS (Netherlands)

    Révész, Andrea; Michel, Marije; Lee, MinJin

    2017-01-01

    This project examined the cognitive processes and online behaviours of second language writers while performing IELTS Academic Writing Test Task 2, and the ways in which the online behaviours of test-takers relate to the quality of the text produced. An additional aim was to assess whether writing

  8. Causal inference between bioavailability of heavy metals and environmental factors in a large-scale region.

    Science.gov (United States)

    Liu, Yuqiong; Du, Qingyun; Wang, Qi; Yu, Huanyun; Liu, Jianfeng; Tian, Yu; Chang, Chunying; Lei, Jing

    2017-07-01

    The causation between bioavailability of heavy metals and environmental factors are generally obtained from field experiments at local scales at present, and lack sufficient evidence from large scales. However, inferring causation between bioavailability of heavy metals and environmental factors across large-scale regions is challenging. Because the conventional correlation-based approaches used for causation assessments across large-scale regions, at the expense of actual causation, can result in spurious insights. In this study, a general approach framework, Intervention calculus when the directed acyclic graph (DAG) is absent (IDA) combined with the backdoor criterion (BC), was introduced to identify causation between the bioavailability of heavy metals and the potential environmental factors across large-scale regions. We take the Pearl River Delta (PRD) in China as a case study. The causal structures and effects were identified based on the concentrations of heavy metals (Zn, As, Cu, Hg, Pb, Cr, Ni and Cd) in soil (0-20 cm depth) and vegetable (lettuce) and 40 environmental factors (soil properties, extractable heavy metals and weathering indices) in 94 samples across the PRD. Results show that the bioavailability of heavy metals (Cd, Zn, Cr, Ni and As) was causally influenced by soil properties and soil weathering factors, whereas no causal factor impacted the bioavailability of Cu, Hg and Pb. No latent factor was found between the bioavailability of heavy metals and environmental factors. The causation between the bioavailability of heavy metals and environmental factors at field experiments is consistent with that on a large scale. The IDA combined with the BC provides a powerful tool to identify causation between the bioavailability of heavy metals and environmental factors across large-scale regions. Causal inference in a large system with the dynamic changes has great implications for system-based risk management. Copyright © 2017 Elsevier Ltd. All

  9. Increasing Skills in Writing Literature Study on Research-Based Learning Through Authentical Assessment Lecturing in Innovation Class of Social Science Learning

    Directory of Open Access Journals (Sweden)

    Naniek Sulistya Wardani

    2017-08-01

    Full Text Available The purpose of this study is to determine whether the improvement of literature review skills on research-based learning can be pursued through the authentic assessment of the lectures of the Innovation of Learning IPS of PGSD students. This type of research is a classroom action research, using a spiral model of C. Kemmis and Robin Mc. Taggart. The research procedure uses 2 cycles, each cycle consists of 3 stages namely, 1 action planning 2 implementation of action and observation, 3 reflection. The subjects of the study were all students of PGSD Class 2014 E of the subjects of Innovation of IPS Learning as much as 27 students consisting of 7 male students and 20 female students. Data collection techniques use observation and product assessment. Data analysis technique is a percentage technique that compares literacy review writing skills through authentic assessment in IPS lectures between cycles. The result of the research shows that there is an improvement of writing skill of study lecture study of IPS learning innovation, which is pursued through authentic assessment. This is evident from the improvement of writing skills worthy of achievement from cycle 1 to cycle 2 ie from 62.14% of 27 students increased to 72.60% of all students in cycle 2. Writing skills in research-based learning is a skill to express the idea of the problem , Organizing facts, concepts and principles, use of EYD grammar and grammar. Authentic assessment is an assessment consisting of connection aspects, reflection aspects, and feedback aspects

  10. Large-scale hydrological simulations using the soil water assessment tool, protocol development, and application in the danube basin.

    Science.gov (United States)

    Pagliero, Liliana; Bouraoui, Fayçal; Willems, Patrick; Diels, Jan

    2014-01-01

    The Water Framework Directive of the European Union requires member states to achieve good ecological status of all water bodies. A harmonized pan-European assessment of water resources availability and quality, as affected by various management options, is necessary for a successful implementation of European environmental legislation. In this context, we developed a methodology to predict surface water flow at the pan-European scale using available datasets. Among the hydrological models available, the Soil Water Assessment Tool was selected because its characteristics make it suitable for large-scale applications with limited data requirements. This paper presents the results for the Danube pilot basin. The Danube Basin is one of the largest European watersheds, covering approximately 803,000 km and portions of 14 countries. The modeling data used included land use and management information, a detailed soil parameters map, and high-resolution climate data. The Danube Basin was divided into 4663 subwatersheds of an average size of 179 km. A modeling protocol is proposed to cope with the problems of hydrological regionalization from gauged to ungauged watersheds and overparameterization and identifiability, which are usually present during calibration. The protocol involves a cluster analysis for the determination of hydrological regions and multiobjective calibration using a combination of manual and automated calibration. The proposed protocol was successfully implemented, with the modeled discharges capturing well the overall hydrological behavior of the basin. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  11. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  12. Causal inference between bioavailability of heavy metals and environmental factors in a large-scale region

    International Nuclear Information System (INIS)

    Liu, Yuqiong; Du, Qingyun; Wang, Qi; Yu, Huanyun; Liu, Jianfeng; Tian, Yu; Chang, Chunying; Lei, Jing

    2017-01-01

    The causation between bioavailability of heavy metals and environmental factors are generally obtained from field experiments at local scales at present, and lack sufficient evidence from large scales. However, inferring causation between bioavailability of heavy metals and environmental factors across large-scale regions is challenging. Because the conventional correlation-based approaches used for causation assessments across large-scale regions, at the expense of actual causation, can result in spurious insights. In this study, a general approach framework, Intervention calculus when the directed acyclic graph (DAG) is absent (IDA) combined with the backdoor criterion (BC), was introduced to identify causation between the bioavailability of heavy metals and the potential environmental factors across large-scale regions. We take the Pearl River Delta (PRD) in China as a case study. The causal structures and effects were identified based on the concentrations of heavy metals (Zn, As, Cu, Hg, Pb, Cr, Ni and Cd) in soil (0–20 cm depth) and vegetable (lettuce) and 40 environmental factors (soil properties, extractable heavy metals and weathering indices) in 94 samples across the PRD. Results show that the bioavailability of heavy metals (Cd, Zn, Cr, Ni and As) was causally influenced by soil properties and soil weathering factors, whereas no causal factor impacted the bioavailability of Cu, Hg and Pb. No latent factor was found between the bioavailability of heavy metals and environmental factors. The causation between the bioavailability of heavy metals and environmental factors at field experiments is consistent with that on a large scale. The IDA combined with the BC provides a powerful tool to identify causation between the bioavailability of heavy metals and environmental factors across large-scale regions. Causal inference in a large system with the dynamic changes has great implications for system-based risk management. - Causation between the

  13. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  14. New Jersey City University's College of Education Writing Assessment Program: Profile of a Local Response to a Systemic Problem

    Science.gov (United States)

    Fisch, Audrey

    2017-01-01

    This profile presents New Jersey City University's Writing Assessment Program from its creation in 2002 to its elimination in 2017. The program arose as an attempt to raise the writing skills of the diverse, first generation teacher certification candidates in the College of Education. Despite political missteps, the program gained greater…

  15. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    Science.gov (United States)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  16. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  17. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  18. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  19. Single-spot e-beam lithography for defining large arrays of nano-holes

    DEFF Research Database (Denmark)

    Højlund-Nielsen, Emil; Greibe, Tine; Mortensen, N. Asger

    2014-01-01

    V prototype EBL system for speed and pattern fidelity to a minimum writing time of around 30 min/cm2 for 200 nm periods in 2D lattices. The machine time and feasibility of the method are assessed in terms of the trade-off between high current and large writing field. © 2014 Elsevier B.V. All rights reserved....

  20. Multi-scale properties of large eddy simulations: correlations between resolved-scale velocity-field increments and subgrid-scale quantities

    Science.gov (United States)

    Linkmann, Moritz; Buzzicotti, Michele; Biferale, Luca

    2018-06-01

    We provide analytical and numerical results concerning multi-scale correlations between the resolved velocity field and the subgrid-scale (SGS) stress-tensor in large eddy simulations (LES). Following previous studies for Navier-Stokes equations, we derive the exact hierarchy of LES equations governing the spatio-temporal evolution of velocity structure functions of any order. The aim is to assess the influence of the subgrid model on the inertial range intermittency. We provide a series of predictions, within the multifractal theory, for the scaling of correlation involving the SGS stress and we compare them against numerical results from high-resolution Smagorinsky LES and from a-priori filtered data generated from direct numerical simulations (DNS). We find that LES data generally agree very well with filtered DNS results and with the multifractal prediction for all leading terms in the balance equations. Discrepancies are measured for some of the sub-leading terms involving cross-correlation between resolved velocity increments and the SGS tensor or the SGS energy transfer, suggesting that there must be room to improve the SGS modelisation to further extend the inertial range properties for any fixed LES resolution.

  1. A Study of Students’ Assessment in Writing Skills of the English Language

    Directory of Open Access Journals (Sweden)

    Muhammad Javed

    2013-07-01

    Full Text Available This paper addresses to evaluate and assess the students’ competency in writing skills at Secondary school level in the English Language focusing five major content areas: word completion, sentence making/syntax, comprehension, tenses/ grammar and handwriting. The target population was the male and female students of grade 10 of urban and rural Secondary schools from public and private sector. Forty (40 Secondary schools of District Bahawalnagar, Pakistan were taken using stratified sampling. A sample consisting of 440 students (11students from each school was randomly selected using a table of random numbers. An achievement test consisting of different items was developed to assess the students’ competency and capability in sub-skills of writing such as word completion, sentence making/syntax, comprehension, tenses/grammar and handwriting. Mean score and standard deviation were used to analyze the students’ proficiency in each sub-skill. The t-test was applied to make the comparison on the bases of gender, density and public and private sector. The overall performance of all the students was better in comprehension as compared to other sub-skills namely word completion, sentence making/syntax, tenses/grammar and handwriting. The analysis, based on t-value, revealed no significant difference between the performance of male and female students and the students of public and private schools, whereas there was a significant difference between the performance of urban and rural students.

  2. A guide of scientific writing in English

    International Nuclear Information System (INIS)

    Han, Bang Geun

    1987-10-01

    This book introduces reference while writing English paper, how to use letters, punctuation, how to use articles, similar word phrases and verbs used in scientific writings, auxiliary verbs, nouns deeply related to scientific writings, expressions about experiment tools and equipment, expressions of chemicals, how to mark numbers, adjectives and pronouns relevant to numbers, how to make plural form, expressions about multiple, surface area, depth, width, time, period, temperature, humidity. It also adds expressions about sensible assessment, statistics, deviation, signs, abbreviations, and how to write letters in English.

  3. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  4. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  5. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  6. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  7. Personal Identification and the Assessment of the Psychophysiological State While Writing a Signature

    Directory of Open Access Journals (Sweden)

    Pavel Lozhnikov

    2015-08-01

    Full Text Available This article discusses the problem of user identification and psychophysiological state assessment while writing a signature using a graphics tablet. The solution of the problem includes the creation of templates containing handwriting signature features simultaneously with the hidden registration of physiological parameters of a person being tested. Heart rate variability description in the different time points is used as a physiological parameter. As a result, a signature template is automatically generated for psychophysiological states of an identified person. The problem of user identification and psychophysiological state assessment is solved depending on the registered value of a physiological parameter.

  8. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  9. Use of Overhead Transparencies in Collaborative Business Writing.

    Science.gov (United States)

    Barker, Randolph T.; And Others

    1991-01-01

    Asserts that small group collaborative writing exercises that produce overhead transparencies for large class critique can be an effective method for teaching letter and memorandum construction. Offers a five-step process for encouraging individual and collaborative writing skills. (PRA)

  10. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  11. Colorado Student Assessment Program: 2001 Released Passages, Items, and Prompts. Grade 4 Reading and Writing, Grade 4 Lectura y Escritura, Grade 5 Mathematics and Reading, Grade 6 Reading, Grade 7 Reading and Writing, Grade 8 Mathematics, Reading and Science, Grade 9 Reading, and Grade 10 Mathematics and Reading and Writing.

    Science.gov (United States)

    Colorado State Dept. of Education, Denver.

    This document contains released reading comprehension passages, test items, and writing prompts from the Colorado Student Assessment Program for 2001. The sample questions and prompts are included without answers or examples of student responses. Test materials are included for: (1) Grade 4 Reading and Writing; (2) Grade 4 Lectura y Escritura…

  12. Control protocol: large scale implementation at the CERN PS complex - a first assessment

    International Nuclear Information System (INIS)

    Abie, H.; Benincasa, G.; Coudert, G.; Davydenko, Y.; Dehavay, C.; Gavaggio, R.; Gelato, G.; Heinze, W.; Legras, M.; Lustig, H.; Merard, L.; Pearson, T.; Strubin, P.; Tedesco, J.

    1994-01-01

    The Control Protocol is a model-based, uniform access procedure from a control system to accelerator equipment. It was proposed at CERN about 5 years ago and prototypes were developed in the following years. More recently, this procedure has been finalized and implemented at a large scale in the PS Complex. More than 300 pieces of equipment are now using this protocol in normal operation and another 300 are under implementation. These include power converters, vacuum systems, beam instrumentation devices, RF equipment, etc. This paper describes how the single general procedure is applied to the different kinds of equipment. The advantages obtained are also discussed. ((orig.))

  13. A feasibility assessment for incorporating of passive RHRS into large scale active PWR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S O; Sub, S Y; Kim, Y S; Chang, M H; Park, J K [Korea Atomic Energy Research Inst., Taejon (Korea, Republic of)

    1996-12-01

    A feasibility study was carried out for the possible incorporation of passive RHRS (Residual Heat Removal System) into a large-scale of active PWR plant. Four kinds of system configurations were considered. For each case its performance and impacts on plant safety, cost, licensing, operation and maintenance were evaluated. The evaluation came up with a finding of PRHRS with a gravity feed tank as most probable design concept. However, considering rearrangement of structure and pipe routing inside and outside containment, it is concluded that implementation of the PRHRS concept into well developed active plants is not desirable at present. (author). 6 refs, 7 figs, 1 tab.

  14. A feasibility assessment for incorporating of passive RHRS into large scale active PWR

    International Nuclear Information System (INIS)

    Kim, S.O.; Sub, S.Y.; Kim, Y.S.; Chang, M.H.; Park, J.K.

    1996-01-01

    A feasibility study was carried out for the possible incorporation of passive RHRS (Residual Heat Removal System) into a large-scale of active PWR plant. Four kinds of system configurations were considered. For each case its performance and impacts on plant safety, cost, licensing, operation and maintenance were evaluated. The evaluation came up with a finding of PRHRS with a gravity feed tank as most probable design concept. However, considering rearrangement of structure and pipe routing inside and outside containment, it is concluded that implementation of the PRHRS concept into well developed active plants is not desirable at present. (author). 6 refs, 7 figs, 1 tab

  15. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  16. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  17. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  18. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  19. Revising the potential of large-scale Jatropha oil production in Tanzania: An economic land evaluation assessment

    International Nuclear Information System (INIS)

    Segerstedt, Anna; Bobert, Jans

    2013-01-01

    Following up the rather sobering results of the biofuels boom in Tanzania, we analyze the preconditions that would make large-scale oil production from the feedstock Jatropha curcas viable. We do this by employing an economic land evaluation approach; first, we estimate the physical land suitability and the necessary inputs to reach certain amounts of yields. Subsequently, we estimate costs and benefits for different input-output levels. Finally, to incorporate the increased awareness of sustainability in the export sector, we introduce also certification criteria. Using data from an experimental farm in Kilosa, we find that high yields are crucial for the economic feasibility and that they can only be obtained on good soils at high input rates. Costs of compliance with certification criteria depend on site specific characteristics such as land suitability and precipitation. In general, both domestic production and (certified) exports are too expensive to be able to compete with conventional diesel/rapeseed oil from the EU. Even though the crop may have potential for large scale production as a niche product, there is still a lot of risk involved and more experimental research is needed. - Highlights: ► We use an economic land evaluation analysis to reassess the potential of large-scale Jatropha oil. ► High yields are possible only at high input rates and for good soil qualities. ► Production costs are still too high to break even on the domestic and export market. ► More research is needed to stabilize yields and improve the oil content. ► Focus should be on broadening our knowledge-base rather than promoting new Jatropha investments

  20. Assessment of small versus large hydro-power developments - a Norwegian case study

    Energy Technology Data Exchange (ETDEWEB)

    Bakken, Tor Haakon; Harby, Atle

    2010-07-01

    Full text: The era of new, large hydro-power development projects seems to be over in Norway. Partly as a response to this, a large number of applications for the development of smallscale hydro power projects up to 10 MW overflow the Water Resources and Energy Directorate, resulting in an extensive development of small tributaries and water courses in Norway. This study has developed a framework for the assessment and comparison of several small versus many large hydro-power projects based on a multi-criteria analysis (MCA) approach, and further tested this approach on planned or developed projects in the Helgeland region, Norway. Multi-criteria analysis is a decision-support tool aimed at providing a systematic approach for the comparison of various alternatives with often non-commensurable and conflicting attributes. At the same time, the technique enables complex problems and various alternatives to be assessed in a transparent and simple way. The MCA-software was in our case equipped with 2 overall criteria (objectives) with a number of sub criteria; Production with sub-criteria like volume of energy production, installed effect, storage capacity and economical profit; Environmental impacts with sub-criteria like fishing interests, biodiversity, protection of unexploited nature The data used in the case study is based on the planned development of Vefsna (large project) with the energy/effect production estimated and the environmental impacts identified as part of the feasibility studies (the project never reached the authorities' licensing system with a formal EIA). The small-scale hydro-power projects used for comparison are based on realized projects in the Helgeland region and a number of proposed projects, up scaled to the size of the proposed Vefsna-development. The results from the study indicate that a large number of small-scale hydro-power projects need to be implemented in order to balance the volume of produced electricity/effect from one

  1. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  2. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  3. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  4. Clinical writing and the documentary construction of schizophrenia.

    Science.gov (United States)

    Barrett, R J

    1988-09-01

    Psychiatric practice involves writing as much as it involves talking. This study examines the interpretive processes of reading, writing and interviewing which are central to the clinical interaction. It is part of a broader ethnographic study of an Australian psychiatric hospital (which specializes in the treatment of patients with a diagnosis of schizophrenia). The paper examines two major types of written assessment of patients--the admission assessment and the 'complete work-up.' Writing is analyzed as performance, thereby focusing on the transformations that are effected in patients, their perceptions of their schizophrenia, and their total identity. One crucial transformation is from 'person suffering from schizophrenia' to 'schizophrenic.' The paper aims to show that as much as psychiatry is a 'talking cure' it is also a 'writing cure.'

  5. Large-Scale, Parallel, Multi-Sensor Data Fusion in the Cloud

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.

    2012-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To efficiently assemble such decade-scale datasets in a timely manner, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. "SciReduce" is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, in which simple tuples (keys & values) are passed between the map and reduce functions, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Thus, SciReduce uses the native datatypes (geolocated grids, swaths, and points) that geo-scientists are familiar with. We are deploying within Sci

  6. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  7. Factors Affecting the Rate of Penetration of Large-Scale Electricity Technologies: The Case of Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    James R. McFarland; Howard J. Herzog

    2007-05-14

    This project falls under the Technology Innovation and Diffusion topic of the Integrated Assessment of Climate Change Research Program. The objective was to better understand the critical variables that affect the rate of penetration of large-scale electricity technologies in order to improve their representation in integrated assessment models. We conducted this research in six integrated tasks. In our first two tasks, we identified potential factors that affect penetration rates through discussions with modeling groups and through case studies of historical precedent. In the next three tasks, we investigated in detail three potential sets of critical factors: industrial conditions, resource conditions, and regulatory/environmental considerations. Research to assess the significance and relative importance of these factors involved the development of a microeconomic, system dynamics model of the US electric power sector. Finally, we implemented the penetration rate models in an integrated assessment model. While the focus of this effort is on carbon capture and sequestration technologies, much of the work will be applicable to other large-scale energy conversion technologies.

  8. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Does Automated Feedback Improve Writing Quality?

    Science.gov (United States)

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  10. Large-scale volcanism associated with coronae on Venus

    Science.gov (United States)

    Roberts, K. Magee; Head, James W.

    1993-01-01

    The formation and evolution of coronae on Venus are thought to be the result of mantle upwellings against the crust and lithosphere and subsequent gravitational relaxation. A variety of other features on Venus have been linked to processes associated with mantle upwelling, including shield volcanoes on large regional rises such as Beta, Atla and Western Eistla Regiones and extensive flow fields such as Mylitta and Kaiwan Fluctus near the Lada Terra/Lavinia Planitia boundary. Of these features, coronae appear to possess the smallest amounts of associated volcanism, although volcanism associated with coronae has only been qualitatively examined. An initial survey of coronae based on recent Magellan data indicated that only 9 percent of all coronae are associated with substantial amounts of volcanism, including interior calderas or edifices greater than 50 km in diameter and extensive, exterior radial flow fields. Sixty-eight percent of all coronae were found to have lesser amounts of volcanism, including interior flooding and associated volcanic domes and small shields; the remaining coronae were considered deficient in associated volcanism. It is possible that coronae are related to mantle plumes or diapirs that are lower in volume or in partial melt than those associated with the large shields or flow fields. Regional tectonics or variations in local crustal and thermal structure may also be significant in determining the amount of volcanism produced from an upwelling. It is also possible that flow fields associated with some coronae are sheet-like in nature and may not be readily identified. If coronae are associated with volcanic flow fields, then they may be a significant contributor to plains formation on Venus, as they number over 300 and are widely distributed across the planet. As a continuation of our analysis of large-scale volcanism on Venus, we have reexamined the known population of coronae and assessed quantitatively the scale of volcanism associated

  11. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  12. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  13. Experimental facilities for large-scale and full-scale study of hydrogen accidents

    Energy Technology Data Exchange (ETDEWEB)

    Merilo, E.; Groethe, M.; Colton, J. [SRI International, Poulter Laboratory, Menlo Park, CA (United States); Chiba, S. [SRI Japan, Tokyo (Japan)

    2007-07-01

    This paper summarized some of the work that has been performed at SRI International over the past 5 years that address safety issues for the hydrogen-based economy. Researchers at SRI International have conducted experiments at the Corral Hollow Experiment Site (CHES) near Livermore California to obtain fundamental data on hydrogen explosions for risk assessment. In particular, large-scale hydrogen tests were conducted using homogeneous mixtures of hydrogen in volumes from 5.3 m{sup 3} to 300 m{sup 3} to represent scenarios involving fuel cell vehicles as well as transport and storage facilities. Experiments have focused on unconfined deflagrations of hydrogen and air, and detonations of hydrogen in a semi-open space to measure free-field blast effects; the use of blast walls as a mitigation technique; turbulent enhancement of hydrogen combustion due to obstacles within the mixture, and determination of when deflagration-to-detonation transition occurs; the effect of confined hydrogen releases and explosions that could originate from an interconnecting hydrogen pipeline; and, large and small accidental releases of hydrogen. The experiments were conducted to improve the prediction of hydrogen explosions and the capabilities for performing risk assessments, and to develop mitigation techniques. Measurements included hydrogen concentration; flame speed; blast overpressure; heat flux; and, high-speed, standard, and infrared video. The data collected in these experiments is used to correlate computer models and to facilitate the development of codes and standards. This work contributes to better safety technology by evaluating the effectiveness of different blast mitigation techniques. 13 refs., 13 figs.

  14. Investigation of Writing Strategies, Writing Apprehension, and Writing Achievement among Saudi EFL-Major Students

    Science.gov (United States)

    Al Asmari, AbdulRahman

    2013-01-01

    The tenet of this study is to investigate the use of writing strategies in reducing writing apprehension and uncovering its effect on EFL students' writing achievement. It also attempts to explore associations between foreign language apprehension, writing achievement and writing strategies. The primary aims of the study were to explore the…

  15. University writing

    Directory of Open Access Journals (Sweden)

    Miguel Zabalza Beraza

    2013-01-01

    Full Text Available Writing in the University is a basic necessity and a long-range educational purpose. One of the basic characteristics of the university context is that it requires writing both as a tool of communication and as a source of intellectual stimulation. After establishing the basic features of academic writing, this article analyzes the role of writing for students (writing to learn and for teachers (write to plan, to reflect, to document what has been done. The article also discusses the contributions of writing for both students and teachers together: writing to investigate. Finally, going beyond what writing is as academic tool, we conclude with a more playful and creative position: writing for pleasure and enjoyment.

  16. A large-scale study of epilepsy in Ecuador: methodological aspects.

    Science.gov (United States)

    Placencia, M; Suarez, J; Crespo, F; Sander, J W; Shorvon, S D; Ellison, R H; Cascante, S M

    1992-01-01

    The methodology is presented of a large-scale study of epilepsy carried out in a highland area in northern Ecuador, South America, covering a population of 72,121 people; The study was carried out in two phases, the first, a cross-sectional phase, consisted of a house-to-house survey of all persons in this population, screening for epileptic seizures using a specially designed questionnaire. Possible cases identified in screening were assessed in a cascade diagnostic procedure applied by general doctors and neurologists. Its objectives were: to establish a comprehensive epidemiological profile of epileptic seizures; to describe the clinical phenomenology of this condition in the community; to validate methods for diagnosis and classification of epileptic seizures by a non-specialised team; and to ascertain the community's knowledge, attitudes and practices regarding epilepsy. A sample was selected in this phase in order to study the social aspects of epilepsy in this community. The second phase, which was longitudinal, assessed the ability of non-specialist care in the treatment of epilepsy. It consisted of a prospective clinical trial of antiepileptic therapy in untreated patients using two standard anti-epileptic drugs. Patients were followed for 12 months by a multidisciplinary team consisting of a primary health worker, rural doctor, neurologist, anthropologist, and psychologist. Standardised, reproducible instruments and methods were used. This study was carried out through co-operation between the medical profession, political agencies and the pharmaceutical industry, at an international level. We consider this a model for further large-scale studies of this type.

  17. Transfrontier consequences to the population of Greece of large scale nuclear accidents: a preliminary assessment

    International Nuclear Information System (INIS)

    Kollas, J.G.; Catsaros, Nicolas.

    1985-06-01

    In this report the consequences to the population of Greece from hypothetical large scale nuclear accidents at the Kozlodui (Bulgaria) nuclear power station are estimated under some simplifying assumptions. Three different hypothetical accident scenarios - the most serious for pressurized water reactors - are examined. The analysis is performed by the current Greek version of code CRAC2 and includes health and economic consequences to the population of Greece. (author)

  18. Using PELA to Predict International Business Students' English Writing Performance with Contextualised English Writing Workshops as Intervention Program

    Science.gov (United States)

    Wong, Caroline; Delante, Nimrod Lawsin; Wang, Pengji

    2017-01-01

    This study examines the effectiveness of Post-Entry English Language Assessment (PELA) as a predictor of international business students' English writing performance and academic performance. An intervention involving the implementation of contextualised English writing workshops was embedded in a specific business subject targeted at students who…

  19. DNA barcoding at riverscape scales: Assessing biodiversity among fishes of the genus Cottus (Teleostei) in northern Rocky Mountain streams

    Science.gov (United States)

    Michael K. Young; Kevin S. McKelvey; Kristine L. Pilgrim; Michael K. Schwartz

    2013-01-01

    There is growing interest in broad-scale biodiversity assessments that can serve as benchmarks for identifying ecological change. Genetic tools have been used for such assessments for decades, but spatial sampling considerations have largely been ignored. Here, we demonstrate how intensive sampling efforts across a large geographical scale can influence identification...

  20. Towards a more explicit writing pedagogy: The complexity of teaching argumentative writing

    Directory of Open Access Journals (Sweden)

    Jacqui Dornbrack

    2014-04-01

    Full Text Available Advances in technology, changes in communication practices, and the imperatives of the workplace have led to the repositioning of the role of writing in the global context. This has implications for the teaching of writing in schools. This article focuses on the argumentative essay, which is a high-stakes genre. A sample of work from one Grade 10 student identified as high performing in a township school in Cape Town (South Africa is analysed. Drawing on the work of Ormerod and Ivanic, who argue that writing practices can be inferred from material artifacts, as well as critical discourse analysis, we show that the argumentative genre is complex, especially for novice first additional language English writers. This complexity is confounded by the conflation of the process and genre approaches in the Curriculum and Assessment Policy Statement (CAPS document. Based on the analysis we discuss the implications of planning, particularly in relation to thinking and reasoning, the need to read in order to write argument and how social and school capital are insufficient without explicit instruction of the conventions of this complex genre. These findings present some insights into particular input needed to improve writing pedagogy for specific genres.

  1. Recent Regional Climate State and Change - Derived through Downscaling Homogeneous Large-scale Components of Re-analyses

    Science.gov (United States)

    Von Storch, H.; Klehmet, K.; Geyer, B.; Li, D.; Schubert-Frisius, M.; Tim, N.; Zorita, E.

    2015-12-01

    Global re-analyses suffer from inhomogeneities, as they process data from networks under development. However, the large-scale component of such re-analyses is mostly homogeneous; additional observational data add in most cases to a better description of regional details and less so on large-scale states. Therefore, the concept of downscaling may be applied to homogeneously complementing the large-scale state of the re-analyses with regional detail - wherever the condition of homogeneity of the large-scales is fulfilled. Technically this can be done by using a regional climate model, or a global climate model, which is constrained on the large scale by spectral nudging. This approach has been developed and tested for the region of Europe, and a skillful representation of regional risks - in particular marine risks - was identified. While the data density in Europe is considerably better than in most other regions of the world, even here insufficient spatial and temporal coverage is limiting risk assessments. Therefore, downscaled data-sets are frequently used by off-shore industries. We have run this system also in regions with reduced or absent data coverage, such as the Lena catchment in Siberia, in the Yellow Sea/Bo Hai region in East Asia, in Namibia and the adjacent Atlantic Ocean. Also a global (large scale constrained) simulation has been. It turns out that spatially detailed reconstruction of the state and change of climate in the three to six decades is doable for any region of the world.The different data sets are archived and may freely by used for scientific purposes. Of course, before application, a careful analysis of the quality for the intended application is needed, as sometimes unexpected changes in the quality of the description of large-scale driving states prevail.

  2. Writing Biomedical Manuscripts Part II: Standard Elements and ...

    African Journals Online (AJOL)

    Several reasons account for rejection or delay of manuscripts submitted to ... You need to have results sorted out early as the rest of what you will write is largely ... follow the universal rules of writing and those of the target journal rules while ...

  3. Healing Classrooms: Therapeutic Possibilities in Academic Writing

    Science.gov (United States)

    Batzer, Benjamin

    2016-01-01

    This article asks us to consider what the process of healing and composition pedagogy have to learn from each other. More specifically, it identifies how the therapeutic potential of writing, which has been largely neglected in the academy in recent years, can influence the ways we teach transferable writing skills. The article considers how…

  4. Assessment of narrative writing by Persian-speaking students with hearing impairments.

    Science.gov (United States)

    Zamani, P; Soleymani, Z; Mousavi, S M; Akbari, N

    2018-02-16

    Previous studies have highlighted that narrative skill is critical to the development of the literacy skills by children. Children with cochlear implants (CI) and hearing aids (HA) may have problems in narrative development compared to peers with healthy hearing (HH). There is no exact data about the narrative writing ability of Persian-speaking students who are hearing-impaired. This study was undertaken to compare the microstructure and macrostructure scores for narrative writing of Persian-speaking students who are hearing-impaired and peers with HH. This was a cross-sectional descriptive-analytical study. The subjects were recruited from elementary schools in the city of Tehran. A total of 144 elementary school students were participated. The written narratives were elicited using a wordless pictorial storybook story. Three-way ANOVA with post hoc adjusted Bonferroni test was applied to determine the main effects and interactions of grounded variables on the microstructure and macrostructure components of narrative writing. No significant differences were observed in the macrostructure components of narrative writing between hearing-impaired and HH students. Factors analysis showed that the 4th grade HH students had significantly the highest scores, and the 3rd grade HA students had significantly the lowest scores in microstructure components of narrative writing. The findings revealed that hearing-impaired students similarly to their HH peers can transmit the main idea (macrostructure) of narrative writing, but show critical difficulties when using complete grammatical elements (microstructures) to form sentences to convey the idea in the narrative. © 2018 John Wiley & Sons Ltd.

  5. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  6. ENHANCING WRITING SKILL THROUGH WRITING PROCESS APPROACH

    Directory of Open Access Journals (Sweden)

    M. Zaini Miftah

    2015-03-01

    Full Text Available The study is aimed at developing the implementation of Writing Process Approach (WPA to enhance the students’ skill in writing essay. The study employed Classroom Action Research. The subjects of the study were 15 university students enrolled in the writing class. The data were gained from writing task, observation and field notes. The findings show that the implementation of WPA with the proper model procedures developed can enhance the students’ skill in writing essay. Before the strategy was implemented, the percentage of the students achieving the score greater than or equal to C (56-70 was 40.00% (6 students of the class. However, after the strategy was implemented in Cycle I, it enhanced enough to 60.00% (9 students of the class, but this result did not meet the criteria of success set up in the study. Next, in Cycle II it increased slightly to 86.67% (13 students of the class. Thus, the enhancement of the students’ skill in writing essay can be reached but it should follow the proper model procedures of the implementation of WPA developed. Keywords: writing process approach, writing skill, essay writing

  7. ENHANCING WRITING SKILL THROUGH WRITING PROCESS APPROACH

    OpenAIRE

    M. Zaini Miftah

    2015-01-01

    The study is aimed at developing the implementation of Writing Process Approach (WPA) to enhance the students’ skill in writing essay. The study employed Classroom Action Research. The subjects of the study were 15 university students enrolled in the writing class. The data were gained from writing task, observation and field notes. The findings show that the implementation of WPA with the proper model procedures developed can enhance the students’ skill in writing essay. Before the strategy ...

  8. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  9. Large-scale renewable energy project barriers: Environmental impact assessment streamlining efforts in Japan and the EU

    International Nuclear Information System (INIS)

    Schumacher, Kim

    2017-01-01

    Environmental Impact Assessment (EIA) procedures have been identified as a major barrier to renewable energy (RE) development with regards to large-scale projects (LS-RE). However EIA laws have also been neglected by many decision-makers who have been underestimating its impact on RE development and the stifling potential they possess. As a consequence, apart from acknowledging the shortcomings of the systems currently in place, few governments momentarily have concrete plans to reform their EIA laws. By looking at recent EIA streamlining efforts in two industrialized regions that underwent major transformations in their energy sectors, this paper attempts to assess how such reform efforts can act as a means to support the balancing of environmental protection and climate change mitigation with socio-economic challenges. Thereby this paper fills this intellectual void by identifying the strengths and weaknesses of the Japanese EIA law by contrasting it with the recently revised EIA Directive of the European Union (EU). This enables the identification of the regulatory provisions that impact RE development the most and the determination of how structured EIA law reforms would affect domestic RE project development. The main focus lies on the evaluation of regulatory streamlining efforts in the Japanese and EU contexts through the application of a mixed-methods approach, consisting of in-depth literary and legal reviews, followed by a comparative analysis and a series of semi-structured interviews. Highlighting several legal inconsistencies in combination with the views of EIA professionals, academics and law- and policymakers, allowed for a more comprehensive assessment of what streamlining elements of the reformed EU EIA Directive and the proposed Japanese EIA framework modifications could either promote or stifle further RE deployment. - Highlights: •Performs an in-depth review of EIA reforms in OECD territories •First paper to compare Japan and the European

  10. Large-scale diversity of slope fishes: pattern inconsistency between multiple diversity indices.

    Science.gov (United States)

    Gaertner, Jean-Claude; Maiorano, Porzia; Mérigot, Bastien; Colloca, Francesco; Politou, Chrissi-Yianna; Gil De Sola, Luis; Bertrand, Jacques A; Murenu, Matteo; Durbec, Jean-Pierre; Kallianiotis, Argyris; Mannini, Alessandro

    2013-01-01

    Large-scale studies focused on the diversity of continental slope ecosystems are still rare, usually restricted to a limited number of diversity indices and mainly based on the empirical comparison of heterogeneous local data sets. In contrast, we investigate large-scale fish diversity on the basis of multiple diversity indices and using 1454 standardized trawl hauls collected throughout the upper and middle slope of the whole northern Mediterranean Sea (36°3'- 45°7' N; 5°3'W - 28°E). We have analyzed (1) the empirical relationships between a set of 11 diversity indices in order to assess their degree of complementarity/redundancy and (2) the consistency of spatial patterns exhibited by each of the complementary groups of indices. Regarding species richness, our results contrasted both the traditional view based on the hump-shaped theory for bathymetric pattern and the commonly-admitted hypothesis of a large-scale decreasing trend correlated with a similar gradient of primary production in the Mediterranean Sea. More generally, we found that the components of slope fish diversity we analyzed did not always show a consistent pattern of distribution according either to depth or to spatial areas, suggesting that they are not driven by the same factors. These results, which stress the need to extend the number of indices traditionally considered in diversity monitoring networks, could provide a basis for rethinking not only the methodological approach used in monitoring systems, but also the definition of priority zones for protection. Finally, our results call into question the feasibility of properly investigating large-scale diversity patterns using a widespread approach in ecology, which is based on the compilation of pre-existing heterogeneous and disparate data sets, in particular when focusing on indices that are very sensitive to sampling design standardization, such as species richness.

  11. Large-scale seismic signal analysis with Hadoop

    Science.gov (United States)

    Addair, T. G.; Dodge, D. A.; Walter, W. R.; Ruppert, S. D.

    2014-05-01

    In seismology, waveform cross correlation has been used for years to produce high-precision hypocenter locations and for sensitive detectors. Because correlated seismograms generally are found only at small hypocenter separation distances, correlation detectors have historically been reserved for spotlight purposes. However, many regions have been found to produce large numbers of correlated seismograms, and there is growing interest in building next-generation pipelines that employ correlation as a core part of their operation. In an effort to better understand the distribution and behavior of correlated seismic events, we have cross correlated a global dataset consisting of over 300 million seismograms. This was done using a conventional distributed cluster, and required 42 days. In anticipation of processing much larger datasets, we have re-architected the system to run as a series of MapReduce jobs on a Hadoop cluster. In doing so we achieved a factor of 19 performance increase on a test dataset. We found that fundamental algorithmic transformations were required to achieve the maximum performance increase. Whereas in the original IO-bound implementation, we went to great lengths to minimize IO, in the Hadoop implementation where IO is cheap, we were able to greatly increase the parallelism of our algorithms by performing a tiered series of very fine-grained (highly parallelizable) transformations on the data. Each of these MapReduce jobs required reading and writing large amounts of data. But, because IO is very fast, and because the fine-grained computations could be handled extremely quickly by the mappers, the net was a large performance gain.

  12. Learning Potential in Narrative Writing: Measuring the Psychometric Properties of an Assessment Tool.

    Science.gov (United States)

    Gurgel, Léia G; de Oliveira, Mônica M C; Joly, Maria C R A; Reppold, Caroline T

    2017-01-01

    Objective: The Computerized and Dynamic Writing Test (TIDE) is designed to examine the learning potential of adolescents in narrative writing. This was a validation study of the TIDE based on its internal structure. Learning potential is responsible for cognitive modifiabilty according to the Theory of Cognitive Structural Modifiability (CSM) developed by Feüerstein. Method: Included 304 participants between 10 and 17 years of age from schools in the South of Brazil. The data collection involved student groups that were divided according to age and school grade. Each participant reponded to the TIDE for an average of 50 min in the school's computer lab. The participants' selection criteria were: being regularly enrolled in the fifth to eighth grade and providing an informed consent form signed by a responsible caregiver. The exclusion criteria included: neurological problems, having been held back in school for two or more years, not cooperating, not completing the test for any reason and physical conditions impeding the assessment. Results: The Kendall test indicated agreement between two evaluators, who corrected the participants' first and second texts that resulted from applying the TIDE. The TIDE is divided into three modules. Factor analysis was applied to the first module (pre-test), which revealed a division in two factors, and to the second module (instructional module), which was divided in three factors. The reliability of the TIDE items was verified using Cronbach's Alpha with coefficients >0.7. The analysis of the third module (post-test) was based on McNemar's Test and showed statistically significant results that demonstrated an evolution in the participants' learning potential. Conclusion: The TIDE proved to be valid and is considered a relevant tool for speech, language, hearing, psychological and educational assessment. The original nature of the tool presented here is highlighted, based on the dynamic assessment method, offering data on a

  13. Learning Potential in Narrative Writing: Measuring the Psychometric Properties of an Assessment Tool

    Directory of Open Access Journals (Sweden)

    Léia G. Gurgel

    2017-05-01

    Full Text Available Objective: The Computerized and Dynamic Writing Test (TIDE is designed to examine the learning potential of adolescents in narrative writing. This was a validation study of the TIDE based on its internal structure. Learning potential is responsible for cognitive modifiabilty according to the Theory of Cognitive Structural Modifiability (CSM developed by Feüerstein.Method: Included 304 participants between 10 and 17 years of age from schools in the South of Brazil. The data collection involved student groups that were divided according to age and school grade. Each participant reponded to the TIDE for an average of 50 min in the school's computer lab. The participants' selection criteria were: being regularly enrolled in the fifth to eighth grade and providing an informed consent form signed by a responsible caregiver.The exclusion criteria included: neurological problems, having been held back in school for two or more years, not cooperating, not completing the test for any reason and physical conditions impeding the assessment.Results: The Kendall test indicated agreement between two evaluators, who corrected the participants' first and second texts that resulted from applying the TIDE. The TIDE is divided into three modules. Factor analysis was applied to the first module (pre-test, which revealed a division in two factors, and to the second module (instructional module, which was divided in three factors. The reliability of the TIDE items was verified using Cronbach's Alpha with coefficients >0.7. The analysis of the third module (post-test was based on McNemar's Test and showed statistically significant results that demonstrated an evolution in the participants' learning potential.Conclusion: The TIDE proved to be valid and is considered a relevant tool for speech, language, hearing, psychological and educational assessment. The original nature of the tool presented here is highlighted, based on the dynamic assessment method, offering data

  14. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  15. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  16. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  17. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  18. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  19. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    Science.gov (United States)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  20. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  1. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  2. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  3. Recent Developments in Language Assessment and the Case of Four Large-Scale Tests of ESOL Ability

    Science.gov (United States)

    Stoynoff, Stephen

    2009-01-01

    This review article surveys recent developments and validation activities related to four large-scale tests of L2 English ability: the iBT TOEFL, the IELTS, the FCE, and the TOEIC. In addition to describing recent changes to these tests, the paper reports on validation activities that were conducted on the measures. The results of this research…

  4. Source-Based Tasks in Writing Independent and Integrated Essays

    Directory of Open Access Journals (Sweden)

    Javad Gholami

    2017-07-01

    Full Text Available Integrated writing tasks have gained considerable attention in ESL and EFL writing assessment and are frequently needed and used in academic settings and daily life. However, they are very rarely practiced and promoted in writing classes. This paper explored the effects of source-based writing practice on EFL learners’ composing abilities and investigated the probable differences between those tasks and independent writing ones in improving Iranian EFL learners’ essay writing abilities. To this end, a quasi-experimental design was implemented to gauge EFL learners’ writing improvements using a pretest-posttest layout. Twenty female learners taking a TOEFL iBT preparation course were randomly divided into an only-writing group with just independent writing instruction and essay practice, and a hybrid-writing-approach group receiving instruction and practice on independent writing plus source-based essay writing for ten sessions. Based on the findings, the participants with hybrid writing practice outperformed their counterparts in integrated essay tests. Their superior performance was not observed in the case of traditional independent writing tasks. The present study calls for incorporating more source-based writing tasks in writing courses.

  5. The writing approaches of secondary students.

    Science.gov (United States)

    Lavelle, Ellen; Smith, Jennifer; O'Ryan, Leslie

    2002-09-01

    Research with college students has supported a model of writing approaches that defines the relationship between a writer and writing task along a deep and surface process continuum (Biggs, 1988). Based on that model, Lavelle (1993) developed the Inventory of Processes in College Composition which reflects students' motives and strategies as related to writing outcomes. It is also important to define the approaches of secondary students to better understand writing processes at that level, and development in written composition. This study was designed to define the writing approaches of secondary students by factor analysing students' responses to items regarding writing beliefs and writing strategies, and to compare the secondary approaches to those of college students. A related goal was to explore the relationships of the secondary writing approaches to perceived self-regulatory efficacy for writing (Zimmerman & Bandura, 1994), writing preferences, and writing outcomes. The initial, factor analytic phase involved 398 junior level high school students (11th grade) enrolled in a mandatory language arts class at each of three large Midwestern high schools (USA). Then, 49 junior level students enrolled in two language arts classes participated as subjects in the second phase. Classroom teachers administered the Inventory of Processes in College Composition (Lavelle, 1993), which contained 72 true-or-false items regarding writing beliefs and strategies, during regular class periods. Data were factor analysed and the structure compared to that of college students. In the second phase, the new inventory, Inventory of Processes in Secondary Composition, was administered in conjunction with the Perceived Self-Regulatory Efficacy for Writing Inventory (Zimmerman & Bandura, 1994), and a writing preferences survey. A writing sample and grade in Language Arts classes were obtained and served as outcome variables. The factor structure of secondary writing reflected three

  6. The Chinese version of the Myocardial Infarction Dimensional Assessment Scale (MIDAS: Mokken scaling

    Directory of Open Access Journals (Sweden)

    Watson Roger

    2012-01-01

    Full Text Available Abstract Background Hierarchical scales are very useful in clinical practice due to their ability to discriminate precisely between individuals, and the original English version of the Myocardial Infarction Dimensional Assessment Scale has been shown to contain a hierarchy of items. The purpose of this study was to analyse a Mandarin Chinese translation of the Myocardial Infarction Dimensional Assessment Scale for a hierarchy of items according to the criteria of Mokken scaling. Data from 180 Chinese participants who completed the Chinese translation of the Myocardial Infarction Dimensional Assessment Scale were analysed using the Mokken Scaling Procedure and the 'R' statistical programme using the diagnostics available in these programmes. Correlation between Mandarin Chinese items and a Chinese translation of the Short Form (36 Health Survey was also analysed. Findings Fifteen items from the Mandarin Chinese Myocardial Infarction Dimensional Assessment Scale were retained in a strong and reliable Mokken scale; invariant item ordering was not evident and the Mokken scaled items of the Chinese Myocardial Infarction Dimensional Assessment Scale correlated with the Short Form (36 Health Survey. Conclusions Items from the Mandarin Chinese Myocardial Infarction Dimensional Assessment Scale form a Mokken scale and this offers further insight into how the items of the Myocardial Infarction Dimensional Assessment Scale relate to the measurement of health-related quality of life people with a myocardial infarction.

  7. Laser deposition and direct-writing of thermoelectric misfit cobaltite thin films

    Science.gov (United States)

    Chen, Jikun; Palla-Papavlu, Alexandra; Li, Yulong; Chen, Lidong; Shi, Xun; Döbeli, Max; Stender, Dieter; Populoh, Sascha; Xie, Wenjie; Weidenkaff, Anke; Schneider, Christof W.; Wokaun, Alexander; Lippert, Thomas

    2014-06-01

    A two-step process combining pulsed laser deposition of calcium cobaltite thin films and a subsequent laser induced forward transfer as micro-pixel is demonstrated as a direct writing approach of micro-scale thin film structures for potential applications in thermoelectric micro-devices. To achieve the desired thermo-electric properties of the cobaltite thin film, the laser induced plasma properties have been characterized utilizing plasma mass spectrometry establishing a direct correlation to the corresponding film composition and structure. The introduction of a platinum sacrificial layer when growing the oxide thin film enables a damage-free laser transfer of calcium cobaltite thereby preserving the film composition and crystallinity as well as the shape integrity of the as-transferred pixels. The demonstrated direct writing approach simplifies the fabrication of micro-devices and provides a large degree of flexibility in designing and fabricating fully functional thermoelectric micro-devices.

  8. Scaling earthquake ground motions for performance-based assessment of buildings

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.

    2011-01-01

    The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.

  9. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  10. Using gamification to develop academic writing skills in dental undergraduate students.

    Science.gov (United States)

    El Tantawi, Maha; Sadaf, Shazia; AlHumaid, Jehan

    2018-02-01

    To assess the satisfaction of first-year dental students with gamification and its effect on perceived and actual improvement of academic writing. Two first-year classes of dental undergraduate students were recruited for the study which extended over 4 months and ended in January 2015. A pre-intervention assessment of students' academic writing skills was performed using criteria to evaluate writing. The same criteria were used to evaluate the final writing assignment after the intervention. Students' satisfaction with game aspects was assessed. The per cent change in writing score was regressed on scores of satisfaction with game aspects controlling for gender. Perceived improvement in writing was also assessed. Data from 87 (94.6%) students were available for analysis. Students' overall satisfaction with the gamified experience was modest [mean (SD) = 5.9 (2.1)] and so was their overall perception of improvement in writing [mean (SD) = 6.0 (2.2)]. The per cent score of the first assignment was 35.6 which improved to 80 in the last assignment. Satisfaction with playing the game was significantly associated with higher percentage of improvement in actual writing skills [regression coefficient (95% confidence interval) = 21.1 (1.9, 40.2)]. Using gamification in an obligatory course for first-year dental students was associated with an improvement in academic writing skills although students' satisfaction with game aspects was modest and their willingness to use gamification in future courses was minimal. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Technical Design Report for large-scale neutrino detectors prototyping and phased performance assessment in view of a long-baseline oscillation experiment

    CERN Document Server

    De Bonis, I.; Duchesneau, D.; Pessard, H.; Bordoni, S.; Ieva, M.; Lux, T.; Sanchez, F.; Jipa, A.; Lazanu, I.; Calin, M.; Esanu, T.; Ristea, O.; Ristea, C.; Nita, L.; Efthymiopoulos, I.; Nessi, M.; Asfandiyarov, R.; Blondel, A.; Bravar, A.; Cadoux, F.; Haesler, A.; Karadzhov, Y.; Korzenev, A.; Martin, C.; Noah, E.; Ravonel, M.; Rayner, M.; Scantamburlo, E.; Bayes, R.; Soler, F.J.P.; Nuijten, G.A.; Loo, K.; Maalampi, J.; Slupecki, M.; Trzaska, W.H.; Campanelli, M.; Blebea-Apostu, A.M.; Chesneanu, D.; Gomoiu, M.C; Mitrica, B.; Margineanu, R.M.; Stanca, D.L.; Colino, N.; Gil-Botella, I.; Novella, P.; Palomares, C.; Santorelli, R.; Verdugo, A.; Karpikov, I.; Khotjantsev, A.; Kudenko, Y.; Mefodiev, A.; Mineev, O.; Ovsiannikova, T.; Yershov, N.; Enqvist, T.; Kuusiniemi, P.; De La Taille, C.; Dulucq, F.; Martin-Chassard, G.; Andrieu, B.; Dumarchez, J.; Giganti, C.; Levy, J.-M.; Popov, B.; Robert, A.; Agostino, L.; Buizza-Avanzini, M.; Dawson, J.; Franco, D.; Gorodetzky, P.; Kryn, D.; Patzak, T.; Tonazzo, A.; Vannucci, F.; Bésida, O.; Bolognesi, S.; Delbart, A.; Emery, S.; Galymov, V.; Mazzucato, E.; Vasseur, G.; Zito, M.; Bogomilov, M.; Tsenov, R.; Vankova-Kirilova, G.; Friend, M.; Hasegawa, T.; Nakadaira, T.; Sakashita, K.; Zambelli, L.; Autiero, D.; Caiulo, D.; Chaussard, L.; Déclais, Y.; Franco, D.; Marteau, J.; Pennacchio, E.; Bay, F.; Cantini, C.; Crivelli, P.; Epprecht, L.; Gendotti, A.; Di Luise, S.; Horikawa, S.; Murphy, S.; Nikolics, K.; Periale, L.; Regenfus, C.; Rubbia, A.; Sgalaberna, D.; Viant, T.; Wu, S.; Sergiampietri, F.; CERN. Geneva. SPS and PS Experiments Committee; SPSC

    2014-01-01

    In June 2012, an Expression of Interest for a long-baseline experiment (LBNO, CERN-SPSC-EOI-007) has been submitted to the CERN SPSC and is presently under review. LBNO considers three types of neutrino detector technologies: a double-phase liquid argon (LAr) TPC and a magnetised iron detector as far detectors. For the near detector, a high-pressure gas TPC embedded in a calorimeter and a magnet is the baseline design. A mandatory milestone in view of any future long baseline experiment is a concrete prototyping effort towards the envisioned large-scale detectors, and an accompanying campaign of measurements aimed at assessing the systematic errors that will be affecting their intended physics programme. Following an encouraging feedback from 108th SPSC on the technology choices, we have defined as priority the construction and operation of a $6\\times 6\\times 6$m$^3$ (active volume) double-phase liquid argon (DLAr) demonstrator, and a parallel development of the technologies necessary for large magnetised MIN...

  12. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  13. We learn to write by reading, but writing can make you smarter We learn to write by reading, but writing can make you smarter

    Directory of Open Access Journals (Sweden)

    Stephen Krashen

    2008-04-01

    Full Text Available My goal in this paper is to make Iwo points: Writing style does not come from writing or from direct instruction, but from reading. Actual writing can help us solve problems and can make us smarter. Writing Style Comes from Readino, A substantial amount of research strongly suggests that we learn to write by reading. To be more precise, we acquire writing style, the special language of writing, by reading. Hypothesizing that writing style comes from reading, not from writing or instniction, is consistent with what is known about language acquisition: Most of language acquisition lakes place subconsciously, not through deliberate study, and it is a result of input (comprehension, not output (production (Krashen, 1982. My goal in this paper is to make Iwo points: Writing style does not come from writing or from direct instruction, but from reading. Actual writing can help us solve problems and can make us smarter. Writing Style Comes from Readino, A substantial amount of research strongly suggests that we learn to write by reading. To be more precise, we acquire writing style, the special language of writing, by reading. Hypothesizing that writing style comes from reading, not from writing or instniction, is consistent with what is known about language acquisition: Most of language acquisition lakes place subconsciously, not through deliberate study, and it is a result of input (comprehension, not output (production (Krashen, 1982.

  14. The Relationship between Quantitative and Qualitative Measures of Writing Skills.

    Science.gov (United States)

    Howerton, Mary Lou P.; And Others

    The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…

  15. Engaging Sources through Reading-Writing Connections across the Disciplines

    Science.gov (United States)

    Carillo, Ellen C.

    2016-01-01

    This essay argues that what might otherwise be considered "plagiarism" in student writing is a symptom of the difficulties students encounter in their reading and writing, moments in which students' inabilities to critically assess, read, and respond to sources through the act of writing come to the surface. Expanding the context within…

  16. Strategic Planning Tools for Large-Scale Technology-Based Assessments

    Science.gov (United States)

    Koomen, Marten; Zoanetti, Nathan

    2018-01-01

    Education systems are increasingly being called upon to implement new technology-based assessment systems that generate efficiencies, better meet changing stakeholder expectations, or fulfil new assessment purposes. These assessment systems require coordinated organisational effort to implement and can be expensive in time, skill and other…

  17. Solution approach for a large scale personnel transport system for a large company in Latin America

    Energy Technology Data Exchange (ETDEWEB)

    Garzón-Garnica, Eduardo-Arturo; Caballero-Morales, Santiago-Omar; Martínez-Flores, José-Luis

    2017-07-01

    The present paper focuses on the modelling and solution of a large-scale personnel transportation system in Mexico where many routes and vehicles are currently used to service 525 points. The routing system proposed can be applied to many cities in the Latin-American region. Design/methodology/approach: This system was modelled as a VRP model considering the use of real-world transit times, and the fact that routes start at the farthest point from the destination center. Experiments were performed on different sized sets of service points. As the size of the instances was increased, the performance of the heuristic method was assessed in comparison with the results of an exact algorithm, the results remaining very close between both. When the size of the instance was full-scale and the exact algorithm took too much time to solve the problem, then the heuristic algorithm provided a feasible solution. Supported by the validation with smaller scale instances, where the difference between both solutions was close to a 6%, the full –scale solution obtained with the heuristic algorithm was considered to be within that same range. Findings: The proposed modelling and solving method provided a solution that would produce significant savings in the daily operation of the routes. Originality/value: The urban distribution of the cities in Latin America is unique to other regions in the world. The general layout of the large cities in this region includes a small town center, usually antique, and a somewhat disordered outer region. The lack of a vehicle-centered urban planning poses distinct challenges for vehicle routing problems in the region. The use of a heuristic VRP combined with the results of an exact VRP, allowed the obtention of an improved routing plan specific to the requirements of the region.

  18. Solution approach for a large scale personnel transport system for a large company in Latin America

    International Nuclear Information System (INIS)

    Garzón-Garnica, Eduardo-Arturo; Caballero-Morales, Santiago-Omar; Martínez-Flores, José-Luis

    2017-01-01

    The present paper focuses on the modelling and solution of a large-scale personnel transportation system in Mexico where many routes and vehicles are currently used to service 525 points. The routing system proposed can be applied to many cities in the Latin-American region. Design/methodology/approach: This system was modelled as a VRP model considering the use of real-world transit times, and the fact that routes start at the farthest point from the destination center. Experiments were performed on different sized sets of service points. As the size of the instances was increased, the performance of the heuristic method was assessed in comparison with the results of an exact algorithm, the results remaining very close between both. When the size of the instance was full-scale and the exact algorithm took too much time to solve the problem, then the heuristic algorithm provided a feasible solution. Supported by the validation with smaller scale instances, where the difference between both solutions was close to a 6%, the full –scale solution obtained with the heuristic algorithm was considered to be within that same range. Findings: The proposed modelling and solving method provided a solution that would produce significant savings in the daily operation of the routes. Originality/value: The urban distribution of the cities in Latin America is unique to other regions in the world. The general layout of the large cities in this region includes a small town center, usually antique, and a somewhat disordered outer region. The lack of a vehicle-centered urban planning poses distinct challenges for vehicle routing problems in the region. The use of a heuristic VRP combined with the results of an exact VRP, allowed the obtention of an improved routing plan specific to the requirements of the region.

  19. Solution approach for a large scale personnel transport system for a large company in Latin America

    Directory of Open Access Journals (Sweden)

    Eduardo-Arturo Garzón-Garnica

    2017-10-01

    Full Text Available Purpose: The present paper focuses on the modelling and solution of a large-scale personnel transportation system in Mexico where many routes and vehicles are currently used to service 525 points. The routing system proposed can be applied to many cities in the Latin-American region. Design/methodology/approach: This system was modelled as a VRP model considering the use of real-world transit times, and the fact that routes start at the farthest point from the destination center. Experiments were performed on different sized sets of service points. As the size of the instances was increased, the performance of the heuristic method was assessed in comparison with the results of an exact algorithm, the results remaining very close between both.  When the size of the instance was full-scale and the exact algorithm took too much time to solve the problem, then the heuristic algorithm provided a feasible solution. Supported by the validation with smaller scale instances, where the difference between both solutions was close to a 6%, the full –scale solution obtained with the heuristic algorithm was considered to be within that same range. Findings: The proposed modelling and solving method provided a solution that would produce significant savings in the daily operation of the routes. Originality/value: The urban distribution of the cities in Latin America is unique to other regions in the world. The general layout of the large cities in this region includes a small town center, usually antique, and a somewhat disordered outer region. The lack of a vehicle-centered urban planning poses distinct challenges for vehicle routing problems in the region. The use of a heuristic VRP combined with the results of an exact VRP, allowed the obtention of an improved routing plan specific to the requirements of the region.

  20. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  1. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  2. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  3. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  4. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  5. Writing in turbulent air.

    Science.gov (United States)

    Bominaar, Jeroen; Pashtrapanska, Mira; Elenbaas, Thijs; Dam, Nico; ter Meulen, Hans; van de Water, Willem

    2008-04-01

    We describe a scheme of molecular tagging velocimetry in air in which nitric oxide (NO) molecules are created out of O2 and N2 molecules in the focus of a strong laser beam. The NO molecules are visualized a while later by laser-induced fluorescence. The precision of the molecular tagging velocimetry of gas flows is affected by the gradual blurring of the written patterns through molecular diffusion. In the case of turbulent flows, molecular diffusion poses a fundamental limit on the resolution of the smallest scales in the flow. We study the diffusion of written patterns in detail for our tagging scheme which, at short (micros) delay times is slightly anomalous due to local heating by absorption of laser radiation. We show that our experiments agree with a simple convection-diffusion model that allows us to estimate the temperature rise upon writing. Molecular tagging can be a highly nonlinear process, which affects the art of writing. We find that our tagging scheme is (only) quadratic in the intensity of the writing laser.

  6. We learn to write by reading, but writing can make you smarter We learn to write by reading, but writing can make you smarter

    Directory of Open Access Journals (Sweden)

    Stephen Krashen

    2008-04-01

    Full Text Available My goal in this paper is to make two points: 1. Writing style does not come from writing or from direct instruction, but from reading. 2. Actual writing can help us solve problems and can make us smarter. Writing Style Comes from Reading A substantial amount of research slrongly suggests that wc learn to write by reading. To be more precise, wc acquire writing style, the special language of writing, by reading. Hypothesizing that writing style comes from reading, not from writing or instruction, is consistent with what is known about language acquisition: Most of language acquisition takes place subconsciously, not through deliberate study, and it is a result of input (comprehension, not output (production (Krashen, 1982. Thus, if you wrile a page a day, your writing style or your command of mechanics will not improve. On Ihe other hand, other good things may result from your writing, as we shall see in the second section of this paper. My goal in this paper is to make two points: 1. Writing style does not come from writing or from direct instruction, but from reading. 2. Actual writing can help us solve problems and can make us smarter. Writing Style Comes from Reading A substantial amount of research slrongly suggests that wc learn to write by reading. To be more precise, wc acquire writing style, the special language of writing, by reading. Hypothesizing that writing style comes from reading, not from writing or instruction, is consistent with what is known about language acquisition: Most of language acquisition takes place subconsciously, not through deliberate study, and it is a result of input (comprehension, not output (production (Krashen, 1982. Thus, if you wrile a page a day, your writing style or your command of mechanics will not improve. On Ihe other hand, other good things may result from your writing, as we shall see in the second section of this paper.

  7. Assessing Programming Costs of Explicit Memory Localization on a Large Scale Shared Memory Multiprocessor

    Directory of Open Access Journals (Sweden)

    Silvio Picano

    1992-01-01

    Full Text Available We present detailed experimental work involving a commercially available large scale shared memory multiple instruction stream-multiple data stream (MIMD parallel computer having a software controlled cache coherence mechanism. To make effective use of such an architecture, the programmer is responsible for designing the program's structure to match the underlying multiprocessors capabilities. We describe the techniques used to exploit our multiprocessor (the BBN TC2000 on a network simulation program, showing the resulting performance gains and the associated programming costs. We show that an efficient implementation relies heavily on the user's ability to explicitly manage the memory system.

  8. Large scale waste combustion projects. A study of financial structures and sensitivities

    International Nuclear Information System (INIS)

    Brandler, A.

    1993-01-01

    The principal objective of the study was to determine the key contractual and financial aspects of large scale energy-from-waste projects, and to provide the necessary background information on financing to appreciate the approach lenders take when they consider financing waste combustion projects. An integral part of the study has been the preparation of a detailed financial model, incorporating all major financing parameters, to assess the economic and financial viability of typical waste combustion projects. (author)

  9. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  10. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  11. Methodology for a GIS-based damage assessment for researchers following large scale disasters

    Science.gov (United States)

    Crawford, Patrick Shane

    research field. Along with visually mapping the data, geometric calculations can be conducted on the data to give the viewer more information about the damage. In Chapter 4, a tornado damage contour for Moore, Oklahoma following the May 20, 2013 tornado is shown. This damage contour was created in GIS based on the Enhanced Fujita (EF) damage scale, and gives the viewer an easily understood picture of the extent and distribution of the tornado. This thesis aims to describe a foundational groundwork for activities that are performed in the GIS-based damage assessment procedure and provide uses for the damage assessment as well as research being conducted on how to use the data collected from these assessments. This will allow researchers to conduct a highly adaptable, rapid GIS-based damage assessment of their own.

  12. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  13. What Are They Thinking? Automated Analysis of Student Writing about Acid–Base Chemistry in Introductory Biology

    Science.gov (United States)

    Haudek, Kevin C.; Prevost, Luanna B.; Moscarella, Rosa A.; Merrill, John; Urban-Lurain, Mark

    2012-01-01

    Students’ writing can provide better insight into their thinking than can multiple-choice questions. However, resource constraints often prevent faculty from using writing assessments in large undergraduate science courses. We investigated the use of computer software to analyze student writing and to uncover student ideas about chemistry in an introductory biology course. Students were asked to predict acid–base behavior of biological functional groups and to explain their answers. Student explanations were rated by two independent raters. Responses were also analyzed using SPSS Text Analysis for Surveys and a custom library of science-related terms and lexical categories relevant to the assessment item. These analyses revealed conceptual connections made by students, student difficulties explaining these topics, and the heterogeneity of student ideas. We validated the lexical analysis by correlating student interviews with the lexical analysis. We used discriminant analysis to create classification functions that identified seven key lexical categories that predict expert scoring (interrater reliability with experts = 0.899). This study suggests that computerized lexical analysis may be useful for automatically categorizing large numbers of student open-ended responses. Lexical analysis provides instructors unique insights into student thinking and a whole-class perspective that are difficult to obtain from multiple-choice questions or reading individual responses. PMID:22949425

  14. AUTHORIAL VOICE IN ISLAMIC COLLEGE ENGLISH DEPARTMENT STUDENTS’ ARGUMENTATIVE WRITING

    Directory of Open Access Journals (Sweden)

    Nur Afifi

    2014-11-01

    Full Text Available While considered elusive and abstract, authorial voice is paramount in English writing. Unfortunately, many of Indonesian EFL learners found it is highly challeging to show their voice in their writing. The importance of voice is even exaggerated in argumentative writing, since this kind of writing needs obvious stance of the writer. This study investigates the authorial voice students made in their argumentative writing. The purpose of this study is to gain the picture of students‟ writing ability especially in authorial voice to map the road in guiding the next writing classes. The object of the study is the argumentative writing made by English department students at one Indonesian State College of Islamic Studies in their writing III course. Using Hyland‟s interactional model of voice (2008 the data analysis results the authorial presence in the essays is in position 2 at 0 – 4 scale which means the reader feels somehow weak presence of the authorial voice in the essay. This result confirms the findings of some previous studies that EFL learners especially from „interdependent‟ cultural background tend to find this authorial voice difficult in writing English essay.

  15. Reflective Blogfolios in the Language Classroom: Impact on EFL Tertiary Students’ Argumentative Writing Skills and Ways of Knowing

    Directory of Open Access Journals (Sweden)

    Ammar Abdullah Mahmoud Ismial

    2016-10-01

    Full Text Available The emerging paradigm shift in educational contexts from walled classroom environments to virtual, hybrid, blended, and lately personal learning environments has brought about vast changes in the foreign language classroom practices.  Numerous calls  for experimenting with new instructional treatments to enhance students' language performance in these new learning environments have been voiced by researchers and language educators in different settings. The current study aimed at investigating the impact of using reflective blogfolios in teaching argumentation to EFL tertiary students on their argumentative essay writing skills and ways of knowing. As well, the study investigated the relationship between student's ways of knowing and their argumentative writing capabilities. The participants of the study were fifty one EFL tertiary students in the Emirati context. Two assessment instruments were used, including a ways-of-knowing scale and a rubric for tapping EFL students' argumentative writing skills. Results of the study indicated that using reflective blogfolios in the foreign language classroom brought about significant changes in EFL tertiary students' argumentative writing skills and their ways of knowing. Results of the study also indicated that connected ways of knowing were better predictors of EFL tertiary students' argumentative writing performance than separate ways of knowing. Details of the instructional intervention, the assessment instruments, results of the study, implications for foreign language instruction in virtual learning environments, and suggestions for further research are discussed. Keywords: Reflective blogfolios, argumentative writing skills, ways of knowing

  16. Passionate Writing

    DEFF Research Database (Denmark)

    Borgström, Benedikte

    With care of writing as a method of inquiry, this paper engages in academic writing such as responsible knowledge development drawing on emotion, thought and reason. The aim of the paper is to better understand emancipatory knowledge development. Bodily experiences and responses shape academic...... writing and there are possibilities for responsible academic writing in that iterative process. I propose that academic writing can be seen as possibilities of passionate as well as passive writing....

  17. Green Routing Fuel Saving Opportunity Assessment: A Case Study on California Large-Scale Real-World Travel Data

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Lei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Holden, Jacob [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gonder, Jeffrey D [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric W [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-07-31

    New technologies, such as connected and automated vehicles, have attracted more and more researchers for improving the energy efficiency and environmental impact of current transportation systems. The green routing strategy instructs a vehicle to select the most fuel-efficient route before the vehicle departs. It benefits the current transportation system with fuel saving opportunity through identifying the greenest route. This paper introduces an evaluation framework for estimating benefits of green routing based on large-scale, real-world travel data. The framework has the capability to quantify fuel savings by estimating the fuel consumption of actual routes and comparing to routes procured by navigation systems. A route-based fuel consumption estimation model, considering road traffic conditions, functional class, and road grade is proposed and used in the framework. An experiment using a large-scale data set from the California Household Travel Survey global positioning system trajectory data base indicates that 31% of actual routes have fuel savings potential with a cumulative estimated fuel savings of 12%.

  18. Using tracking software for writing instruction

    Directory of Open Access Journals (Sweden)

    Sane M. Yagi

    2011-08-01

    Full Text Available Writing is a complex skill that is hard to teach. Although the written product is what is often evaluated in the context of language teaching, the process of giving thought to linguistic form is fascinating. For almost forty years, language teachers have found it more effective to help learners in the writing process than in the written product; it is there that they could find sources of writing problems. Despite all controversy evoked by post-process approaches with respect to process writing, information technology has lately offered tools that can shed new light on how writing takes place. Software that can record keyboard, mouse, and screen activities is capable of unraveling mysteries of the writing process. Technology has given teachers and learners the option of examining the writing process as it unfolds, enabling them to diagnose strategy as well as wording problems, thus empowering teachers to guide learners individually in how to think about each of their trouble spots in the context of a specific product of writing. With these advances in information technology, metacognitive awareness and strategy training begin to acquire new dimensions of meaning. Technology lays open aspects of the writing process, offering unprecedented insight into creative text production as well. This paper attempts to explain how tracking software can influence writing instruction. It briefly examines the process and post-process approaches to assess their viability, explains the concept of tracking software, proposes methodology needed for the adoption of this technology, and then discusses the pedagogical implications of these issues.

  19. Relating beliefs in writing skill malleability to writing performance: The mediating role of achievement goals and self-efficacy

    Directory of Open Access Journals (Sweden)

    Teresa Limpo

    2017-10-01

    Full Text Available It is well established that students’ beliefs in skill malleability influence their academic performance. Specifically, thinking of ability as an incremental (vs. fixed trait is associated with better outcomes. Though this was shown across many domains, little research exists into these beliefs in the writing domain and into the mechanisms underlying their effects on writing performance. The aim of this study was twofold: to gather evidence on the validity and reliability of instruments to measure beliefs in skill malleability, achievement goals, and self-efficacy in writing; and to test a path-analytic model specifying beliefs in writing skill malleability to influence writing performance, via goals and self-efficacy. For that, 196 Portuguese students in Grades 7-8 filled in the instruments and wrote an opinion essay that was assessed for writing performance. Confirmatory factor analyses supported instruments’ validity and reliability. Path analysis revealed direct effects from beliefs in writing skill malleability to mastery goals (ß = .45; from mastery goals to self-efficacy for conventions, ideation, and self-regulation (ß = .27, .42, and .42, respectively; and from self-efficacy for self-regulation to writing performance (ß = .16; along with indirect effects from beliefs in writing skill malleability to self-efficacy for self-regulation via mastery goals (ß = .19, and from mastery goals to writing performance via self-efficacy for self-regulation (ß = .07. Overall, students’ mastery goals and self-efficacy for self-regulation seem to be key factors underlying the link between beliefs in writing skill malleability and writing performance. These findings highlight the importance of attending to motivation-related components in the teaching of writing.

  20. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)