WorldWideScience

Sample records for science data-management challenge

  1. The Office of Science Data-Management Challenge

    Energy Technology Data Exchange (ETDEWEB)

    Mount, Richard P.; /SLAC

    2005-10-10

    Science--like business, national security, and even everyday life--is becoming more and more data intensive. In some sciences the data-management challenge already exceeds the compute-power challenge in its needed resources. Leadership in applying computing to science will necessarily require both world-class computing and world-class data management. The Office of Science program needs a leadership-class capability in scientific data management. Currently two-thirds of Office of Science research and development in data management is left to the individual scientific programs. About $18M/year is spent by the programs on data-management research and development targeted at their most urgent needs. This is to be compared with the $9M/year spent on data management by DOE computer science. This highly mission-directed approach has been effective, but only in meeting just the highest-priority needs of individual programs. A coherent, leadership-class, program of data management is clearly warranted by the scale and nature of the Office of Science programs. More directly, much of the Office of Science portfolio is in desperate need of such a program; without it, data management could easily become the primary bottleneck to scientific progress within the next five years. When grouped into simulation-intensive science, experiment/observation-intensive science, and information-intensive science, the Office of Science programs show striking commonalities in their data-management needs. Not just research and development but also packaging and hardening as well as maintenance and support are required. Meeting these needs is a medium- to long-term effort requiring a well-planned program of evolving investment. We propose an Office of Science Data-Management Program at an initial scale of $32M/year of new funding. The program should be managed by a Director charged with creating and maintaining a forward-looking approach to multiscience data-management challenges. The program

  2. Data management challenges in analysis and synthesis in the ecosystem sciences.

    Science.gov (United States)

    Specht, A; Guru, S; Houghton, L; Keniger, L; Driver, P; Ritchie, E G; Lai, K; Treloar, A

    2015-11-15

    Open-data has created an unprecedented opportunity with new challenges for ecosystem scientists. Skills in data management are essential to acquire, manage, publish, access and re-use data. These skills span many disciplines and require trans-disciplinary collaboration. Science synthesis centres support analysis and synthesis through collaborative 'Working Groups' where domain specialists work together to synthesise existing information to provide insight into critical problems. The Australian Centre for Ecological Analysis and Synthesis (ACEAS) served a wide range of stakeholders, from scientists to policy-makers to managers. This paper investigates the level of sophistication in data management in the ecosystem science community through the lens of the ACEAS experience, and identifies the important factors required to enable us to benefit from this new data-world and produce innovative science. ACEAS promoted the analysis and synthesis of data to solve transdisciplinary questions, and promoted the publication of the synthesised data. To do so, it provided support in many of the key skillsets required. Analysis and synthesis in multi-disciplinary and multi-organisational teams, and publishing data were new for most. Data were difficult to discover and access, and to make ready for analysis, largely due to lack of metadata. Data use and publication were hampered by concerns about data ownership and a desire for data citation. A web portal was created to visualise geospatial datasets to maximise data interpretation. By the end of the experience there was a significant increase in appreciation of the importance of a Data Management Plan. It is extremely doubtful that the work would have occurred or data delivered without the support of the Synthesis centre, as few of the participants had the necessary networks or skills. It is argued that participation in the Centre provided an important learning opportunity, and has resulted in improved knowledge and understanding

  3. Research challenges for energy data management (panel)

    DEFF Research Database (Denmark)

    Pedersen, Torben Bach; Lehner, Wolfgang

    2013-01-01

    This panel paper aims at initiating discussion at the Second International Workshop on Energy Data Management (EnDM 2013) about the important research challenges within Energy Data Management. The authors are the panel organizers, extra panelists will be recruited before the workshop...

  4. Scientific data management challenges, technology and deployment

    CERN Document Server

    Rotem, Doron

    2010-01-01

    Dealing with the volume, complexity, and diversity of data currently being generated by scientific experiments and simulations often causes scientists to waste productive time. Scientific Data Management: Challenges, Technology, and Deployment describes cutting-edge technologies and solutions for managing and analyzing vast amounts of data, helping scientists focus on their scientific goals. The book begins with coverage of efficient storage systems, discussing how to write and read large volumes of data without slowing the simulation, analysis, or visualization processes. It then focuses on the efficient data movement and management of storage spaces and explores emerging database systems for scientific data. The book also addresses how to best organize data for analysis purposes, how to effectively conduct searches over large datasets, how to successfully automate multistep scientific process workflows, and how to automatically collect metadata and lineage information. This book provides a comprehensive u...

  5. Sedimentary Geology Context and Challenges for Cyberinfrastructure Data Management

    Science.gov (United States)

    Chan, M. A.; Budd, D. A.

    2014-12-01

    A cyberinfrastructure data management system for sedimentary geology is crucial to multiple facets of interdisciplinary Earth science research, as sedimentary systems form the deep-time framework for many geoscience communities. The breadth and depth of the sedimentary field spans research on the processes that form, shape and affect the Earth's sedimentary crust and distribute resources such as hydrocarbons, coal, and water. The sedimentary record is used by Earth scientists to explore questions such as the continental crust evolution, dynamics of Earth's past climates and oceans, evolution of the biosphere, and the human interface with Earth surface processes. Major challenges to a data management system for sedimentary geology are the volume and diversity of field, analytical, and experimental data, along with many types of physical objects. Objects include rock samples, biological specimens, cores, and photographs. Field data runs the gamut from discrete location and spatial orientation to vertical records of bed thickness, textures, color, sedimentary structures, and grain types. Ex situ information can include geochemistry, mineralogy, petrophysics, chronologic, and paleobiologic data. All data types cover multiple order-of-magnitude scales, often requiring correlation of the multiple scales with varying degrees of resolution. The stratigraphic framework needs dimensional context with locality, time, space, and depth relationships. A significant challenge is that physical objects represent discrete values at specific points, but measured stratigraphic sections are continuous. In many cases, field data is not easily quantified, and determining uncertainty can be difficult. Despite many possible hurdles, the sedimentary community is anxious to embrace geoinformatic resources that can provide better tools to integrate the many data types, create better search capabilities, and equip our communities to conduct high-impact science at unprecedented levels.

  6. Social Water Science Data: Dimensions, Data Management, and Visualization

    Science.gov (United States)

    Jones, A. S.; Horsburgh, J. S.; Flint, C.; Jackson-Smith, D.

    2016-12-01

    Water systems are increasingly conceptualized as coupled human-natural systems, with growing emphasis on representing the human element in hydrology. However, social science data and associated considerations may be unfamiliar and intimidating to many hydrologic researchers. Monitoring social aspects of water systems involves expanding the range of data types typically used in hydrology and appreciating nuances in datasets that are well known to social scientists, but less understood by hydrologists. We define social water science data as any information representing the human aspects of a water system. We present a scheme for classifying these data, highlight an array of data types, and illustrate data management considerations and challenges unique to social science data. This classification scheme was applied to datasets generated as part of iUTAH (innovative Urban Transitions and Arid region Hydro-sustainability), an interdisciplinary water research project based in Utah, USA that seeks to integrate and share social and biophysical water science data. As the project deployed cyberinfrastructure for baseline biophysical data, cyberinfrastructure for analogous social science data was necessary. As a particular case of social water science data, we focus in this presentation on social science survey data. These data are often interpreted through the lens of the original researcher and are typically presented to interested parties in static figures or reports. To provide more exploratory and dynamic communication of these data beyond the individual or team who collected the data, we developed a web-based, interactive viewer to visualize social science survey responses. This interface is applicable for examining survey results that show human motivations and actions related to environmental systems and as a useful tool for participatory decision-making. It also serves as an example of how new data sharing and visualization tools can be developed once the

  7. Data Provenance and Data Management in eScience

    CERN Document Server

    Bai, Quan; Giugni, Stephen; Williamson, Darrell; Taylor, John

    2013-01-01

    eScience allows scientific research to be carried out in highly distributed environments. The complex nature of the interactions in an eScience infrastructure, which often involves a range of instruments, data, models, applications, people and computational facilities, suggests there is a need for data provenance and data management (DPDM). The W3C Provenance Working Group defines the provenance of a resource as a “record that describes entities and processes involved in producing and delivering or otherwise influencing that resource”. It has been widely recognised that provenance is a critical issue to enable sharing, trust, authentication and reproducibility of eScience process.   Data Provenance and Data Management in eScience identifies the gaps between DPDM foundations and their practice within eScience domains including clinical trials, bioinformatics and radio astronomy. The book covers important aspects of fundamental research in DPDM including provenance representation and querying. It also expl...

  8. Data Management and Preservation Planning for Big Science

    Directory of Open Access Journals (Sweden)

    Juan Bicarregui

    2013-06-01

    Full Text Available ‘Big Science’ - that is, science which involves large collaborations with dedicated facilities, and involving large data volumes and multinational investments – is often seen as different when it comes to data management and preservation planning. Big Science handles its data differently from other disciplines and has data management problems that are qualitatively different from other disciplines. In part, these differences arise from the quantities of data involved, but possibly more importantly from the cultural, organisational and technical distinctiveness of these academic cultures. Consequently, the data management systems are typically and rationally bespoke, but this means that the planning for data management and preservation (DMP must also be bespoke.These differences are such that ‘just read and implement the OAIS specification’ is reasonable Data Management and Preservation (DMP advice, but this bald prescription can and should be usefully supported by a methodological ‘toolkit’, including overviews, case-studies and costing models to provide guidance on developing best practice in DMP policy and infrastructure for these projects, as well as considering OAIS validation, audit and cost modelling.In this paper, we build on previous work with the LIGO collaboration to consider the role of DMP planning within these big science scenarios, and discuss how to apply current best practice. We discuss the result of the MaRDI-Gross project (Managing Research Data Infrastructures – Big Science, which has been developing a toolkit to provide guidelines on the application of best practice in DMP planning within big science projects. This is targeted primarily at projects’ engineering managers, but intending also to help funders collaborate on DMP plans which satisfy the requirements imposed on them.

  9. Archive & Data Management Activities for ISRO Science Archives

    Science.gov (United States)

    Thakkar, Navita; Moorthi, Manthira; Gopala Krishna, Barla; Prashar, Ajay; Srinivasan, T. P.

    2012-07-01

    . For other AO payloads users can view the metadata and the data is available through FTP site. This same archival and dissemination strategy will be extended for the next moon mission Chandrayaan-2. ASTROSAT is going to be the first multi-wavelength astronomical mission for which the data is archived at ISSDC. It consists of five astronomical payloads that would allow simultaneous multi-wavelengths observations from X-ray to Ultra-Violet (UV) of astronomical objects. It is planned to archive the data sets in FITS. The archive of the ASTROSAT will be done in the Archive Layer at ISSDC. The Browse of the Archive will be available through the ISDA (Indian Science Data Archive) web site. The Browse will be IVOA compliant with a search mechanism using VOTable. The data will be available to the users only on request basis via a FTP site after the lock in period is over. It is planned that the Level2 pipeline software and various modules for processing the data sets will be also available on the web site. This paper, describes the archival procedure of Chandrayaan-1 and archive plan for the ASTROSAT, Chandrayaan-2 and other future mission of ISRO including the discussion on data management activities.

  10. Data, Data Management, and the Ethos of Science

    Science.gov (United States)

    Duerr, R.; Barry, R.; Parsons, M. A.

    2006-12-01

    Since the beginnings of the scientific era, data - the record of the observations made to elucidate the inner workings of the universe - have been a fundamental component of the scientific method, a cornerstone of the edifice that is science. Historically it has been a norm for scientists to publish these data so that others may verify the claims made or to extend the field further, for example by using the data as input to models. Entire journals owe their very existence to the need for mechanisms for making data available, for recording the observations of science for posterity. As such, data and the publication of data, are fundamental to the integrity of science, to a scientists ability to trust in the work of other scientists, as well as to uphold the trust the public and policy maker place in science as an enterprise worthy of support. In the past, the data-related mechanisms for maintaining this trust were well understood. A scientist need simply record the observations they made as part of a journal article. With the advent of the digital era and the ever-increasing volumes of data, these old methods have become insufficient to the task. The focus of this talk is on the complex and changing ways that digital data and digital data management are impacting science and the way the external world perceives science. We will discuss many aspects of the issue - from the responsibilities of scientists in regards to making data available, to the elements of sound data management, to the need to explain events visible in the data (e.g., sea ice minima) to the public.

  11. Exploring Best Practices for Research Data Management in Earth Science through Collaborating with University Libraries

    Science.gov (United States)

    Wang, T.; Branch, B. D.

    2013-12-01

    Earth Science research data, its data management, informatics processing and its data curation are valuable in allowing earth scientists to make new discoveries. But how to actively manage these research assets to ensure them safe and secure, accessible and reusable for long term is a big challenge. Nowadays, the data deluge makes this challenge become even more difficult. To address the growing demand for managing earth science data, the Council on Library and Information Resources (CLIR) partners with the Library and Technology Services (LTS) of Lehigh University and Purdue University Libraries (PUL) on hosting postdoctoral fellows in data curation activity. This inter-disciplinary fellowship program funded by the SLOAN Foundation innovatively connects university libraries and earth science departments and provides earth science Ph.D.'s opportunities to use their research experiences in earth science and data curation trainings received during their fellowship to explore best practices for research data management in earth science. In the process of exploring best practices for data curation in earth science, the CLIR Data Curation Fellows have accumulated rich experiences and insights on the data management behaviors and needs of earth scientists. Specifically, Ting Wang, the postdoctoral fellow at Lehigh University has worked together with the LTS support team for the College of Arts and Sciences, Web Specialists and the High Performance Computing Team, to assess and meet the data management needs of researchers at the Department of Earth and Environmental Sciences (EES). By interviewing the faculty members and graduate students at EES, the fellow has identified a variety of data-related challenges at different research fields of earth science, such as climate, ecology, geochemistry, geomorphology, etc. The investigation findings of the fellow also support the LTS for developing campus infrastructure for long-term data management in the sciences. Likewise

  12. Persistent Identifiers in Earth science data management environments

    Science.gov (United States)

    Weigel, Tobias; Stockhause, Martina; Lautenschlager, Michael

    2014-05-01

    Globally resolvable Persistent Identifiers (PIDs) that carry additional context information (which can be any form of metadata) are increasingly used by data management infrastructures for fundamental tasks. The notion of a Persistent Identifier is originally an abstract concept that aims to provide identifiers that are quality-controlled and maintained beyond the life time of the original issuer, for example through the use of redirection mechanisms. Popular implementations of the PID concept are for example the Handle System and the DOI System based on it. These systems also move beyond the simple identification concept by providing facilities that can hold additional context information. Not only in the Earth sciences, data managers are increasingly attracted to PIDs because of the opportunities these facilities provide; however, long-term viable principles and mechanisms for efficient organization of PIDs and context information are not yet available or well established. In this respect, promising techniques are to type the information that is associated with PIDs and to construct actionable collections of PIDs. There are two main drivers for extended PID usage: Earth science data management middleware use cases and applications geared towards scientific end-users. Motivating scenarios from data management include hierarchical data and metadata management, consistent data tracking and improvements in the accountability of processes. If PIDs are consistently assigned to data objects, context information can be carried over to subsequent data life cycle stages much easier. This can also ease data migration from one major curation domain to another, e.g. from early dissemination within research communities to formal publication and long-term archival stages, and it can help to document processes across technical and organizational boundaries. For scientific end users, application scenarios include for example more personalized data citation and improvements in the

  13. Master Data Management Model in Company: Challenges and Opportunity

    Directory of Open Access Journals (Sweden)

    Indrajani Indrajani

    2015-12-01

    Full Text Available The purpose of this research is to analyze, design, and implement Master Data Management (MDM model for company, which include database processing that will be used in the quality of data customer and produce single view of customer. The research method used is literature study from a variety of journals, books, e-books, and articles on the internet. Also, fact finding techniques are done, such as by analyze, collect, and examine the documents, interviews, and observations. Then, other research methods used to analyze and design MDM model are using cleansing and matching technique. The result obtained from this research is animplementation MDM model for the company, where if implemented, will improve the quality of data significantly. The conclusion which can be obtained from this research is that MDM is one of the factors thatcan improve the quality of customer data.

  14. Scientific data management in the environmental molecular sciences laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, P.R.; Keller, T.L.

    1995-09-01

    The Environmental Molecular Sciences Laboratory (EMSL) is currently under construction at Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy (DOE). This laboratory will be used for molecular and environmental sciences research to identify comprehensive solutions to DOE`s environmental problems. Major facilities within the EMSL include the Molecular Sciences Computing Facility (MSCF), a laser-surface dynamics laboratory, a high-field nuclear magnetic resonance (NMR) laboratory, and a mass spectrometry laboratory. The EMSL is scheduled to open early in 1997 and will house about 260 resident and visiting scientists. It is anticipated that at least six (6) terabytes of data will be archived in the first year of operation. An object-oriented database management system (OODBMS) and a mass storage system will be integrated to provide an intelligent, automated mechanism to manage data. The resulting system, called the DataBase Computer System (DBCS), will provide total scientific data management capabilities to EMSL users. A prototype mass storage system based on the National Storage Laboratory`s (NSL) UniTree has been procured and is in limited use. This system consists of two independent hierarchies of storage devices. One hierarchy of lower capacity, slower speed devices provides support for smaller files transferred over the Fiber Distributed Data Interface (FDDI) network. Also part of the system is a second hierarchy of higher capacity, higher speed devices that will be used to support high performance clients (e.g., a large scale parallel processor). The ObjectStore OODBMS will be used to manage metadata for archived datasets, maintain relationships between archived datasets, and -hold small, duplicate subsets of archived datasets (i.e., derivative data). The interim system is called DBCS, Phase 0 (DBCS-0). The production system for the EMSL, DBCS Phase 1 (DBCS-1), will be procured and installed in the summer of 1996.

  15. Data Management challenges in Astronomy and Astroparticle Physics

    Science.gov (United States)

    Lamanna, Giovanni

    2015-12-01

    Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.

  16. Benefits, Challenges and Tools of Big Data Management

    Directory of Open Access Journals (Sweden)

    Fernando L. F. Almeida

    2017-10-01

    Full Text Available Big Data is one of the most predominant field of knowledge and research that has generated high repercussion in the process of digital transformation of organizations in recent years. The Big Data's main goal is to improve work processes through analysis and interpretation of large amounts of data. Knowing how Big Data works, its benefits, challenges and tools, are essential elements for business success. Our study performs a systematic review on Big Data field adopting a mind map approach, which allows us to easily and visually identify its main elements and dependencies. The findings identified and mapped a total of 12 main branches of benefits, challenges and tools, and also a total of 52 sub branches in each of the main areas of the model.

  17. Progress and Challenges in Assessing NOAA Data Management

    Science.gov (United States)

    de la Beaujardiere, J.

    2016-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) produces large volumes of environmental data from a great variety of observing systems including satellites, radars, aircraft, ships, buoys, and other platforms. These data are irreplaceable assets that must be properly managed to ensure they are discoverable, accessible, usable, and preserved. A policy framework has been established which informs data producers of their responsibilities and which supports White House-level mandates such as the Executive Order on Open Data and the OSTP Memorandum on Increasing Access to the Results of Federally Funded Scientific Research. However, assessing the current state and progress toward completion for the many NOAA datasets is a challenge. This presentation will discuss work toward establishing assessment methodologies and dashboard-style displays. Ideally, metrics would be gathered though software and be automatically updated whenever an individual improvement was made. In practice, however, some level of manual information collection is required. Differing approaches to dataset granularity in different branches of NOAA yield additional complexity.

  18. Funding research data management and related infrastructures : Knowledge Exchange and Science Europe briefing paper

    NARCIS (Netherlands)

    Bijsterbosch, Magchiel; Duca, Daniela; Katerbow, Matthias; Kupiainen, Irina; Dillo, Ingrid; Doorn, P.K.; Enke, Harry; de Lucas, Jesus Eugenio Marco

    2016-01-01

    Research Funding Organisations (RFO) and Research Performing Organisations (RPO) throughout Europe are well aware that science and scholarship increasingly depend on infrastructures supporting sustainable Research Data Management (RDM). In two complementary surveys, the Science Europe Working Group

  19. Dealing with Data: Science Librarians' Participation in Data Management at Association of Research Libraries Institutions

    Science.gov (United States)

    Antell, Karen; Foote, Jody Bales; Turner, Jaymie; Shults, Brian

    2014-01-01

    As long as empirical research has existed, researchers have been doing "data management" in one form or another. However, funding agency mandates for doing formal data management are relatively recent, and academic libraries' involvement has been concentrated mainly in the last few years. The National Science Foundation implemented a new…

  20. Science Data Management for the E-ELT: usecase MICADO

    NARCIS (Netherlands)

    Verdoes Kleijn, Gijs

    2015-01-01

    The E-ELT First-light instrument MICADO will explore new parameter space in terms of precision astrometry, photometry and spectroscopy. This provides challenges for the data handling and reduction to ensure MICADO takes the observational capabilities of the AO-assisted E-ELT towards its limits. Our

  1. An Overview of the Challenges with and Proposed Solutions for the Ingest and Distribution Processes For Airborne Data Management

    Science.gov (United States)

    Northup, E. A.; Beach, A. L., III; Early, A. B.; Kusterer, J.; Quam, B.; Wang, D.; Chen, G.

    2015-12-01

    The current data management practices for NASA airborne field projects have successfully served science team data needs over the past 30 years to achieve project science objectives, however, users have discovered a number of issues in terms of data reporting and format. The ICARTT format, a NASA standard since 2010, is currently the most popular among the airborne measurement community. Although easy for humans to use, the format standard is not sufficiently rigorous to be machine-readable, and there lacks a standard variable naming convention among the many airborne measurement variables. This makes data use and management tedious and resource intensive, and also create problems in Distributed Active Archive Center (DAAC) data ingest procedures and distribution. Further, most DAACs use metadata models that concentrate on satellite data observations, making them less prepared to deal with airborne data. There also exists a substantial amount of airborne data distributed by websites designed for science team use that are less friendly to users unfamiliar with operations of airborne field studies. A number of efforts are underway to help overcome the issues with airborne data discovery and distribution. The ICARTT Refresh Earth Science Data Systems Working Group (ESDSWG) was established to enable a platform for atmospheric science data providers, users, and data managers to collaborate on developing new criteria for the file format in an effort to enhance airborne data usability. In addition, the NASA Langley Research Center Atmospheric Science Data Center (ASDC) has developed the Toolsets for Airborne Data (TAD) to provide web-based tools and centralized access to airborne in situ measurements of atmospheric composition. This presentation will discuss the aforementioned challenges and attempted solutions in an effort to demonstrate how airborne data management can be improved to streamline data ingest and discoverability to a broader user community.

  2. Meeting the Data Management Compliance Challenge: Funder Expectations and Institutional Reality

    Directory of Open Access Journals (Sweden)

    Catherine Pink

    2013-11-01

    Full Text Available In common with many global research funding agencies, in 2011 the UK Engineering and Physical Sciences Research Council (EPSRC published its Policy Framework on Research Data along with a mandate that institutions be fully compliant with the policy by May 2015. The University of Bath has a strong applied science and engineering research focus and, as such, the EPSRC is a major funder of the university’s research. In this paper, the Jisc-funded Research360 project shares its experience in developing the infrastructure required to enable a research-intensive institution to achieve full compliance with a particular funder’s policy, in such a way as to support the varied data management needs of both the University of Bath and its external stakeholders. A key feature of the Research360 project was to ensure that after the project’s completion in summer 2013 the newly developed data management infrastructure would be maintained up to and beyond the EPSRC’s 2015 deadline. Central to these plans was the ‘University of Bath Roadmap for EPSRC’, which was identified as an exemplar response by the EPSRC. This paper explores how a roadmap designed to meet a single funder’s requirements can be compatible with the strategic goals of an institution. Also discussed is how the project worked with Charles Beagrie Ltd to develop a supporting business case, thus ensuring implementation of these long-term objectives. This paper describes how two new data management roles, the Institutional Data Scientist and Technical Data Coordinator, have contributed to delivery of the Research360 project and the importance of these new types of cross-institutional roles for embedding a new data management infrastructure within an institution. Finally, the experience of developing a new institutional data policy is shared. This policy represents a particular example of the need to reconcile a funder’s expectations with the needs of individual researchers and their

  3. Big data management challenges in health research-a literature review.

    Science.gov (United States)

    Wang, Xiaoming; Williams, Carolyn; Liu, Zhen Hua; Croghan, Joe

    2017-08-07

    Big data management for information centralization (i.e. making data of interest findable) and integration (i.e. making related data connectable) in health research is a defining challenge in biomedical informatics. While essential to create a foundation for knowledge discovery, optimized solutions to deliver high-quality and easy-to-use information resources are not thoroughly explored. In this review, we identify the gaps between current data management approaches and the need for new capacity to manage big data generated in advanced health research. Focusing on these unmet needs and well-recognized problems, we introduce state-of-the-art concepts, approaches and technologies for data management from computing academia and industry to explore improvement solutions. We explain the potential and significance of these advances for biomedical informatics. In addition, we discuss specific issues that have a great impact on technical solutions for developing the next generation of digital products (tools and data) to facilitate the raw-data-to-knowledge process in health research. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.

  4. Data management in astrobiology: challenges and opportunities for an interdisciplinary community.

    Science.gov (United States)

    Aydinoglu, Arsev Umur; Suomela, Todd; Malone, Jim

    2014-06-01

    Data management and sharing are growing concerns for scientists and funding organizations throughout the world. Funding organizations are implementing requirements for data management plans, while scientists are establishing new infrastructures for data sharing. One of the difficulties is sharing data among a diverse set of research disciplines. Astrobiology is a unique community of researchers, containing over 110 different disciplines. The current study reports the results of a survey of data management practices among scientists involved in the astrobiology community and the NASA Astrobiology Institute (NAI) in particular. The survey was administered over a 2-month period in the first half of 2013. Fifteen percent of the NAI community responded (n=114), and additional (n=80) responses were collected from members of an astrobiology Listserv. The results of the survey show that the astrobiology community shares many of the same concerns for data sharing as other groups. The benefits of data sharing are acknowledged by many respondents, but barriers to data sharing remain, including lack of acknowledgement, citation, time, and institutional rewards. Overcoming technical, institutional, and social barriers to data sharing will be a challenge into the future.

  5. Advanced Technologies and Data Management Practices in Environmental Science: Lessons from Academia

    Science.gov (United States)

    Hernandez, Rebecca R.; Mayernik, Matthew S.; Murphy-Mariscal, Michelle L.; Allen, Michael F.

    2012-01-01

    Environmental scientists are increasing their capitalization on advancements in technology, computation, and data management. However, the extent of that capitalization is unknown. We analyzed the survey responses of 434 graduate students to evaluate the understanding and use of such advances in the environmental sciences. Two-thirds of the…

  6. The European HST Science Data Archive. [and Data Management Facility (DMF)

    Science.gov (United States)

    Pasian, F.; Pirenne, B.; Albrecht, R.; Russo, G.

    1993-01-01

    The paper describes the European HST Science Data Archive. Particular attention is given to the flow from the HST spacecraft to the Science Data Archive at the Space Telescope European Coordinating Facility (ST-ECF); the archiving system at the ST-ECF, including the hardware and software system structure; the operations at the ST-ECF and differences with the Data Management Facility; and the current developments. A diagram of the logical structure and data flow of the system managing the European HST Science Data Archive is included.

  7. Data Management Challenges in a National Scientific Program of 55 Diverse Research Projects

    Science.gov (United States)

    De Bruin, T.

    2016-12-01

    In 2007-2015, the Dutch funding agency NWO funded the National Ocean and Coastal Research Program (in Dutch: ZKO). This program focused on `the scientific analysis of five societal challenges related to a sustainable use of the sea and coastal zones'. These five challenges were safety, economic yield, nature, spatial planning & development and water quality. The ZKO program was `set up to strengthen the cohesion and collaboration within Dutch marine research'. From the start of the program, data management was addressed, to allow data to be shared amongst the, diverse, research projects. The ZKO program was divided in 4 different themes (or regions). The `Carrying Capacity' theme was subdivided into 3 `research lines': Carrying capacity (Wadden Sea) - Policy-relevant Research - Monitoring - Hypothesis-driven Research Oceans North Sea Transnational Wadden Sea Research 56 Projects were funded, ranging from studies on the governance of the Wadden Sea to expeditions studying trace elements in the Atlantic Ocean. One of the first projects to be funded was the data management project. Its objectives were to allow data exchange between projects, to archive all relevant data from all ZKO projects and to make the data and publications publicly available, following the ZKO Data Policy. This project was carried out by the NIOZ Data Management Group. It turned out that the research projects had hardly any interest in sharing data between projects and had good (?) arguments not to share data at all until the end of the projects. A data portal was built, to host and make available all ZKO data and publications. When it came to submitting the data to this portal, most projects obliged willingly, though found it occasionally difficult to find time to do so. However, some projects refused to submit data to an open data portal, despite the rules set up by the funding agency and agreed by all. The take-home message of this presentation is that data sharing is a cultural and

  8. Challenges in data science

    DEFF Research Database (Denmark)

    Carbone, Anna; Jensen, M.; Sato, Aki-Hiro

    2016-01-01

    of global properties from locally interacting data entities and clustering phenomena demand suitable approaches and methodologies recently developed in the foundational area of Data Science by taking a Complex Systems standpoint. Here, we deal with challenges that can be summarized by the question: "What...... can Complex Systems Science contribute to Big Data? ". Such question can be reversed and brought to a superior level of abstraction by asking "What Knowledge can be drawn from Big Data?" These aspects constitute the main motivation behind this article to introduce a volume containing a collection...... of papers presenting interdisciplinary advances in the Big Data area by methodologies and approaches typical of the Complex Systems Science, Nonlinear Systems Science and Statistical Physics. (C) 2016 Elsevier Ltd. All rights reserved....

  9. Clinical data management: Current status, challenges, and future directions from industry perspectives

    Directory of Open Access Journals (Sweden)

    Zhengwu Lu

    2010-06-01

    Full Text Available Zhengwu Lu1, Jing Su21Smith Hanley Consulting, Houston, Texas; 2Department of Chemical Engineering, University of Massachusetts, Amherst, MA, USAAbstract: To maintain a competitive position, the biopharmaceutical industry has been facing the challenge of increasing productivity both internally and externally. As the product of the clinical development process, clinical data are recognized to be the key corporate asset and provide critical evidence of a medicine’s efficacy and safety and of its potential economic value to the market. It is also well recognized that using effective technology-enabled methods to manage clinical data can enhance the speed with which the drug is developed and commercialized, hence enhancing the competitive advantage. The effective use of data-capture tools may ensure that high-quality data are available for early review and rapid decision-making. A well-designed, protocol-driven, standardized, site workflow-oriented and documented database, populated via efficient data feed mechanisms, will ensure regulatory and commercial questions receive rapid responses. When information from a sponsor’s clinical database or data warehouse develops into corporate knowledge, the value of the medicine can be realized. Moreover, regulators, payer groups, patients, activist groups, patient advocacy groups, and employers are becoming more educated consumers of medicine, requiring monetary value and quality, and seeking out up-todate medical information supplied by biopharmaceutical companies. All these developments in the current biopharmaceutical arena demand that clinical data management (CDM is at the forefront, leading change, influencing direction, and providing objective evidence. Sustaining an integrated database or data repository for initial product registration and subsequent postmarketing uses is a long-term process to maximize return on investment for organizations. CDM should be the owner of driving clinical data

  10. Globus Identity, Access, and Data Management: Platform Services for Collaborative Science

    Science.gov (United States)

    Ananthakrishnan, R.; Foster, I.; Wagner, R.

    2016-12-01

    Globus is software-as-a-service for research data management, developed at, and operated by, the University of Chicago. Globus, accessible at www.globus.org, provides high speed, secure file transfer; file sharing directly from existing storage systems; and data publication to institutional repositories. 40,000 registered users have used Globus to transfer tens of billions of files totaling hundreds of petabytes between more than 10,000 storage systems within campuses and national laboratories in the US and internationally. Web, command line, and REST interfaces support both interactive use and integration into applications and infrastructures. An important component of the Globus system is its foundational identity and access management (IAM) platform service, Globus Auth. Both Globus research data management and other applications use Globus Auth for brokering authentication and authorization interactions between end-users, identity providers, resource servers (services), and a range of clients, including web, mobile, and desktop applications, and other services. Compliant with important standards such as OAuth, OpenID, and SAML, Globus Auth provides mechanisms required for an extensible, integrated ecosystem of services and clients for the research and education community. It underpins projects such as the US National Science Foundation's XSEDE system, NCAR's Research Data Archive, and the DOE Systems Biology Knowledge Base. Current work is extending Globus services to be compliant with FEDRAMP standards for security assessment, authorization, and monitoring for cloud services. We will present Globus IAM solutions and give examples of Globus use in various projects for federated access to resources. We will also describe how Globus Auth and Globus research data management capabilities enable rapid development and low-cost operations of secure data sharing platforms that leverage Globus services and integrate them with local policy and security.

  11. Data management and its role in delivering science at DOE BES user facilities - Past, Present, and Future

    Science.gov (United States)

    Miller, Stephen D.; Herwig, Kenneth W.; Ren, Shelly; Vazhkudai, Sudharshan S.; Jemian, Pete R.; Luitz, Steffen; Salnikov, Andrei A.; Gaponenko, Igor; Proffen, Thomas; Lewis, Paul; Green, Mark L.

    2009-07-01

    The primary mission of user facilities operated by Basic Energy Sciences under the Department of Energy is to produce data for users in support of open science and basic research [1]. We trace back almost 30 years of history across selected user facilities illustrating the evolution of facility data management practices and how these practices have related to performing scientific research. The facilities cover multiple techniques such as X-ray and neutron scattering, imaging and tomography sciences. Over time, detector and data acquisition technologies have dramatically increased the ability to produce prolific volumes of data challenging the traditional paradigm of users taking data home upon completion of their experiments to process and publish their results. During this time, computing capacity has also increased dramatically, though the size of the data has grown significantly faster than the capacity of one's laptop to manage and process this new facility produced data. Trends indicate that this will continue to be the case for yet some time. Thus users face a quandary for how to manage today's data complexity and size as these may exceed the computing resources users have available to themselves. This same quandary can also stifle collaboration and sharing. Realizing this, some facilities are already providing web portal access to data and computing thereby providing users access to resources they need [2]. Portal based computing is now driving researchers to think about how to use the data collected at multiple facilities in an integrated way to perform their research, and also how to collaborate and share data. In the future, inter-facility data management systems will enable next tier cross-instrument-cross facility scientific research fuelled by smart applications residing upon user computer resources. We can learn from the medical imaging community that has been working since the early 1990's to integrate data from across multiple modalities to achieve

  12. Data management and its role in delivering science at DOE BES user facilities - Past, Present, and Future

    International Nuclear Information System (INIS)

    Miller, Stephen D; Herwig, Kenneth W; Ren, Shelly; Vazhkudai, Sudharshan S; Jemian, Pete R; Luitz, Steffen; Salnikov, Andrei A; Gaponenko, Igor; Proffen, Thomas; Lewis, Paul; Green, Mark L

    2009-01-01

    The primary mission of user facilities operated by Basic Energy Sciences under the Department of Energy is to produce data for users in support of open science and basic research. We trace back almost 30 years of history across selected user facilities illustrating the evolution of facility data management practices and how these practices have related to performing scientific research. The facilities cover multiple techniques such as X-ray and neutron scattering, imaging and tomography sciences. Over time, detector and data acquisition technologies have dramatically increased the ability to produce prolific volumes of data challenging the traditional paradigm of users taking data home upon completion of their experiments to process and publish their results. During this time, computing capacity has also increased dramatically, though the size of the data has grown significantly faster than the capacity of one's laptop to manage and process this new facility produced data. Trends indicate that this will continue to be the case for yet some time. Thus users face a quandary for how to manage today's data complexity and size as these may exceed the computing resources users have available to themselves. This same quandary can also stifle collaboration and sharing. Realizing this, some facilities are already providing web portal access to data and computing thereby providing users access to resources they need. Portal based computing is now driving researchers to think about how to use the data collected at multiple facilities in an integrated way to perform their research, and also how to collaborate and share data. In the future, inter-facility data management systems will enable next tier cross-instrument-cross facility scientific research fuelled by smart applications residing upon user computer resources. We can learn from the medical imaging community that has been working since the early 1990's to integrate data from across multiple modalities to achieve better

  13. Data Management and its Role in Delivering Science at DOE BES User Facilities - Past, Present, and Future

    International Nuclear Information System (INIS)

    Miller, Stephen D.; Herwig, Kenneth W.; Ren, Shelly; Vazhkudai, Sudharshan S.; Jemian, Pete R.; Luitz, Steffen; Salnikov, Andrei; Gaponenko, Igor; Proffen, Thomas; Lewis, Paul; Hagen, Mark E.

    2009-01-01

    The primary mission of user facilities operated by Basic Energy Sciences under the Department of Energy is to produce data for users in support of open science and basic research. We trace back almost 30 years of history across selected user facilities illustrating the evolution of facility data management practices and how these practices have related to performing scientific research. The facilities cover multiple techniques such as X-ray and neutron scattering, imaging and tomography sciences. Over time, detector and data acquisition technologies have dramatically increased the ability to produce prolific volumes of data challenging the traditional paradigm of users taking data home upon completion of their experiments to process and publish their results. During this time, computing capacity has also increased dramatically, though the size of the data has grown significantly faster than the capacity of one's laptop to manage and process this new facility produced data. Trends indicate that this will continue to be the case for yet some time. Thus users face a quandary for how to manage today's data complexity and size as these may exceed the computing resources users have available to themselves. This same quandary can also stifle collaboration and sharing. Realizing this, some facilities are already providing web portal access to data and computing thereby providing users access to resources they need. Portal based computing is now driving researchers to think about how to use the data collected at multiple facilities in an integrated way to perform their research, and also how to collaborate and share data. In the future, inter-facility data management systems will enable next tier cross-instrument-cross facility scientific research fuelled by smart applications residing upon user computer resources. We can learn from the medical imaging community that has been working since the early 1990's to integrate data from across multiple modalities to achieve better

  14. Data Management and Its Role in Delivering Science at DOE BES User Facilities Past, Present, and Future

    International Nuclear Information System (INIS)

    Miller, Stephen D.; Herwig, Kenneth W.; Ren, Shelly; Vazhkudai, Sudharshan S.

    2009-01-01

    The primary mission of user facilities operated by Basic Energy Sciences under the Department of Energy is to produce data for users in support of open science and basic research (1). We trace back almost 30 years of history across selected user facilities illustrating the evolution of facility data management practices and how these practices have related to performing scientific research. The facilities cover multiple techniques such as X-ray and neutron scattering, imaging and tomography sciences. Over time, detector and data acquisition technologies have dramatically increased the ability to produce prolific volumes of data challenging the traditional paradigm of users taking data home upon completion of their experiments to process and publish their results. During this time, computing capacity has also increased dramatically, though the size of the data has grown significantly faster than the capacity of one's laptop to manage and process this new facility produced data. Trends indicate that this will continue to be the case for yet some time. Thus users face a quandary for how to manage today's data complexity and size as these may exceed the computing resources users have available to themselves. This same quandary can also stifle collaboration and sharing. Realizing this, some facilities are already providing web portal access to data and computing thereby providing users access to resources they need (2). Portal based computing is now driving researchers to think about how to use the data collected at multiple facilities in an integrated way to perform their research, and also how to collaborate and share data. In the future, inter-facility data management systems will enable next tier cross-instrument-cross facility scientific research fuelled by smart applications residing upon user computer resources. We can learn from the medical imaging community that has been working since the early 1990's to integrate data from across multiple modalities to achieve

  15. Data Management Activities of Canada's National Science Library - 2010 Update and Prospective

    Directory of Open Access Journals (Sweden)

    Mary Zborowski

    2011-01-01

    Full Text Available NRC-CISTI serves Canada as its National Science Library (as mandated by Canada's Parliament in 1924 and also provides direct support to researchers of the National Research Council of Canada (NRC. By reason of its mandate, vision, and strategic positioning, NRC-CISTI has been rapidly and effectively mobilizing Canadian stakeholders and resources to become a lead player on both the Canadian national and international scenes in matters relating to the organization and management of scientific research data. In a previous communication (CODATA International Conference, 2008, the orientation of NRC-CISTI towards this objective and its short- and medium-term plans and strategies were presented. Since then, significant milestones have been achieved. This paper presents NRC-CISTI's most recent activities in these areas, which are progressing well alongside a strategic organizational redesign process that is realigning NRC-CISTI's structure, mission, and mandate to better serve its clients. Throughout this transformational phase, activities relating to data management remain vibrant.

  16. Challenging Data Management in CMS Computing with Network-aware Systems

    CERN Document Server

    Bonacorsi, Daniele

    2013-01-01

    After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of �?��??Intelligent Network Services�?��?�, including also bandwidt...

  17. An Overview of the Challenges With and Proposed Solutions for the Ingest and Distribution Processes for Airborne Data Management

    Science.gov (United States)

    Beach, Aubrey; Northup, Emily; Early, Amanda; Wang, Dali; Kusterer, John; Quam, Brandi; Chen, Gao

    2015-01-01

    The current data management practices for NASA airborne field projects have successfully served science team data needs over the past 30 years to achieve project science objectives, however, users have discovered a number of issues in terms of data reporting and format. The ICARTT format, a NASA standard since 2010, is currently the most popular among the airborne measurement community. Although easy for humans to use, the format standard is not sufficiently rigorous to be machine-readable. This makes data use and management tedious and resource intensive, and also create problems in Distributed Active Archive Center (DAAC) data ingest procedures and distribution. Further, most DAACs use metadata models that concentrate on satellite data observations, making them less prepared to deal with airborne data.

  18. Quality-assurance and data-management plan for water-quality activities in the Kansas Water Science Center, 2014

    Science.gov (United States)

    Rasmussen, Teresa J.; Bennett, Trudy J.; Foster, Guy M.; Graham, Jennifer L.; Putnam, James E.

    2014-01-01

    As the Nation’s largest water, earth, and biological science and civilian mapping information agency, the U.S. Geological Survey is relied on to collect high-quality data, and produce factual and impartial interpretive reports. This quality-assurance and data-management plan provides guidance for water-quality activities conducted by the Kansas Water Science Center. Policies and procedures are documented for activities related to planning, collecting, storing, documenting, tracking, verifying, approving, archiving, and disseminating water-quality data. The policies and procedures described in this plan complement quality-assurance plans for continuous water-quality monitoring, surface-water, and groundwater activities in Kansas.

  19. Sound data management as a foundation for natural resources management and science

    Science.gov (United States)

    Burley, Thomas E.

    2012-01-01

    Effective decision making is closely related to the quality and completeness of available data and information. Data management helps to ensure data quality in any discipline and supports decision making. Managing data as a long-term scientific asset helps to ensure that data will be usable beyond the original intended application. Emerging issues in water-resources management and climate variability require the ability to analyze change in the conditions of natural resources over time. The availability of quality, well-managed, and documented data from the past and present helps support this requirement.

  20. Using GIS in an Earth Sciences Field Course for Quantitative Exploration, Data Management and Digital Mapping

    Science.gov (United States)

    Marra, Wouter A.; van de Grint, Liesbeth; Alberti, Koko; Karssenberg, Derek

    2017-01-01

    Field courses are essential for subjects like Earth Sciences, Geography and Ecology. In these topics, GIS is used to manage and analyse spatial data, and offers quantitative methods that are beneficial for fieldwork. This paper presents changes made to a first-year Earth Sciences field course in the French Alps, where new GIS methods were…

  1. Data Management in Metagenomics: A Risk Management Approach

    Directory of Open Access Journals (Sweden)

    Filipe Ferreira

    2014-07-01

    Full Text Available In eScience, where vast data collections are processed in scientific workflows, new risks and challenges are emerging. Those challenges are changing the eScience paradigm, mainly regarding digital preservation and scientific workflows. To address specific concerns with data management in these scenarios, the concept of the Data Management Plan was established, serving as a tool for enabling digital preservation in eScience research projects. We claim risk management can be jointly used with a Data Management Plan, so new risks and challenges can be easily tackled. Therefore, we propose an analysis process for eScience projects using a Data Management Plan and ISO 31000 in order to create a Risk Management Plan that can complement the Data Management Plan. The motivation, requirements and validation of this proposal are explored in the MetaGen-FRAME project, focused in Metagenomics.

  2. Artificial intelligence and big data management: the dynamic duo for moving forward data centric sciences

    OpenAIRE

    Vargas Solar, Genoveva

    2017-01-01

    After vivid discussions led by the emergence of the buzzword “Big Data”, it seems that industry and academia have reached an objective understanding about data properties (volume, velocity, variety, veracity and value), the resources and “know how” it requires, and the opportunities it opens. Indeed, new applications promising fundamental changes in society, industry and science, include face recognition, machine translation, digital assistants, self-driving cars, ad-serving, chat-bots, perso...

  3. Meeting the Needs of Data Management Training: The Federation of Earth Science Information Partners (ESIP) Data Management for Scientists Short Course

    Science.gov (United States)

    Hou, Chung-Yi

    2015-01-01

    With the proliferation of digital technologies, scientists are exploring various methods for the integration of data to produce scientific discoveries. To maximize the potential of data for science advancement, proper stewardship must be provided to ensure data integrity and usability both for the short- and the long-term. In order to assist…

  4. Between Scylla and Charybdis: reconciling competing data management demands in the life sciences.

    Science.gov (United States)

    Bezuidenhout, Louise M; Morrison, Michael

    2016-05-17

    The widespread sharing of biologicaConcluding Comments: Teaching Responsible Datal and biomedical data is recognised as a key element in facilitating translation of scientific discoveries into novel clinical applications and services. At the same time, twenty-first century states are increasingly concerned that this data could also be used for purposes of bioterrorism. There is thus a tension between the desire to promote the sharing of data, as encapsulated by the Open Data movement, and the desire to prevent this data from 'falling into the wrong hands' as represented by 'dual use' policies. Both frameworks posit a moral duty for life sciences researchers with respect to how they should make their data available. However, Open data and dual use concerns are rarely discussed in concert and their implementation can present scientists with potentially conflicting ethical requirements. Both dual use and Open data policies frame scientific data and data dissemination in particular, though different, ways. As such they contain implicit models for how data is translated. Both approaches are limited by a focus on abstract conceptions of data and data sharing. This works to impede consensus-building between the two ethical frameworks. As an alternative, this paper proposes that an ethics of responsible management of scientific data should be based on a more nuanced understanding of the everyday data practices of life scientists. Responsibility for these 'micromovements' of data must consider the needs and duties of scientists as individuals and as collectively-organised groups. Researchers in the life sciences are faced with conflicting ethical responsibilities to share data as widely as possible, but prevent it being used for bioterrorist purposes. In order to reconcile the responsibilities posed by the Open Data and dual use frameworks, approaches should focus more on the everyday practices of laboratory scientists and less on abstract conceptions of data.

  5. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  6. Foundations of data-intensive science: Technology and practice for high throughput, widely distributed, data management and analysis systems

    Science.gov (United States)

    Johnston, William; Ernst, M.; Dart, E.; Tierney, B.

    2014-04-01

    Today's large-scale science projects involve world-wide collaborations depend on moving massive amounts of data from an instrument to potentially thousands of computing and storage systems at hundreds of collaborating institutions to accomplish their science. This is true for ATLAS and CMS at the LHC, and it is true for the climate sciences, Belle-II at the KEK collider, genome sciences, the SKA radio telescope, and ITER, the international fusion energy experiment. DOE's Office of Science has been collecting science discipline and instrument requirements for network based data management and analysis for more than a decade. As a result of this certain key issues are seen across essentially all science disciplines that rely on the network for significant data transfer, even if the data quantities are modest compared to projects like the LHC experiments. These issues are what this talk will address; to wit: 1. Optical signal transport advances enabling 100 Gb/s circuits that span the globe on optical fiber with each carrying 100 such channels; 2. Network router and switch requirements to support high-speed international data transfer; 3. Data transport (TCP is still the norm) requirements to support high-speed international data transfer (e.g. error-free transmission); 4. Network monitoring and testing techniques and infrastructure to maintain the required error-free operation of the many R&E networks involved in international collaborations; 5. Operating system evolution to support very high-speed network I/O; 6. New network architectures and services in the LAN (campus) and WAN networks to support data-intensive science; 7. Data movement and management techniques and software that can maximize the throughput on the network connections between distributed data handling systems, and; 8. New approaches to widely distributed workflow systems that can support the data movement and analysis required by the science. All of these areas must be addressed to enable large

  7. Grand challenges for crop science

    Science.gov (United States)

    Crop science is a highly integrative science using the disciplines of conventional plant breeding, transgenic crop improvement, plant physiology, and cropping system sciences to develop improved varieties of agronomic, turf, and forage crops to produce feed, food, fuel, and fiber for our world's gro...

  8. Capable and credible? Challenging nutrition science : Challenging nutrition science

    NARCIS (Netherlands)

    Penders, Bart; Wolters, Anna; Feskens, Edith F; Brouns, Fred; Huber, Machteld; Maeckelberghe, Els L M; Navis, Gerjan; Ockhuizen, Theo; Plat, Jogchum; Sikkema, Jan; Stasse-Wolthuis, Marianne; van 't Veer, Pieter; Verweij, Marcel; de Vries, Jan

    Nutrition science has enriched our understanding of how to stay healthy by producing valuable knowledge about the interaction of nutrients, food, and the human body. Nutrition science also has raised societal awareness about the links between food consumption and well-being, and provided the basis

  9. Science Diplomacy: New Global Challenges, New Trend

    OpenAIRE

    Van Langenhove, Luk

    2016-01-01

    As new challenges such as the critical need for a universal sustainable development agenda confront mankind, science and diplomacy are converging as common tools for trouble-shooting. Science Diplomacy can be seen as a new phenomenon involving the role of science in diplomacy.

  10. Multidimensional Space-Time Methodology for Development of Planetary and Space Sciences, S-T Data Management and S-T Computational Tomography

    Science.gov (United States)

    Andonov, Zdravko

    This R&D represent innovative multidimensional 6D-N(6n)D Space-Time (S-T) Methodology, 6D-6nD Coordinate Systems, 6D Equations, new 6D strategy and technology for development of Planetary Space Sciences, S-T Data Management and S-T Computational To-mography. . . The Methodology is actual for brain new RS Microwaves' Satellites and Compu-tational Tomography Systems development, aimed to defense sustainable Earth, Moon, & Sun System evolution. Especially, extremely important are innovations for monitoring and protec-tion of strategic threelateral system H-OH-H2O Hydrogen, Hydroxyl and Water), correspond-ing to RS VHRS (Very High Resolution Systems) of 1.420-1.657-22.089GHz microwaves. . . One of the Greatest Paradox and Challenge of World Science is the "transformation" of J. L. Lagrange 4D Space-Time (S-T) System to H. Minkovski 4D S-T System (O-X,Y,Z,icT) for Einstein's "Theory of Relativity". As a global result: -In contemporary Advanced Space Sciences there is not real adequate 4D-6D Space-Time Coordinate System and 6D Advanced Cosmos Strategy & Methodology for Multidimensional and Multitemporal Space-Time Data Management and Tomography. . . That's one of the top actual S-T Problems. Simple and optimal nD S-T Methodology discovery is extremely important for all Universities' Space Sci-ences' Education Programs, for advances in space research and especially -for all young Space Scientists R&D!... The top ten 21-Century Challenges ahead of Planetary and Space Sciences, Space Data Management and Computational Space Tomography, important for successfully de-velopment of Young Scientist Generations, are following: 1. R&D of W. R. Hamilton General Idea for transformation all Space Sciences to Time Sciences, beginning with 6D Eukonal for 6D anisotropic mediums & velocities. Development of IERS Earth & Space Systems (VLBI; LLR; GPS; SLR; DORIS Etc.) for Planetary-Space Data Management & Computational Planetary & Space Tomography. 2. R&D of S. W. Hawking Paradigm for 2D

  11. The IRIS DMC: Perspectives on Real-Time Data Management and Open Access From a Large Seismological Archive: Challenges, Tools, and Quality Assurance

    Science.gov (United States)

    Benson, R. B.

    2007-05-01

    The IRIS Data Management Center, located in Seattle, WA, is the largest openly accessible geophysical archive in the world, and has a unique perspective on data management and operational practices that gets the most out of your network. Networks scale broad domains in time and space, from finite needs to monitor bridges and dams to national and international networks like the GSN and the FDSN that establish a baseline for global monitoring and research, the requirements that go into creating a well-tuned DMC archive treat these the same, building a collaborative network of networks that generations of users rely on and adds value to the data. Funded by the National Science Foundation through the Division of Earth Sciences, IRIS is operated through member universities and in cooperation with the USGS, and the DMS facility is a bridge between a globally distributed collaboration of seismic networks and an equally distributed network of users that demand a high standard for data quality, completeness, and ease of access. I will describe the role that a perpetual archive has in the life cycle of data, and how hosting real-time data performs a dual role of being a hub for continuous data from approximately 59 real-time networks, and distributing these (along with other data from the 40-year library of available time-series data) to researchers, while simultaneously providing shared data back to networks in real- time that benefits monitoring activities. I will describe aspects of our quality-assurance framework that are both passively and actively performed on 1100 seismic stations, generating over 6,000 channels of regularly sampled data arriving daily, that data providers can use as aids in operating their network, and users can likewise use when requesting suitable data for research purposes. The goal of the DMC is to eliminate bottlenecks in data discovery and shortening the steps leading to analysis. This includes many challenges, including keeping metadata

  12. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  13. Development of a Pilot Data Management Infrastructure for Biomedical Researchers at University of Manchester – Approach, Findings, Challenges and Outlook of the MaDAM Project

    Directory of Open Access Journals (Sweden)

    Meik Poschen

    2012-12-01

    Full Text Available Management and curation of digital data has been becoming ever more important in a higher education and research environment characterised by large and complex data, demand for more interdisciplinary and collaborative work, extended funder requirements and use of e-infrastructures to facilitate new research methods and paradigms. This paper presents the approach, technical infrastructure, findings, challenges and outlook (including future development within the successor project, MiSS of the ‘MaDAM: Pilot data management infrastructure for biomedical researchers at University of Manchester’ project funded under the infrastructure strand of the JISC Managing Research Data (JISCMRD programme. MaDAM developed a pilot research data management solution at the University of Manchester based on biomedical researchers’ requirements, which includes technical and governance components with the flexibility to meet future needs across multiple research groups and disciplines.

  14. Materials science challenges in paintings.

    Science.gov (United States)

    Walter, Philippe; de Viguerie, Laurence

    2018-01-23

    Through the paintings of the old masters, we showcase how materials science today provides us with a vision of the processes involved in the creation of a work of art: the choice of materials, the painter's skill in handling these materials, and the perception of the finished work.

  15. Materials science challenges in paintings

    Science.gov (United States)

    Walter, Philippe; de Viguerie, Laurence

    2018-02-01

    Through the paintings of the old masters, we showcase how materials science today provides us with a vision of the processes involved in the creation of a work of art: the choice of materials, the painter's skill in handling these materials, and the perception of the finished work.

  16. Ocean Science Video Challenge Aims to Improve Science Communication

    Science.gov (United States)

    Showstack, Randy

    2013-10-01

    Given today's enormous management and protection challenges related to the world's oceans, a new competition calls on ocean scientists to effectively communicate their research in videos that last up to 3 minutes. The Ocean 180 Video Challenge, named for the number of seconds in 3 minutes, aims to improve ocean science communication while providing high school and middle school teachers and students with new and interesting educational materials about current science topics.

  17. Science Education: Issues, Approaches and Challenges

    Directory of Open Access Journals (Sweden)

    Shairose Irfan Jessani

    2015-06-01

    Full Text Available In today’s global education system, science education is much more than fact-based knowledge. Science education becomes meaningless and incomprehensible for learners, if the learners are unable to relate it with their lives. It is thus recommended that Pakistan, like many other countries worldwide should adopt Science Technology Society (STS approach for delivery of science education. The purpose of the STS approach lies in developing scientifically literate citizens who can make conscious decisions about the socio-scientific issues that impact their lives. The challenges in adopting this approach for Pakistan lie in four areas that will completely need to be revamped according to STS approach. These areas include: the examination system; science textbooks; science teacher education programs; and available resources and school facilities.

  18. Provenance Challenges for Earth Science Dataset Publication

    Science.gov (United States)

    Tilmes, Curt

    2011-01-01

    Modern science is increasingly dependent on computational analysis of very large data sets. Organizing, referencing, publishing those data has become a complex problem. Published research that depends on such data often fails to cite the data in sufficient detail to allow an independent scientist to reproduce the original experiments and analyses. This paper explores some of the challenges related to data identification, equivalence and reproducibility in the domain of data intensive scientific processing. It will use the example of Earth Science satellite data, but the challenges also apply to other domains.

  19. Plagiarism challenges at Ukrainian science and education

    Directory of Open Access Journals (Sweden)

    Denys Svyrydenko

    2016-12-01

    Full Text Available The article analyzes the types and severity of plagiarism violations at the modern educational and scientific spheres using the philosophic methodological approaches. The author analyzes Ukrainian context as well as global one and tries to formulate "order of the day" of plagiarism challenges. The plagiarism phenomenon is intuitively comprehensible for academicians but in reality it has a very complex nature and a lot of manifestation. Using approaches of ethics, philosophical anthropology, philosophy of science and education author formulates the series of recommendation for overcoming of plagiarism challenges at Ukrainian science and education.

  20. Challenges for Data Archival Centers in Evolving Environmental Sciences

    Science.gov (United States)

    Wei, Y.; Cook, R. B.; Gu, L.; Santhana Vannan, S. K.; Beaty, T.

    2015-12-01

    Environmental science has entered into a big data era as enormous data about the Earth environment are continuously collected through field and airborne missions, remote sensing observations, model simulations, sensor networks, etc. An open-access and open-management data infrastructure for data-intensive science is a major grand challenge in global environmental research (BERAC, 2010). Such an infrastructure, as exemplified in EOSDIS, GEOSS, and NSF EarthCube, will provide a complete lifecycle of environmental data and ensures that data will smoothly flow among different phases of collection, preservation, integration, and analysis. Data archival centers, as the data integration units closest to data providers, serve as the source power to compile and integrate heterogeneous environmental data into this global infrastructure. This presentation discusses the interoperability challenges and practices of geosciences from the aspect of data archival centers, based on the operational experiences of the NASA-sponsored Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) and related environmental data management activities. Specifically, we will discuss the challenges to 1) encourage and help scientists to more actively share data with the broader scientific community, so that valuable environmental data, especially those dark data collected by individual scientists in small independent projects, can be shared and integrated into the infrastructure to tackle big science questions; 2) curate heterogeneous multi-disciplinary data, focusing on the key aspects of identification, format, metadata, data quality, and semantics to make them ready to be plugged into a global data infrastructure. We will highlight data curation practices at the ORNL DAAC for global campaigns such as BOREAS, LBA, SAFARI 2000; and 3) enhance the capabilities to more effectively and efficiently expose and deliver "big" environmental data to broad range of users and systems

  1. Recruiting and Advising Challenges in Actuarial Science

    Science.gov (United States)

    Case, Bettye Anne; Guan, Yuanying Michelle; Paris, Stephen

    2014-01-01

    Some challenges to increasing actuarial science program size through recruiting broadly among potential students are identified. Possible solutions depend on the structures and culture of the school. Up to three student cohorts may result from partition of potential students by the levels of academic progress before program entry: students…

  2. Research data management support for large-scale, long-term, interdisciplinary collaborative research centers with a focus on environmental sciences

    Science.gov (United States)

    Curdt, C.; Hoffmeister, D.; Bareth, G.; Lang, U.

    2017-12-01

    Science conducted in collaborative, cross-institutional research projects, requires active sharing of research ideas, data, documents and further information in a well-managed, controlled and structured manner. Thus, it is important to establish corresponding infrastructures and services for the scientists. Regular project meetings and joint field campaigns support the exchange of research ideas. Technical infrastructures facilitate storage, documentation, exchange and re-use of data as results of scientific output. Additionally, also publications, conference contributions, reports, pictures etc. should be managed. Both, knowledge and data sharing is essential to create synergies. Within the coordinated programme `Collaborative Research Center' (CRC), the German Research Foundation offers funding to establish research data management (RDM) infrastructures and services. CRCs are large-scale, interdisciplinary, multi-institutional, long-term (up to 12 years), university-based research institutions (up to 25 sub-projects). These CRCs address complex and scientifically challenging research questions. This poster presents the RDM services and infrastructures that have been established for two CRCs, both focusing on environmental sciences. Since 2007, a RDM support infrastructure and associated services have been set up for the CRC/Transregio 32 (CRC/TR32) `Patterns in Soil-Vegetation-Atmosphere-Systems: Monitoring, Modelling and Data Assimilation' (www.tr32.de). The experiences gained have been used to arrange RDM services for the CRC1211 `Earth - Evolution at the Dry Limit' (www.crc1211.de), funded since 2016. In both projects scientists from various disciplines collect heterogeneous data at field campaigns or by modelling approaches. To manage the scientific output, the TR32DB data repository (www.tr32db.de) has been designed and implemented for the CRC/TR32. This system was transferred and adapted to the CRC1211 needs (www.crc1211db.uni-koeln.de) in 2016. Both

  3. The AGU Data Management Maturity Model Initiative

    Science.gov (United States)

    Bates, J. J.

    2015-12-01

    In September 2014, the AGU Board of Directors approved two initiatives to help the Earth and space sciences community address the growing challenges accompanying the increasing size and complexity of data. These initiatives are: 1) Data Science Credentialing: development of a continuing education and professional certification program to help scientists in their careers and to meet growing responsibilities and requirements around data science; and 2) Data Management Maturity (DMM) Model: development and implementation of a data management maturity model to assess process maturity against best practices, and to identify opportunities in organizational data management processes. Each of these has been organized within AGU as an Editorial Board and both Boards have held kick off meetings. The DMM model Editorial Board will recommend strategies for adapting and deploying a DMM model to the Earth and space sciences create guidance documents to assist in its implementation, and provide input on a pilot appraisal process. This presentation will provide an overview of progress to date in the DMM model Editorial Board and plans for work to be done over the upcoming year.

  4. Symposium 1: Challenges in science education and popularization of Science

    Directory of Open Access Journals (Sweden)

    Ildeo de Castro Moreira

    2014-08-01

    Full Text Available Science education and popularization of science are important elements for social inclusion. The Brazil exhibits strong inequalities regarding the distribution of wealth, access to cultural assets and appropriation of scientific and technological knowledge. Each Brazilian should have the opportunity to acquire a basic knowledge of science and its operation that allow them to understand their environment and expand their professional opportunities. However, the overall performance of Brazilian students in science and math is bad. The basic science education has, most often, few resources and is discouraging, with little appreciation of experimentation, interdisciplinarity and creativity. Beside the shortage of science teachers, especially teachers with good formation, predominate poor wage and working conditions, and deficiencies in instructional materials and laboratories. If there was a significant expansion in access to basic education, the challenge remains to improve their quality. According to the last National Conference of STI, there is need of a profound educational reform at all levels, in particular with regard to science education. Already, the popularization of science can be an important tool for the construction of scientific culture and refinement of the formal teaching instrument. However, we still lack a comprehensive and adequate public policy to her intended. Clearly, in recent decades, an increase in scientific publication occurred: creating science centers and museums; greater media presence; use of the internet and social networks; outreach events, such as the National Week of CT. But the scenario is shown still fragile and limited to broad swathes of Brazilians without access to scientific education and qualified information on CT. In this presentation, from a general diagnosis of the situation, some of the main challenges related to education and popularization of science in the country will address herself.

  5. Six Challenges for Ethical Conduct in Science.

    Science.gov (United States)

    Niemi, Petteri

    2016-08-01

    The realities of human agency and decision making pose serious challenges for research ethics. This article explores six major challenges that require more attention in the ethics education of students and scientists and in the research on ethical conduct in science. The first of them is the routinization of action, which makes the detection of ethical issues difficult. The social governance of action creates ethical problems related to power. The heuristic nature of human decision making implies the risk of ethical bias. The moral disengagement mechanisms represent a human tendency to evade personal responsibility. The greatest challenge of all might be the situational variation in people's ethical behaviour. Even minor situational factors have a surprisingly strong influence on our actions. Furthermore, finally, the nature of ethics itself also causes problems: instead of clear answers, we receive a multitude of theories and intuitions that may sometimes be contradictory. All these features of action and ethics represent significant risks for ethical conduct in science. I claim that they have to be managed within the everyday practices of science and addressed explicitly in research ethics education. I analyse them and suggest some ways in which their risks can be alleviated.

  6. The Challenges of Creating a Real-Time Data Management System for TRU-Mixed Waste at the Advanced Mixed Waste Treatment Plant

    International Nuclear Information System (INIS)

    Paff, S. W; Doody, S.

    2003-01-01

    This paper discusses the challenges associated with creating a data management system for waste tracking at the Advanced Mixed Waste Treatment Plant (AMWTP) at the Idaho National Engineering Lab (INEEL). The waste tracking system combines data from plant automation systems and decision points. The primary purpose of the system is to provide information to enable the plant operators and engineers to assess the risks associated with each container and determine the best method of treating it. It is also used to track the transuranic (TRU) waste containers as they move throughout the various processes at the plant. And finally, the goal of the system is to support paperless shipments of the waste to the Waste Isolation Pilot Plant (WIPP). This paper describes the approach, methodologies, the underlying design of the database, and the challenges of creating the Data Management System (DMS) prior to completion of design and construction of a major plant. The system was built utilizing an Oracle database platform, and Oracle Forms 6i in client-server mode. The underlying data architecture is container-centric, with separate tables and objects for each type of analysis used to characterize the waste, including real-time radiography (RTR), non-destructive assay (NDA), head-space gas sampling and analysis (HSGS), visual examination (VE) and coring. The use of separate tables facilitated the construction of automatic interfaces with the analysis instruments that enabled direct data capture. Movements are tracked using a location system describing each waste container's current location and a history table tracking the container's movement history. The movement system is designed to interface both with radio-frequency bar-code devices and the plant's integrated control system (ICS). Collections of containers or information, such as batches, were created across the various types of analyses, which enabled a single, cohesive approach to be developed for verification and

  7. Developing and Teaching a Two-Credit Data Management Course for Graduate Students in Climate and Space Sciences

    Science.gov (United States)

    Thielen, Joanna; Samuel, Sara M.; Carlson, Jake; Moldwin, Mark

    2017-01-01

    Engineering researchers face increasing pressure to manage, share, and preserve their data, but these subjects are not typically a part of the curricula of engineering graduate programs. To address this situation, librarians at the University of Michigan, in partnership with the Climate and Space Sciences and Engineering Department, developed a…

  8. Historical Development and Key Issues of Data Management Plan Requirements for National Science Foundation Grants: A Review

    Science.gov (United States)

    Pasek, Judith E.

    2017-01-01

    Sharing scientific research data has become increasingly important for knowledge advancement in today's networked, digital world. This article describes the evolution of access to United States government information in relation to scientific research funded by federal grants. It analyzes the data sharing policy of the National Science Foundation,…

  9. Data management in EGEE

    Energy Technology Data Exchange (ETDEWEB)

    Frohner, Akos; Baud, Jean-Philippe; Rioja, Rosa Maria Garcia; Mollon, Remi; Smith, David; Tedesco, Paolo [CERN (Switzerland); Grosdidier, Gilbert [LAL/IN2P3/CNRS (France)

    2010-04-01

    Data management is one of the cornerstones in the distributed production computing environment that the EGEE project aims to provide for a e-Science infrastructure. We have designed and implemented a set of services and client components, addressing the diverse requirements of all user communities. LHC experiments as main users will generate and distribute approximately 15 PB of data per year worldwide using this infrastructure. Another key user community, biomedical projects, have strict security requirements with less emphasis on the volume of data. We maintain three service groups for grid data management: The Disk Pool Manager (DPM) Storage Element (with more than 100 instances deployed world-wide), the LCG File Catalogue (LFC) and the File Transfer Service (FTS) which sustains an aggregated transfer rate of 1.5GB/sec. They are complemented by individual client components and also tools which help coordinating more complex uses cases with multiple services (GFAL-client, lcg util, eds-cli). In this paper we show how these services, keeping clean and standard interfaces among each other, can work together to cover the data flow and how they can be used as individual components to cover diverse requirements. We will also describe areas that we consider for further improvements, both for performance and functionality.

  10. An operational perspective of challenging statistical dogma while establishing a modern, secure distributed data management and imaging transport system: the Pediatric Brain Tumor Consortium phase I experience.

    Science.gov (United States)

    Onar, Arzu; Ramamurthy, Uma; Wallace, Dana; Boyett, James M

    2009-04-01

    The Pediatric Brain Tumor Consortium (PBTC) is a multidisciplinary cooperative research organization devoted to the study of correlative tumor biology and new therapies for primary central nervous system (CNS) tumors of childhood. The PBTC was created in 1999 to conduct early-phase studies in a rapid fashion in order to provide sound scientific foundation for the Children's Oncology Group to conduct definitive trials. The Operations and Biostatistics Center (OBC) of the PBTC is responsible for centrally administering study design and trial development, study conduct and monitoring, data collection and management as well as various regulatory and compliance processes. The phase I designs utilized for the consortium trials have accommodated challenges unique to pediatric trials such as body surface area (BSA)-based dosing in the absence of pediatric formulations of oral agents. Further during the past decade, the OBC has developed and implemented a state-of-the-art secure and efficient internet-based paperless distributed data management system. Additional web-based systems are also in place for tracking and distributing correlative study data as well as neuroimaging files. These systems enable effective communications among the members of the consortium and facilitate the conduct and timely reporting of multi-institutional early-phase clinical trials.

  11. Data management in maintenance outsourcing

    International Nuclear Information System (INIS)

    Murthy, D.N.P.; Karim, M.R.; Ahmadi, A.

    2015-01-01

    Most businesses view maintenance as tasks carried out by technicians and the data collected is mostly cost related. There is a growing trend towards outsourcing of maintenance and the data collection issues are not addressed properly in most maintenance service contracts. Effective maintenance management requires proper data management - data collection and analysis for decision-making. This requires a proper framework and when maintenance is outsourced it raises several issues and challenges. The paper develops a framework for data management when maintenance is outsourced and looks at a real case study that highlights the need for proper data management. - Highlights: • Framework for data management in maintenance outsourcing. • Black-box to grey-box approaches for modelling. • Improvements to maintenance decision-making. • Case study to illustrate the approaches and the shortcomings in data collection

  12. Energy challenge and nano-sciences

    International Nuclear Information System (INIS)

    Romulus, Anne-Marie; Chamelot, Pierre; Chaudret, Bruno; Comtat, Maurice; Fajerwerg, Katia; Philippot, Karine; Geoffron, Patrice; Lacroix, Jean-Christophe; Abanades, Stephane; Flamant, Gilles; HUERTA-ORTEGA, Benjamin; Cezac, Pierre; Lincot, Daniel; Roncali, Jean; Artero, Vincent; GuiLLET, Nicolas; Fauvarque, Jean-Francois; Simon, Patrice; Taberna, Pierre-Louis

    2013-01-01

    This book first describes the role of energy in the development of nano-sciences, discusses energy needs, the perception of nano-sciences by societies as far as the energy challenge is concerned, describes the contribution of nano-catalyzers to energy and how these catalyzers are prepared. A second part addresses the new perspectives regarding carbon: production of biofuels from biomass, process involved in CO 2 geological storage, improvement of solar fuel production with the use of nano-powders. The third part describes the new orientations of solar energy: contribution of the thin-layer inorganic sector to photovoltaic conversion, perspectives for organic photovoltaic cells, operation of new dye-sensitized nanocrystalline solar cells. The fourth part addresses the hydrogen sector: credibility, contribution of biomass in hydrogen production, production of hydrogen by electrochemistry, new catalyzers for electrolyzers and fuel cells. The last part address improved electrochemical reactors

  13. NASA's Earth Science Enterprise: Future Science Missions, Objectives and Challenges

    Science.gov (United States)

    Habib, Shahid

    1998-01-01

    NASA has been actively involved in studying the planet Earth and its changing environment for well over thirty years. Within the last decade, NASA's Earth Science Enterprise has become a major observational and scientific element of the U.S. Global Change Research Program. NASA's Earth Science Enterprise management has developed a comprehensive observation-based research program addressing all the critical science questions that will take us into the next century. Furthermore, the entire program is being mapped to answer five Science Themes (1) land-cover and land-use change research (2) seasonal-to-interannual climate variability and prediction (3) natural hazards research and applications (4) long-term climate-natural variability and change research and (5) atmospheric ozone research. Now the emergence of newer technologies on the horizon and at the same time continuously declining budget environment has lead to an effort to refocus the Earth Science Enterprise activities. The intent is not to compromise the overall scientific goals, but rather strengthen them by enabling challenging detection, computational and space flight technologies those have not been practically feasible to date. NASA is planning faster, cost effective and relatively smaller missions to continue the science observations from space for the next decade. At the same time, there is a growing interest in the world in the remote sensing area which will allow NASA to take advantage of this by building strong coalitions with a number of international partners. The focus of this presentation is to provide a comprehensive look at the NASA's Earth Science Enterprise in terms of its brief history, scientific objectives, organization, activities and future direction.

  14. LIMS and Clinical Data Management.

    Science.gov (United States)

    Chen, Yalan; Lin, Yuxin; Yuan, Xuye; Shen, Bairong

    2016-01-01

    In order to achieve more accurate disease prevention, diagnosis, and treatment, clinical and genetic data need extensive and systematically associated study. As one way to achieve precision medicine, a laboratory information management system (LIMS) can effectively associate clinical data in a macrocosmic aspect and genomic data in a microcosmic aspect. This chapter summarizes the application of the LIMS in a clinical data management and implementation mode. It also discusses the principles of a LIMS in clinical data management, as well as the opportunities and challenges in the context of medical informatics.

  15. Architecture for Data Management

    OpenAIRE

    Vukolic, Marko

    2015-01-01

    In this document we present the preliminary architecture of the SUPERCLOUD data management and storage. We start by defining the design requirements of the architecture, motivated by use cases and then review the state-of-the-art. We survey security and dependability technologies and discuss designs for the overall unifying architecture for data management that serves as an umbrella for different security and dependability data management features. Specifically the document lays out the archi...

  16. Support for global science: Remote sensing's challenge

    Science.gov (United States)

    Estes, J. E.; Star, J. L.

    1986-01-01

    Remote sensing uses a wide variety of techniques and methods. Resulting data are analyzed by man and machine, using both analog and digital technology. The newest and most important initiatives in the U. S. civilian space program currently revolve around the space station complex, which includes the core station as well as co-orbiting and polar satellite platforms. This proposed suite of platforms and support systems offers a unique potential for facilitating long term, multidisciplinary scientific investigations on a truly global scale. Unlike previous generations of satellites, designed for relatively limited constituencies, the space station offers the potential to provide an integrated source of information which recognizes the scientific interest in investigating the dynamic coupling between the oceans, land surface, and atmosphere. Earth scientist already face problems that are truly global in extent. Problems such as the global carbon balance, regional deforestation, and desertification require new approaches, which combine multidisciplinary, multinational research teams, employing advanced technologies to produce a type, quantity, and quality of data not previously available. The challenge before the international scientific community is to continue to develop both the infrastructure and expertise to, on the one hand, develop the science and technology of remote sensing, while on the other hand, develop an integrated understanding of global life support systems, and work toward a quantiative science of the biosphere.

  17. Challenges in Managing Trustworthy Large-scale Digital Science

    Science.gov (United States)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  18. COMPLEX NETWORKS IN CLIMATE SCIENCE: PROGRESS, OPPORTUNITIES AND CHALLENGES

    Data.gov (United States)

    National Aeronautics and Space Administration — COMPLEX NETWORKS IN CLIMATE SCIENCE: PROGRESS, OPPORTUNITIES AND CHALLENGES KARSTEN STEINHAEUSER, NITESH V. CHAWLA, AND AUROOP R. GANGULY Abstract. Networks have...

  19. Data Management Coordinators (DMC)

    Science.gov (United States)

    The Regional Data Management Coordinators (DMCs) were identified to serve as the primary contact for each region for all Water Quality Framework activities. They will facilitate and communicate information to the necessary individuals at the region and tra

  20. Data management of web archive research data

    DEFF Research Database (Denmark)

    Zierau, Eld; Jurik, Bolette

    This paper will provide recommendations to overcome various challenges for data management of web materials. The recommendations are based on results from two independent Danish research projects with different requirements to data management: The first project focuses on high precision on a par...

  1. Data Management Plan

    DEFF Research Database (Denmark)

    Hansen, Ernst Jan de Place; Vogelsang, Stefan; Freudenberg, Peggy

    2015-01-01

    This document describes the Data Management Plan (DMP) (first version), relating to RIBuild WP8, deliverable D8.1. The DMP include description of data sets, standards and metadata, data sharing and archiving and preservation of data.......This document describes the Data Management Plan (DMP) (first version), relating to RIBuild WP8, deliverable D8.1. The DMP include description of data sets, standards and metadata, data sharing and archiving and preservation of data....

  2. ethiopian students' achievement challenges in science education

    African Journals Online (AJOL)

    IICBA01

    Oli Negassa. Adama Science and Technology University, Ethiopia ... achievement in science education across selected preparatory schools of Ethiopia. The .... To what extent do students' achievements vary across grade levels, regions,.

  3. Challenges of Women in Science: Bangladesh Perspectives

    Indian Academy of Sciences (India)

    ranjeetha

    Director, Bose Centre for Advanced Study and Research in Natural Sciences .... Enrolment in Universities by management d d and gender. 100000 .... in science. • Encouragement in the classroom, family and environment ... Desired strategy.

  4. Integrated groundwater data management

    Science.gov (United States)

    Fitch, Peter; Brodaric, Boyan; Stenson, Matt; Booth, Nathaniel; Jakeman, Anthony J.; Barreteau, Olivier; Hunt, Randall J.; Rinaudo, Jean-Daniel; Ross, Andrew

    2016-01-01

    The goal of a data manager is to ensure that data is safely stored, adequately described, discoverable and easily accessible. However, to keep pace with the evolution of groundwater studies in the last decade, the associated data and data management requirements have changed significantly. In particular, there is a growing recognition that management questions cannot be adequately answered by single discipline studies. This has led a push towards the paradigm of integrated modeling, where diverse parts of the hydrological cycle and its human connections are included. This chapter describes groundwater data management practices, and reviews the current state of the art with enterprise groundwater database management systems. It also includes discussion on commonly used data management models, detailing typical data management lifecycles. We discuss the growing use of web services and open standards such as GWML and WaterML2.0 to exchange groundwater information and knowledge, and the need for national data networks. We also discuss cross-jurisdictional interoperability issues, based on our experience sharing groundwater data across the US/Canadian border. Lastly, we present some future trends relating to groundwater data management.

  5. Challenges to implementing "best available science"

    Science.gov (United States)

    Vita Wright

    2010-01-01

    Interagency wildland fire policy directs manager to apply "best available science" to management plans and activities. But what does "best available science" mean? With a vague definition of this concept and few guidelines for delivering or integrating science into management, it can be difficult for scientists to effectively provide managers with...

  6. Essential Partnerships in the Data Management Life Cycle

    Science.gov (United States)

    Kinkade, D.; Allison, M. D.; Chandler, C. L.; Copley, N. J.; Gegg, S. R.; Groman, R. C.; Rauch, S.

    2015-12-01

    An obvious product of the scientific research process is data. Today's geoscience research efforts can rapidly produce an unprecedented volume of multidisciplinary data that can pose management challenges for the facility charged with curating that information. How do these facilities achieve efficient data management in a high volume, heterogeneous data world? Partnerships are critical, especially for small to mid-sized data management offices, such as those dedicated to academic research communities. The idea of partnerships can encompass a wide range of collaborative relationships aimed at helping these facilities meet the evolving needs of their communities. However, one basic and often overlooked partnership in the data management process is that of the information manager and the Principal Investigator (PI) or data originator. Such relationships are critical in discerning the best possible management strategy, and in obtaining the most robust metadata necessary for reuse of multidisciplinary datasets. Partnerships established early in the data life cycle enable efficient management and dissemination of data in high volumes and heterogeneous formats. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created to fulfill the data management needs of PIs funded by the NSF Ocean Sciences Biological and Chemical Sections, and Division of Polar Programs. Since its inception, the Office has relied upon the close relationships it cultivates between its data managers and PIs in order to provide effective data management for a wide variety of ecological and biogeochemical oceanographic data. This presentation will highlight some of the successful partnerships BCO-DMO has made with individual and collaborative investigators, as well as those with other data managers representing specific research communities.

  7. Data management for environmental research

    International Nuclear Information System (INIS)

    Strand, R.H.

    1976-01-01

    The objective of managing environmental research data is to develop a resource sufficient for the study and potential solution of environmental problems. Consequently, environmnetal data management must include a broad spectrum of activities ranging from statistical analysis and modeling, through data set archiving to computer hardware procurement. This paper briefly summarizes the data management requirements for environmental research and the techniques and automated procedures which are currently used by the Environmental Sciences Division at Oak Ridge National Laboratory. Included in these requirements are readily retrievable data, data indexed by categories for retrieval and application, data documentation (including collection methods), design and error bounds, easily used analysis and display programs, and file manipulation routines. The statistical analysis system (SAS) and other systems provide the automated procedures and techniques for analysis and management of environmental research data

  8. DIRAC Data Management System

    CERN Document Server

    Smith, A C

    2007-01-01

    The LHCb experiment being built to utilize CERN’s flagship Large Hadron Collider will generate data to be analysed by a community of over 600 physicists worldwide. DIRAC, LHCb’s Workload and Data Management System, facilitates the use of underlying EGEE Grid resources to generate, process and analyse this data in the distributed environment. The Data Management System, presented here, provides real-time, data-driven distribution in accordance with LHCb’s Computing Model. The data volumes produced by the LHC experiments are unprecedented, rendering individual institutes and even countries, unable to provide the computing and storage resources required to make full use of the produced data. EGEE Grid resources allow the processing of LHCb data possible in a distributed fashion and LHCb’s Computing Model is based on this approach. Data Management in this environment requires reliable and high-throughput transfer of data, homogeneous access to storage resources and the cataloguing of data replicas, all of...

  9. Capable and credible? Challenging nutrition science

    NARCIS (Netherlands)

    Penders, Bart; Wolters, Anna; Feskens, Edith F.; Brouns, Fred; Huber, Machteld; Maeckelberghe, Els L.M.; Navis, Gerjan; Ockhuizen, Theo; Plat, Jogchum; Sikkema, Jan; Stasse-Wolthuis, Marianne; Veer, van 't Pieter; Verweij, Marcel; Vries, de Jan

    2017-01-01

    Nutrition science has enriched our understanding of how to stay healthy by producing valuable knowledge about the interaction of nutrients, food, and the human body. Nutrition science also has raised societal awareness about the links between food consumption and well-being, and provided the basis

  10. Data Management Plan

    DEFF Research Database (Denmark)

    Hansen, Ernst Jan de Place; Sørensen, Nils Lykke

    2016-01-01

    This document describes the Data Management Plan (DMP) (second version), relating to RIBuild WP8, deliverable D8.1. It draws the first lines for how data can be made findable, accessible, interoperable and re-usable after the project period.......This document describes the Data Management Plan (DMP) (second version), relating to RIBuild WP8, deliverable D8.1. It draws the first lines for how data can be made findable, accessible, interoperable and re-usable after the project period....

  11. The opportunities and challenges for ICT in science education

    OpenAIRE

    Ferk Savec, Vesna

    2017-01-01

    This article examines the opportunities and challenges for the use of ICT in science education in the light of science teachers’ Technological Pedagogical Content Knowledge (TPACK). Some of the variables that have been studied with regard to the TPACK fra mework in science classrooms (such as teachers’ self - efficacy, gender, teaching experience, teachers’ beliefs, etc.) are reviewed, and variations of the TPACK framework specific for science education ...

  12. Spatially explicit data: stewardship and ethical challenges in science.

    Science.gov (United States)

    Hartter, Joel; Ryan, Sadie J; Mackenzie, Catrina A; Parker, John N; Strasser, Carly A

    2013-09-01

    Scholarly communication is at an unprecedented turning point created in part by the increasing saliency of data stewardship and data sharing. Formal data management plans represent a new emphasis in research, enabling access to data at higher volumes and more quickly, and the potential for replication and augmentation of existing research. Data sharing has recently transformed the practice, scope, content, and applicability of research in several disciplines, in particular in relation to spatially specific data. This lends exciting potentiality, but the most effective ways in which to implement such changes, particularly for disciplines involving human subjects and other sensitive information, demand consideration. Data management plans, stewardship, and sharing, impart distinctive technical, sociological, and ethical challenges that remain to be adequately identified and remedied. Here, we consider these and propose potential solutions for their amelioration.

  13. Some Challenges for eScience Liaison

    Directory of Open Access Journals (Sweden)

    Graham Pryor

    2007-12-01

    Full Text Available The Digital Curation Centre’s promotion of expertise and good practice in digital data curation is no mere exercise in theory. Through its new eScience Liaison initiative the DCC has kept a close eye on its founding principle, that the necessity for the physical and life sciences to share access to digital research resources is due mainly to issues characteristic of eScience. This article describes some of the principal liaison activities that have been addressed within that community since the summer of 2007.

  14. Research Data Management Education for Future Curators

    Directory of Open Access Journals (Sweden)

    Mark Scott

    2013-06-01

    Full Text Available Science has progressed by “standing on the shoulders of giants” and for centuries research and knowledge have been shared through the publication and dissemination of books, papers and scholarly communications. Moving forward, much of our understanding builds on (large scale datasets, which have been collected or generated as part of the scientific process of discovery. How will this be made available for future generations? How will we ensure that, once collected or generated, others can stand on the shoulders of the data we produce?Educating students about the challenges and opportunities of data management is a key part of the solution and helps the researchers of the future to start to think about the problems early on in their careers. We have compiled a set of case studies to show the similarities and differences in data between disciplines, and produced a booklet for students containing the case studies and an introduction to the data lifecycle and other data management practices. This has already been used at the University of Southampton within the Faculty of Engineering and is now being adopted centrally for use in other faculties. In this paper, we will provide an overview of the case studies and the guide, and reflect on the reception the guide has had to date.

  15. Editorial: Challenges of Social Science Literacy

    Directory of Open Access Journals (Sweden)

    Birgit Weber

    2010-12-01

    Full Text Available Since international tests compare the performance of students in different subjects, the issue of literacy in the social science subject is becoming more pressing. The successes and failures in international tests influence the national education policies considerably. First, the inclusion of subjects in international comparisons has consequences for their importance. Second, the race in the Olympics of education leads to an increasing focus on the output of educational processes, also measured in the central exams. Social Sciences can refuse to take part in the national comparison studies with the price of losing much more importance; they can participate with the danger of undermining their goals. This raises a lot of questions: What competences students need in this social world to reason about it und to act responsibly? What is the foundation of concepts from social science students need for guidance and understanding their place and role as an individual in society? The social science disciplines, as sociology, political science and economics in a narrow sense, history, law and geography in a broader sense, supported by philosophy, pedagogy and psychology are able to select them for educational purposes or determine such educational aims. This Journal wants to resume und discuss competences and core con¬cepts for political and economic teaching and learning as Social Science Literacy”. Contributions in this issue do not only discuss and recommend competences and core concepts from a domain specific political or economic point of view, but also from an interdisciplinary or psychological point of view. They analyse preconditions and interdependencies as well as obstacles und problems of development and diagnosis core concepts and competences of Social Science Literacy.

  16. Challenges to the Indicators on Science, Technology and Innovation Development

    OpenAIRE

    Chobanova, Rossitsa

    2006-01-01

    The paper attempts to define the challenges to the indicators on science, technology and innovation development which result from the contemporary dynamics of the global knowledge based economy progress and the pursued challenges of identification of the specific national priority dimensions for public funding research and innovation projects on the case of Bulgaria. It is argued that recent the most widespread methodologies of positioning science, technology and innovation indicators do not ...

  17. Toward a Big Data Science: A challenge of "Science Cloud"

    Science.gov (United States)

    Murata, Ken T.; Watanabe, Hidenobu

    2013-04-01

    During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science

  18. Ethical challenges for the life sciences

    NARCIS (Netherlands)

    Korthals, M.J.J.A.A.

    2004-01-01

    In this book we will first discuss broader issues of ethics of the life sciences, which enable us later on to focus on the more specific issues. Therefore, we begin with two contributions on the ethical issues of working in organizations. A fruitful side effect of this start is that it gives a good

  19. Data management redefined

    Directory of Open Access Journals (Sweden)

    Nimita Limaye

    2010-01-01

    Full Text Available Core perspectives on the traditional approach to CDM are rapidly changing and EDC and new eclincal initiatives are redefining the face of data management. Associated with EDC are not only the higher efficiencies, resulting in lower study costs, but its applications in key areas such as adaptive trials and clinical event adjudication; however the cost and effort involved in deployment and integration remain a deterrent. The role of the data manager may change to that of a data broker who manages the exchange of data from multiple sources, and semantic interoperability, data standards and data privacy will prove to be the defining factors. Simulation modeling, pharmacogenomics, personalized medicine and EHRs will no longer exist as silos and seamless data flows will be the drivers of healthcare solutions.

  20. TFTR data management system

    International Nuclear Information System (INIS)

    Randerson, L.; Chu, J.; Ludescher, C.; Malsbury, J.; Stark, W.

    1986-01-01

    Developments in the tokamak fusion test reactor (TFTR) data management system supporting data management system supporting data acquisition and off-line physics data reduction are described. Data from monitor points, timing channels, and transient recorder channels and other devices are acquired and stored for use by on-line tasks. Files are transferred off-line automatically. A configuration utility determines data acquired and files transferred. An event system driven by file arrival activates off-line reduction processes. A post-run process transfers files not shipped during runs. Files are archived to tape and are retrievable by digraph and shot number. Automatic skimming based on most recent access, file type, shot numbers, and user-set protection maintains the files required for post-run data reduction

  1. Data Management in Practice

    DEFF Research Database (Denmark)

    Hansen, Karsten Kryger; Hüser, Falco Jonas; Lavanchy, Paula Maria Martinez

    covering all aspects of the lifecycle of research data: from application, through the research phase, and finally to the dissemination of results and sharing of research data. The setup was to be based on researchers’ demands, and the suggestions and results of the project were to be at an international......This report presents the results of the Data Management i Praksis (DMiP) project (in English: Data Management in Practice). The project was funded by Denmark’s Electronic Research Library (DEFF), the National Danish Archives and the participating main Danish libraries. The following partners...... level. The project should also demonstrate that research libraries have a role to play regarding research data. Furthermore, the project should ensure development of competences at the libraries, which can then be used in the future process of managing research data....

  2. Psychological Challenges Confronting Women in the Sciences.

    Science.gov (United States)

    Moulton, Ruth

    1979-01-01

    A woman psychiatrist points out the internal psychological challenges that threaten to undermine a woman's effectiveness to fight for her rights because of inner fears of which she may not be aware. Four intrapsychic conflicts of women scientists are discussed, described briefly, and illustrated by specific case examples. (Author/MK)

  3. The challenges of 'e-science'

    CERN Multimedia

    Dickson, D

    2003-01-01

    "Last week's World Summit on the Information Society endorsed the use of electronic media to support sceintific developments and their applications to social needs. The challenge now is how to achieve this as effectively as possible" (1 1/2 pages)

  4. Technology and Science Education: New Challenges

    Science.gov (United States)

    García, Beatriz Amante; Martínez, María Martínez

    2017-01-01

    The first editorial of the new year usually presents an analysis of the journal evolution. This article provides a reflection on the changes the journal has undergone over the years, and the challenges it will face in 2017. The journal expresses pride in advocating for international scholars, allowing authors to speak with their own voices, and…

  5. Science Communication Through Art: Objectives, Challenges, and Outcomes.

    Science.gov (United States)

    Lesen, Amy E; Rogan, Ama; Blum, Michael J

    2016-09-01

    The arts are becoming a favored medium for conveying science to the public. Tracking trending approaches, such as community-engaged learning, alongside challenges and goals can help establish metrics to achieve more impactful outcomes, and to determine the effectiveness of arts-based science communication for raising awareness or shaping public policy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. The Challenge of Gender Gap in Science and Technology Among ...

    African Journals Online (AJOL)

    The Challenge of Gender Gap in Science and Technology Among ... of Mkar shows that the gender gap in core science and computer courses is too wide to be ... tuition scholarship and the introduction of sexuality education for the purpose of ...

  7. The Ethical Challenges of Socially Responsible Science.

    Science.gov (United States)

    Resnik, David B; Elliott, Kevin C

    2016-01-01

    Social responsibility is an essential part of the responsible conduct of research that presents difficult ethical questions for scientists. Recognizing one's social responsibilities as a scientist is an important first step toward exercising social responsibility, but it is only the beginning, since scientists may confront difficult value questions when deciding how to act responsibly. Ethical dilemmas related to socially responsible science fall into at least three basic categories: 1) dilemmas related to problem selection, 2) dilemmas related to publication and data sharing, and 3) dilemmas related to engaging society. In responding to these dilemmas, scientists must decide how to balance their social responsibilities against other professional commitments and how to avoid compromising their objectivity. In this article, we will examine the philosophical and ethical basis of social responsibility in science, discuss some of the ethical dilemmas related to exercising social responsibility, and make five recommendations to help scientists deal with these issues.

  8. Future challenges in nuclear science education

    International Nuclear Information System (INIS)

    Yates, S.W.

    1993-01-01

    The role of Division of Nuclear Chemistry and Technology of the American Chemical Society in nuclear science education is reviewed, and suggestions for enhanced involvement in additional areas are presented. Possible new areas of emphasis, such as educational programs for pre-college students and non-scientific public, are discussed. Suggestions for revitalizing the position of radiochemistry laboratories in academic institutions are offered. (author) 7 refs

  9. New ethical challenges in science and technology

    International Nuclear Information System (INIS)

    NONE

    2001-01-01

    The published research features some of the nation's leading scientists and engineers, as well as science policy experts, and discusses a wide range of issues and topics. These include the economic and social pressure impacting biomedical research, the impossibility of predicting all the behaviors of increasingly complex, engineered systems, a look at the new federal guidelines for misconduct and new wrinkles on faculty conflicts of interest

  10. Army Science & Technology: Problems and Challenges

    Science.gov (United States)

    2012-03-01

    Boundary Conditions: Who: Small Units is COIN/Stability Operations What: Provide affordable real-time translations and d t di f b h i f l i th t i...Soldiers, Leaders and Units in complex tactical operations exceeds the Army’s current capability for home-station Challenge: Formulate a S& T program...Formulate a S& T program to capture, process and electronically a vance rauma managemen . disseminate near-real-time medical information on Soldier

  11. A partnership approach to research data management

    OpenAIRE

    Brown, Mark L.; White, Wendy

    2013-01-01

    This outlines developments to support and enhance research data management policy and practice at the University of Southampton. It details a research-led approach to identify institutional challenges and priorities and use of this evidence-base to inform the creation of a 10 year roadmap and policy framework. The particular issues relating to workflow, storage, security and archiving are discussed and examples are given of both pilot and embedded services including data management planning s...

  12. Spatial Data Management

    CERN Document Server

    Mamoulis, Nikos

    2011-01-01

    Spatial database management deals with the storage, indexing, and querying of data with spatial features, such as location and geometric extent. Many applications require the efficient management of spatial data, including Geographic Information Systems, Computer Aided Design, and Location Based Services. The goal of this book is to provide the reader with an overview of spatial data management technology, with an emphasis on indexing and search techniques. It first introduces spatial data models and queries and discusses the main issues of extending a database system to support spatial data.

  13. Introduction to the special section on peer-to-peer computing and web data management

    Institute of Scientific and Technical Information of China (English)

    Aoying ZHOU

    2008-01-01

    @@ Peer-to-peer (P2P) computing has been attracting attention from quite a few researchers and practitioners from different fields of computer science, such as networking, distributed computing, and database. Over P2P environment, the data management becomes a challenging issue.

  14. The Challenges Faced by New Science Teachers in Saudi Arabia

    Science.gov (United States)

    Alsharari, Salman

    Growing demand for science teachers in the Kingdom of Saudi Arabia, fed by increasing numbers of public school students, is forcing the Saudi government to attract, recruit and retain well-qualified science teachers. Beginning science teachers enter the educational profession with a massive fullfilment and satisfaction in their roles and positions as teachers to educating children in a science classroom. Nevertheless, teachers, over their early years of practice, encounter numerous challenges to provide the most effective science instruction. Therefore, the current study was aimed to identify academic and behavioral classroom challenges faced by science teachers in their first three years of teaching in the Kingdom of Saudi Arabia. In addition, new science teacher gender, school level and years of teaching experience differences in perceptions of the challenges that they encountered at work were analyzed. The present study also investigated various types of support that new science teachers may need to overcome academic and behavioral classroom challenges. In order to gain insights about ways to adequately support novice science teachers, it was important to examine new science teachers' beliefs, ideas and perceptions about effective science teaching. Three survey questionnaires were developed and distributed to teachers of both sexes who have been teaching science subjects, for less than three years, to elementary, middle and high school students in Al Jouf public schools. A total of 49 novice science teachers responded to the survey and 9 of them agreed to participate voluntarily in a face-to-face interview. Different statistical procedures and multiple qualitative methodologies were used to analyze the collected data. Findings suggested that the top three academic challenges faced by new science teachers were: poor quality of teacher preparation programs, absence of appropriate school equipment and facilities and lack of classroom materials and instructional

  15. Computational science: Emerging opportunities and challenges

    International Nuclear Information System (INIS)

    Hendrickson, Bruce

    2009-01-01

    In the past two decades, computational methods have emerged as an essential component of the scientific and engineering enterprise. A diverse assortment of scientific applications has been simulated and explored via advanced computational techniques. Computer vendors have built enormous parallel machines to support these activities, and the research community has developed new algorithms and codes, and agreed on standards to facilitate ever more ambitious computations. However, this track record of success will be increasingly hard to sustain in coming years. Power limitations constrain processor clock speeds, so further performance improvements will need to come from ever more parallelism. This higher degree of parallelism will require new thinking about algorithms, programming models, and architectural resilience. Simultaneously, cutting edge science increasingly requires more complex simulations with unstructured and adaptive grids, and multi-scale and multi-physics phenomena. These new codes will push existing parallelization strategies to their limits and beyond. Emerging data-rich scientific applications are also in need of high performance computing, but their complex spatial and temporal data access patterns do not perform well on existing machines. These interacting forces will reshape high performance computing in the coming years.

  16. WCS Challenges for NASA's Earth Science Data

    Science.gov (United States)

    Cantrell, S.; Swentek, L.; Khan, A.

    2017-12-01

    In an effort to ensure that data in NASA's Earth Observing System Data and Information System (EOSDIS) is available to a wide variety of users through the tools of their choice, NASA continues to focus on exposing data and services using standards based protocols. Specifically, this work has focused recently on the Web Coverage Service (WCS). Experience has been gained in data delivery via GetCoverage requests, starting out with WCS v1.1.1. The pros and cons of both the version itself and different implementation approaches will be shared during this session. Additionally, due to limitations with WCS v1.1.1's ability to work with NASA's Earth science data, this session will also discuss the benefit of migrating to WCS 2.0.1 with EO-x to enrich this capability to meet a wide range of anticipated user needs This will enable subsetting and various types of data transformations to be performed on a variety of EOS data sets.

  17. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Tang, W M; Chan, V S

    2005-01-01

    should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science. (topical review)

  18. Opportunities and Challenges for the Life Sciences Community

    Science.gov (United States)

    Stewart, Elizabeth; Ozdemir, Vural

    2012-01-01

    Abstract Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19–20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16–17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org) was formed to become a Digital Commons for the life sciences community. PMID:22401659

  19. Linked data management

    CERN Document Server

    Hose, Katja; Schenkel, Ralf

    2014-01-01

    Linked Data Management presents techniques for querying and managing Linked Data that is available on today’s Web. The book shows how the abundance of Linked Data can serve as fertile ground for research and commercial applications. The text focuses on aspects of managing large-scale collections of Linked Data. It offers a detailed introduction to Linked Data and related standards, including the main principles distinguishing Linked Data from standard database technology. Chapters also describe how to generate links between datasets and explain the overall architecture of data integration systems based on Linked Data. A large part of the text is devoted to query processing in different setups. After presenting methods to publish relational data as Linked Data and efficient centralized processing, the book explores lookup-based, distributed, and parallel solutions. It then addresses advanced topics, such as reasoning, and discusses work related to read-write Linked Data for system interoperation. Desp...

  20. TFTR data management system

    International Nuclear Information System (INIS)

    Randerson, L.; Chu, J.; Ludescher, C.; Malsbury, J.; Stark, W.

    1986-01-01

    Developments in the tokamak fusion test reactor (TFTR) data-management system supporting data acquisition and off-line physics data reduction are described. Data from monitor points, timing channels, transient recorder channels, and other devices are acquired and stored for use by on-line tasks. Files are transferred off line automatically. A configuration utility determines data acquired and files transferred. An event system driven by file arrival activates off-line reduction processes. A post-run process transfers files not shipped during runs. Files are archived to tape and are retrievable by digraph and shot number. Automatic skimming based on most recent access, file type, shot numbers, and user-set protections maintains the files required for post-run data reduction

  1. ATF data management

    International Nuclear Information System (INIS)

    Kannan, K.L.; Baylor, L.R.

    1988-01-01

    Data management for the Advanced Toroidal Facility (ATF), a stellarator located at Oak Ridge National Laboratory, is provided by DMG, a locally developed, VAX-based software system. DMG is a data storage and retrieval software system that provides the user interface to ATF raw and analyzed data. Data are described in terms of data models and data types and are organized as signals into files, which are internally documented. The system was designed with user accessibility, software maintainability, and extensibility as primary goals. Extensibility features include compatibility with ATF as it moves from pulsed to steady-state operation and capability for use of the DMG system with experiments other than ATF. DMG is implemented as a run-time library of routines available as a shareable image. General-purpose and specialized data acquisition and analysis applications have been developed using the DMG system. This article describes the DMG system and the interfaces to it

  2. Data Management System

    Science.gov (United States)

    1997-01-01

    CENTRA 2000 Inc., a wholly owned subsidiary of Auto-trol technology, obtained permission to use software originally developed at Johnson Space Center for the Space Shuttle and early Space Station projects. To support their enormous information-handling needs, a product data management, electronic document management and work-flow system was designed. Initially, just 33 database tables comprised the original software, which was later expanded to about 100 tables. This system, now called CENTRA 2000, is designed for quick implementation and supports the engineering process from preliminary design through release-to-production. CENTRA 2000 can also handle audit histories and provides a means to ensure new information is distributed. The product has 30 production sites worldwide.

  3. Achievements and Challenges in the Science of Space Weather

    Science.gov (United States)

    Koskinen, Hannu E. J.; Baker, Daniel N.; Balogh, André; Gombosi, Tamas; Veronig, Astrid; von Steiger, Rudolf

    2017-11-01

    In June 2016 a group of 40 space weather scientists attended the workshop on Scientific Foundations of Space Weather at the International Space Science Institute in Bern. In this lead article to the volume based on the talks and discussions during the workshop we review some of main past achievements in the field and outline some of the challenges that the science of space weather is facing today and in the future.

  4. Science Drivers and Technical Challenges for Advanced Magnetic Resonance

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Karl T.; Pruski, Marek; Washton, Nancy M.; Lipton, Andrew S.

    2013-03-07

    This report recaps the "Science Drivers and Technical Challenges for Advanced Magnetic Resonance" workshop, held in late 2011. This exploratory workshop's goal was to discuss and address challenges for the next generation of magnetic resonance experimentation. During the workshop, participants from throughout the world outlined the science drivers and instrumentation demands for high-field dynamic nuclear polarization (DNP) and associated magnetic resonance techniques, discussed barriers to their advancement, and deliberated the path forward for significant and impactful advances in the field.

  5. Web-scale data management for the cloud

    CERN Document Server

    Lehner, Wolfgang

    2013-01-01

    The efficient management of a consistent and integrated database is a central task in modern IT and highly relevant for science and industry. Hardly any critical enterprise solution comes without any functionality for managing data in its different forms. Web-Scale Data Management for the Cloud addresses fundamental challenges posed by the need and desire to provide database functionality in the context of the Database as a Service (DBaaS) paradigm for database outsourcing. This book also discusses the motivation of the new paradigm of cloud computing, and its impact to data outsourcing and se

  6. Meeting global health challenges through operational research and management science.

    Science.gov (United States)

    Royston, Geoff

    2011-09-01

    This paper considers how operational research and management science can improve the design of health systems and the delivery of health care, particularly in low-resource settings. It identifies some gaps in the way operational research is typically used in global health and proposes steps to bridge them. It then outlines some analytical tools of operational research and management science and illustrates how their use can inform some typical design and delivery challenges in global health. The paper concludes by considering factors that will increase and improve the contribution of operational research and management science to global health.

  7. Challenging hyperprofessionalisation vs. hyperpopularisation in the history of science

    DEFF Research Database (Denmark)

    Nielsen, Kristian Hvidtfelt

    , the history of science profession now suffers from a crisis of readership?. In contrast, ever since the publication of Dava Sobel?s surprising bestseller, Longitude, popular history of science has dramatically increased its readership. Some historians of science lament the Sobel Effect, whereas others take up......Recently, Steven Shapin have identified a pathological form of professionalism in the history of science. He calls the disease hyperprofessionalism. Its symptoms include self-referentiality, self-absorption, and a narrowing of intellectual focus. Partly as a result of hyperprofessionalism...... the challenge by writing books for a broader audience. In effect, historians of science seemed to be faced with the choice between hyperprofessionalisation and hyperpopularisation. This paper attempts a first deconstruction of the twin notions of hyperprofessionalisation vs. hyperpopularisation....

  8. Chemistry Students' Challenges in Using MBL's in Science Laboratories.

    Science.gov (United States)

    Atar, Hakan Yavuz

    Understanding students' challenges about using microcomputer based laboratories (MBLs) would provide important data in understanding the appropriateness of using MBLs in high school chemistry laboratories. Identifying students' concerns about this technology will in part help educators identify the obstacles to science learning when using this…

  9. Surmounting the challenge of numbers, science and technology in ...

    African Journals Online (AJOL)

    Surmounting the challenge of numbers, science and technology in educational policy development. TK Yesufu, AO Yesufu. Abstract. No Abstract. Nigerian Journal of Physics Vol. 17 (Supplement) 2005: pp. 299-310. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL ...

  10. How Augmented Reality Enables Conceptual Understanding of Challenging Science Content

    Science.gov (United States)

    Yoon, Susan; Anderson, Emma; Lin, Joyce; Elinich, Karen

    2017-01-01

    Research on learning about science has revealed that students often hold robust misconceptions about a number of scientific ideas. Digital simulation and dynamic visualization tools have helped to ameliorate these learning challenges by providing scaffolding to understand various aspects of the phenomenon. In this study we hypothesize that…

  11. Meeting national challenges with science, engineering, and technology

    International Nuclear Information System (INIS)

    1992-03-01

    This report discusses research in the following areas at Lawrence Livermore National Laboratory: national challenges; the Livermore Laboratory; national defense: preserving peace in a rapidly changing world; energy: clean and economic; environment: from the microscopic to the global; health: genetics and biomedicine; economy: bringing laboratory technology to the US market; education: sparking interest in science; and the Livermore Laboratory: a national resource

  12. The challenge of the social sciences: The impact of Sociology ...

    African Journals Online (AJOL)

    The challenge of the social sciences: The impact of Sociology among first year students. JF Graaff. Abstract. No Abstract. Full Text: EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Article Metrics. Metrics Loading ... Metrics powered by PLOS ALM

  13. Challenges and Prospects of Methodological Anarchism for Science ...

    African Journals Online (AJOL)

    This paper examjnes Feyerabend's idea of Methodological Anarchism. Specifically, it looks at its challenges and prospects for the growth of science and epistemology in Africa. Feyerabend's point is that people develop best in pluralistic societies; that contain many ideas, traditions and forms of life. It is argued that ...

  14. Educator Perspectives on Earth System Science Literacy: Challenges and Priorities

    Science.gov (United States)

    LaDue, Nicole; Clark, Scott K.

    2012-01-01

    The challenges and priorities of defining and achieving Earth System Science (ESS) literacy are examined through surveys of geoscience educators attending a professional geological meeting. Two surveys with Likert-style and free-response questions were distributed to geoscientists and K-12 teachers to elicit what instructors think are important…

  15. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  16. Scientific Data Management Center for Enabling Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Vouk, Mladen A.

    2013-01-15

    data management technologies to DOE application scientists in astrophysics, climate, fusion, and biology. Equally important, it established collaborations with these scientists to better understand their science as well as their forthcoming data management and data analytics challenges. Building on our early successes, we have greatly enhanced, robustified, and deployed our technology to these communities. In some cases, we identified new needs that have been addressed in order to simplify the use of our technology by scientists. This report summarizes our work so far in SciDAC-2. Our approach is to employ an evolutionary development and deployment process: from research through prototypes to deployment and infrastructure. Accordingly, we have organized our activities in three layers that abstract the end-to-end data flow described above. We labeled the layers (from bottom to top): a) Storage Efficient Access (SEA), b) Data Mining and Analysis (DMA), c) Scientific Process Automation (SPA). The SEA layer is immediately on top of hardware, operating systems, file systems, and mass storage systems, and provides parallel data access technology, and transparent access to archival storage. The DMA layer, which builds on the functionality of the SEA layer, consists of indexing, feature identification, and parallel statistical analysis technology. The SPA layer, which is on top of the DMA layer, provides the ability to compose scientific workflows from the components in the DMA layer as well as application specific modules. NCSU work performed under this contract was primarily at the SPA layer.

  17. Data management in NOAA

    Science.gov (United States)

    Callicott, William M.

    1993-01-01

    The NOAA archives contain 150 terabytes of data in digital form, most of which are the high volume GOES satellite image data. There are 630 data bases containing 2,350 environmental variables. There are 375 million film records and 90 million paper records in addition to the digital data base. The current data accession rate is 10 percent per year and the number of users are increasing at a 10 percent annual rate. NOAA publishes 5,000 publications and distributes over one million copies to almost 41,000 paying customers. Each year, over six million records are key entered from manuscript documents and about 13,000 computer tapes and 40,000 satellite hardcopy images are entered into the archive. Early digital data were stored on punched cards and open reel computer tapes. In the late seventies, an advanced helical scan technology (AMPEX TBM) was implemented. Now, punched cards have disappeared, the TBM system was abandoned, most data stored on open reel tapes have been migrated to 3480 cartridges, many specialized data sets were distributed on CD ROM's, special archives are being copied to 12 inch optical WORM disks, 5 1/4 inch magneto-optical disks were employed for workstation applications, and 8 mm EXABYTE tapes are planned for major data collection programs. The rapid expansion of new data sets, some of which constitute large volumes of data, coupled with the need for vastly improved access mechanisms, portability, and improved longevity are factors which will influence NOAA's future systems approaches for data management.

  18. CITIESData: a smart city data management framework

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Heller, Alfred; Nielsen, Per Sieverts

    2017-01-01

    and publishing challenging. In this paper,we propose a framework to streamline smart city data management, including data collection, cleansing, anonymization, and publishing. The paper classifies smart city data in sensitive, quasi-sensitive, and open/public levels and then suggests different strategies...

  19. New Challenges for Data Managment in Genebanks

    Science.gov (United States)

    The use of genetic resources for crop improvement has undergone a fundamental shift. Continued progress will be dependent upon the natural variation contained within the world’s gene banks. Gene banks must manage their collections in ways that promote their utilization by increasing access to not ...

  20. Innovation in Extraterrestrial Service Systems - A Challenge for Service Science

    Science.gov (United States)

    Bergner, David

    2010-01-01

    This presentation was prepared at the invitation of Professor Yukio Ohsawa, Department of Systems Innovation, School of Engineering, The University of Tokyo, for delivery at the International Workshop on Innovating Service Systems, sponsored by the Japanese Society of Artificial Intelligence (JSAI) as part of the JSAI Internation Symposium on AI, 2010. It offers several challenges for Service Science and Service Innovation. the goal of the presentation is to stimulate thinking about how service systems viII evolve in the future, as human society advances from its terrestrial base toward a permanent presence in space. First we will consider the complexity of the International Space Station (ISS) as it is today, with particular emphasis of its research facilities, and focus on a current challenge - to maximize the utilization of ISS research facilities for the benefit of society. After briefly reviewing the basic principles of Service Science, we will discuss the potential application of Service Innovation methodology to this challenge. Then we viII consider how game-changing technologies - in particular Synthetic Biology - could accelerate the pace of sociocultural evolution and consequently, the progression of human society into space. We will use this provocative vision to advance thinking about how the emerging field of Service Science, Management, and Engineering (SSME) might help us anticipate and better handle the challenges of this inevitable evolutionary process.

  1. Forging New Service Paths: Institutional Approaches to Providing Research Data Management Services

    Directory of Open Access Journals (Sweden)

    Regina Raboin

    2012-01-01

    Full Text Available Objective: This paper describes three different institutional experiences in developing research data management programs and services, challenges/opportunities and lessons learned.Overview: This paper is based on the Librarian Panel Discussion during the 4th Annual University of Massachusetts and New England Region e-Science Symposium. Librarians representing large public and private research universities presented an overview of service models developed at their respective organizations to bring support for data management and eScience to their communities. The approaches described include two library-based, integrated service models and one collaboratively-staffed, center-based service model.Results: Three institutions describe their experiences in creating the organizational capacity for research data management support services. Although each institutional approach is unique, common challenges include garnering administrative support, managing the integration of services with new or existing staff structures, and continuing to meet researchers needs as they evolve.Conclusions: There is no one way to provide research data management services, but any staff position, committee, or formalized center reflects an overarching organizational commitment to data management support.

  2. Data management on the fusion computational pipeline

    International Nuclear Information System (INIS)

    Klasky, S; Beck, M; Bhat, V; Feibush, E; Ludaescher, B; Parashar, M; Shoshani, A; Silver, D; Vouk, M

    2005-01-01

    Fusion energy science, like other science areas in DOE, is becoming increasingly data intensive and network distributed. We discuss data management techniques that are essential for scientists making discoveries from their simulations and experiments, with special focus on the techniques and support that Fusion Simulation Project (FSP) scientists may need. However, the discussion applies to a broader audience since most of the fusion SciDAC's, and FSP proposals include a strong data management component. Simulations on ultra scale computing platforms imply an ability to efficiently integrate and network heterogeneous components (computational, storage, networks, codes, etc), and to move large amounts of data over large distances. We discuss the workflow categories needed to support such research as well as the automation and other aspects that can allow an FSP scientist to focus on the science and spend less time tending information technology

  3. Data Management as a Cluster Middleware Centerpiece

    Science.gov (United States)

    Zero, Jose; McNab, David; Sawyer, William; Cheung, Samson; Duffy, Daniel; Rood, Richard; Webster, Phil; Palm, Nancy; Salmon, Ellen; Schardt, Tom

    2004-01-01

    Through earth and space modeling and the ongoing launches of satellites to gather data, NASA has become one of the largest producers of data in the world. These large data sets necessitated the creation of a Data Management System (DMS) to assist both the users and the administrators of the data. Halcyon Systems Inc. was contracted by the NASA Center for Computational Sciences (NCCS) to produce a Data Management System. The prototype of the DMS was produced by Halcyon Systems Inc. (Halcyon) for the Global Modeling and Assimilation Office (GMAO). The system, which was implemented and deployed within a relatively short period of time, has proven to be highly reliable and deployable. Following the prototype deployment, Halcyon was contacted by the NCCS to produce a production DMS version for their user community. The system is composed of several existing open source or government-sponsored components such as the San Diego Supercomputer Center s (SDSC) Storage Resource Broker (SRB), the Distributed Oceanographic Data System (DODS), and other components. Since Data Management is one of the foremost problems in cluster computing, the final package not only extends its capabilities as a Data Management System, but also to a cluster management system. This Cluster/Data Management System (CDMS) can be envisioned as the integration of existing packages.

  4. Challenges in data science: a complex systems perspective

    International Nuclear Information System (INIS)

    Carbone, Anna; Jensen, Meiko; Sato, Aki-Hiro

    2016-01-01

    The ability to process and manage large data volumes has been proven to be not enough to tackle the current challenges presented by “Big Data”. Deep insight is required for understanding interactions among connected systems, space- and time- dependent heterogeneous data structures. Emergence of global properties from locally interacting data entities and clustering phenomena demand suitable approaches and methodologies recently developed in the foundational area of Data Science by taking a Complex Systems standpoint. Here, we deal with challenges that can be summarized by the question: “What can Complex Systems Science contribute to Big Data? ”. Such question can be reversed and brought to a superior level of abstraction by asking “What Knowledge can be drawn from Big Data?” These aspects constitute the main motivation behind this article to introduce a volume containing a collection of papers presenting interdisciplinary advances in the Big Data area by methodologies and approaches typical of the Complex Systems Science, Nonlinear Systems Science and Statistical Physics.

  5. The computational challenges of Earth-system science.

    Science.gov (United States)

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  6. DIII-D DATA MANAGEMENT

    International Nuclear Information System (INIS)

    McHARG, B.B; BURUSS, J.R. Jr.; FREEMAN, J.; PARKER, C.T.; SCHACHTER, J.; SCHISSEL, D.P.

    2001-08-01

    OAK-B135 The DIII-D tokamak at the DIII-D National Fusion Facility routinely acquires ∼ 500 Megabytes of raw data per pulse of the experiment through a centralized data management system. It is expected that in FY01, nearly one Terabyte of data will be acquired. In addition there are several diagnostics, which are not part of the centralized system, which acquire hundreds of megabytes of raw data per pulse. There is also a growing suite of codes running between pulses that produce analyzed data, which add ∼ 10 Megabytes per pulse with total disk usage of about 100 Gigabytes. A relational database system has been introduced which further adds to the overall data load. In recent years there has been an order of magnitude increase in magnetic disk space devoted to raw data and a Hierarchical Storage Management system (HSM) was implemented to allow 7 x 24 unattended access to raw data. The management of all of the data is a significant and growing challenge as the quantities of both raw and analyzed data are expected to continue to increase in the future. This paper will examine the experiences of the approaches that have been taken in management of the data and plans for the continued growth of the data quantity

  7. Operational research as implementation science: definitions, challenges and research priorities.

    Science.gov (United States)

    Monks, Thomas

    2016-06-06

    Operational research (OR) is the discipline of using models, either quantitative or qualitative, to aid decision-making in complex implementation problems. The methods of OR have been used in healthcare since the 1950s in diverse areas such as emergency medicine and the interface between acute and community care; hospital performance; scheduling and management of patient home visits; scheduling of patient appointments; and many other complex implementation problems of an operational or logistical nature. To date, there has been limited debate about the role that operational research should take within implementation science. I detail three such roles for OR all grounded in upfront system thinking: structuring implementation problems, prospective evaluation of improvement interventions, and strategic reconfiguration. Case studies from mental health, emergency medicine, and stroke care are used to illustrate each role. I then describe the challenges for applied OR within implementation science at the organisational, interventional, and disciplinary levels. Two key challenges include the difficulty faced in achieving a position of mutual understanding between implementation scientists and research users and a stark lack of evaluation of OR interventions. To address these challenges, I propose a research agenda to evaluate applied OR through the lens of implementation science, the liberation of OR from the specialist research and consultancy environment, and co-design of models with service users. Operational research is a mature discipline that has developed a significant volume of methodology to improve health services. OR offers implementation scientists the opportunity to do more upfront system thinking before committing resources or taking risks. OR has three roles within implementation science: structuring an implementation problem, prospective evaluation of implementation problems, and a tool for strategic reconfiguration of health services. Challenges facing OR

  8. Life sciences payload definition and integration study. Volume 4: Appendix, costs, and data management requirements of the dedicated 30-day laboratory. [carry-on laboratory for Spacelab

    Science.gov (United States)

    1974-01-01

    The results of the updated 30-day life sciences dedicated laboratory scheduling and costing activities are documented, and the 'low cost' methodology used to establish individual equipment item costs is explained in terms of its allowances for equipment that is commerical off-the-shelf, modified commercial, and laboratory prototype; a method which significantly lowers program costs. The costs generated include estimates for non-recurring development, recurring production, and recurring operations costs. A cost for a biomedical emphasis laboratory and a Delta cost to provide a bioscience and technology laboratory were also generated. All cost reported are commensurate with the design and schedule definitions available.

  9. Meeting report: Ocean ‘omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013)

    Science.gov (United States)

    Gilbert, Jack A; Dick, Gregory J.; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R. M.

    2014-01-01

    The National Science Foundation’s EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on ‘omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, “big-data capable” analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean ‘omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the ‘omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography. PMID:25197495

  10. Meeting report: Ocean 'omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013).

    Science.gov (United States)

    Gilbert, Jack A; Dick, Gregory J; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R M; DeLong, Edward F

    2014-06-15

    The National Science Foundation's EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on 'omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, "big-data capable" analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean 'omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the 'omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography.

  11. Data Management for Mars Exploration Rovers

    Science.gov (United States)

    Snyder, Joseph F.; Smyth, David E.

    2004-01-01

    Data Management for the Mars Exploration Rovers (MER) project is a comprehensive system addressing the needs of development, test, and operations phases of the mission. During development of flight software, including the science software, the data management system can be simulated using any POSIX file system. During testing, the on-board file system can be bit compared with files on the ground to verify proper behavior and end-to-end data flows. During mission operations, end-to-end accountability of data products is supported, from science observation concept to data products within the permanent ground repository. Automated and human-in-the-loop ground tools allow decisions regarding retransmitting, re-prioritizing, and deleting data products to be made using higher level information than is available to a protocol-stack approach such as the CCSDS File Delivery Protocol (CFDP).

  12. Making Data Management Accessible in the Undergraduate Chemistry Curriculum

    Science.gov (United States)

    Reisner, Barbara A.; Vaughan, K. T. L.; Shorish, Yasmeen L.

    2014-01-01

    In the age of "big data" science, data management is becoming a key information literacy skill for chemistry professionals. To introduce this skill in the undergraduate chemistry major, an activity has been developed to familiarize undergraduates with data management. In this activity, students rename and organize cards that represent…

  13. Data science and symbolic AI: Synergies, challenges and opportunities

    KAUST Repository

    Hoehndorf, Robert

    2017-06-02

    Symbolic approaches to artificial intelligence represent things within a domain of knowledge through physical symbols, combine symbols into symbol expressions, and manipulate symbols and symbol expressions through inference processes. While a large part of Data Science relies on statistics and applies statistical approaches to artificial intelligence, there is an increasing potential for successfully applying symbolic approaches as well. Symbolic representations and symbolic inference are close to human cognitive representations and therefore comprehensible and interpretable; they are widely used to represent data and metadata, and their specific semantic content must be taken into account for analysis of such information; and human communication largely relies on symbols, making symbolic representations a crucial part in the analysis of natural language. Here we discuss the role symbolic representations and inference can play in Data Science, highlight the research challenges from the perspective of the data scientist, and argue that symbolic methods should become a crucial component of the data scientists’ toolbox.

  14. [The undergraduate program in forensic science: a national challenge].

    Science.gov (United States)

    García Castillo, Zoraida; Graue Wiechers, Enrique; Durante Montiel, Irene; Herrera Saint Leu, Patricia

    2014-01-01

    The challenge in achieving an ideal state of justice is that each "proof" has the highest degree of reliability. This is the main responsibility of the forensic scientist. Up to now, criminal investigations in Mexico have been supported by forensic work from a wide variety of disciplinary backgrounds that give testimony in a particular area, even though they may have become forensic witnesses in a complementary and experiential manner. In January 2013, the Universidad Nacional Autónoma de México (UNAM) approved the "Forensic Science" undergraduate program that, in collaboration with various academic entities and government institutions, will develop forensic scientists trained in science, law, and criminology. This is focused on contributing to the national demand that the justice system has more elements to procure and administer justice in dealing with crime.

  15. Data science and symbolic AI: Synergies, challenges and opportunities

    KAUST Repository

    Hoehndorf, Robert; Queralt-Rosinach, Nú ria

    2017-01-01

    Symbolic approaches to artificial intelligence represent things within a domain of knowledge through physical symbols, combine symbols into symbol expressions, and manipulate symbols and symbol expressions through inference processes. While a large part of Data Science relies on statistics and applies statistical approaches to artificial intelligence, there is an increasing potential for successfully applying symbolic approaches as well. Symbolic representations and symbolic inference are close to human cognitive representations and therefore comprehensible and interpretable; they are widely used to represent data and metadata, and their specific semantic content must be taken into account for analysis of such information; and human communication largely relies on symbols, making symbolic representations a crucial part in the analysis of natural language. Here we discuss the role symbolic representations and inference can play in Data Science, highlight the research challenges from the perspective of the data scientist, and argue that symbolic methods should become a crucial component of the data scientists’ toolbox.

  16. Engineering and physical sciences in oncology: challenges and opportunities.

    Science.gov (United States)

    Mitchell, Michael J; Jain, Rakesh K; Langer, Robert

    2017-11-01

    The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas.

  17. Challenges in Modern Anti-Doping Analytical Science.

    Science.gov (United States)

    Ayotte, Christiane; Miller, John; Thevis, Mario

    2017-01-01

    The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.

  18. 'Big data' in pharmaceutical science: challenges and opportunities.

    Science.gov (United States)

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  19. Minnesota 4-H Science of Agriculture Challenge: Infusing Agricultural Science and Engineering Concepts into 4-H Youth Development

    Science.gov (United States)

    Rice, Joshua E.; Rugg, Bradley; Davis, Sharon

    2016-01-01

    Youth involved in 4-H projects have been engaged in science-related endeavors for years. Since 2006, 4-H has invested considerable resources in the advancement of science learning. The new Minnesota 4-H Science of Agriculture Challenge program challenges 4-H youth to work together to identify agriculture-related issues in their communities and to…

  20. The Grand Challenges Discourse: Transforming Identity Work in Science and Science Policy.

    Science.gov (United States)

    Kaldewey, David

    2018-01-01

    This article analyzes the concept of "grand challenges" as part of a shift in how scientists and policymakers frame and communicate their respective agendas. The history of the grand challenges discourse helps to understand how identity work in science and science policy has been transformed in recent decades. Furthermore, the question is raised whether this discourse is only an indicator, or also a factor in this transformation. Building on conceptual history and historical semantics, the two parts of the article reconstruct two discursive shifts. First, the observation that in scientific communication references to "problems" are increasingly substituted by references to "challenges" indicates a broader cultural trend of how attitudes towards what is problematic have shifted in the last decades. Second, as the grand challenges discourse is rooted in the sphere of sports and competition, it introduces a specific new set of societal values and practices into the spheres of science and technology. The article concludes that this process can be characterized as the sportification of science, which contributes to self-mobilization and, ultimately, to self-optimization of the participating scientists, engineers, and policymakers.

  1. Health Sciences

    OpenAIRE

    McEntyre, Johanna; Swan, Alma; Meier zu Verl, Christian; Horstmann, Wolfram

    2011-01-01

    This chapter provides an overview of research data management in the health sciences, primarily focused upon the sort of data curated by the European Bioinformatics Institute and similar organisations. In this field, data management is well-advanced, with a sophisticated infrastructure created and maintained by the community for the benefit of all. These advances have been brought about because the field has been data-intense for many years and has been driven by the challenges biology fac...

  2. The LHCb Data Management System

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    We shall describe all the tools that are available for Data Management, from handling of large datasets to basic tools for users as well as for monitoring the dynamic behaviour of LHCb Storage capacity.

  3. Autonomous vertical profiler data management

    Digital Repository Service at National Institute of Oceanography (India)

    Afzulpurkar, S.; Navelkar, G.S.; Desa, E.S.; Madhan, R.; Dabholkar, N.; Prabhudesai, S.P.; Mascarenhas, A.A.M.Q.

    the data management. It is expected that there would be multiple profilers operating at various locations, such as coastal seas, dams and other water bodies. Data would be relayed for archival, processing and be made available to the communities who...

  4. Cargo Data Management Demonstration System

    Science.gov (United States)

    1974-02-01

    Delays in receipt and creation of cargo documents are a problem in international trade. The work described demonstrates some of the advantages and capabilities of a computer-based cargo data management system. A demonstration system for data manageme...

  5. Data management for interdisciplinary field experiments: OTTER project support

    Science.gov (United States)

    Angelici, Gary; Popovici, Lidia; Skiles, J. W.

    1993-01-01

    The ability of investigators of an interdisciplinary science project to properly manage the data that are collected during the experiment is critical to the effective conduct of science. When the project becomes large, possibly including several scenes of large-format remotely sensed imagery shared by many investigators requiring several services, the data management effort can involve extensive staff and computerized data inventories. The OTTER (Oregon Transect Ecosystem Research) project was supported by the PLDS (Pilot Land Data System) with several data management services, such as data inventory, certification, and publication. After a brief description of these services, experiences in providing them are compared with earlier data management efforts and some conclusions regarding data management in support of interdisciplinary science are discussed. In addition to providing these services, a major goal of this data management capability was to adopt characteristics of a pro-active attitude, such as flexibility and responsiveness, believed to be crucial for the effective conduct of active, interdisciplinary science. These are also itemized and compared with previous data management support activities. Identifying and improving these services and characteristics can lead to the design and implementation of optimal data management support capabilities, which can result in higher quality science and data products from future interdisciplinary field experiments.

  6. Towards Data Management Planning Support for Research Data

    NARCIS (Netherlands)

    Görzig, Heike; Engel, Felix; Brocks, Holger; Vogel, Tobias; Hemmje, Matthias

    2015-01-01

    Görzig, H., Engel, F., Brocks, H., Vogel, T. & Hemmje, M. (2015, August). Towards Data Management Planning Support for Research Data. Paper presented at the ASE International Conference on Data Science, Stanford, United States of America.

  7. Challenges of citizen science contributions to modelling hydrodynamics of floods

    Science.gov (United States)

    Assumpção, Thaine Herman; Popescu, Ioana; Jonoski, Andreja; Solomatine, Dimitri P.

    2017-04-01

    Citizen science is an established mechanism in many fields of science, including ecology, biology and astronomy. Citizen participation ranges from collecting and interpreting data towards designing experiments with scientists and cooperating with water management authorities. In the environmental sciences, its potential has begun to be explored in the past decades and many studies on the applicability to water resources have emerged. Citizen Observatories are at the core of several EU-funded projects such as WeSenseIt, GroundTruth, GroundTruth 2.0 and SCENT (Smart Toolbox for Engaging Citizens into a People-Centric Observation Web) that already resulted in valuable contributions to the field. Buytaert et al. (2014) has already reviewed the role of citizen science in hydrology. The work presented here aims to complement it, reporting and discussing the use of citizen science for modelling the hydrodynamics of floods in a variety of studies. Additionally, it highlights the challenges that lie ahead to utilize more fully the citizen science potential contribution. In this work, focus is given to each component of hydrodynamic models: water level, velocity, flood extent, roughness and topography. It is addressed how citizens have been contributing to each aspect, mainly considering citizens as sensors and citizens as data interpreters. We consider to which kind of model (1D or 2D) the discussed approaches contribute and what their limitations and potential uses are. We found that although certain mechanisms are well established (e.g. the use of Volunteer Geographic Information for soft validation of land-cover and land-use maps), the applications in a modelling context are rather modest. Also, most studies involving models are limited to replacing traditional data with citizen data. We recommend that citizen science continue to be explored in modelling frameworks, in different case studies, taking advantage of the discussed mechanisms and of new sensor technologies

  8. New challenges for Life Sciences flight project management

    Science.gov (United States)

    Huntoon, C. L.

    1999-01-01

    Scientists have conducted studies involving human spaceflight crews for over three decades. These studies have progressed from simple observations before and after each flight to sophisticated experiments during flights of several weeks up to several months. The findings from these experiments are available in the scientific literature. Management of these flight experiments has grown into a system fashioned from the Apollo Program style, focusing on budgeting, scheduling and allocation of human and material resources. While these areas remain important to the future, the International Space Station (ISS) requires that the Life Sciences spaceflight experiments expand the existing project management methodology. The use of telescience with state-the-art information technology and the multi-national crews and investigators challenges the former management processes. Actually conducting experiments on board the ISS will be an enormous undertaking and International Agreements and Working Groups will be essential in giving guidance to the flight project management Teams forged in this matrix environment must be competent to make decisions and qualified to work with the array of engineers, scientists, and the spaceflight crews. In order to undertake this complex task, data systems not previously used for these purposes must be adapted so that the investigators and the project management personnel can all share in important information as soon as it is available. The utilization of telescience and distributed experiment operations will allow the investigator to remain involved in their experiment as well as to understand the numerous issues faced by other elements of the program The complexity in formation and management of project teams will be a new kind of challenge for international science programs. Meeting that challenge is essential to assure success of the International Space Station as a laboratory in space.

  9. Challenges and Successes Managing Airborne Science Data for CARVE

    Science.gov (United States)

    Hardman, S. H.; Dinardo, S. J.; Lee, E. C.

    2014-12-01

    The Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission collects detailed measurements of important greenhouse gases on local to regional scales in the Alaskan Arctic and demonstrates new remote sensing and improved modeling capabilities to quantify Arctic carbon fluxes and carbon cycle-climate processes. Airborne missions offer a number of challenges when it comes to collecting and processing the science data and CARVE is no different. The biggest challenge relates to the flexibility of the instrument payload. Within the life of the mission, instruments may be removed from or added to the payload, or even reconfigured on a yearly, monthly or daily basis. Although modification of the instrument payload provides a distinct advantage for airborne missions compared to spaceborne missions, it does tend to wreak havoc on the underlying data system when introducing changes to existing data inputs or new data inputs that require modifications to the pipeline for processing the data. In addition to payload flexibility, it is not uncommon to find unsupported files in the field data submission. In the case of CARVE, these include video files, photographs taken during the flight and screen shots from terminal displays. These need to captured, saved and somehow integrated into the data system. The CARVE data system was built on a multi-mission data system infrastructure for airborne instruments called the Airborne Cloud Computing Environment (ACCE). ACCE encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation. This well-tested and proven infrastructure allows the CARVE data system to be easily adapted in order to handle the challenges posed by the CARVE mission and to successfully process, manage and distribute the mission's science data. This

  10. Qualitative research in rehabilitation science: opportunities, challenges, and future directions.

    Science.gov (United States)

    VanderKaay, Sandra; Moll, Sandra E; Gewurtz, Rebecca E; Jindal, Pranay; Loyola-Sanchez, Adalberto; Packham, Tara L; Lim, Chun Y

    2018-03-01

    Qualitative research has had a significant impact within rehabilitation science over time. During the past 20 years the number of qualitative studies published per year in Disability and Rehabilitation has markedly increased (from 1 to 54). In addition, during this period there have been significant changes in how qualitative research is conceptualized, conducted, and utilized to advance the field of rehabilitation. The purpose of this article is to reflect upon the progress of qualitative research within rehabilitation to date, to explicate current opportunities and challenges, and to suggest future directions to continue to strengthen the contribution of qualitative research in this field. Relevant literature searches were conducted in electronic data bases and reference lists. Pertinent literature was examined to identify current opportunities and challenges for qualitative research use in rehabilitation and to identify future directions. Six key areas of opportunity and challenge were identified: (a) paradigm shifts, (b) advancements in methodology, (c) emerging technology, (d) advances in quality evaluation, (e) increasing popularity of mixed methods approaches, and (f) evolving approaches to knowledge translation. Two important future directions for rehabilitation are posited: (1) advanced training in qualitative methods and (2) engaging qualitative communities of research. Qualitative research is well established in rehabilitation and has an important place in the continued growth of this field. Ongoing development of qualitative researchers and methods are essential. Implications for Rehabilitation Qualitative research has the potential to improve rehabilitation practice by addressing some of the most pervasive concerns in the field such as practitioner-client interaction, the subjective and lived experience of disability, and clinical reasoning and decision making. This will serve to better inform those providing rehabilitation services thereby benefiting

  11. Proceedings of conferences on large data management

    International Nuclear Information System (INIS)

    Ueshima, Yutaka

    2004-03-01

    This report consists of 14 contributed papers in conferences on Large Data Management, which were held at the JAERI in Kyoto. The papers are proceedings of the Open Workshop of Large Data Management at the ITBL building on January 29-30, 2003, and the Conference of Advanced Photon-Matter Interaction Research with Large-Scale Simulation on January 31, 2003. The aim of the workshop and the conference is for researchers to report on the lastest research and technology. The contents of the workshop are speeches and laboratory, supercomputers and photon science museum tours. There were three private sector speeches and ten university and research organization speeches. There were thirteen speeches in total. A total of 107 people participated including 93 participants from other than JAERI. In the conference, there were three university speeches. 30 people participated including 27 participants from other than JAERI. The conferences showed the present condition and view of large data management technology which is important for computer science, advanced photon research and became a valuable forum from the stand point as an indicator for future research. The 14 of the presented papers are indexed individually. (J.P.N.)

  12. Open science, e-science and the new technologies: Challenges and old problems in qualitative research in the social sciences

    Directory of Open Access Journals (Sweden)

    Ercilia García-Álvarez

    2012-12-01

    Full Text Available Purpose: As well as introducing the articles in the special issue titled "Qualitative Research in the Social Sciences", this article reviews the challenges, problems and main advances made by the qualitative paradigm in the context of the new European science policy based on open science and e-Science and analysis alternative technologies freely available in the 2.0 environment and their application to fieldwork and data analysis. Design/methodology: Theoretical review. Practical implications: The article identifies open access technologies with applications in qualitative research such as applications for smartphones and tablets, web platforms and specific qualitative data analysis software, all developed in both the e-Science context and the 2.0 environment. Social implications: The article discusses the possible role to be played by qualitative research in the open science and e-Science context and considers the impact of this new context on the size and structure of research groups, the development of truly collaborative research, the emergence of new ethical problems and quality assessment in review processes in an open environment. Originality/value: The article describes the characteristics that define the new scientific environment and the challenges posed for qualitative research, reviews the latest open access technologies available to researchers in terms of their main features and proposes specific applications suitable for fieldwork and data analysis.

  13. Earth & Space Science in the Next Generation Science Standards: Promise, Challenge, and Future Actions. (Invited)

    Science.gov (United States)

    Pyle, E. J.

    2013-12-01

    The Next Generation Science Standards (NGSS) are a step forward in ensuring that future generations of students become scientifically literate. The NGSS document builds from the National Science Education Standards (1996) and the National Assessment of Educational Progress (NAEP) science framework of 2005. Design teams for the Curriculum Framework for K-12 Science Education were to outline the essential content necessary for students' science literacy, considering the foundational knowledge and the structure of each discipline in the context of learning progressions. Once draft standards were developed, two issues emerged from their review: (a) the continual need to prune 'cherished ideas' within the content, such that only essential ideas were represented, and (b) the potential for prior conceptions of Science & Engineering Practices (SEP) and cross-cutting concepts (CCC) to limit overly constrain performance expectations. With the release of the NGSS, several challenges are emerging for geoscience education. First, the traditional emphasis of Earth science in middle school has been augmented by new standards for high school that require major syntheses of concepts. Second, the integration of SEPs into performance expectations places an increased burden on teachers and curriculum developers to organize instruction around the nature of inquiry in the geosciences. Third, work is needed to define CCCs in Earth contexts, such that the unique structure of the geosciences is best represented. To ensure that the Earth & Space Science standards are implemented through grade 12, two supporting structures must be developed. In the past, many curricular materials claimed that they adhered to the NSES, but in some cases this match was a simple word match or checklist that bore only superficial resemblance to the standards. The structure of the performance expectations is of sufficient sophistication to ensure that adherence to the standards more than a casual exercise. Claims

  14. AUTHENTIC SCIENCE EXPERIENCES: PRE-COLLEGIATE SCIENCE EDUCATORS’ SUCCESSES AND CHALLENGES DURING PROFESSIONAL DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Andrea C. Burrows

    2016-04-01

    Full Text Available Twenty-three pre-collegiate educators of elementary students (ages 5-10 years and secondary students (ages 11-18 years attended a two-week science, technology, engineering, and mathematics (STEM astronomy focused professional development in the summer of 2015 with activities focused on authentic science experiences, inquiry, and partnership building. ‘Authentic’ in this research refers to scientific skills and are defined. The study explores the authentic science education experience of the pre-collegiate educators, detailing the components of authentic science as seen through a social constructionism lens. Using qualitative and quantitative methods, the researchers analyzed the successes and challenges of pre-collegiate science and mathematics educators when immersed in STEM and astronomy authentic science practices, the educators’ perceptions before and after the authentic science practices, and the educators’ performance on pre to post content tests during the authentic science practices. Findings show that the educators were initially engaged, then disengaged, and then finally re-engaged with the authentic experience. Qualitative responses are shared, as are the significant results of the quantitative pre to post content learning scores of the educators. Conclusions include the necessity for PD team delivery of detailed explanations to the participants - before, during, and after – for the entire authentic science experience and partnership building processes. Furthermore, expert structure and support is vital for participant research question generation, data collection, and data analysis (successes, failures, and reattempts. Overall, in order to include authentic science in pre-collegiate classrooms, elementary and secondary educators need experience, instruction, scaffolding, and continued support with the STEM processes.

  15. The CMS Data Management System

    Science.gov (United States)

    Giffels, M.; Guo, Y.; Kuznetsov, V.; Magini, N.; Wildish, T.

    2014-06-01

    The data management elements in CMS are scalable, modular, and designed to work together. The main components are PhEDEx, the data transfer and location system; the Data Booking Service (DBS), a metadata catalog; and the Data Aggregation Service (DAS), designed to aggregate views and provide them to users and services. Tens of thousands of samples have been cataloged and petabytes of data have been moved since the run began. The modular system has allowed the optimal use of appropriate underlying technologies. In this contribution we will discuss the use of both Oracle and NoSQL databases to implement the data management elements as well as the individual architectures chosen. We will discuss how the data management system functioned during the first run, and what improvements are planned in preparation for 2015.

  16. The CMS data management system

    International Nuclear Information System (INIS)

    Giffels, M; Magini, N; Guo, Y; Kuznetsov, V; Wildish, T

    2014-01-01

    The data management elements in CMS are scalable, modular, and designed to work together. The main components are PhEDEx, the data transfer and location system; the Data Booking Service (DBS), a metadata catalog; and the Data Aggregation Service (DAS), designed to aggregate views and provide them to users and services. Tens of thousands of samples have been cataloged and petabytes of data have been moved since the run began. The modular system has allowed the optimal use of appropriate underlying technologies. In this contribution we will discuss the use of both Oracle and NoSQL databases to implement the data management elements as well as the individual architectures chosen. We will discuss how the data management system functioned during the first run, and what improvements are planned in preparation for 2015.

  17. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  18. Doomsday 2012 and Cosmophobia: Challenges and Opportunities for Science Communication

    Science.gov (United States)

    Fraknoi, A.; Larsen, K.; Mendez, B.; Morrison, D.; Van Stone, M.

    2013-04-01

    Hollywood movies, cable-channel documentaries, and countless books and websites have convinced a significant fraction of the U.S. public that some kind of catastrophe awaits us around the winter solstice of 2012, and that the cause of this catastrophe will be an astronomical or geophysical event. “Doomsday 2012” represents both a challenge and opportunity for science communication and education. This plenary panel discussed the basic ideas of the 2012 scenario and considered what is being done and what could be done to help the public understand what is real and what isn't. These lessons can be applied to future pseudoscientific predictions about the end of the world.

  19. Characteristics, emerging needs, and challenges of transdisciplinary sustainability science

    DEFF Research Database (Denmark)

    Ruppert-Winkel, Chantal; Arlinghaus, Robert; Deppisch, Sonja

    2015-01-01

    Transdisciplinary sustainability science (TSS) is a prominent way of scientifically contributing to the solution of sustainability problems. Little is known, however, about the practice of scientists in TSS, especially those early in their career. Our objectives were to identify these practices...... and to outline the needs and challenges for early career scientists in TSS. To that end, we compiled 10 key characteristics of TSS based on a literature survey. We then analyzed research groups with 81 early career scientists against these characteristics. All of these research groups are funded by an ongoing...... achievements of societal and scientific impact, acknowledging that focusing on the time-consuming former aspect is difficult to integrate into a scientific career path; and (3) although generalist researchers are increasingly becoming involved in such TSS research projects, supporting the integration of social...

  20. Global hunger: a challenge to agricultural, food, and nutritional sciences.

    Science.gov (United States)

    Wu, Shiuan-Huei; Ho, Chi-Tang; Nah, Sui-Lin; Chau, Chi-Fai

    2014-01-01

    Hunger has been a concern for generations and has continued to plague hundreds of millions of people around the world. Although many efforts have been devoted to reduce hunger, challenges such as growing competitions for natural resources, emerging climate changes and natural disasters, poverty, illiteracy, and diseases are posing threats to food security and intensifying the hunger crisis. Concerted efforts of scientists to improve agricultural and food productivity, technology, nutrition, and education are imperative to facilitate appropriate strategies for defeating hunger and malnutrition. This paper provides some aspects of world hunger issues and summarizes the efforts and measures aimed to alleviate food problems from the food and nutritional sciences perspectives. The prospects and constraints of some implemented strategies for alleviating hunger and achieving sustainable food security are also discussed. This comprehensive information source could provide insights into the development of a complementary framework for dealing with the global hunger issue.

  1. Open science initiatives: challenges for public health promotion.

    Science.gov (United States)

    Holzmeyer, Cheryl

    2018-03-07

    While academic open access, open data and open science initiatives have proliferated in recent years, facilitating new research resources for health promotion, open initiatives are not one-size-fits-all. Health research particularly illustrates how open initiatives may serve various interests and ends. Open initiatives not only foster new pathways of research access; they also discipline research in new ways, especially when associated with new regimes of research use and peer review, while participating in innovation ecosystems that often perpetuate existing systemic biases toward commercial biomedicine. Currently, many open initiatives are more oriented toward biomedical research paradigms than paradigms associated with public health promotion, such as social determinants of health research. Moreover, open initiatives too often dovetail with, rather than challenge, neoliberal policy paradigms. Such initiatives are unlikely to transform existing health research landscapes and redress health inequities. In this context, attunement to social determinants of health research and community-based local knowledge is vital to orient open initiatives toward public health promotion and health equity. Such an approach calls for discourses, norms and innovation ecosystems that contest neoliberal policy frameworks and foster upstream interventions to promote health, beyond biomedical paradigms. This analysis highlights challenges and possibilities for leveraging open initiatives on behalf of a wider range of health research stakeholders, while emphasizing public health promotion, health equity and social justice as benchmarks of transformation.

  2. Sustainable development: challenges and opportunities for the natural sciences (Invited)

    Science.gov (United States)

    Mutter, J. C.; Fishman, R.; Anttila-Hughes, J. K.; Hsiang, S. M.

    2009-12-01

    The challenges of sustainable development -- equitably improving global human welfare while ensuring that the environment is preserved for future generations - demand research at the nexus of the social and natural sciences. Massive and inevitable changes in climate, ecosystem functions, and human interaction with the environment will perturb societies throughout the world in different ways over the coming century. The changes faced by poor societies and their ability to cope differs markedly from those that face the richest. Yet in all regions the dynamic interaction of social and natural drivers will govern the prospects for human welfare and its improvement. Developing an understanding of these phenomena will require field research together with analytical and modeling capabilities that couple physical and social phenomena, allowing feedback between the two to manifest and permit forecasting over long time scales. Heterogeneous income and population growth further complicate this need through their consequences for food security, migration, resource allocation, and conflict. In this contribution, we identify some key concepts of sustainable development, open research questions and outline how scientific research might engage this emerging discipline. Using recent examples of interaction, we discuss the opportunities and challenges facing the further development of this dialogue.

  3. Environmental data management at Fernald

    International Nuclear Information System (INIS)

    Jones, B.W.; Williams, J.

    1994-01-01

    FERMCO supports DOE's ongoing initiatives for the continuous improvement of site restoration through the development and application of innovative technologies. A major thrust of FERMCO's efforts has been the enhancement of environmental data management technology for the site. The understanding of environmental data is the fundamental basis for determining the need for environmental restoration, developing and comparing remedial alternatives, and reaching a decision on how to clean up a site. Environmental data management at Fernald is being focused on two major objectives: to improve the efficiency of the data management process, and to provide a better understanding of the meaning of the data at the earliest possible time. Environmental data at Fernald is typically a soil or groundwater sample collected by one of the field geologists. These samples are then shipped to one or more laboratories for analysis. After the analyses are returned from the laboratories the data are reviewed and qualified for usability. The data are then used by environmental professionals for determining nature and extent of contamination. Additionally, hazardous waste materials whether generated during production or during cleanup, may be sampled to characterize the waste before shipment or treatment. The data management process, which uses four major software systems, is presented graphically

  4. Data management of protein interaction networks

    CERN Document Server

    Cannataro, Mario

    2012-01-01

    Interactomics: a complete survey from data generation to knowledge extraction With the increasing use of high-throughput experimental assays, more and more protein interaction databases are becoming available. As a result, computational analysis of protein-to-protein interaction (PPI) data and networks, now known as interactomics, has become an essential tool to determine functionally associated proteins. From wet lab technologies to data management to knowledge extraction, this timely book guides readers through the new science of interactomics, giving them the tools needed to: Generate

  5. The global nutrient challenge. From science to public engagement

    Energy Technology Data Exchange (ETDEWEB)

    Sutton, M.A.; Howard, C.M. [NERC Centre for Ecology and Hydrology, Edinburgh (United Kingdom); Bleeker, A. [Energy research Centre of the Netherlands, Petten (Netherlands); Datta, A. [United Nations Environment Programme, Nairobi (Kenya)

    2013-04-15

    Among the many environment and development challenges facing humanity, it is fair to say that nutrients do not currently feature so regularly in the newspapers, radio and television. The media tends to prefer easy single issues which affect our daily lives in a clear-cut way. The role of carbon in climate change is a good example. We all depend on climate. Burning fossil fuels makes more carbon dioxide, tending to change temperature and rainfall patterns, to which we can easily relate. The science is complex, but it is a simple message for the public to understand. It does not take long to think of several other easily grasped threats, like urban air pollution, poor drinking water, or even the occurrence of horsemeat in food chains. It is perhaps for these reasons that the role of nutrients in environmental change has received much less public attention. After all, nutrients - including nitrogen, phosphorus and many micronutrients - play multiple roles in our world; they affect many biogeochemical processes and they lead to a plethora of interacting threats. If we are not careful, we can quickly get buried in the complexity of the different ways in which our lives are affected by these elements. The outcome is that it can become hard to convey the science of global nutrient cycles in a way that the public can understand. These are points about which we have given substantial thought as we contributed to a recently launched report Our Nutrient World: The challenge to produce more food and energy with less pollution (Sutton et al., 2013). The report was commissioned by the United Nations Environment Programme (UNEP) and conducted by the Global Partnership on Nutrient Management in cooperation with the International Nitrogen Initiative. The commission was not to provide a full scientific assessment, but rather to develop a global overview of the challenges associated with nutrient management. Drawing on existing knowledge, the aim was to distill the nature of the

  6. Wildlife tracking data management: a new vision.

    Science.gov (United States)

    Urbano, Ferdinando; Cagnacci, Francesca; Calenge, Clément; Dettki, Holger; Cameron, Alison; Neteler, Markus

    2010-07-27

    To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioural data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals' environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling.

  7. The LHCb Data Management System

    International Nuclear Information System (INIS)

    Baud, J P; Charpentier, Ph; Ciba, K; Lanciotti, E; Màthè, Z; Graciani, R; Remenska, D; Santana, R

    2012-01-01

    The LHCb Data Management System is based on the DIRAC Grid Community Solution. LHCbDirac provides extensions to the basic DMS such as a Bookkeeping System. Datasets are defined as sets of files corresponding to a given query in the Bookkeeping system. Datasets can be manipulated by CLI tools as well as by automatic transformations (removal, replication, processing). A dynamic handling of dataset replication is performed, based on disk space usage at the sites and dataset popularity. For custodial storage, an on-demand recall of files from tape is performed, driven by the requests of the jobs, including disk cache handling. We shall describe the tools that are available for Data Management, from handling of large datasets to basic tools for users as well as for monitoring the dynamic behavior of LHCb Storage capacity.

  8. User-Centered Data Management

    CERN Document Server

    Catarci, Tiziana; Kimani, Stephen

    2010-01-01

    This lecture covers several core issues in user-centered data management, including how to design usable interfaces that suitably support database tasks, and relevant approaches to visual querying, information visualization, and visual data mining. Novel interaction paradigms, e.g., mobile and interfaces that go beyond the visual dimension, are also discussed. Table of Contents: Why User-Centered / The Early Days: Visual Query Systems / Beyond Querying / More Advanced Applications / Non-Visual Interfaces / Conclusions

  9. The Next Generation Science Standards: The Features and Challenges

    Science.gov (United States)

    Pruitt, Stephen L.

    2014-01-01

    Beginning in January of 2010, the Carnegie Corporation of New York funded a two-step process to develop a new set of state developed science standards intended to prepare students for college and career readiness in science. These new internationally benchmarked science standards, the Next Generation Science Standards (NGSS) were completed in…

  10. Current fundamental science challenges in low temperature plasma science that impact energy security and international competitiveness

    Science.gov (United States)

    Hebner, Greg

    2010-11-01

    Products and consumer goods that utilize low temperature plasmas at some point in their creation touch and enrich our lives on almost a continuous basis. Examples are many but include the tremendous advances in microelectronics and the pervasive nature of the internet, advanced material coatings that increase the strength and reliability of products from turbine engines to potato chip bags, and the recent national emphasis on energy efficient lighting and compact fluorescent bulbs. Each of these products owes their contributions to energy security and international competiveness to fundamental research investments. However, it would be a mistake to believe that the great commercial success of these products implies a robust understanding of the complicated interactions inherent in plasma systems. Rather, current development of the next generation of low temperature plasma enabled products and processes is clearly exposing a new set of exciting scientific challenges that require leaps in fundamental understanding and interdisciplinary research teams. Emerging applications such as liquid-plasma systems to improve water quality and remediate hazardous chemicals, plasma-assisted combustion to increase energy efficiency and reduce emissions, and medical applications promise to improve our lives and the environment only if difficult science questions are solved. This talk will take a brief look back at the role of low temperature plasma science in enabling entirely new markets and then survey the next generation of emerging plasma applications. The emphasis will be on describing the key science questions and the opportunities for scientific cross cutting collaborations that underscore the need for increased outreach on the part of the plasma science community to improve visibility at the federal program level. This work is supported by the DOE, Office of Science for Fusion Energy Sciences, and Sandia National Laboratories, a multi-program laboratory managed and operated

  11. Medicinal mushroom science: Current perspectives, advances, evidences, and challenges

    Directory of Open Access Journals (Sweden)

    Solomon P Wasser

    2014-12-01

    Full Text Available The main target of the present review is to draw attention to the current perspectives, advances, evidences, challenges, and future development of medicinal mushroom science in the 21 st century. Medicinal mushrooms and fungi are thought to possess approximately 130 medicinal functions, including antitumor, immunomodulating, antioxidant, radical scavenging, cardiovascular, anti-hypercholesterolemic, antiviral, antibacterial, anti-parasitic, antifungal, detoxification, hepatoprotective, and antidiabetic effects. Many, if not all, higher Basidiomycetes mushrooms contain biologically active compounds in fruit bodies, cultured mycelium, and cultured broth. Special attention is paid to mushroom polysaccharides. The data on mushroom polysaccharides and different secondary metabolites are summarized for approximately 700 species of higher hetero- and homobasidiomycetes. Numerous bioactive polysaccharides or polysaccharide-protein complexes from the medicinal mushrooms described appear to enhance innate and cell-mediated immune responses, and exhibit antitumor activities in animals and humans. Whilst the mechanism of their antitumor actions is still not completely understood, stimulation and modulation of key host immune responses by these mushroom compounds appear central. Polysaccharides and low-molecular-weight secondary metabolites are particularly important due to their antitumor and immunostimulating properties. Several of the mushroom compounds have been subjected to Phase I, II, and III clinical trials, and are used extensively and successfully in Asia to treat various cancers and other diseases. Special attention is given to many important unsolved problems in the study of medicinal mushrooms.

  12. Challenges and opportunities of cloud computing for atmospheric sciences

    Science.gov (United States)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  13. Synthesis in land change science: methodological patterns, challenges, and guidelines.

    Science.gov (United States)

    Magliocca, Nicholas R; Rudel, Thomas K; Verburg, Peter H; McConnell, William J; Mertz, Ole; Gerstner, Katharina; Heinimann, Andreas; Ellis, Erle C

    Global and regional economic and environmental changes are increasingly influencing local land-use, livelihoods, and ecosystems. At the same time, cumulative local land changes are driving global and regional changes in biodiversity and the environment. To understand the causes and consequences of these changes, land change science (LCS) draws on a wide array synthetic and meta-study techniques to generate global and regional knowledge from local case studies of land change. Here, we review the characteristics and applications of synthesis methods in LCS and assess the current state of synthetic research based on a meta-analysis of synthesis studies from 1995 to 2012. Publication of synthesis research is accelerating, with a clear trend toward increasingly sophisticated and quantitative methods, including meta-analysis. Detailed trends in synthesis objectives, methods, and land change phenomena and world regions most commonly studied are presented. Significant challenges to successful synthesis research in LCS are also identified, including issues of interpretability and comparability across case-studies and the limits of and biases in the geographic coverage of case studies. Nevertheless, synthesis methods based on local case studies will remain essential for generating systematic global and regional understanding of local land change for the foreseeable future, and multiple opportunities exist to accelerate and enhance the reliability of synthetic LCS research in the future. Demand for global and regional knowledge generation will continue to grow to support adaptation and mitigation policies consistent with both the local realities and regional and global environmental and economic contexts of land change.

  14. Research Data Management Training for Geographers: First Impressions

    Directory of Open Access Journals (Sweden)

    Kerstin Helbig

    2016-03-01

    Full Text Available Sharing and secondary analysis of data have become increasingly important for research. Especially in geography, the collection of digital data has grown due to technological changes. Responsible handling and proper documentation of research data have therefore become essential for funders, publishers and higher education institutions. To achieve this goal, universities offer support and training in research data management. This article presents the experiences of a pilot workshop in research data management, especially for geographers. A discipline-specific approach to research data management training is recommended. The focus of this approach increases researchers’ interest and allows for more specific guidance. The instructors identified problems and challenges of research data management for geographers. In regards to training, the communication of benefits and reaching the target groups seem to be the biggest challenges. Consequently, better incentive structures as well as communication channels have to be established.

  15. Dynamic and adaptive data-management in ATLAS

    CERN Document Server

    Lassnig, M; Branco, M; Molfetas, A

    2010-01-01

    Distributed data-management on the grid is subject to huge uncertainties yet static policies govern its usage. Due to the unpredictability of user behaviour, the high-latency and the heterogeneous nature of the environment, distributed data-management on the grid is challenging. In this paper we present the first steps towards a future dynamic data-management system that adapts to the changing conditions and environment. Such a system would eliminate the number of manual interventions and remove unnecessary software layers, thereby providing a higher quality of service to the collaboration.

  16. A Case Study: Data Management in Biomedical Engineering

    Directory of Open Access Journals (Sweden)

    Glenn R. Gaudette

    2012-01-01

    Full Text Available In a biomedical engineering lab at Worcester Polytechnic Institute, co-author Dr. Glenn R. Gaudette and his research team are investigating the effects of stem cell therapy on the regeneration of function in damaged cardiac tissue in laboratory rats. Each instance of stem cell experimentation on a rat yields hundreds of data sets that must be carefully captured, documented and securely stored so that the data will be easily accessed and retrieved for papers, reports, further research, and validation of findings, while meeting NIH guidelines for data sharing. After a brief introduction to the bioengineering field and stem cell research, this paper focuses on the experimental workflow and the data generated in one instance of stem cell experimentation; the lab’s data management practices; and how Dr. Gaudette teaches data management to the lab’s incoming graduate students each semester. The co-authors discuss the haphazard manner by which engineering and science students typically learn data management practices, and advocate for the integration of formal data management instruction in higher education STEM curricula. The paper concludes with a discussion of the Frameworks for a Data Management Curriculum developed collaboratively by the co-authors’ institutions -- the University of Massachusetts Medical School and Worcester Polytechnic Institute -- to teach data management best practices to students in the sciences, health sciences, and engineering.

  17. Romanian spatial planning research facing the challenges of globalizing sciences

    Directory of Open Access Journals (Sweden)

    Alexandru-Ionuţ Petrişor

    2018-03-01

    competitiveness, measured in terms of scientific yield and citations, primarily affects fields where articles and citations are not the traditional outputs, such as the humanities and social sciences in general and planning-related disciplines in particular. When discussing planning, it has to be stressed out that research has a merely societal value and is not aimed at developing products that can foster economic growth or delivering scientific articles that profoundly change the theoretical perspectives. Simply put, research in planning aims at increasing the safety and welfare of people. As a consequence, planning research topics have shifted from providing scientific grounds to regional development policies, to addressing research quality and social responsibility or producing research guidelines. This article looks at the particular case of Romanian planning research based on SCImago data, in an attempt to assess whether this field is able to meet these global challenges, especially after the consistent, albeit uneven, in terms of goal and pace, application of new research policies designed after joining the European Union, which were aimed at increasing its article output and its international visibility. The findings indicate that the numerical growth of articles and publications is spectacular in Romania for most fields, and even more so within the humanities, the social sciences and planning. However, the question remains whether this impressive growth is supported by an increase in quality. We have therefore left aside matters such as the globalization of authors, topics or citations. These aspects require a more in-depth research effort.

  18. Data management system performance modeling

    Science.gov (United States)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  19. Spatial Data Management System (SDMS)

    Science.gov (United States)

    Hutchison, Mark W.

    1994-01-01

    The Spatial Data Management System (SDMS) is a testbed for retrieval and display of spatially related material. SDMS permits the linkage of large graphical display objects with detail displays and explanations of its smaller components. SDMS combines UNIX workstations, MIT's X Window system, TCP/IP and WAIS information retrieval technology to prototype a means of associating aggregate data linked via spatial orientation. SDMS capitalizes upon and extends previous accomplishments of the Software Technology Branch in the area of Virtual Reality and Automated Library Systems.

  20. The Cheetah data management system

    International Nuclear Information System (INIS)

    Kunz, P.F.; Word, G.B.

    1992-09-01

    Cheetah is a data management system based on the C programming language, with support for other languages. Its main goal is to transfer data between memory and I/O steams in a general way. The streams are either associated with disk files or are network data stems. Cheetah provides optional convenience functions to assist in the management of C structures. Cheetah steams are self-describing so that general purpose applications can fully understand an incoming steam. This information can be used to display the data in an incoming steam to the user of an interactive general application, complete with variable names and optional comments

  1. Science Education Reform in Qatar: Progress and Challenges

    Science.gov (United States)

    Said, Ziad

    2016-01-01

    Science education reform in Qatar has had limited success. In the Trends in International Mathematics and Science Study (TIMMS), Qatari 4th and 8th grade students have shown progress in science achievement, but they remain significantly below the international average. Also, in the Program for International Student Assessment (PISA), Qatari…

  2. The Challenges Faced by New Science Teachers in Saudi Arabia

    Science.gov (United States)

    Alsharari, Salman

    2016-01-01

    Growing demand for science teachers in the Kingdom of Saudi Arabia, fed by increasing numbers of public school students, is forcing the Saudi government to attract, recruit and retain well-qualified science teachers. Beginning science teachers enter the educational profession with a massive fullfilment and satisfaction in their roles and positions…

  3. Prospects and challenges for social media data in conservation science

    Directory of Open Access Journals (Sweden)

    Enrico eDi Minin

    2015-09-01

    Full Text Available Social media data have been extensively used in numerous fields of science, but examples of their use in conservation science are still very limited. In this paper, we propose a framework on how social media data could be useful for conservation science and practice. We present the commonly used social media platforms and discuss how their content could be providing new data and information for conservation science. Based on this, we discuss how future work in conservation science and practice would benefit from social media data.

  4. Metaphor and knowledge the challenges of writing science

    CERN Document Server

    Baake, Ken

    2003-01-01

    Analyzing the power of metaphor in the rhetoric of science, this book examines the use of words to express complex scientific concepts. Metaphor and Knowledge offers a sweeping history of rhetoric and metaphor in science, delving into questions about how language constitutes knowledge. Weaving together insights from a group of scientists at the Santa Fe Institute as they shape the new interdisciplinary field of complexity science, Ken Baake shows the difficulty of writing science when word meanings are unsettled, and he analyzes the power of metaphor in science.

  5. ARDA Dashboard Data Management Monitoring

    CERN Document Server

    Rocha, R; Andreeva, J; Saiz, P

    2007-01-01

    The Atlas DDM (Distributed Data Management) system is responsible for the management and distribution of data across the different grid sites. The data is generated at CERN and has to be made available as fast as possible in a large number of centres for production purposes, and later in many other sites for end-user analysis. Monitoring their data transfer activity and availability is an essential task for both site administrators and end users doing analysis in their local centres. Data management using the grid depends on a complex set of services. File catalogues for file and file location bookkeeping, transfer services for file movement, storage managers and others. In addition there are several flavours of each of these components, tens of sites each managing a distinct installation - over 100 at the present time - and in some organizations data is seen and moved in larger granularity than files - usually called datasets, which makes the successful usage of the standard grid monitoring tools a non strai...

  6. NBII-SAIN Data Management Toolkit

    Science.gov (United States)

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    The Strategic Plan for the U.S. Geological Survey Biological Informatics Program (2005-2009) recognizes the need for effective data management: Though the Federal government invests more than $600 million per year in biological data collection, it is difficult to address these issues because of limited accessibility and lack of standards for data and information...variable quality, sources, methods, and formats (for example observations in the field, museum specimens, and satellite images) present additional challenges. This is further complicated by the fast-moving target of emerging and changing technologies such as GPS and GIS. Even though these technologies offer new solutions, they also create new informatics challenges (Ruggiero and others, 2005). The USGS National Biological Information Infrastructure program, hereafter referred to as NBII, is charged with the mission to improve the way data and information are gathered, documented, stored, and accessed. The central objective of this project is a direct reflection of the purpose of NBII as described by John Mosesso, Program Manager of the U.S. Geological Survey-Biological Informatics Program-GAP Analysis: At the outset, the reason for bringing about NBII was that there were significant amounts of data and information scattered all over the U.S., not accessible, in incompatible formats, and that NBII was tasked with addressing this problem...NBII's focus is to pull data together that truly matters to someone or communities. Essentially, the core questions are: 1) what are the issues, 2) where is the data, and 3) how can we make it usable and accessible (John Mosesso, U.S. Geological Survey, oral commun., 2006). Redundancy in data collection can be a major issue when multiple stakeholders are involved with a common effort. In 2001 the U.S. General Accounting Office (USGAO) estimated that about 50 percent of the Federal government's geospatial data at the time was redundant. In addition, approximately 80

  7. Data Management Practices and Perspectives of Atmospheric Scientists and Engineering Faculty

    Directory of Open Access Journals (Sweden)

    Christie Wiley

    2016-12-01

    Full Text Available This article analyzes 21 in-depth interviews of engineering and atmospheric science faculty at the University of Illinois Urbana-Champaign (UIUC to determine faculty data management practices and needs within the context of their research activities. A detailed literature review of previous large-scale and institutional surveys and interviews revealed that researchers have a broad awareness of data-sharing mandates of federal agencies and journal publishers and a growing acceptance, with some concerns, of the value of data-sharing. However, the disciplinary differences in data management needs are significant and represent a set of challenges for libraries in setting up consistent and successful services. In addition, faculty have not yet significantly changed their data management practices to conform with the mandates. The interviews focused on current research projects and funding sources, data types and format, the use of disciplinary and institutional repositories, data-sharing, their awareness of university library data management and preservation services, funding agency review panel experiences, and struggles or challenges with managing research data. In general, the interviews corroborated the trends identified in the literature. One clear observation from the interviews was that scientists and engineers take a holistic view of the research lifecycle and treat data as one of many elements in the scholarly communication workflow. Data generation, usage, storage, and sharing are an integrated aspect of a larger scholarly workflow, and are not necessarily treated as a separate entity. Acknowledging this will allow libraries to develop programs that better integrate data management support into scholarly communication instruction and training.

  8. Information Sciences: training, challenges and new proposal from Venezuela

    Directory of Open Access Journals (Sweden)

    Leomar José Montilla

    2012-04-01

    Full Text Available It reflects on the training of information professionals in Venezuela and the potential contributions that these professionals can provide to society and its projection to it. The content is divided into three parts: the first deals with issues related to professional training in Information Sciences in Venezuela, the second project the training Venezuelan Information Sciences in the future and the third reflects on the prospects for professionals in Information Science

  9. Challenges of Teaching Science to Address Global Sustainability

    OpenAIRE

    Halim, Lilia

    2015-01-01

    For a liveable condition in this post- industrial era, it would depend on our ability to understand and use the science and technology advancement in a responsible manner. Water pollution and global warming phenomena are outcomes of scientific and technological advancement that has been mismanaged. One way to achieve global sustainability is through science education and the development of a scientific literate citizen. This paper, based on the literature and research work in science educatio...

  10. Global Social Challenges: insights from the physical sciences and their relevance to the evolution of social science

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The complex challenges confronting humanity today point to the need for new thinking and new theory in the social sciences which overcomes the limitations of compartmentalized, sectoral concepts, strategies and policies and mechanistic approaches to living social systems. The World Academy of Art & Science is convening a consortium of leading institutions and thinkers from different sectors to contribute ideas for formulation of a cohesive framework capable of addressing global social challenges in their totality and complex interrelationships. The objective of my presentation will be to explore the potential for collaboration between the physical and social sciences to arrive at a more cohesive and effective framework by exploring a series of questions, including - - Is an integrated science of society possible that transcends disciplinary boundaries based on common underlying principles as we find in the natural sciences? - To what extent can principles of natural science serve as valid models and a...

  11. Expiration Times for Data Management

    DEFF Research Database (Denmark)

    Schmidt, Albrecht; Jensen, Christian Søndergaard; Saltenis, Simonas

    2006-01-01

    This paper describes an approach to incorporating the notion of expiration time into data management based on the relational model. Expiration times indicate when tuples cease to be current in a database. The paper presents a formal data model and a query algebra that handle expiration times...... transparently and declaratively. In particular, expiration times are exposed to users only on insertion and update, and when triggers fire due to the expiration of a tuple; for queries, they are handled behind the scenes and do not concern the user. Notably, tuples are removed automatically from (materialised......) query results as they expire in the (base) relations. For application developers, the benefits of using expiration times are (1) leaner application code, (2) lower transaction volume, (3) smaller databases, and, (4) higher consistency for replicated data with lower overhead. Expiration times turn out...

  12. Integrated data management for RODOS

    International Nuclear Information System (INIS)

    Abramowicz, K.; Koschel, A.; Rafat, M.; Wendelgass, R.

    1995-12-01

    The report presents the results of a feasibility study on an integrated data organisation and management in RODOS, the real-time on-line decision support system for off-site nuclear emergency management. The conceptual design of the functional components of the integrated data management are described taking account of the software components and the operation environment of the RODOS system. In particular, the scheme architecture of a database integration manager for accessing and updating a multi-database system is discussed in detail under a variety of database management aspects. Furthermore, the structural design of both a simple knowledge database and a real-time database are described. Finally, some short comments on the benefits and disadvantages of the proposed concept of data integration in RODOS are given. (orig.) [de

  13. Data management system advanced development

    Science.gov (United States)

    Douglas, Katherine; Humphries, Terry

    1990-01-01

    The Data Management System (DMS) Advanced Development task provides for the development of concepts, new tools, DMS services, and for the testing of the Space Station DMS hardware and software. It also provides for the development of techniques capable of determining the effects of system changes/enhancements, additions of new technology, and/or hardware and software growth on system performance. This paper will address the built-in characteristics which will support network monitoring requirements in the design of the evolving DMS network implementation, functional and performance requirements for a real-time, multiprogramming, multiprocessor operating system, and the possible use of advanced development techniques such as expert systems and artificial intelligence tools in the DMS design.

  14. Conference on Environmental Data Management

    CERN Document Server

    Oppenheimer, Dorothy; Brogden, William; Environmental Data Management

    1976-01-01

    Throughout the world a staggering amount of resources have been used to obtain billions of environmental data points. Some, such as meteorological data, have been organized for weather map display where many thousands of data points are synthesized in one compressed map. Most environmental data, however, are still widely scattered and generally not used for a systems approach, but only for the purpose for which they were originally taken. These data are contained in relatively small computer programs, research files, government and industrial reports, etc. This Conference was called to bring together some of the world's leaders from research centers and government agencies, and others concerned with environmental data management. The purpose of the Conference was to organize discussion on the scope of world environmental data, its present form and documentation, and whether a systematic approach to a total system is feasible now or in the future. This same subject permeated indirectly the Stockholm Conference...

  15. Meeting the Capstone Challenge in Postgraduate Food Science Education

    Science.gov (United States)

    McSweeney, Peter; Calvo, Joaquin; Santhanam-Martin, Michael; Billman-Jacobe, Helen

    2017-01-01

    Project work and work placements can help prepare tertiary food science students for the workplace. Programs in the curriculum should support the development of transferable skills such as communication, problem-solving, and planning. This paper describes a case study of a new capstone project for Masters of Food Science students based on a work…

  16. Recent Challenges Facing US Government Climate Science Access and Application

    Science.gov (United States)

    Goldman, G. T.; Carter, J. M.; Licker, R.

    2017-12-01

    Climate scientists have long faced politicization of their work, especially those working within the US federal government. However, political interference in federal government climate change science has escalated in the current political era with efforts by political actors to undermine and disrupt infrastructure supporting climate science. This has included funding changes, decreased access to climate science information on federal agency websites, restrictions on media access to scientific experts within the government, and rolling back of science-based policies designed to incorporate and respond to climate science findings. What are the impacts of such changes for both the climate science community and the broader public? What can be done to ensure that access to and application of climate change-related research to policy decisions continues? We will summarize and analyze the state of climate change research and application in the US government. The impacts of political interference in climate change science as well as opportunities the scientific community has to support climate science in the US government, will be discussed.

  17. Customization of Curriculum Materials in Science: Motives, Challenges, and Opportunities

    Science.gov (United States)

    Romine, William L.; Banerjee, Tanvi

    2012-01-01

    Exemplary science instructors use inquiry to tailor content to student's learning needs; traditional textbooks treat science as a set of facts and a rigid curriculum. Publishers now allow instructors to compile pieces of published and/or self-authored text to make custom textbooks. This brings numerous advantages, including the ability to produce…

  18. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    Science.gov (United States)

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  19. Profitability and optimization of data management

    Energy Technology Data Exchange (ETDEWEB)

    Boussa, M. [Sonatrach, Alger (Algeria). Petroleum Engineering and Development

    2008-07-01

    Information systems and technologies for the oil and gas industry were discussed with particular reference to the use of data analysis in dynamic planning processes. This paper outlined the risks and challenges associated with reorganizing data systems and the costs associated with equipment and software purchases. Issues related to Intranet encryption and electronic commerce systems were also reviewed along with the impact of the Internet on the oil and gas industry. New methods for using real time data systems for updating well data were outlined together with recent developments in Intranet and Extranet technologies and services. Other topics of discussion included new software applications for network optimization and nodal analyses; industry-specific software developed for well testing and reservoir engineering; and simulation and management production software. Data management solutions for storing, retrieving and analyzing data streams were presented. It was concluded that successful organizations must develop accurate data systems in order to ensure continuing success. 4 refs., 8 figs.

  20. Data management on the spatial web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2012-01-01

    Due in part to the increasing mobile use of the web and the proliferation of geo-positioning, the web is fast acquiring a significant spatial aspect. Content and users are being augmented with locations that are used increasingly by location-based services. Studies suggest that each week, several...... billion web queries are issued that have local intent and target spatial web objects. These are points of interest with a web presence, and they thus have locations as well as textual descriptions. This development has given prominence to spatial web data management, an area ripe with new and exciting...... opportunities and challenges. The research community has embarked on inventing and supporting new query functionality for the spatial web. Different kinds of spatial web queries return objects that are near a location argument and are relevant to a text argument. To support such queries, it is important...

  1. Quantum Opportunities and Challenges for Fundamental Sciences in Space

    Science.gov (United States)

    Yu, Nan

    2012-01-01

    Space platforms offer unique environment for and measurements of quantum world and fundamental physics. Quantum technology and measurements enhance measurement capabilities in space and result in greater science returns.

  2. Challenges of medical and biological engineering and science

    Energy Technology Data Exchange (ETDEWEB)

    Magjarevic, R [University of Zagreb, Faculty of Electrical Engineering and Computing, Zagreb (Croatia)

    2004-07-01

    All aspects of biomedical engineering and science, from research and development, education and training, implementation in health care systems, internationalisation and globalisation, and other, new issues are present in the strategy and in action plans of the International Federation for Medical and Biological Engineering (IFMBE) which, with help of a large number of highly motivated volunteers, will stay in leading position in biomedical engineering and science.

  3. Challenges of medical and biological engineering and science

    International Nuclear Information System (INIS)

    Magjarevic, R.

    2004-01-01

    All aspects of biomedical engineering and science, from research and development, education and training, implementation in health care systems, internationalisation and globalisation, and other, new issues are present in the strategy and in action plans of the International Federation for Medical and Biological Engineering (IFMBE) which, with help of a large number of highly motivated volunteers, will stay in leading position in biomedical engineering and science

  4. Sample and data management process description

    International Nuclear Information System (INIS)

    Kessner, J.H.

    2000-01-01

    The sample and data management process was initiated in 1994 as a result of a process improvement workshop. The purpose of the workshop was to develop a sample and data management process that would reduce cycle time and costs, simplify systems and procedures, and improve customer satisfaction for sampling, analytical services, and data management activities

  5. International safeguards data management system

    International Nuclear Information System (INIS)

    Argentesi, F.; Costantini, L.; Franklin, M.; Dondi, M.G.

    1981-01-01

    The data base management system ''ISADAM'' (i.e. International Safeguards Data Management System) described in this report is intended to facilitate the safeguards authority in making efficient and effective use of accounting reports. ISADAM has been developed using the ADABAS data base management system and is implemented on the JRC-Ispra computer. The evaluation of safeguards declarations focuses on three main objectives: - the requirement of syntactical consistency with the legal conventions of data recording for safeguards accountancy; - the requirement of accounting evidence that there is no material unaccounted for (MUF); - the requirement of semantic consistency with the technological characteristics of the plant and the processing plans of the operator. Section 2 describes in more detail the facilities which ISADAM makes available to a safeguards inspector. Section 3 describes how the MUF variance computation is derived from models of measurement error propagation. Many features of the ISADAM system are automatically provided by ADABAS. The exceptions to this are the utility software designed to: - screen plant declarations before loading into the data base, - prepare variance summary files designed to support real-time computation of MUF and variance of MUF, - provide analyses in response to user requests in interactive or batch mode. Section 4 describes the structure and functions of this software which have been developed by JRC-Ispra

  6. NSF-Sponsored Biological and Chemical Oceanography Data Management Office

    Science.gov (United States)

    Allison, M. D.; Chandler, C. L.; Copley, N.; Galvarino, C.; Gegg, S. R.; Glover, D. M.; Groman, R. C.; Wiebe, P. H.; Work, T. T.; Biological; Chemical Oceanography Data Management Office

    2010-12-01

    Ocean biogeochemistry and marine ecosystem research projects are inherently interdisciplinary and benefit from improved access to well-documented data. Improved data sharing practices are important to the continued exploration of research themes that are a central focus of the ocean science community and are essential to interdisciplinary and international collaborations that address complex, global research themes. In 2006, the National Science Foundation Division of Ocean Sciences (NSF OCE) funded the Biological and Chemical Oceanography Data Management Office (BCO-DMO) to serve the data management requirements of scientific investigators funded by the National Science Foundation’s Biological and Chemical Oceanography Sections. BCO-DMO staff members work with investigators to manage marine biogeochemical, ecological, and oceanographic data and information developed in the course of scientific research. These valuable data sets are documented, stored, disseminated, and protected over short and intermediate time frames. One of the goals of the BCO-DMO is to facilitate regional, national, and international data and information exchange through improved data discovery, access, display, downloading, and interoperability. In May 2010, NSF released a statement to the effect that in October 2010, it is planning to require that all proposals include a data management plan in the form of a two-page supplementary document. The data management plan would be an element of the merit review process. NSF has long been committed to making data from NSF-funded research publicly available and the new policy will strengthen this commitment. BCO-DMO is poised to assist in creating the data management plans and in ultimately serving the data and information resulting from NSF OCE funded research. We will present an overview of the data management system capabilities including: geospatial and text-based data discovery and access systems; recent enhancements to data search tools; data

  7. [When simple meets false: challenges to science journalism].

    Science.gov (United States)

    Haaf, Günter

    2012-01-01

    Science journalists working for public media are caught between the two poles of factual correctness ("Thou shalt not harm") and entertaining presentation ("Thou shalt not bore"). Writing about (in most cases) complex topics they need to stand their ground against the mass media, the consumption of which is--in contrast to science and technology media--inherently voluntary. Within the general framework of the mass media, science journalism has emerged from a "late department" to become an important, but by no means leading part of the press arena. The trend is moving away from interpreting towards critically accompanying science. Due to the strong support to high-quality science journalism that major foundations provided during the past thirty years, the numbers of better trained scientific journalists operating in Germany have considerably increased, but so have the requirements: higher levels of work stress and a higher demand for real-time information, particularly from online media, the risk of economic and other organisations taking control over information by intensifying their public relations campaigns. Copyright © 2012. Published by Elsevier GmbH.

  8. Nuclear Science Capacity Building in Kenya: Challenges and Opportunities

    International Nuclear Information System (INIS)

    Mangala, J. M.

    2017-01-01

    Kenya's significant involvement in Nuclear Science and Technology can be traced back to 1965 when the country became a member state of the International Atomic Energy Agency (IAEA). In 1978, Kenya formulated a project for the establishment of the ''Nuclear Science Laboratory'' at the University of Nairobi that soon after, received assistance from International Atomic Energy Agency. The laboratory was expected to be a base for the promotion of nuclear science technologies in the country was started in 1979 and has since developed into a fully-fledged institute of the University of Nairobi. In general, six main areas of nuclear science applications have continued to receive IAEA assistance; during the past ten years ; agriculture and soil management (30%), livestock production , introduction to nuclear power production (21%)- radiation oncology in cancer management and nuclear medicine (16%). Smaller shares went to nuclear safety (9%), nuclear engineering and technology (8%), industry and water resource management (7%) and nuclear physics and chemistry (5%). At present, the Agency is supporting several technical co-operation projects, four of which are in agriculture and two in nuclear physics and chemistry with additional assistance in the areas of manpower development, nuclear medicine, non-destructive testing techniques and radioactive waste management. Thus, through Government initiatives, and with the assistance of IAEA, quite a number of specialist national laboratories for nuclear science application have emerged

  9. Frameworks Coordinate Scientific Data Management

    Science.gov (United States)

    2012-01-01

    Jet Propulsion Laboratory computer scientists developed a unique software framework to help NASA manage its massive amounts of science data. Through a partnership with the Apache Software Foundation of Forest Hill, Maryland, the technology is now available as an open-source solution and is in use by cancer researchers and pediatric hospitals.

  10. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  11. Major Challenges for the Modern Chemistry in Particular and Science in General.

    Science.gov (United States)

    Uskokovíc, Vuk

    2010-11-01

    In the past few hundred years, science has exerted an enormous influence on the way the world appears to human observers. Despite phenomenal accomplishments of science, science nowadays faces numerous challenges that threaten its continued success. As scientific inventions become embedded within human societies, the challenges are further multiplied. In this critical review, some of the critical challenges for the field of modern chemistry are discussed, including: (a) interlinking theoretical knowledge and experimental approaches; (b) implementing the principles of sustainability at the roots of the chemical design; (c) defining science from a philosophical perspective that acknowledges both pragmatic and realistic aspects thereof; (d) instigating interdisciplinary research; (e) learning to recognize and appreciate the aesthetic aspects of scientific knowledge and methodology, and promote truly inspiring education in chemistry. In the conclusion, I recapitulate that the evolution of human knowledge inherently depends upon our ability to adopt creative problem-solving attitudes, and that challenges will always be present within the scope of scientific interests.

  12. Science, technology and the 'grand challenge' of aging

    DEFF Research Database (Denmark)

    Jæger, Birgit; Peine, Alexander; Moors, Ellen

    2015-01-01

    In this paper, we introduce the themes addressed and the approaches used in this special issue. We start by briefly discussing the state of the art in research and policy making related to science, technology and ageing. We argue that an important gap characterizes this state of the art: current...... approaches do not consider material practice and materiality to be an inherent part of later life as constituted in contemporary societies. Science and Technology Studies (STS) provide both the theories and methods to address this gap, and thus deploy a theoretical and empirical understanding of science......, technology and ageing that captures how later life co-evolves with the practices of technology use and design. We briefly discuss how the articles in the collection each contribute to such an understanding across various locations. We conclude that, together, the contributions specify a perspective...

  13. Lydia Becker's "School for Science": a challenge to domesticity.

    Science.gov (United States)

    Parker, J E

    2001-01-01

    Lydia Becker (1827-1890) is known as a leader of the Women's Suffrage Movement but little is known about her work to include women and girls in science. Before her energy was channelled into politics, she aimed to have a scientific career. Mid-Victorian Britain was a period in which women's intellect and potential were widely debated, and in which the dominant ideology was that their primary role in life was that of wife and mother. Science was widely regarded as a "masculine" subject which women were deliberately discouraged from studying. The author concentrates on the two main areas in which important contributions were made, the British Association for the Advancement of Science, and the Manchester School Board.

  14. Psychoanalysis as a science: a response to the new challenges.

    Science.gov (United States)

    Wallerstein, R S

    1986-07-01

    Few theoretical issues in psychoanalysis have been more constantly argued than the status of our discipline as a science. For long the attack has been from the logical positivists and the extensions of their argument by Karl Popper. Over recent decades the debate about the place of our metapsychology has intensified the concerns about our scientific status. In this paper I respond briefly to the logical positivist, the Popperian, and the information-processing systems theory arguments and then develop at greater length a response to the two current, most widespread philosophy-of-science assaults upon our credibility as science, that of the hermeneuticists (Ricoeur, Habermas, Gadamer, and others), and the newest, that of the philosopher, Adolf Grünbaum.

  15. Design and Data Management System

    Science.gov (United States)

    Messer, Elizabeth; Messer, Brad; Carter, Judy; Singletary, Todd; Albasini, Colby; Smith, Tammy

    2007-01-01

    The Design and Data Management System (DDMS) was developed to automate the NASA Engineering Order (EO) and Engineering Change Request (ECR) processes at the Propulsion Test Facilities at Stennis Space Center for efficient and effective Configuration Management (CM). Prior to the development of DDMS, the CM system was a manual, paper-based system that required an EO or ECR submitter to walk the changes through the acceptance process to obtain necessary approval signatures. This approval process could take up to two weeks, and was subject to a variety of human errors. The process also requires that the CM office make copies and distribute them to the Configuration Control Board members for review prior to meetings. At any point, there was a potential for an error or loss of the change records, meaning the configuration of record was not accurate. The new Web-based DDMS eliminates unnecessary copies, reduces the time needed to distribute the paperwork, reduces time to gain the necessary signatures, and prevents the variety of errors inherent in the previous manual system. After implementation of the DDMS, all EOs and ECRs can be automatically checked prior to submittal to ensure that the documentation is complete and accurate. Much of the configuration information can be documented in the DDMS through pull-down forms to ensure consistent entries by the engineers and technicians in the field. The software also can electronically route the documents through the signature process to obtain the necessary approvals needed for work authorization. The workflow of the system allows for backups and timestamps that determine the correct routing and completion of all required authorizations in a more timely manner, as well as assuring the quality and accuracy of the configuration documents.

  16. Disciplinary differences in faculty research data management practices and perspectives

    Directory of Open Access Journals (Sweden)

    Katherine G. Akers

    2013-11-01

    Full Text Available Academic librarians are increasingly engaging in data curation by providing infrastructure (e.g., institutional repositories and offering services (e.g., data management plan consultations to support the management of research data on their campuses. Efforts to develop these resources may benefit from a greater understanding of disciplinary differences in research data management needs. After conducting a survey of data management practices and perspectives at our research university, we categorized faculty members into four research domains—arts and humanities, social sciences, medical sciences, and basic sciences—and analyzed variations in their patterns of survey responses. We found statistically significant differences among the four research domains for nearly every survey item, revealing important disciplinary distinctions in data management actions, attitudes, and interest in support services. Serious consideration of both the similarities and dissimilarities among disciplines will help guide academic librarians and other data curation professionals in developing a range of data-management services that can be tailored to the unique needs of different scholarly researchers.

  17. Curriculum challenges faced by rural-origin health science students ...

    African Journals Online (AJOL)

    This article is one of a series of investigations into various aspects of university life and career choices of health science students. Data were collected at three South African universities by the Collaboration for Health Equity through Education and Research (CHEER) collaborators. Ethical permission was sought from each ...

  18. Integrating Social Science and Ecosystem Management: A National Challenge

    Science.gov (United States)

    Cordell; H. Ken; Linda Caldwell

    1995-01-01

    These proceedings contain the contributed papers and panel presentations, as well as a paper presented at the National Workshop, of the Conference on Integrating Social Sciences and Ecosystem Management, which was held at Unicoi Lodge and Conference Center, Helen, GA, December 12-14, 1995. The overall purpose of this Conference was to improve understanding, integration...

  19. Deliberations on the Life Science: Pitfalls, Challenges and Solutions

    NARCIS (Netherlands)

    Korthals, M.J.J.A.A.

    2011-01-01

    In this article I sketch several versions of the deliberative approach and then discuss five problems which confront a deliberative ethicist of contemporary problems of the life sciences, in particular about food, nature and agriculture. I begin by discussing problems of unequal participation in

  20. Is it possible to give scientific solutions to Grand Challenges? On the idea of grand challenges for life science research.

    Science.gov (United States)

    Efstathiou, Sophia

    2016-04-01

    This paper argues that challenges that are grand in scope such as "lifelong health and wellbeing", "climate action", or "food security" cannot be addressed through scientific research only. Indeed scientific research could inhibit addressing such challenges if scientific analysis constrains the multiple possible understandings of these challenges into already available scientific categories and concepts without translating between these and everyday concerns. This argument builds on work in philosophy of science and race to postulate a process through which non-scientific notions become part of science. My aim is to make this process available to scrutiny: what I call founding everyday ideas in science is both culturally and epistemologically conditioned. Founding transforms a common idea into one or more scientifically relevant ones, which can be articulated into descriptively thicker and evaluatively deflated terms and enable operationalisation and measurement. The risk of founding however is that it can invisibilise or exclude from realms of scientific scrutiny interpretations that are deemed irrelevant, uninteresting or nonsensical in the domain in question-but which may remain salient for addressing grand-in-scope challenges. The paper considers concepts of "wellbeing" in development economics versus in gerontology to illustrate this process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Benefits and challenges of incorporating citizen science into university education.

    Science.gov (United States)

    Mitchell, Nicola; Triska, Maggie; Liberatore, Andrea; Ashcroft, Linden; Weatherill, Richard; Longnecker, Nancy

    2017-01-01

    A common feature of many citizen science projects is the collection of data by unpaid contributors with the expectation that the data will be used in research. Here we report a teaching strategy that combined citizen science with inquiry-based learning to offer first year university students an authentic research experience. A six-year partnership with the Australian phenology citizen science program ClimateWatch has enabled biology students from the University of Western Australia to contribute phenological data on plants and animals, and to conduct the first research on unvalidated species datasets contributed by public and university participants. Students wrote scientific articles on their findings, peer-reviewed each other's work and the best articles were published online in a student journal. Surveys of more than 1500 students showed that their environmental engagement increased significantly after participating in data collection and data analysis. However, only 31% of students agreed with the statement that "data collected by citizen scientists are reliable" at the end of the project, whereas the rate of agreement was initially 79%. This change in perception was likely due to students discovering erroneous records when they mapped data points and analysed submitted photographs. A positive consequence was that students subsequently reported being more careful to avoid errors in their own data collection, and making greater efforts to contribute records that were useful for future scientific research. Evaluation of our project has shown that by embedding a research process within citizen science participation, university students are given cause to improve their contributions to environmental datasets. If true for citizen scientists in general, enabling participants as well as scientists to analyse data could enhance data quality, and so address a key constraint of broad-scale citizen science programs.

  2. Three challenges to the complementarity of the logic and the pragmatics of science.

    Science.gov (United States)

    Uebel, Thomas

    2015-10-01

    The bipartite metatheory thesis attributes to Rudolf Carnap, Philipp Frank and Otto Neurath a conception of the nature of post-metaphysical philosophy of science that sees the purely formal-logical analyses of the logic of science as complemented by empirical inquiries into the psychology, sociology and history of science. Three challenges to this thesis are considered in this paper: that Carnap did not share this conception of the nature of philosophy of science even on a programmatic level, that Carnap's detailed analysis of the language of science is incompatible with one developed by Neurath for the pursuit of empirical studies of science, and, finally, that Neurath himself was confused about the programme of which the bipartite metatheory thesis makes him a representative. I argue that all three challenges can be met and refuted. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Grand challenge commentary: Transforming biosynthesis into an information science.

    Science.gov (United States)

    Bayer, Travis S

    2010-12-01

    Engineering biosynthetic pathways to natural products is a challenging endeavor that promises to provide new therapeutics and tools to manipulate biology. Information-guided design strategies and tools could unlock the creativity of a wide spectrum of scientists and engineers by decoupling expertise from implementation.

  4. Global networks for invasion science: benefits, challenges and guidelines

    DEFF Research Database (Denmark)

    Packer, Jasmin G.; Meyerson, Laura A.; Richardson, David M.

    2017-01-01

    Much has been done to address the challenges of biological invasions, but fundamental questions (e.g., which species invade? Which habitats are invaded? How can invasions be effectively managed?) still need to be answered before the spread and impact of alien taxa can be effectively managed. Ques...

  5. Science Education and Challenges of Globalization in Igbo Nation

    Science.gov (United States)

    Ezeudu, F. O.; Nkokelonye, C. U.; Adigwe, J. C.

    2013-01-01

    This paper reviewed the scientific contents in Igbo culture. Description of the Igbos who constitutes an ethnic group occupying southeastern Nigeria was made. It x-rayed the pre-colonial, colonial, and post-colonial culture of Igbo people and identified the scientific cultural activities, which can be harnessed to meet the challenges of modern day…

  6. Science, Sport and Technology--A Contribution to Educational Challenges

    Science.gov (United States)

    O'Hara, Kelly; Reis, Paula; Esteves, Dulce; Bras, Rui; Branco, Luisa

    2011-01-01

    Improve students' ability to link knowledge with real life practice, through enhancing children or teenagers' ability to think critically by way of making observations, posing questions, drawing up hypotheses, planning and carrying out investigations, analysing data and therefore improve their decision making is an educational challenge. Learning…

  7. Challenges sssociated with Learning Oral Diagnostic Sciences: A ...

    African Journals Online (AJOL)

    Several barriers that may impede effective clinical teaching include inadequate institutional financial support and lack of access to appropriate educational space and resources. The aim of this study was to categorize challenges of learning ODS in Nigeria. Methods: This was a cross sectional survey of undergraduate ...

  8. Challenges and Opportunities for Education about Dual Use Issues in the Life Sciences

    Science.gov (United States)

    National Academies Press, 2011

    2011-01-01

    The Challenges and Opportunities for Education About Dual Use Issues in the Life Sciences workshop was held to engage the life sciences community on the particular security issues related to research with dual use potential. More than 60 participants from almost 30 countries took part and included practicing life scientists, bioethics and…

  9. Environmental Remediation Data Management Tools

    International Nuclear Information System (INIS)

    Wierowski, J. V.; Henry, L. G.; Dooley, D. A.

    2002-01-01

    Computer software tools for data management can improve site characterization, planning and execution of remediation projects. This paper discusses the use of two such products that have primarily been used within the nuclear power industry to enhance the capabilities of radiation protection department operations. Advances in digital imaging, web application development and programming technologies have made development of these tools possible. The Interactive Visual Tour System (IVTS) allows the user to easily create and maintain a comprehensive catalog containing digital pictures of the remediation site. Pictures can be cataloged in groups (termed ''tours'') that can be organized either chronologically or spatially. Spatial organization enables the user to ''walk around'' the site and view desired areas or components instantly. Each photo is linked to a map (floor plan, topographical map, elevation drawing, etc.) with graphics displaying the location on the map and any available tour/component links. Chronological organization enables the user to view the physical results of the remediation efforts over time. Local and remote management teams can view these pictures at any time and from any location. The Visual Survey Data System (VSDS) allows users to record survey and sample data directly on photos and/or maps of areas and/or components. As survey information is collected for each area, survey data trends can be reviewed for any repetitively measured location or component. All data is stored in a Quality Assurance (Q/A) records database with reference to its physical sampling point on the site as well as other information to support the final closeout report for the site. The ease of use of these web-based products has allowed nuclear power plant clients to plan outage work from their desktop and realize significant savings with respect to dose and cost. These same tools are invaluable for remediation and decommissioning planning of any scale and for recording

  10. Academic language and the challenge of reading for learning about science.

    Science.gov (United States)

    Snow, Catherine E

    2010-04-23

    A major challenge to students learning science is the academic language in which science is written. Academic language is designed to be concise, precise, and authoritative. To achieve these goals, it uses sophisticated words and complex grammatical constructions that can disrupt reading comprehension and block learning. Students need help in learning academic vocabulary and how to process academic language if they are to become independent learners of science.

  11. The challenges associated with developing science-based landscape scale management plans.

    Science.gov (United States)

    Robert C. Szaro; Douglas A. Jr. Boyce; Thomas. Puchlerz

    2005-01-01

    Planning activities over large landscapes poses a complex of challenges when trying to balance the implementation of a conservation strategy while still allowing for a variety of consumptive and nonconsumptive uses. We examine a case in southeast Alaska to illustrate the breadth of these challenges and an approach to developing a science-based resource plan. Not only...

  12. Challenges and Concerns for Library and Information Science (LIS) Education in India and South Asia

    Science.gov (United States)

    Kaur, Trishanjit

    2015-01-01

    This paper presents some of the challenges and concerns for library and information science (LIS) education in India. In order to provide context for these challenges, the paper begins with a brief overview of higher education in India in general and then discusses the beginning of LIS education. It briefly summarizes LIS education in South Asia…

  13. Integrated Data Management Processes Expedite Common Data Management Tasks in Autism Research

    OpenAIRE

    Farach, Frank; Sinanis, Naralys; Hawthorne, Julie; Agnew, Henry; Schantz, Tricia; Jensen, Bill; Rozenblit, Leon

    2013-01-01

    We compare the efficiency of (1) just-in-time data management, in which data are cleaned prior to each analysis, and (2) integrated data management, in which data are centralized, cleaned up front, and made available via a query interface. Integrated data management was associated with faster completion of data management requests.

  14. Smartphone measurement engineering - Innovative challenges for science & education, instrumentation & training

    Science.gov (United States)

    Hofmann, D.; Dittrich, P.-G.; Duentsch, E.

    2010-07-01

    Smartphones have an enormous conceptual and structural influence on measurement science & education, instrumentation & training. Smartphones are matured. They became convenient, reliable and affordable. In 2009 worldwide 174 million Smartphones has been delivered. Measurement with Smartphones is ready for the future. In only 10 years the German vision industry tripled its global sales volume to one Billion Euro/Year. Machine vision is used for mobile object identification, contactless industrial quality control, personalized health care, remote facility and transport management, safety critical surveillance and all tasks which are too complex for the human eye or too monotonous for the human brain. Aim of the paper is to describe selected success stories for the application of Smartphones for measurement engineering in science and education, instrumentation and training.

  15. Longitudinal Research in Social Science: Some Theoretical Challenges

    Directory of Open Access Journals (Sweden)

    Thomas K. Burch

    2001-12-01

    Full Text Available Every advance carries with it potential problems, and longitudinal analysis is no exception. This paper focuses on the problems related to the massive amounts of data generated by longitudinal surveys. It is argued that a proliferation of data may be to the good but it will not necessarily lead to better scientific knowledge. Most demographers think the logical positivist way that theory arises out of empirical generalisations, but massive empirical investigations have only led to disappointing theoretical outcomes in demography. This paper discusses one way out of this impasse - to adopt a different view of theory, a model-based view of science. Theoretical models based on empirical generalisation should become the main representational device in science.

  16. Challenges for Transitioning Science Research to Space Weather Applications

    Science.gov (United States)

    Spann, James

    2013-01-01

    Effectively transitioning science knowledge to useful applications relevant to space weather has become important. The effort to transition scientific knowledge to a useful application is not a research nor is it operations, but an activity that connects two. Successful transitioning must be an intentional effort with a clear goal and measureable outcome. This talk will present proven methodologies that have been demonstrated to be effective, and how in the current environment those can be applied to space weather transition efforts.

  17. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  18. Data management for community research projects: A JGOFS case study

    Science.gov (United States)

    Lowry, Roy K.

    1992-01-01

    Since the mid 1980s, much of the marine science research effort in the United Kingdom has been focused into large scale collaborative projects involving public sector laboratories and university departments, termed Community Research Projects. Two of these, the Biogeochemical Ocean Flux Study (BOFS) and the North Sea Project incorporated large scale data collection to underpin multidisciplinary modeling efforts. The challenge of providing project data sets to support the science was met by a small team within the British Oceanographic Data Centre (BODC) operating as a topical data center. The role of the data center was to both work up the data from the ship's sensors and to combine these data with sample measurements into online databases. The working up of the data was achieved by a unique symbiosis between data center staff and project scientists. The project management, programming and data processing skills of the data center were combined with the oceanographic experience of the project communities to develop a system which has produced quality controlled, calibrated data sets from 49 research cruises in 3.5 years of operation. The data center resources required to achieve this were modest and far outweighed by the time liberated in the scientific community by the removal of the data processing burden. Two online project databases have been assembled containing a very high proportion of the data collected. As these are under the control of BODC their long term availability as part of the UK national data archive is assured. The success of the topical data center model for UK Community Research Project data management has been founded upon the strong working relationships forged between the data center and project scientists. These can only be established by frequent personal contact and hence the relatively small size of the UK has been a critical factor. However, projects covering a larger, even international scale could be successfully supported by a

  19. Statistical analysis and data management

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found

  20. Data Science Meets the Clinician: Challenges and Future Directions.

    Science.gov (United States)

    Charitos, Efstratios I; Wilbring, Manuel; Treede, Hendrik

    2018-01-01

    In the last three decades a profound transformation of the medical profession has taken place. The modern clinician is required to consume vast amounts of information from clinical studies, critically reviewing evidence that may or may not lead to changes in clinical practice. The present article presents some challenges that this era of information poses to clinicians and patients. Georg Thieme Verlag KG Stuttgart · New York.

  1. Reflections on the challenges and possibilities of journal publication in science education

    Science.gov (United States)

    Milne, Catherine; Siry, Christina; Mueller, Michael

    2015-12-01

    In this editorial we reflect on the intersections between the review and publishing policies of Cultural Studies of Science Education (CSSE) and the challenges and possibilities in global science education publishing. In particular we discuss the tensions associated with open or closed review policies, the hegemony of English as a language of publication, and reflect on some of the common challenges experienced by editors and authors from different contexts. We draw on the paper set in this issue consisting of five papers focused on publishing in various contexts, and elaborate several central questions for the field of science education and the dissemination of knowledges.

  2. Teaching science with a multicultural agenda: The challenges and conflicts for preservice teachers

    Science.gov (United States)

    Yang, Kimberley

    This dissertation examines the challenges and conflicts that preservice teachers have when teaching science with a multicultural agenda. This study is based on the experience of three preservice teachers who have participated in a one or two semester(s) volunteered commitment teaching science to pre-kindergarten students at a homeless shelter in the South Bronx of New York City. Findings derived from in-depth interviews, observations, lesson planning and debriefing sessions, journals, questionnaires and extracurricular interaction of the researcher and participants, indicate that preservice teachers were initially uncertain about the philosophy and actual practice of teaching science with a multicultural agenda. Their experience at the homeless shelter brings up issues of social class and family background as determinants of access and success in science education, multicultural science as exclusive from the accepted science canon, and the value of practicing science education with a multicultural agenda. The philosophical framework for teaching science from a multicultural framework is based on ideas that stem from feminist theories of valuing the lived social and educational experiences of children, and critical theory that examines the role of school and science as culture. The intention of multicultural science education is to create a science education that is inclusive for students regardless of cultural background. This includes students who have been traditionally marginalized from school science. In many instances, children from severe inner-city economically impoverished environments have been overlooked as science-able within school culture.

  3. Big Data: New science, new challenges, new dialogical opportunities

    OpenAIRE

    Fuller, Michael

    2015-01-01

    The advent of extremely large datasets, known as “big data”, has been heralded as the instantiation of a new science, requiring a new kind of practitioner: the “data scientist”. This paper explores the concept of big data, drawing attention to a number of new issues – not least ethical concerns, and questions surrounding interpretation – which big data sets present. It is observed that the skills required for data scientists are in some respects closer to those traditionally associated with t...

  4. The BRAzilian Seismographic Integrated Systems (BRASIS: infrastructure and data management

    Directory of Open Access Journals (Sweden)

    João Carlos Dourado

    2011-04-01

    Full Text Available In geophysics and seismology, raw data need to be processed to generate useful information that can be turned into knowledge by researchers. The number of sensors that are acquiring raw data is increasing rapidly. Without good data management systems, more time can be spent in querying and preparing datasets for analyses than in acquiring raw data. Also, a lot of good quality data acquired at great effort can be lost forever if they are not correctly stored. Local and international cooperation will probably be reduced, and a lot of data will never become scientific knowledge. For this reason, the Seismological Laboratory of the Institute of Astronomy, Geophysics and Atmospheric Sciences at the University of São Paulo (IAG-USP has concentrated fully on its data management system. This report describes the efforts of the IAG-USP to set up a seismology data management system to facilitate local and international cooperation.

  5. Customization of Curriculum Materials in Science: Motives, Challenges, and Opportunities

    Science.gov (United States)

    Romine, William L.; Banerjee, Tanvi

    2012-02-01

    Exemplary science instructors use inquiry to tailor content to student's learning needs; traditional textbooks treat science as a set of facts and a rigid curriculum. Publishers now allow instructors to compile pieces of published and/or self-authored text to make custom textbooks. This brings numerous advantages, including the ability to produce smaller, cheaper text and added flexibility on the teaching models used. Moreover, the internet allows instructors to decentralize textbooks through easy access to educational objects such as audiovisual simulations, individual textbook chapters, and scholarly research articles. However, these new opportunities bring with them new problems. With educational materials easy to access, manipulate and duplicate, it is necessary to define intellectual property boundaries, and the need to secure documents against unlawful copying and use is paramount. Engineers are developing and enhancing information embedding technologies, including steganography, cryptography, watermarking, and fingerprinting, to label and protect intellectual property. While these are showing their utility in securing information, hackers continue to find loop holes in these protection schemes, forcing engineers to constantly assess the algorithms to make them as secure as possible. As newer technologies rise, people still question whether custom publishing is desirable. Many instructors see the process as complex, costly, and substandard in comparison to using traditional text. Publishing companies are working to improve attitudes through advertising. What lacks is peer reviewed evidence showing that custom publishing improves learning. Studies exploring the effect of custom course materials on student attitude and learning outcomes are a necessary next step.

  6. E-Infrastructure and Data Management for Global Change Research

    Science.gov (United States)

    Allison, M. L.; Gurney, R. J.; Cesar, R.; Cossu, R.; Gemeinholzer, B.; Koike, T.; Mokrane, M.; Peters, D.; Nativi, S.; Samors, R.; Treloar, A.; Vilotte, J. P.; Visbeck, M.; Waldmann, H. C.

    2014-12-01

    The Belmont Forum, a coalition of science funding agencies from 15 countries, is supporting an 18-month effort to assess the state of international of e-infrastructures and data management so that global change data and information can be more easily and efficiently exchanged internationally and across domains. Ultimately, this project aims to address the Belmont "Challenge" to deliver knowledge needed for action to avoid and adapt to detrimental environmental change, including extreme hazardous events. This effort emerged from conclusions by the Belmont Forum that transformative approaches and innovative technologies are needed for heterogeneous data/information to be integrated and made interoperable for researchers in disparate fields, and for myriad uses across international, institutional, disciplinary, spatial and temporal boundaries. The project will deliver a Community Strategy and Implementation Plan to prioritize international funding opportunities and long-term policy recommendations on how the Belmont Forum can implement a more coordinated, holistic, and sustainable approach to funding and supporting global change research. The Plan is expected to serve as the foundation of future Belmont Forum funding calls for proposals in support of research science goals as well as to establish long term e-infrastructure. More than 120 scientists, technologists, legal experts, social scientists, and other experts are participating in six Work Packages to develop the Plan by spring, 2015, under the broad rubrics of Architecture/Interoperability and Governance: Data Integration for Multidisciplinary Research; Improved Interface between Computation & Data Infrastructures; Harmonization of Global Data Infrastructure; Data Sharing; Open Data; and Capacity Building. Recommendations could lead to a more coordinated approach to policies, procedures and funding mechanisms to support e-infrastructures in a more sustainable way.

  7. Assessment of Data Management Services at New England Region Resource Libraries

    Directory of Open Access Journals (Sweden)

    Julie Goldman

    2015-07-01

    Full Text Available Objective: To understand how New England medical libraries are addressing scientific research data management and providing services to their communities. Setting: The National Network of Libraries of Medicine, New England Region (NN/LM NER contains 17 Resource Libraries. The University of Massachusetts Medical School serves as the New England Regional Medical Library (RML. Sixteen of the NER Resource Libraries completed this survey. Methods: A 40-question online survey assessed libraries’ services and programs for providing research data management education and support. Libraries shared their current plans and institutional challenges associated with developing data services. Results: This study shows few NER Resource Libraries currently integrate scientific research data management into their services and programs, and highlights the region’s use of resources provided by the NN/LM NER RML at the University of Massachusetts Medical School. Conclusions: Understanding the types of data services being delivered at NER libraries helps to inform the NN/LM NER about the eScience learning needs of New England medical librarians and helps in the planning of professional development programs that foster effective biomedical research data services.

  8. 7 CFR 275.15 - Data management.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false Data management. 275.15 Section 275.15 Agriculture... § 275.15 Data management. (a) Analysis. Analysis is the process of classifying data, such as by areas of... management information sources available to: (1) Identify all deficiencies in program operations and systems...

  9. Data management in WLCG and EGEE

    CERN Document Server

    Donno, Flavia; CERN. Geneva. IT Department

    2008-01-01

    This work is a contribution to a book on Scientific Data Management by CRC Press/Taylor and Francis Books. Data Management and Storage Access experience in WLCG is described together with the major use cases. Furthermore, some considerations about the EGEE requirements are also reported.

  10. GESDATA: A failure-data management code

    International Nuclear Information System (INIS)

    Garcia Gay, J.; Francia Gonzalez, L.; Ortega Prieto, P.; Mira McWilliams, J.; Aguinaga Zapata, M.

    1987-01-01

    GESDATA is a failure data management code for both qualitative and quantitative fault-tree evaluation. Data management using the code should provide the analyst, in the quickest and easiest way, with the reliability data which constitute the input values for fault-tree evaluation programs. (orig./HSCH)

  11. Educational challenges of molecular life science: Characteristics and implications for education and research.

    Science.gov (United States)

    Tibell, Lena A E; Rundgren, Carl-Johan

    2010-01-01

    Molecular life science is one of the fastest-growing fields of scientific and technical innovation, and biotechnology has profound effects on many aspects of daily life-often with deep, ethical dimensions. At the same time, the content is inherently complex, highly abstract, and deeply rooted in diverse disciplines ranging from "pure sciences," such as math, chemistry, and physics, through "applied sciences," such as medicine and agriculture, to subjects that are traditionally within the remit of humanities, notably philosophy and ethics. Together, these features pose diverse, important, and exciting challenges for tomorrow's teachers and educational establishments. With backgrounds in molecular life science research and secondary life science teaching, we (Tibell and Rundgren, respectively) bring different experiences, perspectives, concerns, and awareness of these issues. Taking the nature of the discipline as a starting point, we highlight important facets of molecular life science that are both characteristic of the domain and challenging for learning and education. Of these challenges, we focus most detail on content, reasoning difficulties, and communication issues. We also discuss implications for education research and teaching in the molecular life sciences.

  12. Everware toolkit. Supporting reproducible science and challenge-driven education.

    Science.gov (United States)

    Ustyuzhanin, A.; Head, T.; Babuschkin, I.; Tiunov, A.

    2017-10-01

    Modern science clearly demands for a higher level of reproducibility and collaboration. To make research fully reproducible one has to take care of several aspects: research protocol description, data access, environment preservation, workflow pipeline, and analysis script preservation. Version control systems like git help with the workflow and analysis scripts part. Virtualization techniques like Docker or Vagrant can help deal with environments. Jupyter notebooks are a powerful platform for conducting research in a collaborative manner. We present project Everware that seamlessly integrates git repository management systems such as Github or Gitlab, Docker and Jupyter helping with a) sharing results of real research and b) boosts education activities. With the help of Everware one can not only share the final artifacts of research but all the depth of the research process. This been shown to be extremely helpful during organization of several data analysis hackathons and machine learning schools. Using Everware participants could start from an existing solution instead of starting from scratch. They could start contributing immediately. Everware allows its users to make use of their own computational resources to run the workflows they are interested in, which leads to higher scalability of the toolkit.

  13. Web Coverage Service Challenges for NASA's Earth Science Data

    Science.gov (United States)

    Cantrell, Simon; Khan, Abdul; Lynnes, Christopher

    2017-01-01

    In an effort to ensure that data in NASA's Earth Observing System Data and Information System (EOSDIS) is available to a wide variety of users through the tools of their choice, NASA continues to focus on exposing data and services using standards based protocols. Specifically, this work has focused recently on the Web Coverage Service (WCS). Experience has been gained in data delivery via GetCoverage requests, starting out with WCS v1.1.1. The pros and cons of both the version itself and different implementation approaches will be shared during this session. Additionally, due to limitations with WCS v1.1.1 ability to work with NASA's Earth science data, this session will also discuss the benefit of migrating to WCS 2.0.1 with EO-x to enrich this capability to meet a wide range of anticipated user's needs This will enable subsetting and various types of data transformations to be performed on a variety of EOS data sets.

  14. High school students as science researchers: Opportunities and challenges

    Science.gov (United States)

    Smith, W. R.; Grannas, A. M.

    2007-12-01

    Today's K-12 students will be the scientists and engineers who bring currently emerging technologies to fruition. Existing research endeavors will be continued and expanded upon in the future only if these students are adequately prepared. High school-university collaborations provide an effective means of recruiting and training the next generation of scientists and engineers. Here, we describe our successful high school-university collaboration in the context of other models. We have developed an authentic inquiry-oriented environmental chemistry research program involving high school students as researchers. The impetus behind the development of this project was twofold. First, participation in authentic research may give some of our students the experience and drive to enter technical studies after high school. One specific goal was to develop a program to recruit underrepresented minorities into university STEM (science, technology, engineering, and mathematics) programs. Second, inquiry-oriented lessons have been shown to be highly effective in developing scientific literacy among the general population of students. This collaboration involves the use of local resources and equipment available to most high schools and could serve as a model for developing high school- university partnerships.

  15. Transdisciplinary Higher Education—A Challenge for Public Health Science

    Directory of Open Access Journals (Sweden)

    Alexandra Krettek

    2011-01-01

    Full Text Available This paper highlights and discusses issues associated with transdisciplinary teaching and suggests ways to overcome the challenges posed by different epistemologies, methods, and ethical positions. Our own transdisciplinary teaching experience in public health helped us identify some important questions including (i what is transdisciplinary research in practice, and does methods triangulation yield more valid results?, (ii from a teaching perspective, how do biopsychosocial and medical research differ?, (iii what is the difference between deductive and inductive research, and does each discipline represent a different ethical position?, and (iv does pure inductive research lack theories, and does it require a hypothesis—a “rule of thumb”—on how to proceed? We also suggest ways to facilitate and enhance transdisciplinary teaching, focusing on what unites us and not on what sets us apart, openly underlining and highlighting our differences. Using diverse methodologies, a newly educated transdisciplinary workforce will likely extend current knowledge and facilitate solutions for complex public health issues.

  16. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  17. Sensor Web Technology Challenges and Advancements for the Earth Science Decadal Survey Era

    Science.gov (United States)

    Norton, Charles D.; Moe, Karen

    2011-01-01

    This paper examines the Earth science decadal survey era and the role ESTO developed sensor web technologies can contribute to the scientific observations. This includes hardware and software technology advances for in-situ and in-space measurements. Also discussed are emerging areas of importance such as the potential of small satellites for sensor web based observations as well as advances in data fusion critical to the science and societal benefits of future missions, and the challenges ahead.

  18. Directing Matter and Energy: Five Challenges for Science and the Imagination

    Energy Technology Data Exchange (ETDEWEB)

    Hemminger, J.; Fleming, G.; Ratner, M.

    2007-12-20

    The twin aspects of energy and control (or direction) are the underlying concepts. Matter and energy are closely linked, and their understanding and control will have overwhelming importance for our civilization, our planet, our science, and our technology. This importance ranges even beyond the large portfolio of BES, both because these truly significant Grand Challenges confront many other realms of science and because even partial solutions to these challenges will enrich scientists’ collective imagination and ability to solve problems with new ideas and new methods.

  19. Marine data management: a positive evolution from JGOFS to OCEANS

    Science.gov (United States)

    Avril, B.

    2003-04-01

    The JGOFS project has been highly successful in providing new insights into global biogeochemical cycling of carbon and associated elements in the oceans through a multi-national effort at the regional scale (process studies in the North Atlantic, Arabian Sea, Equatorial Pacific, Southern Ocean and North Pacific), global scale (carbon survey) and from long-term measurements at key ocean sites (time-series). The database thus created is very large and complex in diversity and format, and it is currently managed at the international level, thank to the efforts of the JGOFS Data Management Task Team. To be fully usable for current and future studies, the JGOFS datasets will be organised as a single database (so-called, the International JGOFS Master Dataset), in a single format and in a single location (in the World Data Centre (WDC) system, thanks to an initiative of PANGAEA / WDC-MARE; and on CDs or DVDs) before the end of the project (Dec. 2003). This should be achieved by adapting previously developed tools, especially from the US-JGOFS DMO (for the user query interface) and from ODV/PANGAEA (for the datasets visualization and metadata handling). Whilst the OCEANS project science and implementation plans are being prepared, the international oceanographic community is now hoping to benefit from the JGOFS data management experience and to elaborate beforehand the best design and practices for its data management. The draft OCEANS data management plan (international data policy and recommendations for participating international agencies and national data managers) is presented. This plan should result in the rapid and full availability of data, and its long-term preservation and accessibility, thanks to a better, integrated and fully implemented data management system.

  20. The Dark Energy Survey Data Management System

    International Nuclear Information System (INIS)

    Mohr, Joseph J.; Darnell, J.Anthony; Beldica, Cristina; Barkhouse, Wayne; Bertin, Emmanuel; Dora Cai, Y.; Daues, Gregory E.; Gower, Michelle; Nicolaci da Costa, Luiz A.; Jarvis, Michael; Lin, Huan

    2008-01-01

    The Dark Energy Survey (DES) collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process and archive these data and the resulting science ready data products. The DESDM system consists of an integrated archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the DESDM system for operation in the high performance computing (HPC) environments at the National Center for Supercomputing Applications (NCSA) and Fermilab. Operating the DESDM system in an HPC environment offers both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested the current DESDM system on both simulated and real survey data. We used TeraGrid to process 10 simulated DES nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database. We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera. Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate that astrometric and photometric data quality is excellent

  1. The Dark Energy Survey Data Management System

    Energy Technology Data Exchange (ETDEWEB)

    Mohr, Joseph J.; /Illinois U., Urbana, Astron. Dept. /Illinois U., Urbana; Barkhouse, Wayne; /North Dakota U.; Beldica, Cristina; /Illinois U., Urbana; Bertin, Emmanuel; /Paris, Inst. Astrophys.; Dora Cai, Y.; /NCSA, Urbana; Nicolaci da Costa, Luiz A.; /Rio de Janeiro Observ.; Darnell, J.Anthony; /Illinois U., Urbana, Astron. Dept.; Daues, Gregory E.; /NCSA, Urbana; Jarvis, Michael; /Pennsylvania U.; Gower, Michelle; /NCSA, Urbana; Lin, Huan; /Fermilab /Rio de Janeiro Observ.

    2008-07-01

    The Dark Energy Survey (DES) collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process and archive these data and the resulting science ready data products. The DESDM system consists of an integrated archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the DESDM system for operation in the high performance computing (HPC) environments at the National Center for Supercomputing Applications (NCSA) and Fermilab. Operating the DESDM system in an HPC environment offers both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested the current DESDM system on both simulated and real survey data. We used TeraGrid to process 10 simulated DES nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database. We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera. Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate that astrometric and photometric data quality is excellent.

  2. Framing the challenge of climate change in Nature and Science editorials

    Science.gov (United States)

    Hulme, Mike; Obermeister, Noam; Randalls, Samuel; Borie, Maud

    2018-06-01

    Through their editorializing practices, leading international science journals such as Nature and Science interpret the changing roles of science in society and exert considerable influence on scientific priorities and practices. Here we examine nearly 500 editorials published in these two journals between 1966 and 2016 that deal with climate change, thereby constructing a lens through which to view the changing engagement of science and scientists with the issue. A systematic longitudinal frame analysis reveals broad similarities between Nature and Science in the waxing and waning of editorializing attention given to the topic, but, although both journals have diversified how they frame the challenges of climate change, they have done so in different ways. We attribute these differences to three influences: the different political and epistemic cultures into which they publish; their different institutional histories; and their different editors and editorial authorship practices.

  3. English for Scientific Purposes (EScP): Technology, Trends, and Future Challenges for Science Education

    Science.gov (United States)

    Liu, Gi-Zen; Chiu, Wan-Yu; Lin, Chih-Chung; Barrett, Neil E.

    2014-12-01

    To date, the concept of English for Specific Purposes has brought about a great impact on English language learning across various disciplines, including those in science education. Hence, this review paper aimed to address current English language learning in the science disciplines through the practice of computer-assisted language learning to identify the use of learning technologies in science-based literacy. In the literature review, the researchers found that science-based literacy instruction shares many pedagogical aims with English language teaching in terms of reading, writing, listening and speaking, allowing it to be classified as English for Scientific Purposes (EScP). To answer the research questions, the researchers conducted the survey by extracting related articles and teaching examples from the Web of Science. In the search procedure, the researchers used the keywords science OR scientific AND technolog* OR comput* in ten selected journals of social science citation index. Only articles which are specified as journal articles rather than other document types were included. After compiling the corpora, the researchers compared the trends, methodologies and results of EScP instruction in science education. The implications of this study include the opportunities, advantages and challenges for EScP instruction in science education to further develop better educational approaches, adopt new technologies, as well as offer some directions for researchers to conduct future studies.

  4. Benchmarking and improving point cloud data management in MonetDB

    NARCIS (Netherlands)

    Martinez-Rubi, O.; Van Oosterom, P.J.M.; Goncalves, R.; Tijssen, T.P.M.; Ivanova, M.; Kersten, M.L.; Alvanaki, F.

    2015-01-01

    The popularity, availability and sizes of point cloud data sets are increasing, thus raising interesting data management and processing challenges. Various software solutions are available for the management of point cloud data. A benchmark for point cloud data management systems was defined and it

  5. Benchmarking and improving point cloud data management in MonetDB

    NARCIS (Netherlands)

    O. Martinez-Rubi (Oscar); P. van Oosterom; R.A. Goncalves (Romulo); T. Tijssen; M.G. Ivanova (Milena); M.L. Kersten (Martin); F. Alvanaki (Foteini)

    2014-01-01

    htmlabstractThe popularity, availability and sizes of point cloud data sets are increasing, thus raising interesting data management and processing challenges. Various software solutions are available for the management of point cloud data. A benchmark for point cloud data management systems was

  6. Management as a science-based profession: a grand societal challenge

    NARCIS (Netherlands)

    Romme, A.G.L.

    2017-01-01

    The purpose of this paper is to explore how the quest for management as a science-based profession, conceived as a grand societal challenge, can be revitalized. A reflective approach is adopted by questioning some of the key assumptions made by management scholars, especially those that undermine

  7. Challenges of Virtual and Open Distance Science Teacher Education in Zimbabwe

    Science.gov (United States)

    Mpofu, Vongai; Samukange, Tendai; Kusure, Lovemore M.; Zinyandu, Tinoidzwa M.; Denhere, Clever; Huggins, Nyakotyo; Wiseman, Chingombe; Ndlovu, Shakespear; Chiveya, Renias; Matavire, Monica; Mukavhi, Leckson; Gwizangwe, Isaac; Magombe, Elliot; Magomelo, Munyaradzi; Sithole, Fungai; Bindura University of Science Education (BUSE),

    2012-01-01

    This paper reports on a study of the implementation of science teacher education through virtual and open distance learning in the Mashonaland Central Province, Zimbabwe. The study provides insight into challenges faced by students and lecturers on inception of the program at four centres. Data was collected from completed evaluation survey forms…

  8. Various Political and Social Challenges Including Wars and Displacement in Empowering Women and Girls in Science

    Directory of Open Access Journals (Sweden)

    Nilüfer Narli

    2016-02-01

    Full Text Available Poor gender ratio in science and engineering has been a global concern, despite growing number of female scientists in the world. Women’s empowerment in science is key to achieve human progress and dignity and directly related to accomplishing SDG 16: "Promote peaceful and inclusive societies for sustainable development, provide access to justice for all and build effective, accountable and inclusive institutions at all levels". What are the challenges that hinder women and girls’ progress in science? Added to several challenges discussed below, wars and displaced population create obstacles for female education and women’s advancement in science and technology. There are some challenges that have prevailed for the last two decades (e.g. economic insecurity and new challenges that are the results of the new forms wars, civil wars and extremism (e.g., large scale armed conflicts that involves state and non-state actors which have produced large numbers of displaced women in the Middle East who lost their jobs and isolated elsewhere, many young displaced females and refugees and who have no access to formal education and who face health risks in conflict and displacement settings, and new forms of gender discrimination produced by religious extremism.......

  9. Design challenges for long-term interaction with a robot in a science classroom

    NARCIS (Netherlands)

    Davison, Daniel Patrick; Charisi, Vasiliki; Wijnen, Frances Martine; Papenmeier, Andrea; van der Meij, Jan; Reidsma, Dennis; Evers, Vanessa

    This paper aims to present the main challenges that emerged during the process of the research design of a longitudinal study on child-robot interaction for science education and to discuss relevant suggestions in the context. The theoretical rationale is based on aspects of the theory of social

  10. Social media as a platform for science and health engagement: challenges and opportunities.

    Science.gov (United States)

    Dixon, Graham

    2016-01-01

    Social media has become a major platform for debates on science and health. This commentary argues that while social media can present challenges to communicating important health matters, it can also provide health experts a unique opportunity to engage with and build trust among members of the public.

  11. Learning about the Human Genome. Part 1: Challenge to Science Educators. ERIC Digest.

    Science.gov (United States)

    Haury, David L.

    This digest explains how to inform high school students and their parents about the human genome project (HGP) and how the information from this milestone finding will affect future biological and medical research and challenge science educators. The sections include: (1) "The Emerging Legacy of the HGP"; (2) "Transforming How…

  12. A data management infrastructure for bridge monitoring

    Science.gov (United States)

    Jeong, Seongwoon; Byun, Jaewook; Kim, Daeyoung; Sohn, Hoon; Bae, In Hwan; Law, Kincho H.

    2015-04-01

    This paper discusses a data management infrastructure framework for bridge monitoring applications. As sensor technologies mature and become economically affordable, their deployment for bridge monitoring will continue to grow. Data management becomes a critical issue not only for storing the sensor data but also for integrating with the bridge model to support other functions, such as management, maintenance and inspection. The focus of this study is on the effective data management of bridge information and sensor data, which is crucial to structural health monitoring and life cycle management of bridge structures. We review the state-of-the-art of bridge information modeling and sensor data management, and propose a data management framework for bridge monitoring based on NoSQL database technologies that have been shown useful in handling high volume, time-series data and to flexibly deal with unstructured data schema. Specifically, Apache Cassandra and Mongo DB are deployed for the prototype implementation of the framework. This paper describes the database design for an XML-based Bridge Information Modeling (BrIM) schema, and the representation of sensor data using Sensor Model Language (SensorML). The proposed prototype data management framework is validated using data collected from the Yeongjong Bridge in Incheon, Korea.

  13. Increasing Diversity in the Sciences: a Partial Solution to the Challenge and the Benefits it Produces

    Science.gov (United States)

    Givan, A. V.

    2009-12-01

    Science is supposed to be about talent devoid of the bias’ and judgments generated by background, gender, ethnicity or any culturally determined discriminators. The scientific, academic, corporate and government communities have a vested interest in developing models, practices and policies that significantly increase the number of U.S. graduates in scientific disciplines. Additionally, it is crucial that these graduates possess the essential competencies and creative problem solving skills to compete in the current global economy. The stakeholders (corporations, researchers, educational practitioners, policymakers and funders) who have the common goal of producing highly qualified scientists must commit to collaborate in developing innovative strategies and solutions to this complex challenge. Volumes of research data from a variety of sources such the social and cognitive sciences, educational psychology, National Science Foundation and non-profit groups have been and are available for use enabling us to rise to the challenge we have been charged with, and are responsible for the outcome. A proposed solution to part of the challenge and discussion of the impacts of increasing diversity in science will be discussed in this paper. The paper will address one element of the issue - strategies for the recruitment and retention of under-represented groups in science focusing on the historical and current culture, climate and barriers encountered by minorities as they progress through the educational system and career pathways. The paper will examine the benefits of diversity to the individual and society as a whole.

  14. Big Data and Data Science: Opportunities and Challenges of iSchools

    Directory of Open Access Journals (Sweden)

    Il-Yeol Song

    2017-08-01

    Full Text Available Due to the recent explosion of big data, our society has been rapidly going through digital transformation and entering a new world with numerous eye-opening developments. These new trends impact the society and future jobs, and thus student careers. At the heart of this digital transformation is data science, the discipline that makes sense of big data. With many rapidly emerging digital challenges ahead of us, this article discusses perspectives on iSchools’ opportunities and suggestions in data science education. We argue that iSchools should empower their students with “information computing” disciplines, which we define as the ability to solve problems and create values, information, and knowledge using tools in application domains. As specific approaches to enforcing information computing disciplines in data science education, we suggest the three foci of user-based, tool-based, and application-based. These three foci will serve to differentiate the data science education of iSchools from that of computer science or business schools. We present a layered Data Science Education Framework (DSEF with building blocks that include the three pillars of data science (people, technology, and data, computational thinking, data-driven paradigms, and data science lifecycles. Data science courses built on the top of this framework should thus be executed with user-based, tool-based, and application-based approaches. This framework will help our students think about data science problems from the big picture perspective and foster appropriate problem-solving skills in conjunction with broad perspectives of data science lifecycles. We hope the DSEF discussed in this article will help fellow iSchools in their design of new data science curricula.

  15. P2P Data Management in Mobile Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Nida Sahar Sayeda

    2013-04-01

    Full Text Available The rapid growth in wireless technologies has made wireless communication an important source for transporting data across different domains. In the same way, there are possibilities of many potential applications that can be deployed using WSNs (Wireless Sensor Networks. However, very limited applications are deployed in real life due to the uncertainty and dynamics of the environment and scare resources. This makes data management in WSN a challenging area to find an approach that suits its characteristics. Currently, the trend is to find efficient data management schemes using evolving technologies, i.e. P2P (Peer-to-Peer systems. Many P2P approaches have been applied in WSNs to carry out the data management due to similarities between WSN and P2P. With the similarities, there are differences too that makes P2P protocols inefficient in WSNs. Furthermore, to increase the efficiency and to exploit the delay tolerant nature of WSNs, where ever possible, the mobile WSNs are gaining importance. Thus, creating a three dimensional problem space to consider, i.e. mobility, WSNs and P2P. In this paper, an efficient algorithm is proposed for data management using P2P techniques for mobile WSNs. The real world implementation and deployment of proposed algorithm is also presented

  16. Virtue and the scientist: using virtue ethics to examine science's ethical and moral challenges.

    Science.gov (United States)

    Chen, Jiin-Yu

    2015-02-01

    As science has grown in size and scope, it has also presented a number of ethical and moral challenges. Approaching these challenges from an ethical framework can provide guidance when engaging with them. In this article, I place science within a virtue ethics framework, as discussed by Aristotle. By framing science within virtue ethics, I discuss what virtue ethics entails for the practicing scientist. Virtue ethics holds that each person should work towards her conception of flourishing where the virtues enable her to realize that conception. The virtues must become part of the scientist's character, undergirding her intentions and motivations, as well as the resulting decisions and actions. The virtue of phronêsis, or practical wisdom, is critical for cultivating virtue, enabling the moral agent to discern the appropriate actions for a particular situation. In exercising phronêsis, the scientist considers the situation from multiple perspectives for an in-depth and nuanced understanding of the situation, discerns the relevant factors, and settles upon an appropriate decision. I examine goods internal to a practice, which are constitutive of science practiced well and discuss the role of phronêsis when grappling with science's ethical and moral features and how the scientist might exercise it. Although phronêsis is important for producing scientific knowledge, it is equally critical for working through the moral and ethical questions science poses.

  17. The need to respect nature and its limits challenges society and conservation science.

    Science.gov (United States)

    Martin, Jean-Louis; Maris, Virginie; Simberloff, Daniel S

    2016-05-31

    Increasing human population interacts with local and global environments to deplete biodiversity and resources humans depend on, thus challenging societal values centered on growth and relying on technology to mitigate environmental stress. Although the need to address the environmental crisis, central to conservation science, generated greener versions of the growth paradigm, we need fundamental shifts in values that ensure transition from a growth-centered society to one acknowledging biophysical limits and centered on human well-being and biodiversity conservation. We discuss the role conservation science can play in this transformation, which poses ethical challenges and obstacles. We analyze how conservation and economics can achieve better consonance, the extent to which technology should be part of the solution, and difficulties the "new conservation science" has generated. An expanded ambition for conservation science should reconcile day-to-day action within the current context with uncompromising, explicit advocacy for radical transitions in core attitudes and processes that govern our interactions with the biosphere. A widening of its focus to understand better the interconnectedness between human well-being and acknowledgment of the limits of an ecologically functional and diverse planet will need to integrate ecological and social sciences better. Although ecology can highlight limits to growth and consequences of ignoring them, social sciences are necessary to diagnose societal mechanisms at work, how to correct them, and potential drivers of social change.

  18. Teaching Outside the Box: Challenging Gifted Students with Polar Sciences Without Benefit of a Science Classroom

    Science.gov (United States)

    Dooley, J.

    2013-12-01

    In the high-stakes-testing world of one-size-fits-most educational practices, it is often the needs of the most able students that are unmet, yet these high ability learners can benefit greatly from exploration in the area of polar science. With school schedules and budgets already stretched to the breaking point and Common Core (CCSS) subjects are the focus, very few resources remain for topics considered by some as unimportant. Polar and climate science are prime examples. Here, a council member of Polar Educators International and Gifted Education Teacher, shares resources and ideas to engage this unique group of students and others. She draws from experiences and knowledge gained through ANDRILL's Arise Educator program, IPY Oslo and Montreal PolarEDUCATOR workshops, and Consortium for Ocean Leadership's Deep Earth Academy. Topics include School-wide Enrichment through use of ANDRILL's Flexhibit material and participation in Antarctica Day, afterschool Deep Freeze clubs that presented in public outreach venues for polar science events at the Maryland Science Center in Baltimore and NYC's Museum of Natural History, group project work using IODP core data from Antarctica, interaction with polar scientists via Skype, and other projects.

  19. Data management and dissemination challenges for commercial remote sensing

    Science.gov (United States)

    Straeter, Terry A.

    1996-12-01

    Looking toward 2000, the ways by which commercial satellite imagery and imagery products are managed by the various remote sensing companies will be dictated by financial considerations, not technical feasibility. In the convergence of technologies that will shape the commercial companies in 2000, the most influential will likely be electronic commerce via the Internet. This paper discusses the character of these combined forces and speculates on how the industry might respond.

  20. Team science and the physician-scientist in the age of grand health challenges.

    Science.gov (United States)

    Steer, Clifford J; Jackson, Peter R; Hornbeak, Hortencia; McKay, Catherine K; Sriramarao, P; Murtaugh, Michael P

    2017-09-01

    Despite remarkable advances in medical research, clinicians face daunting challenges from new diseases, variations in patient responses to interventions, and increasing numbers of people with chronic health problems. The gap between biomedical research and unmet clinical needs can be addressed by highly talented interdisciplinary investigators focused on translational bench-to-bedside medicine. The training of talented physician-scientists comfortable with forming and participating in multidisciplinary teams that address complex health problems is a top national priority. Challenges, methods, and experiences associated with physician-scientist training and team building were explored at a workshop held at the Second International Conference on One Medicine One Science (iCOMOS 2016), April 24-27, 2016, in Minneapolis, Minnesota. A broad range of scientists, regulatory authorities, and health care experts determined that critical investments in interdisciplinary training are essential for the future of medicine and healthcare delivery. Physician-scientists trained in a broad, nonlinear, cross-disciplinary manner are and will be essential members of science teams in the new age of grand health challenges and the birth of precision medicine. Team science approaches have accomplished biomedical breakthroughs once considered impossible, and dedicated physician-scientists have been critical to these achievements. Together, they translate into the pillars of academic growth and success. © 2017 New York Academy of Sciences.

  1. Methodological Challenges in Sustainability Science: A Call for Method Plurality, Procedural Rigor and Longitudinal Research

    Directory of Open Access Journals (Sweden)

    Henrik von Wehrden

    2017-02-01

    Full Text Available Sustainability science encompasses a unique field that is defined through its purpose, the problem it addresses, and its solution-oriented agenda. However, this orientation creates significant methodological challenges. In this discussion paper, we conceptualize sustainability problems as wicked problems to tease out the key challenges that sustainability science is facing if scientists intend to deliver on its solution-oriented agenda. Building on the available literature, we discuss three aspects that demand increased attention for advancing sustainability science: 1 methods with higher diversity and complementarity are needed to increase the chance of deriving solutions to the unique aspects of wicked problems; for instance, mixed methods approaches are potentially better suited to allow for an approximation of solutions, since they cover wider arrays of knowledge; 2 methodologies capable of dealing with wicked problems demand strict procedural and ethical guidelines, in order to ensure their integration potential; for example, learning from solution implementation in different contexts requires increased comparability between research approaches while carefully addressing issues of legitimacy and credibility; and 3 approaches are needed that allow for longitudinal research, since wicked problems are continuous and solutions can only be diagnosed in retrospect; for example, complex dynamics of wicked problems play out across temporal patterns that are not necessarily aligned with the common timeframe of participatory sustainability research. Taken together, we call for plurality in methodologies, emphasizing procedural rigor and the necessity of continuous research to effectively addressing wicked problems as well as methodological challenges in sustainability science.

  2. FUSION ENERGY SCIENCES WORKSHOP ON PLASMA MATERIALS INTERACTIONS: Report on Science Challenges and Research Opportunities in Plasma Materials Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Maingi, Rajesh [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Zinkle, Steven J. [University of Tennessee – Knoxville; Foster, Mark S. [U.S. Department of Energy

    2015-05-01

    The realization of controlled thermonuclear fusion as an energy source would transform society, providing a nearly limitless energy source with renewable fuel. Under the auspices of the U.S. Department of Energy, the Fusion Energy Sciences (FES) program management recently launched a series of technical workshops to “seek community engagement and input for future program planning activities” in the targeted areas of (1) Integrated Simulation for Magnetic Fusion Energy Sciences, (2) Control of Transients, (3) Plasma Science Frontiers, and (4) Plasma-Materials Interactions aka Plasma-Materials Interface (PMI). Over the past decade, a number of strategic planning activities1-6 have highlighted PMI and plasma facing components as a major knowledge gap, which should be a priority for fusion research towards ITER and future demonstration fusion energy systems. There is a strong international consensus that new PMI solutions are required in order for fusion to advance beyond ITER. The goal of the 2015 PMI community workshop was to review recent innovations and improvements in understanding the challenging PMI issues, identify high-priority scientific challenges in PMI, and to discuss potential options to address those challenges. The community response to the PMI research assessment was enthusiastic, with over 80 participants involved in the open workshop held at Princeton Plasma Physics Laboratory on May 4-7, 2015. The workshop provided a useful forum for the scientific community to review progress in scientific understanding achieved during the past decade, and to openly discuss high-priority unresolved research questions. One of the key outcomes of the workshop was a focused set of community-initiated Priority Research Directions (PRDs) for PMI. Five PRDs were identified, labeled A-E, which represent community consensus on the most urgent near-term PMI scientific issues. For each PRD, an assessment was made of the scientific challenges, as well as a set of actions

  3. The need to respect nature and its limits challenges society and conservation science

    Science.gov (United States)

    Martin, Jean-Louis; Maris, Virginie; Simberloff, Daniel S.

    2016-01-01

    Increasing human population interacts with local and global environments to deplete biodiversity and resources humans depend on, thus challenging societal values centered on growth and relying on technology to mitigate environmental stress. Although the need to address the environmental crisis, central to conservation science, generated greener versions of the growth paradigm, we need fundamental shifts in values that ensure transition from a growth-centered society to one acknowledging biophysical limits and centered on human well-being and biodiversity conservation. We discuss the role conservation science can play in this transformation, which poses ethical challenges and obstacles. We analyze how conservation and economics can achieve better consonance, the extent to which technology should be part of the solution, and difficulties the “new conservation science” has generated. An expanded ambition for conservation science should reconcile day-to-day action within the current context with uncompromising, explicit advocacy for radical transitions in core attitudes and processes that govern our interactions with the biosphere. A widening of its focus to understand better the interconnectedness between human well-being and acknowledgment of the limits of an ecologically functional and diverse planet will need to integrate ecological and social sciences better. Although ecology can highlight limits to growth and consequences of ignoring them, social sciences are necessary to diagnose societal mechanisms at work, how to correct them, and potential drivers of social change. PMID:27185943

  4. XML Based Scientific Data Management Facility

    Science.gov (United States)

    Mehrotra, P.; Zubair, M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of XML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management ,facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  5. LHCb Data Management: consistency, integrity and coherence of data

    CERN Document Server

    Bargiotti, Marianne

    2007-01-01

    The Large Hadron Collider (LHC) at CERN will start operating in 2007. The LHCb experiment is preparing for the real data handling and analysis via a series of data challenges and production exercises. The aim of these activities is to demonstrate the readiness of the computing infrastructure based on WLCG (Worldwide LHC Computing Grid) technologies, to validate the computing model and to provide useful samples of data for detector and physics studies. DIRAC (Distributed Infrastructure with Remote Agent Control) is the gateway to WLCG. The Dirac Data Management System (DMS) relies on both WLCG Data Management services (LCG File Catalogues, Storage Resource Managers and File Transfer Service) and LHCb specific components (Bookkeeping Metadata File Catalogue). Although the Dirac DMS has been extensively used over the past years and has proved to achieve a high grade of maturity and reliability, the complexity of both the DMS and its interactions with numerous WLCG components as well as the instability of facilit...

  6. Hanford Site Cleanup Challenges and Opportunities for Science and Technology--A Strategic Assessment

    International Nuclear Information System (INIS)

    Wood, Thomas W.; Johnson, Wayne L.; Kreid, Dennis K.; Walton, Terry L.

    2001-01-01

    The sheer expanse of the Hanford Site, the inherent hazards associated with the significant inventory of nuclear materials and wastes, the large number of aging contaminated facilities, the diverse nature and extent of environmental contamination, and the proximity to the Columbia River make Hanford perhaps the world's largest and most complex environmental cleanup project. It is not possible to address the more complex elements of this enormous challenge in a cost-effective manner without strategic investments in science and technology. Success requires vigorous and sustained efforts to enhance the science and technology basis, develop and deploy innovative solutions, and provide firm scientific bases to support site cleanup and closure decisions at Hanford

  7. Using Grand Challenges to Teach Science: A Biology-Geology Collaboration

    Science.gov (United States)

    Lyford, M.; Myers, J. D.

    2012-12-01

    Three science courses at the University of Wyoming explore the inextricable connections between science and society by centering on grand challenges. Two of these courses are introductory integrated science courses for non-majors while the third is an upper level course for majors and non-majors. Through collaboration, the authors have developed these courses to explore the grand challenges of energy, water and climate. Each course focuses on the fundamental STEM principles required for a citizen to understand each grand challenge. However, the courses also emphasize the non-STEM perspectives (e.g., economics, politics, human well-being, externalities) that underlie each grand challenge and argue that creating equitable, sustainable and just solutions to the grand challenges hinges on an understanding of STEM and non-STEM perspectives. Moreover, the authors also consider the multitude of personal perspectives individuals bring to the classroom (e.g., values, beliefs, empathy misconceptions) that influence any stakeholder's ability to engage in fruitful discussions about grand challenge solutions. Discovering Science (LIFE 1002) focuses on the grand challenges of energy and climate. Students attend three one-hour lectures, one two-hour lab and a one-hour discussion each week. Lectures emphasize the STEM and non-STEM principles underlying each grand challenge. Laboratory activities are designed to be interdisciplinary and engage students in inquiry-driven activities to reinforce concepts from lecture and to model how science is conducted. Labs also expose students to the difficulties often associated with scientific studies, the limits of science, and the inherent uncertainties associated with scientific findings. Discussion sessions provide an opportunity for students to explore the complexity of the grand challenges from STEM and non-STEM perspectives, and expose the multitude of personal perspectives an individual might harbor related to each grand challenge

  8. Developing Data Management Education, Support, and Training

    OpenAIRE

    Smith, Plato

    2018-01-01

    This presentation was an invited guest lecture on data management for CISE graduates students of the CAP5108: Research Methods for Human-centered Computing course at the University of Florida on April 12, 2018 from 4:05 pm - 4:55 pm, period 9. Graduate students were introduced to the DCC Checklist for a Data Management Plan, OAIS Model (cessda adaptation), ORCiD, IR, high-performance computing (HPC) storage options at UF, data lifecycle models (USGS and UNSW), data publication guides (Beckles...

  9. Data Management in Practice Supplementary Files

    DEFF Research Database (Denmark)

    Hansen, Karsten Kryger; Madsen, Christina Guldfeldt; Kjeldgaard, Anne Sofie Fink

    This report presents the results of the Data Management i Praksis (DMiP) project (in English: Data Management in Practice). The project was funded by Denmark’s Electronic Research Library (DEFF), the National Danish Archives and the participating main Danish libraries. The following partners...... level. The project should also demonstrate that research libraries have a role to play regarding research data. Furthermore, the project should ensure development of competences at the libraries, which can then be used in the future process of managing research data....

  10. Challenges and Opportunities for Integrating Social Science Perspectives into Climate and Global Change Assessments

    Science.gov (United States)

    Larson, E. K.; Li, J.; Zycherman, A.

    2017-12-01

    Integration of social science into climate and global change assessments is fundamental for improving understanding of the drivers, impacts and vulnerability of climate change, and the social, cultural and behavioral challenges related to climate change responses. This requires disciplinary and interdisciplinary knowledge as well as integrational and translational tools for linking this knowledge with the natural and physical sciences. The USGCRP's Social Science Coordinating Committee (SSCC) is tasked with this challenge and is working to integrate relevant social, economic and behavioral knowledge into processes like sustained assessments. This presentation will discuss outcomes from a recent SSCC workshop, "Social Science Perspectives on Climate Change" and their applications to sustained assessments. The workshop brought academic social scientists from four disciplines - anthropology, sociology, geography and archaeology - together with federal scientists and program managers to discuss three major research areas relevant to the USGCRP and climate assessments: (1) innovative tools, methods, and analyses to clarify the interactions of human and natural systems under climate change, (2) understanding of factors contributing to differences in social vulnerability between and within communities under climate change, and (3) social science perspectives on drivers of global climate change. These disciplines, collectively, emphasize the need to consider socio-cultural, political, economic, geographic, and historic factors, and their dynamic interactions, to understand climate change drivers, social vulnerability, and mitigation and adaptation responses. They also highlight the importance of mixed quantitative and qualitative methods to explain impacts, vulnerability, and responses at different time and spatial scales. This presentation will focus on major contributions of the social sciences to climate and global change research. We will discuss future directions for

  11. Defining a data management strategy for USGS Chesapeake Bay studies

    Science.gov (United States)

    Ladino, Cassandra

    2013-01-01

    The mission of U.S. Geological Survey’s (USGS) Chesapeake Bay studies is to provide integrated science for improved understanding and management of the Chesapeake Bay ecosystem. Collective USGS efforts in the Chesapeake Bay watershed began in the 1980s, and by the mid-1990s the USGS adopted the watershed as one of its national place-based study areas. Great focus and effort by the USGS have been directed toward Chesapeake Bay studies for almost three decades. The USGS plays a key role in using “ecosystem-based adaptive management, which will provide science to improve the efficiency and accountability of Chesapeake Bay Program activities” (Phillips, 2011). Each year USGS Chesapeake Bay studies produce published research, monitoring data, and models addressing aspects of bay restoration such as, but not limited to, fish health, water quality, land-cover change, and habitat loss. The USGS is responsible for collaborating and sharing this information with other Federal agencies and partners as described under the President’s Executive Order 13508—Strategy for Protecting and Restoring the Chesapeake Bay Watershed signed by President Obama in 2009. Historically, the USGS Chesapeake Bay studies have relied on national USGS databases to store only major nationally available sources of data such as streamflow and water-quality data collected through local monitoring programs and projects, leaving a multitude of other important project data out of the data management process. This practice has led to inefficient methods of finding Chesapeake Bay studies data and underutilization of data resources. Data management by definition is “the business functions that develop and execute plans, policies, practices and projects that acquire, control, protect, deliver and enhance the value of data and information.” (Mosley, 2008a). In other words, data management is a way to preserve, integrate, and share data to address the needs of the Chesapeake Bay studies to better

  12. The Antarctic Master Directory - a fundamental data management element for the International Polar Year 2007-2008

    Science.gov (United States)

    Scharfen, G.; Bauer, R.

    2004-12-01

    A successful International Polar Year (IPY) in 2007-2008 will extend the scientific spirit of international collaboration and exploration first undertaken in earlier IPYs and the 1957/58 International Geophysical Year (IGY) to the current era of advanced collection and analysis technology. The IGY not only led to a number of important scientific achievements; it also established an enduring data system - the World Data Centers - which continues today. Effective utilization of the vast arrays of data which will result from the coming IPY will challenge data managers and scientists alike. Coordinating the collection, assembly, archival and international exchange of disparate and voluminous data sets requires advance planning and the involvement of the relevant science agencies and data managers to utilize and extend existing capabilities. The IPY Planning Group has identified key objectives indicating that data management is an essential part of the IPY planning process including: - Ensure data collected under the IPY are made available in an open and timely manner - Intensify the recovery of relevant historical data and ensure that these also are made openly available - Develop and embrace new technological and logistical capabilities The Scientific Committee on Antarctic Research (SCAR) and Committee of Managers of National Antarctic Programmes (COMNAP) have established the Joint Committee on Antarctic Data Management (JCADM) to develop the Antarctic Master Directory (AMD) to enable scientists to find and access the data sets collected more than 22 countries in the Antarctic. Incorporating concepts developed as part of the AMD and extending them to cover the scope of the IPY is an important part of a successful IPY data management program. This paper identifies major aspects of the AMD and how it can serve the IPY.

  13. Data Access, Interoperability and Sustainability: Key Challenges for the Evolution of Science Capabilities

    Science.gov (United States)

    Walton, A. L.

    2015-12-01

    In 2016, the National Science Foundation (NSF) will support a portfolio of activities and investments focused upon challenges in data access, interoperability, and sustainability. These topics are fundamental to science questions of increasing complexity that require multidisciplinary approaches and expertise. Progress has become tractable because of (and sometimes complicated by) unprecedented growth in data (both simulations and observations) and rapid advances in technology (such as instrumentation in all aspects of the discovery process, together with ubiquitous cyberinfrastructure to connect, compute, visualize, store, and discover). The goal is an evolution of capabilities for the research community based on these investments, scientific priorities, technology advances, and policies. Examples from multiple NSF directorates, including investments by the Advanced Cyberinfrastructure Division, are aimed at these challenges and can provide the geosciences research community with models and opportunities for participation. Implications for the future are highlighted, along with the importance of continued community engagement on key issues.

  14. Nanoscale control of energy and matter: challenges and opportunities for plasma science

    International Nuclear Information System (INIS)

    Ostrikov, Kostya

    2013-01-01

    Multidisciplinary challenges and opportunities in the ultimate ability to achieve nanoscale control of energy and matter are discussed using an example of the Plasma Nanoscience. This is an emerging multidisciplinary research field at the cutting edge of a large number of disciplines including but not limited to physics and chemistry of plasmas and gas discharges, materials science, surface science, nanoscience and nanotechnology, solid state physics, space physics and astrophysics, photonics, optics, plasmonics, spintronics, quantum information, physical chemistry, biomedical sciences and related engineering subjects. The origin, progress and future perspectives of this research field driven by the global scientific and societal challenges, is examined. The future potential of the Plasma Nanoscience to remain as a highly topical area in the global research and technological agenda in the Age of Fundamental-Level Control for a Sustainable Future is assessed using a framework of the five Grand Challenges for Basic Energy Sciences recently mapped by the US Department of Energy. It is concluded that the ongoing research is very relevant and is expected to substantially expand to competitively contribute to the solution of all of these Grand Challenges. The approach to control energy and matter at nano- and subnanoscales is based on identifying the prevailing carriers and transfer mechanisms of the energy and matter at the spatial and temporal scales that are most relevant to any particular nanofabrication process. Strong accent is made on the competitive edge of the plasma-based nanotechnology in applications related to the major socio-economic issues (energy, food, water, health and environment) that are crucial for a sustainable development of humankind. Several important emerging topics, opportunities and multidisciplinary synergies for the Plasma Nanoscience are highlighted. The main nanosafety issues are also discussed and the environment- and human health

  15. Challenges of Virtual and Open Distance Science Teacher Education in Zimbabwe

    OpenAIRE

    Vongai Mpofu; Tendai Samukange; Lovemore M Kusure; Tinoidzwa M Zinyandu; Clever Denhere; Nyakotyo Huggins; Chingombe Wiseman; Shakespear Ndlovu; Rennias Chiveya; Monica Matavire; Leckson Mukavhi; Isaac Gwizangwe; Elliot Magombe; Munyaradzi Magomelo; Fungai Sithole

    2012-01-01

    This paper reports on a study of the implementation of science teacher education through virtual and open distance learning in the Mashonaland Central Province, Zimbabwe. The study provides insight into challenges faced by students and lecturers on inception of the program at four centres. Data was collected from completed evaluation survey forms of forty-two lecturers who were directly involved at the launch of the program and in-depth interviews. Qualitative data analysis revealed that the ...

  16. Opportunities and challenges of big data for the social sciences: The case of genomic data.

    Science.gov (United States)

    Liu, Hexuan; Guo, Guang

    2016-09-01

    In this paper, we draw attention to one unique and valuable source of big data, genomic data, by demonstrating the opportunities they provide to social scientists. We discuss different types of large-scale genomic data and recent advances in statistical methods and computational infrastructure used to address challenges in managing and analyzing such data. We highlight how these data and methods can be used to benefit social science research. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Emerging Challenges and Opportunities for Education and Research in Weed Science

    Directory of Open Access Journals (Sweden)

    Bhagirath S. Chauhan

    2017-09-01

    Full Text Available In modern agriculture, with more emphasis on high input systems, weed problems are likely to increase and become more complex. With heightened awareness of adverse effects of herbicide residues on human health and environment and the evolution of herbicide-resistant weed biotypes, a significant focus within weed science has now shifted to the development of eco-friendly technologies with reduced reliance on herbicides. Further, with the large-scale adoption of herbicide-resistant crops, and uncertain climatic optima under climate change, the problems for weed science have become multi-faceted. To handle these complex weed problems, a holistic line of action with multi-disciplinary approaches is required, including adjustments to technology, management practices, and legislation. Improved knowledge of weed ecology, biology, genetics, and molecular biology is essential for developing sustainable weed control practices. Additionally, judicious use of advanced technologies, such as site-specific weed management systems and decision support modeling, will play a significant role in reducing costs associated with weed control. Further, effective linkages between farmers and weed researchers will be necessary to facilitate the adoption of technological developments. To meet these challenges, priorities in research need to be determined and the education system for weed science needs to be reoriented. In respect of the latter imperative, closer collaboration between weed scientists and other disciplines can help in defining and solving the complex weed management challenges of the 21st century. This consensus will provide more versatile and diverse approaches to innovative teaching and training practices, which will be needed to prepare future weed science graduates who are capable of handling the anticipated challenges of weed science facing in contemporary agriculture. To build this capacity, mobilizing additional funding for both weed research and

  18. Principles of data management facilitating information sharing

    CERN Document Server

    Gordon, Keith

    2013-01-01

    Data is a valuable corporate asset and its effective management can be vital to success. This professional guide covers all the key areas of data management, including database development and corporate data modelling. The new edition covers web technology and its relation to databases and includes material on the management of master data.

  19. Oceanographic data management - A national perspective

    Digital Repository Service at National Institute of Oceanography (India)

    Pankajakshan, T.

    data is examined. The CMD acts as a 'single window' facility to inform the end-users about the national data warehouse. The issues addressed in the context of the oceanographic data management are common for other geophysical parameters as well...

  20. Principles of Data Management Facilitating Information Sharing

    CERN Document Server

    Gordon, Keith

    2007-01-01

    Organisations increasingly view data as a valuable corporate asset and its effective management can be vital to success. This professional guide covers all the key areas including database development, data quality and corporate data modelling. It provides the knowledge and techniques required to successfully implement the data management function.

  1. Sensor Data Management with Probabilistic Models

    NARCIS (Netherlands)

    Evers, S.

    2009-01-01

    The anticipated 'sensing environments' of the near future pose new requirements to the data management systems that mediate between sensor data supply and demand sides. We identify and investigate one of them: the need to deal with the inherent uncertainty in sensor data due to measurement noise,

  2. Adaptable data management for systems biology investigations

    Directory of Open Access Journals (Sweden)

    Burdick David

    2009-03-01

    Full Text Available Abstract Background Within research each experiment is different, the focus changes and the data is generated from a continually evolving barrage of technologies. There is a continual introduction of new techniques whose usage ranges from in-house protocols through to high-throughput instrumentation. To support these requirements data management systems are needed that can be rapidly built and readily adapted for new usage. Results The adaptable data management system discussed is designed to support the seamless mining and analysis of biological experiment data that is commonly used in systems biology (e.g. ChIP-chip, gene expression, proteomics, imaging, flow cytometry. We use different content graphs to represent different views upon the data. These views are designed for different roles: equipment specific views are used to gather instrumentation information; data processing oriented views are provided to enable the rapid development of analysis applications; and research project specific views are used to organize information for individual research experiments. This management system allows for both the rapid introduction of new types of information and the evolution of the knowledge it represents. Conclusion Data management is an important aspect of any research enterprise. It is the foundation on which most applications are built, and must be easily extended to serve new functionality for new scientific areas. We have found that adopting a three-tier architecture for data management, built around distributed standardized content repositories, allows us to rapidly develop new applications to support a diverse user community.

  3. Data Management in the EHDI System

    Science.gov (United States)

    Bradham, Tamala S.; Hoffman, Jeff; Houston, K. Todd

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that examined 12 areas within EHDI programs. Forty-seven coordinators listed 242 items in the area of data management, and themes were identified in each category. A threats, opportunities,…

  4. A Graduate Class in Research Data Management

    Science.gov (United States)

    Schmidt, Lawrence; Holles, Joseph

    2018-01-01

    A graduate elective course in Research Data Management (RDM) was developed and taught as a team by a research librarian and a research active faculty member. Coteaching allowed each instructor to contribute knowledge in their specialty areas. The goal of this course was to provide graduate students the RDM knowledge necessary to efficiently and…

  5. ITS data management system : year one activities

    Science.gov (United States)

    1997-08-01

    This report documents research conducted in the development of an ITS data management system, hereafter referred to as ITS DataLink. The objective of the ITS DataLink system is to retain, manage, share, and analyze ITS data for a variety of tra...

  6. Facing tomorrow's challenges: U.S. Geological Survey science in the decade 2007-2017

    Science.gov (United States)

    ,

    2007-01-01

    In order for the U.S. Geological Survey (USGS) to respond to evolving national and global priorities, it must periodically reflect on, and optimize, its strategic directions. This report is the first comprehensive science strategy since the early 1990s to examine critically major USGS science goals and priorities. The development of this science strategy comes at a time of global trends and rapidly evolving societal needs that pose important natural-science challenges. The emergence of a global economy affects the demand for all resources. The last decade has witnessed the emergence of a new model for managing Federal lands-ecosystem-based management. The U.S. Climate Change Science Program predicts that the next few decades will see rapid changes in the Nation's and the Earth's environment. Finally, the natural environment continues to pose risks to society in the form of volcanoes, earthquakes, wildland fires, floods, droughts, invasive species, variable and changing climate, and natural and anthropogenic toxins, as well as animal-borne diseases that affect humans. The use of, and competition for, natural resources on the global scale, and natural threats to those resources, has the potential to impact the Nation's ability to sustain its economy, national security, quality of life, and natural environment. Responding to these national priorities and global trends requires a science strategy that not only builds on existing USGS strengths and partnerships but also demands the innovation made possible by integrating the full breadth and depth of USGS capabilities. The USGS chooses to go forward in the science directions proposed here because the societal issues addressed by these science directions represent major challenges for the Nation's future and for the stewards of Federal lands, both onshore and offshore. The six science directions proposed in this science strategy are listed as follows. The ecosystems strategy is listed first because it has a dual nature

  7. Effect of the challenger experience on elementary children's attitudes to science

    Science.gov (United States)

    Jarvis, Tina; Pell, Anthony

    2002-12-01

    This research explored how the Challenger experience influenced over 655 elementary boys' and girls' general attitudes to science and space during the 5 months after their visit by examining their responses to four different attitude scales. These were administered to the 10- to 11-year-olds immediately before and after the Challenger experience as well as 2 and 5 months later. Knowledge tests were also administered before and after the visit. A sample of children completed an existing measure of anxiety. Although there were mainly positive outcomes immediately after the Challenger experience, there were some negative effects. There were also noticeable differences between boys and girls. Some 24% of pupils were inspired to become scientists. There was also less fear of space travel with a greater appreciation of the use of science to protect the planet after the visit. Most girls improved and maintained their attitudes toward science in society. A sizeable number of pupils were relatively unaffected by the experience and there was a significant negative effect on a small group of anxious girls. There are indications that previsit preparation and careful choice of roles during the simulation are important.

  8. Recent progress and modern challenges in applied mathematics, modeling and computational science

    CERN Document Server

    Makarov, Roman; Belair, Jacques

    2017-01-01

    This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

  9. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    Science.gov (United States)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  10. Fighting A Strong Headwind: Challenges in Communicating The Science of Climate Change

    Science.gov (United States)

    Mann, M. E.

    2008-12-01

    Communicating science to the public is an intrinsic challenge to begin with. An effective communicator must find ways to translate often technical and complex scientific findings for consumption by an audience unfamiliar with the basic tools and lexicon that scientists themselves take for granted. The challenge is made all the more difficult still when the science has implications for public policy, and the scientists face attack by institutions who judge themselves to be at threat by the implications of scientific findings. Such areas of science include (but certainly are not limited to) evolution, stem cell research, environmental health, and the subject of this talk--climate change. In each of these areas, a highly organized, well funded effort has been mounted to attack the science and the scientists themselves. These attacks are rarely fought in legitimate scientific circles such as the peer-reviewed scientific literature or other scholarly venues, but rather through rhetorically-aimed efforts delivered by media outlets aligned with the views of the attackers, and by politicians and groups closely aligned with special interests. I will discuss various approaches to combating such attacks, drawing upon my own experiences in the public arena with regard to the scientific discourse on climate change.

  11. NASA's Earth Science Flight Program Meets the Challenges of Today and Tomorrow

    Science.gov (United States)

    Ianson, Eric E.

    2016-01-01

    NASA's Earth science flight program is a dynamic undertaking that consists of a large fleet of operating satellites, an array of satellite and instrument projects in various stages of development, a robust airborne science program, and a massive data archiving and distribution system. Each element of the flight program is complex and present unique challenges. NASA builds upon its successes and learns from its setbacks to manage this evolving portfolio to meet NASA's Earth science objectives. NASA fleet of 16 operating missions provide a wide range of scientific measurements made from dedicated Earth science satellites and from instruments mounted to the International Space Station. For operational missions, the program must address issues such as an aging satellites operating well beyond their prime mission, constellation flying, and collision avoidance with other spacecraft and orbital debris. Projects in development are divided into two broad categories: systematic missions and pathfinders. The Earth Systematic Missions (ESM) include a broad range of multi-disciplinary Earth-observing research satellite missions aimed at understanding the Earth system and its response to natural and human-induced forces and changes. Understanding these forces will help determine how to predict future changes, and how to mitigate or adapt to these changes. The Earth System Science Pathfinder (ESSP) program provides frequent, regular, competitively selected Earth science research opportunities that accommodate new and emerging scientific priorities and measurement capabilities. This results in a series of relatively low-cost, small-sized investigations and missions. Principal investigators whose scientific objectives support a variety of studies lead these missions, including studies of the atmosphere, oceans, land surface, polar ice regions, or solid Earth. This portfolio of missions and investigations provides opportunity for investment in innovative Earth science that enhances

  12. The questions of scientific literacy and the challenges for contemporary science teaching: An ecological perspective

    Science.gov (United States)

    Kim, Mijung

    This study began with questions about how science education can bring forth humanity and ethics to reflect increasing concerns about controversial issues of science and technology in contemporary society. Discussing and highlighting binary epistemological assumptions in science education, the study suggests embodied science learning with human subjectivity and integrity between knowledge and practice. The study questions (a) students' understandings of the relationships between STSE and their everyday lifeworld and (b) the challenges of cultivating scientific literacy through STSE teaching. In seeking to understand something about the pedagogical enactment of embodied scientific literacy that emphasizes the harmony of children's knowledges and their lifeworlds, this study employs a mindful pedagogy of hermeneutics. The intro- and intra-dialogical modes of hermeneutic understanding investigate the pedagogical relationship of parts (research texts of students, curriculum, and social milieu) and the whole (STSE teaching in contemporary time and place). The research was conducted with 86 Korean 6 graders at a public school in Seoul, Korea in 2003. Mixed methods were utilized for data collection including a survey questionnaire, a drawing activity, interviews, children's reflective writing, and classroom teaching and observation. The research findings suggest the challenges and possibilities of STSE teaching as follows: (a) children's separated knowledge from everyday practice and living, (b) children's conflicting ideas between ecological/ethical aspects and modernist values, (c) possibilities of embodied knowing in children's practice, and (d) teachers' pedagogical dilemmas in STSE teaching based on the researcher's experiences and reflection throughout teaching practice. As further discussion, this study suggests an ecological paradigm for science curriculum and teaching as a potential framework to cultivate participatory scientific literacy for citizenship in

  13. Enhancing watershed research capacity: the role of data management

    Science.gov (United States)

    Water resources are under growing pressure globally, and in the face of projected climate change, changes in precipitation frequency and intensity; evapotranspiration, runoff, and snowmelt pose severe societal challenges. Interdisciplinary environmental research across natural and social sciences to...

  14. Research data management in academic institutions: A scoping review.

    Directory of Open Access Journals (Sweden)

    Laure Perrier

    Full Text Available The purpose of this study is to describe the volume, topics, and methodological nature of the existing research literature on research data management in academic institutions.We conducted a scoping review by searching forty literature databases encompassing a broad range of disciplines from inception to April 2016. We included all study types and data extracted on study design, discipline, data collection tools, and phase of the research data lifecycle.We included 301 articles plus 10 companion reports after screening 13,002 titles and abstracts and 654 full-text articles. Most articles (85% were published from 2010 onwards and conducted within the sciences (86%. More than three-quarters of the articles (78% reported methods that included interviews, cross-sectional, or case studies. Most articles (68% included the Giving Access to Data phase of the UK Data Archive Research Data Lifecycle that examines activities such as sharing data. When studies were grouped into five dominant groupings (Stakeholder, Data, Library, Tool/Device, and Publication, data quality emerged as an integral element.Most studies relied on self-reports (interviews, surveys or accounts from an observer (case studies and we found few studies that collected empirical evidence on activities amongst data producers, particularly those examining the impact of research data management interventions. As well, fewer studies examined research data management at the early phases of research projects. The quality of all research outputs needs attention, from the application of best practices in research data management studies, to data producers depositing data in repositories for long-term use.

  15. Delivering and Incentivizing Data Management Education to Geoscience Researchers

    Science.gov (United States)

    Knuth, S. L.; Johnson, A. M.; Hauser, T.

    2015-12-01

    Good data management practices are imperative for all researchers who want to ensure the usability of their research data. For geoscientists, this is particularly important due to the vast amount of data collected as part of field work, model studies, or other efforts. While many geoscientists want to ensure their data is appropriately maintained, they are generally not trained in good data management, which, realistically, has a much lower priority in the "publish or perish" cycle of research. Many scientists learn programming or advanced computational and data skills during the process of developing their research. With the amount of digital data being collected in the sciences increasing, and the interest federal funding agencies are taking in ensuring data collected is well maintained, there is pressure to quickly and properly educate and train geoscientists on its management. At the University of Colorado Boulder (CU-Boulder), Research Data Services (RDS) has developed several educational and outreach activities centered at training researchers and students in ways to properly manage their data, including "boot camps", workshops, individual consultations, and seminars with topics of interest to the CU-Boulder community. Part of this effort is centered at incentivizing the researcher to learn these tools and practices despite their busy schedule. Much of this incentive has come through small grant competitions at the university level. The two competitions most relevant are a new "Best Digital Data Management Plan" competition, awarding unrestricted funds to the best plan submitted in each of five categories, and an added data management plan requirement to an existing faculty competition. This presentation will focus on examples of user outreach and educational opportunities given to researchers at CU-Boulder, incentives given to the researchers to participate, and assessment of the impact of these activities.

  16. Natural Hazard Resilience - A Large-scale Transdisciplinary "National Science Challenge" for New Zealand

    Science.gov (United States)

    Cronin, S. J.

    2017-12-01

    The National Science Challenges are initiatives to address the most important public science issues that face New Zealand with long-term funding and the combined strength of a coordinated science-sector behind them. Eleven major topics are tackled, across our human, natural and built environments. In the "Resilience Challenge" we address New Zealand's natural hazards. Alongside severe metrological threats, New Zealand also faces one of the highest levels of earthquake and volcanic hazard in the world. Resilience is a hotly discussed concept, here, we take the view: Resilience encapsulates the features of a system to anticipate threats, acknowledge there will be impacts (no matter how prepared we are), quickly pick up the pieces, as well as learn and adapt from the experience to better absorb and rebound from future shocks. Our research must encompass innovation in building and lifelines engineering, planning and regulation, emergency management practice, alongside understanding how our natural hazard systems work, how we monitor them and how our communities/governance/industries can be influenced and encouraged (e.g., via economic incentives) to develop and implement resilience practice. This is a complex interwoven mix of areas and is best addressed through case-study areas where researchers and the users of the research can jointly identify problems and co-develop science solutions. I will highlight some of the strengths and weaknesses of this coordinated approach to an all-hazard, all-country problem, using the example of the Resilience Challenge approach after its first two and a half years of operation. Key issues include balancing investment into high-profile (and often high consequence), but rare hazards against the frequent "monthly" hazards that collectively occupy regional and local governance. Also, it is clear that despite increasingly sophisticated hazard and hazard mitigation knowledge being generated in engineering and social areas, a range of policy

  17. How commercial and ``violent'' video games can promote culturally sensitive science learning: some questions and challenges

    Science.gov (United States)

    Kwah, Helen

    2012-12-01

    In their paper, Muñoz and El-Hani propose to bring video games into science classrooms to promote culturally sensitive ethics and citizenship education. Instead of bringing "educational" games, Muñoz and El-Hani take a more creative route and include games such as Fallout 3® precisely because they are popular and they reproduce ideological and violent representations of gender, race, class, nationality, science and technology. However, there are many questions that arise in bringing these commercial video games into science classrooms, including the questions of how students' capacities for critical reflection can be facilitated, whether traditional science teachers can take on the role of using such games in their classrooms, and which video games would be most appropriate to use. In this response, I raise these questions and consider some of the challenges in order to further the possibility of implementing Muñoz and El-Hani's creative proposal for generating culturally sensitive science classrooms.

  18. Ex Machina: Analytical platforms, Law and the Challenges of Computational Legal Science

    Directory of Open Access Journals (Sweden)

    Nicola Lettieri

    2018-04-01

    Full Text Available Over the years, computation has become a fundamental part of the scientific practice in several research fields that goes far beyond the boundaries of natural sciences. Data mining, machine learning, simulations and other computational methods lie today at the hearth of the scientific endeavour in a growing number of social research areas from anthropology to economics. In this scenario, an increasingly important role is played by analytical platforms: integrated environments allowing researchers to experiment cutting-edge data-driven and computation-intensive analyses. The paper discusses the appearance of such tools in the emerging field of computational legal science. After a general introduction to the impact of computational methods on both natural and social sciences, we describe the concept and the features of an analytical platform exploring innovative cross-methodological approaches to the academic and investigative study of crime. Stemming from an ongoing project involving researchers from law, computer science and bioinformatics, the initiative is presented and discussed as an opportunity to raise a debate about the future of legal scholarship and, inside of it, about the challenges of computational legal science.

  19. The opportunities and challenges of guided inquiry science for students with special needs

    Science.gov (United States)

    Miller, Marianne

    Research in science education has been conducted with various goals for instruction. Four outcomes identified include: immediate and delayed recall, literal comprehension, science skills and processes, and conceptual understanding. The promise of developing important thinking skills exists for all students if science instruction is designed to teach students the products of science and the principled process of inquiry. Guided inquiry science seeks to develop conceptual understanding through the pursuit of meaningful questions using scientific problem solving to conduct investigations that are thoughtfully generated and evaluated. Using a social constructivist perspective, this study examines the learning experiences of four students, identified by their teachers as learning disabled or underachieving. Four case studies are presented of the students' participation in a guided inquiry investigation of the behavior of light. Measures of conceptual understanding included pre- and post-instruction assessments, interviews, journal writing, videotapes, and fieldnotes. All four students demonstrated improved conceptual understanding of light. Five patterns of relationships influenced the development of the students' thinking. First, differences in the culture of the two classrooms altered the learning environment, Second, the nature of teacher interaction with the target students affected conceptual understanding. Third, interactions with peers modified the learning experiences for the identified students. Fourth, the conceptual and procedural complexity of the tasks increased the tendency for the students to lose focus. Finally, the literacy requirements of the work were challenging for these students.

  20. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    Science.gov (United States)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  1. DIRAC reliable data management for LHCb

    CERN Document Server

    Smith, A C

    2008-01-01

    DIRAC, LHCb's Grid Workload and Data Management System, utilizes WLCG resources and middleware components to perform distributed computing tasks satisfying LHCb's Computing Model. The Data Management System (DMS) handles data transfer and data access within LHCb. Its scope ranges from the output of the LHCb Online system to Grid-enabled storage for all data types. It supports metadata for these files in replica and bookkeeping catalogues, allowing dataset selection and localization. The DMS controls the movement of files in a redundant fashion whilst providing utilities for accessing all metadata. To do these tasks effectively the DMS requires complete self integrity between its components and external physical storage. The DMS provides highly redundant management of all LHCb data to leverage available storage resources and to manage transient errors in underlying services. It provides data driven and reliable distribution of files as well as reliable job output upload, utilizing VO Boxes at LHCb Tier1 sites ...

  2. SRL-NURE hydrogeochemical data management system

    International Nuclear Information System (INIS)

    Maddox, J.H.; Wren, H.F.; Honeck, H.C.; Tharin, C.R.; Howard, M.D.

    1976-07-01

    A data management system was developed to store and retrieve all physical, chemical, and geological data collected for the NURE Hydrogeochemical Reconnaissance program by the Savannah River laboratory (SRL). In 1975, SRL accepted responsibility for hydrogeochemical reconnaissance of twenty-five states in the eastern United States as part of the National Uranium Resource Evaluation (NURE) program to identify areas favorable for uranium exploration. The SRL-NURE hydrogeochemical data management system is written in FORTRAN IV for an IBM System 360/195 computer. The system is designed to accommodate the changes in the types of data collected about a sampling site and for the different numbers of samples taken at the sites. The data are accepted as they become available and are combined with relevant data already in the system

  3. App-lifying USGS Earth Science Data: Engaging the public through Challenge.gov

    Science.gov (United States)

    Frame, M. T.

    2013-12-01

    With the goal of promoting innovative use and applications of USGS data, USGS Core Science Analytics and Synthesis (CSAS) launched the first USGS Challenge: App-lifying USGS Earth Science Data. While initiated before the recent Office of Science and Technology Policy's memorandum 'Increasing Access to the Results of Federally Funded Scientific Research', our challenge focused on one of the core tenets of the memorandum- expanding discoverability, accessibility and usability of CSAS data. From January 9 to April 1, 2013, we invited developers, information scientists, biologists/ecologists, and scientific data visualization specialists to create applications for selected USGS datasets. Identifying new, innovative ways to represent, apply, and make these data available is a high priority for our leadership. To help boost innovation, our only constraint on the challengers stated they must incorporate at least one of the identified datasets in their application. Winners were selected based on the relevance to the USGS and CSAS missions, innovation in design, and overall ease of use of the application. The winner for Best Overall App was TaxaViewer by the rOpenSci group. TaxaViewer is a Web interface to a mashup of data from the USGS-sponsored interagency Integrated Taxonomic Information System (ITIS) and other data from the Phylotastic taxonomic Name service, the Global Invasive Species Database, Phylomatic, and the Global Biodiversity Information Facility. The Popular Choice App award, selected through a public vote on the submissions, went to the Species Comparison Tool by Kimberly Sparks of Raleigh, N.C., which allows users to explore the USGS Gap Analysis Program habitat distribution and/or range of two species concurrently. The application also incorporates ITIS data and provides external links to NatureServe species information. Our results indicated that running a challenge was an effective method for promoting our data products and therefore improving

  4. Ensuring on-time quality data management deliverables from global clinical data management teams

    Directory of Open Access Journals (Sweden)

    Zia Haque

    2010-01-01

    Full Text Available The growing emphasis on off-site and off-shore clinical data management activities mandates a paramount need for adequate solutions geared toward on-time, quality deliverables. The author has been leading large teams that have been involved in successful global clinical data management endeavors. While each study scenario is unique and has to be approached as such, there are several elements in defining strategy and team structure in global clinical data management that can be applied universally. In this article, key roles, practices, and high-level procedures are laid out as a road map to ensure success with the model.

  5. Development of geophysical data management system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Tai-Sup; Lee, Sang-Kyu; Gu, Sung-Bon [Korea Institute of Geology Mining and Materials, Taejon (KR)] (and others)

    1999-12-01

    (1) Development of a complete geophysical database system under C/S environment for data management. (2) Development of database system for the general user, who has not special knowledge of database, under the Internet environment. (3) Operation of the Web service for the general user. (4) Development of the stand-alone database system for a small-scale research group such as college and engineering consultant firms. (author). 15 refs.

  6. Adding intelligence to scientific data management

    Science.gov (United States)

    Campbell, William J.; Short, Nicholas M., Jr.; Treinish, Lloyd A.

    1989-01-01

    NASA plans to solve some of the problems of handling large-scale scientific data bases by turning to artificial intelligence (AI) are discussed. The growth of the information glut and the ways that AI can help alleviate the resulting problems are reviewed. The employment of the Intelligent User Interface prototype, where the user will generate his own natural language query with the assistance of the system, is examined. Spatial data management, scientific data visualization, and data fusion are discussed.

  7. Assessing XML Data Management with XMark

    OpenAIRE

    Schmidt, A.R.; Waas, F.; Kersten, Martin; Carey, M.J.; Manolescu, I.; Busse, R.

    2002-01-01

    textabstractWe discuss some of the experiences we gathered during the development and deployment of XMark, a tool to assess the infrastructure and performance of XML Data Management Systems. Since the appearance of the first XML database prototypes in research institutions and development labs, topics like validation, performance evaluation and optimization of XML query processors have received significant interest. The XMark benchmark follows a tradition in database research and provides a f...

  8. FOSS Tools for Research Data Management

    Science.gov (United States)

    Stender, Vivien; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim

    2017-04-01

    Established initiatives and organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. These infrastructures aim the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. In this regard, Research Data Management (RDM) gains importance and thus requires the support by appropriate tools integrated in these infrastructures. Different projects provide arbitrary solutions to manage research data. However, within two projects - SUMARIO for land and water management and TERENO for environmental monitoring - solutions to manage research data have been developed based on Free and Open Source Software (FOSS) components. The resulting framework provides essential components for harvesting, storing and documenting research data, as well as for discovering, visualizing and downloading these data on the basis of standardized services stimulated considerably by enhanced data management approaches of Spatial Data Infrastructures (SDI). In order to fully exploit the potentials of these developments for enhancing data management in Geosciences the publication of software components, e.g. via GitHub, is not sufficient. We will use our experience to move these solutions into the cloud e.g. as PaaS or SaaS offerings. Our contribution will present data management solutions for the Geosciences developed in two projects. A sort of construction kit with FOSS components build the backbone for the assembly and implementation of projects specific platforms. Furthermore, an approach is presented to stimulate the reuse of FOSS RDM solutions with cloud concepts. In further projects specific RDM platforms can be set-up much faster, customized to the individual needs and tools can be added during the run-time.

  9. Challenges and strategies for effectively teaching the nature of science: A qualitative case study

    Science.gov (United States)

    Koehler, Catherine M.

    This year long, qualitative, case study examines two, experienced, high school, biology teachers as they facilitated nature of science (NOS) understandings in their classrooms. This study explored three research questions: (1) In what ways do experienced teachers' conceptions of NOS evolve over one full year as a result of participating in a course that explicitly address NOS teaching and learning? (2) In what ways do experienced teachers' pedagogical practices evolve over one full year as a result of participating in a course that explicitly address NOS teaching and learning?, and (3) What are the challenges facing experienced teachers in their attempts to implement NOS understandings in their science, high school classrooms? This study was conducted in two parts. In Part I (fall 2004 semester), the participants were enrolled in a graduate course titled, Teaching the Nature of Science , where they were introduced to: (1) NOS, (2) a strategy, the Model for Teaching NOS (MTNOS), which helped them facilitate teaching NOS understandings through inquiry-based activities, and (3) participated in "real" science activities that reinforced their conceptions of NOS. In Part II (spring 2005 semester), classroom observations were made to uncover how these teachers implemented inquiry-based activities emphasizing NOS understanding in their classrooms. Their conceptions of NOS were measured using the Views of the Nature of Science questionnaire. Results demonstrated that each teacher's conceptions of NOS shifted slightly during course the study, but, for one, this was not a permanent shift. Over the year, one teacher's pedagogical practices changed to include inquiry-based lessons using MTNOS; the other, although very amenable to using prepared inquiry-based lessons, did not change her pedagogical practices. Both reported similar challenges while facilitating NOS understanding. The most significant challenges included: (1) time management; (2) the perception that NOS was a

  10. [Source data management in clinical researches].

    Science.gov (United States)

    Ho, Effie; Yao, Chen; Zhang, Zi-bao; Liu, Yu-xiu

    2015-11-01

    Source data and its source documents are the foundation of clinical research. Proper source data management plays an essential role for compliance with regulatory and GCP requirements. Both paper and electronic source data co-exist in China. Due to the increasing use of electronic technology in pharmaceutical and health care industry, electronic data source becomes an upcoming trend with clear advantages. To face new opportunities and to ensure data integrity, quality and traceability from source data to regulatory submission, this document demonstrates important concepts, principles and best practices during managing source data. It includes but not limited to: (1) important concepts of source data (e.g., source data originator, source data elements, source data identifier for audit trail, etc.); (2) various modalities of source data collection in paper and electronic methods (e.g., paper CRF, EDC, Patient Report Outcomes/eCOA, etc.); (3) seven main principles recommended in the aspect of data collection, traceability, quality standards, access control, quality control, certified copy and security during source data management; (4) a life cycle from source data creation to obsolete is used as an example to illustrate consideration and implementation of source data management.

  11. Challenging the Science Curriculum Paradigm: Teaching Primary Children Atomic-Molecular Theory

    Science.gov (United States)

    Haeusler, Carole; Donovan, Jennifer

    2017-11-01

    Solutions to global issues demand the involvement of scientists, yet concern exists about retention rates in science as students pass through school into University. Young children are curious about science, yet are considered incapable of grappling with abstract and microscopic concepts such as atoms, sub-atomic particles, molecules and DNA. School curricula for primary (elementary) aged children reflect this by their limitation to examining only what phenomena are without providing any explanatory frameworks for how or why they occur. This research challenges the assumption that atomic-molecular theory is too difficult for young children, examining new ways of introducing atomic theory to 9 year olds and seeks to verify their efficacy in producing genuine learning in the participants. Early results in three cases in different schools indicate these novel methods fostered further interest in science, allowed diverse children to engage and learn aspects of atomic theory, and satisfied the children's desire for intellectual challenge. Learning exceeded expectations as demonstrated in the post-interview findings. Learning was also remarkably robust, as demonstrated in two schools 8 weeks after the intervention and, in one school, 1 year after their first exposure to ideas about atoms, elements and molecules.

  12. Broadening Participation in the Sciences within and from Africa: Purpose, Challenges, and Prospects.

    Science.gov (United States)

    Okeke, Iruka N; Babalola, Chinedum P; Byarugaba, Denis K; Djimde, Abdoulaye; Osoniyi, Omolaja R

    2017-01-01

    Many of Africa's challenges have scientific solutions, but there are fewer individuals engaged in scientific activity per capita on this continent than on any other. Only a handful of African scientists use their skills to capacity or are leaders in their disciplines. Underrepresentation of Africans in scientific practice, discourse, and decision making reduces the richness of intellectual contributions toward hard problems worldwide. This essay outlines challenges faced by teacher-scholars from sub-Saharan Africa as we build scientific expertise. Access to tertiary-level science is difficult and uneven across Africa, and the quality of training available varies from top-range to inadequate. Access to science higher education needs to increase, particularly for female students, first-generation literates, and rural populations. We make suggestions for collaborative initiatives involving stakeholders outside Africa and/or outside academia that could extend educational opportunities available to African students and increase the chance that Africa-based expertise is globally available. © 2017 I. N. Okeke et al. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  13. Challenges in materials science and possibilities in 3D and 4D characterization techniques

    International Nuclear Information System (INIS)

    Hansen, N.; Juul Jensen, D.; Nielsen, S.F.; Poulsen, H.F.; Ralph, B.

    2010-01-01

    Present days see a global effort to develop new and advanced materials and as an integral part of this endeavor a range of new characterization techniques are becoming available, which have led to significant breakthroughs in materials science and engineering. Within this broad scientific field the symposium focus on metals and on 3D and 4D characterization techniques using x-rays, neutrons and electrons. These techniques now allow characterization on a finer and finer scale and open up for analysis of dynamic behavior by real time in-situ investigations. This means that techniques are now available by which key challenges in materials science can be addressed. The combination of techniques and challenges has been the guide for contributions to this year symposium and these proceedings shows the successful result. The collection of papers demonstrates the many new possibilities in 3D and 4D characterization techniques and also the applications of these techniques in the studies of important materials science and engineering themes, for example: evolution in structure and properties under thermal and mechanical loading and during annealing, phase transformations and fracture/damage. The proceedings contain the 14 key note and 34 contributed presentations of the symposium, covering the above key themes. (LN)

  14. Data management for early hearing detection and intervention in South Africa

    Directory of Open Access Journals (Sweden)

    Selvarani Moodley

    2017-06-01

    Full Text Available Introduction: Internationally, newborn hearing screening is becoming part of standard neonatal healthcare service guidelines for the implementation of early hearing detection and intervention (EHDI initiatives, including screening, diagnosis, data management and intervention. Data management includes the processes of data collection and storage thereof, as well as analysis and interpretation of data to guide the future planning, implementation and evaluation of EHDI programmes. There have been limited studies on data management in the South African EHDI context. Methods: The aim of this study was to determine the type of data management systems in use in South Africa and whether they allow for cross-disciplinary sharing and evaluation of the EHDI processes. A survey instrument on the management of EHDI data was developed and sent to HI HOPES referral agents in both public and private sectors. Results: A return rate of 80% was achieved, with 19 (59% public sector and 13 (41% private sector audiologists participating in the study. The data revealed that there was no uniform data management system in use nationally, and no consistent shared system within the public or private sectors. The majority of respondents (44% used a paper-based system for data recording. No institutions were using data management systems that enabled sharing of information with other medical professionals. Conclusion: Data management and tracking of the pathway from screening to diagnosis to intervention is necessary to ensure quality care and outcomes for children identified with hearing loss. International studies reveal the importance of effective implementation of data management systems; however, to date these have focussed on developed country contexts. Data management challenges identified in this study reflect international challenges as well as challenges unique to a developing country context.

  15. United States-Mexican Borderlands: Facing tomorrow's challenges through USGS science

    Science.gov (United States)

    Updike, Randall G.; Ellis, Eugene G.; Page, William R.; Parker, Melanie J.; Hestbeck, Jay B.; Horak, William F.

    2013-01-01

    Along the nearly 3,200 kilometers (almost 2,000 miles) of the United States–Mexican border, in an area known as the Borderlands, we are witnessing the expression of the challenges of the 21st century. This circular identifies several challenge themes and issues associated with life and the environment in the Borderlands, listed below. The challenges are not one-sided; they do not originate in one country only to become problems for the other. The issues and concerns of each challenge theme flow in both directions across the border, and both nations feel their effects throughout the Borderlands and beyond. The clear message is that our two nations, the United States and Mexico, face the issues in these challenge themes together, and the U.S. Geological Survey (USGS) understands it must work with its counterparts, partners, and customers in both countries.Though the mission of the USGS is not to serve as land manager, law enforcer, or code regulator, its innovation and creativity and the scientific and technical depth of its capabilities can be directly applied to monitoring the conditions of the landscape. The ability of USGS scientists to critically analyze the monitored data in search of signals and trends, whether they lead to negative or positive results, allows us to reach significant conclusions—from providing factual conclusions to decisionmakers, to estimating how much of a natural resource exists in a particular locale, to predicting how a natural hazard phenomenon will unfold, to forecasting on a scale from hours to millennia how ecosystems will behave.None of these challenge themes can be addressed strictly by one or two science disciplines; all require well-integrated, cross-discipline thinking, data collection, and analyses. The multidisciplinary science themes that have become the focus of the USGS mission parallel the major challenges in the border region between Mexico and the United States. Because of this multidisciplinary approach, the USGS

  16. Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.

    Science.gov (United States)

    Stein, Lincoln D

    2008-09-01

    Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.

  17. Challenges of Virtual and Open Distance Science Teacher Education in Zimbabwe

    Directory of Open Access Journals (Sweden)

    Vongai Mpofu

    2012-01-01

    Full Text Available This paper reports on a study of the implementation of science teacher education through virtual and open distance learning in the Mashonaland Central Province, Zimbabwe. The study provides insight into challenges faced by students and lecturers on inception of the program at four centres. Data was collected from completed evaluation survey forms of forty-two lecturers who were directly involved at the launch of the program and in-depth interviews. Qualitative data analysis revealed that the programme faces potential threat from centre-, institution-, lecturer-, and student-related factors. These include limited resources, large classes, inadequate expertise in open and distance education, inappropriate science teacher education qualifications, implementer conflict of interest in program participation, students’ low self-esteem, lack of awareness of quality parameters of delivery systems among staff, and lack of standard criteria to measure the quality of services. The paper recommends that issues raised be addressed in order to produce quality teachers.

  18. Psychological science's contributions to a sustainable environment: extending our reach to a grand challenge of society.

    Science.gov (United States)

    Kazdin, Alan E

    2009-01-01

    Climate change and degradation of the environment are global problems associated with many other challenges (e.g., population increases, reduction of glaciers, and loss of critical habitats). Psychological science can play a critical role in addressing these problems by fostering a sustainable environment. Multiple strategies for fostering a sustainable environment could draw from the diversity of topics and areas of specialization within psychology. Psychological research on fostering environmentally sustainable behaviors is rather well developed, as illustrated by interventions focusing on education of the public, message framing, feedback, decision making, the media, incentives and disincentives, and social marketing. Other sciences and professions as well as religion and ethics are actively involved in fostering a sustainable environment. Psychology ought to be more involved directly, systematically, and visibly to draw on our current knowledge and to have palpable impact. We would serve the world very well and in the process our discipline and profession.

  19. Hanford Site Cleanup Challenges and Opportunities for Science and Technology--A Strategic Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Thomas W.; Johnson, Wayne L.; Kreid, Dennis K.; Walton, Terry L.

    2001-02-01

    The sheer expanse of the Hanford Site, the inherent hazards associated with the significant inventory of nuclear materials and wastes, the large number of aging contaminated facilities, the diverse nature and extent of environmental contamination, and the proximity to the Columbia River make Hanford perhaps the world's largest and most complex environmental cleanup project. It is not possible to address the more complex elements of this enormous challenge in a cost-effective manner without strategic investments in science and technology. Success requires vigorous and sustained efforts to enhance the science and technology basis, develop and deploy innovative solutions, and provide firm scientific bases to support site cleanup and closure decisions at Hanford.

  20. On the added value of forensic science and grand innovation challenges for the forensic community.

    Science.gov (United States)

    van Asten, Arian C

    2014-03-01

    In this paper the insights and results are presented of a long term and ongoing improvement effort within the Netherlands Forensic Institute (NFI) to establish a valuable innovation programme. From the overall perspective of the role and use of forensic science in the criminal justice system, the concepts of Forensic Information Value Added (FIVA) and Forensic Information Value Efficiency (FIVE) are introduced. From these concepts the key factors determining the added value of forensic investigations are discussed; Evidential Value, Relevance, Quality, Speed and Cost. By unravelling the added value of forensic science and combining this with the future needs and scientific and technological developments, six forensic grand challenges are introduced: i) Molecular Photo-fitting; ii) chemical imaging, profiling and age estimation of finger marks; iii) Advancing Forensic Medicine; iv) Objective Forensic Evaluation; v) the Digital Forensic Service Centre and vi) Real time In-Situ Chemical Identification. Finally, models for forensic innovation are presented that could lead to major international breakthroughs on all these six themes within a five year time span. This could cause a step change in the added value of forensic science and would make forensic investigative methods even more valuable than they already are today. © 2013. Published by Elsevier Ireland Ltd on behalf of Forensic Science Society. All rights reserved.

  1. A new approach to environmental education: environment-challenge for science, technology and society

    International Nuclear Information System (INIS)

    Popovic, D.

    2002-01-01

    The paper presents a new approach to environmental education within the project Environment: Challenge for Science, Technology and Education, realized on the Alternative Academic Education Network (AAEN) in Belgrade. The project is designed for graduate or advanced undergraduate students of science, medicine, engineering, biotechnology, political and law sciences. It is multidisciplinary and interdisciplinary project aimed to support students interest in different areas of the environmental sciences through strong inter-connection between modern scientific ideas, technological achievements and society. The project contains four basic courses (Living in the Environment; Physical and Chemical Processes in the Environment; Industrial Ecology and Sustainable Development; Environmental Philosophy and Ethics) and a number of elective courses dealing with environmental biology, adaptation processes , global eco politics, environmental ethics, scientific and public policy, environmental consequences of warfare, environmental pollution control, energy management, environmental impact assessment, etc. The standard ex catedra teaching is replaced with active student-teacher communication method enabling students to participate actively in the subject through seminars, workshops, short essays and individual research projects

  2. Discovery informatics in biological and biomedical sciences: research challenges and opportunities.

    Science.gov (United States)

    Honavar, Vasant

    2015-01-01

    New discoveries in biological, biomedical and health sciences are increasingly being driven by our ability to acquire, share, integrate and analyze, and construct and simulate predictive models of biological systems. While much attention has focused on automating routine aspects of management and analysis of "big data", realizing the full potential of "big data" to accelerate discovery calls for automating many other aspects of the scientific process that have so far largely resisted automation: identifying gaps in the current state of knowledge; generating and prioritizing questions; designing studies; designing, prioritizing, planning, and executing experiments; interpreting results; forming hypotheses; drawing conclusions; replicating studies; validating claims; documenting studies; communicating results; reviewing results; and integrating results into the larger body of knowledge in a discipline. Against this background, the PSB workshop on Discovery Informatics in Biological and Biomedical Sciences explores the opportunities and challenges of automating discovery or assisting humans in discovery through advances (i) Understanding, formalization, and information processing accounts of, the entire scientific process; (ii) Design, development, and evaluation of the computational artifacts (representations, processes) that embody such understanding; and (iii) Application of the resulting artifacts and systems to advance science (by augmenting individual or collective human efforts, or by fully automating science).

  3. Project management of life-science research projects: project characteristics, challenges and training needs.

    Science.gov (United States)

    Beukers, Margot W

    2011-02-01

    Thirty-four project managers of life-science research projects were interviewed to investigate the characteristics of their projects, the challenges they faced and their training requirements. A set of ten discriminating parameters were identified based on four project categories: contract research, development, discovery and call-based projects--projects set up to address research questions defined in a call for proposals. The major challenges these project managers are faced with relate to project members, leadership without authority and a lack of commitment from the respective organization. Two-thirds of the project managers indicated that they would be interested in receiving additional training, mostly on people-oriented, soft skills. The training programs that are currently on offer, however, do not meet their needs. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Science and technology in business management: Challenges for the training of professionals

    Directory of Open Access Journals (Sweden)

    Carlos Fernando Giler-Zúñiga

    2016-09-01

    Full Text Available The advances that are operated in science and technology at preset has been of accelerated changes, that lead to the analysis of the need of professionals{ training to face since the enterprise practice the challenges that contemporary innovation impose about knowledge, it is obliging education to assume different challenges. The professional training of professional correctly prepared, with the aim of contributing to the development of the country, it link it with economical politic and the a wider social politic ,thus, as with the systems of production and management, it pretends to give a new approach which is: to train to make capable of ,to specialize and update students and professionals to prepare a professional and leaders with critical thought and social  intellectual and of social goods and service and consciousness ,that link him with the principle of belonging being this the responsibility with preparation and training ,at the service of the an aesthetic politics of the society.

  5. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    International Nuclear Information System (INIS)

    Pop, Florin

    2014-01-01

    Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  6. An Overview of Science Challenges Pertaining to our Understanding of Extreme Geomagnetically Induced Currents. Chapter 8

    Science.gov (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti A.

    2018-01-01

    Vulnerability of man-made infrastructure to Earth-directed space weather events is a serious concern for today's technology-dependent society. Space weather-driven geomagnetically induced currents (GICs) can disrupt operation of extended electrically conducting technological systems. The threat of adverse impacts on critical technological infrastructure, like power grids, oil and gas pipelines, and communication networks, has sparked renewed interest in extreme space weather. Because extreme space weather events have low occurrence rate but potentially high impact, this presents a major challenge for our understanding of extreme GIC activity. In this chapter, we discuss some of the key science challenges pertaining to our understanding of extreme events. In addition, we present an overview of GICs including highlights of severe impacts over the last 80 years and recent U.S. Federal actions relevant to this community.

  7. Hanford Site Cleanup Challenges and Opportunities for Science and Technology - A Strategic Assessment

    International Nuclear Information System (INIS)

    Johnson, W.; Reichmuth, B.; Wood, T.; Glasper, M.; Hanson, J.

    2002-01-01

    In November 2000, the U.S. Department of Energy (DOE) Richland Operations Office (RL) initiated an effort to produce a single, strategic perspective of RL Site closure challenges and potential Science and Technology (S and T) opportunities. This assessment was requested by DOE Headquarters (HQ), Office of Science and Technology, EM-50, as a means to provide a site level perspective on S and T priorities in the context of the Hanford 2012 Vision. The objectives were to evaluate the entire cleanup lifecycle (estimated at over $24 billion through 2046), to identify where the greatest uncertainties exist, and where investments in S and T can provide the maximum benefit. The assessment identified and described the eleven strategic closure challenges associated with the cleanup of the Hanford Site. The assessment was completed in the spring of 2001 and provided to DOE-HQ and the Hanford Site Technology Coordination Group (STCG) for review and input. It is the first step in developing a Site-level S and T strategy for RL. To realize the full benefits of this assessment, RL and Site contractors will work with the Hanford STCG to ensure: identified challenges and opportunities are reflected in project baselines; detailed S and T program-level road maps reflecting both near- and long-term investments are prepared using this assessment as a starting point; and integrated S and T priorities are incorporated into Environmental Management (EM) Focus Areas, Environmental Management Science Program (EMSP) and other research and development (R and D) programs to meet near-term and longer-range challenges. Hanford is now poised to begin the detailed planning and road mapping necessary to ensure that the integrated Site level S and T priorities are incorporated into the national DOE S and T program and formally incorporated into the relevant project baselines. DOE-HQ's response to this effort has been very positive and similar efforts are likely to be undertaken at other sites

  8. TWRS process engineering data management plan

    Energy Technology Data Exchange (ETDEWEB)

    Adams, M.R.

    1997-05-12

    The Tank Characterization Data Management (TCDM) system provides customers and users with data and information of known and acceptable quality when they are needed, in the form they are needed, and at a reasonable cost. The TCDM mission will be accomplished by the following: (1) maintaining and managing tank characterization data and information based on business needs and objectives including transfer of ownership to future contractors; (2) capturing data where it originates and entering it only once to control data consistency, electronic data and information management shall be emphasized to the extent practicable; (3) establishing data quality standards, and managing and certifying databases and data sources against these standards to maintain the proper level of data and information quality consistent with the importance of the data and information, data obtained at high cost with significant implications to decision making regarding tank safety and/or disposal will be maintained and managed at the highest necessary levels of quality; (4) establishing and enforcing data management standards for the Tank Characterization Database (TCD) and supporting data sources including providing mechanisms for discovering and correcting data errors before they propagate; (5) emphasizing electronic data sharing with all authorized users, customers, contractors, and stakeholders to the extent practicable; (6) safeguarding data and information from unauthorized alteration or destruction; (7) providing standards for electronic information deliverables to subcontractors and vendors to achieve uniformity in electronic data management; and (8) investing in new technology (hardware and/or software) as prudent and necessary to accomplish the mission in an efficient and effective manner.

  9. STEM Is Elementary: Challenges Faced by Elementary Teachers in the Era of the Next Generation Science Standards

    Science.gov (United States)

    Isabelle, Aaron D.

    2017-01-01

    For students to achieve the goals of the Next Generation Science Standards (NGSS) by Grade 12, thinking and acting like scientists and engineers must begin in the elementary grades. However, elementary teachers may find this challenging -because language arts and mathematics still dominate many classrooms--often at the expense of science. This…

  10. Making Sense: Talking Data Management with Researchers

    Directory of Open Access Journals (Sweden)

    Catharine Ward

    2011-10-01

    Full Text Available Incremental is one of eight projects in the JISC Managing Research Data programme funded to identify institutional requirements for digital research data management and pilot relevant infrastructure. Our findings concur with those of other Managing Research Data projects, as well as with several previous studies. We found that many researchers: (i organise their data in an ad hoc fashion, posing difficulties with retrieval and re-use; (ii store their data on all kinds of media without always considering security and back-up; (iii are positive about data sharing in principle though reluctant in practice; (iv believe back-up is equivalent to preservation. The key difference between our approach and that of other Managing Research Data projects is the type of infrastructure we are piloting. While the majority of these projects focus on developing technical solutions, we are focusing on the need for ‘soft’ infrastructure, such as one-to-one tailored support, training, and easy-to-find, concise guidance that breaks down some of the barriers information professionals have unintentionally built with their use of specialist terminology.We are employing a bottom-up approach as we feel that to support the step-by-step development of sound research data management practices, you must first understand researchers’ needs and perspectives. Over the life of the project, Incremental staff will act as mediators, assisting researchers and local support staff to understand the data management requirements within which they are expect to work, and will determine how these can be addressed within research workflows and the existing technical infrastructure.Our primary goal is to build data management capacity within the Universities of Cambridge and Glasgow by raising awareness of basic principles so everyone can manage their data to a certain extent. We will ensure our lessons can be picked up and used by other institutions. Our affiliation with the Digital

  11. Data management facility for JT-60

    International Nuclear Information System (INIS)

    Ohasa, K.; Kurimoto, K.; Mochizuki, O.

    1983-01-01

    This study considers the Data Management Facility which is provided for unified management of various diagnostics data with JT-60 experiments. This facility is designed for the purpose of data access. There are about 30 kinds of diagnostic devices that are classified by diagnostic objects equipped for JT-60 facility. It gathers the diagnostic date about 10 Mega Byte per each discharge. Those diagnostic data are varied qualitatively and quantitatively by experimental purpose. Other fundamental information like discharge condition, adjustive value for diagnostic devices is required to process those gathered data

  12. Increasing efficiency through integrated energy data management

    International Nuclear Information System (INIS)

    Brack, M.

    2002-01-01

    This article discusses how improved management of energy data can bring about the increase in efficiency that is necessary for an electricity enterprise operating in a liberalised electricity market. The relevant technical and business processes involved for a typical power distribution utility are described. The present situation is reviewed and the various physical, data-logistics and commercial 'domains' involved are examined. Possible solutions for energy data logistics and integrated data management are discussed from the points of view of the operating utility, the power supplier and those responsible for balancing out supply and demand

  13. Building the IOOS data management subsystem

    Science.gov (United States)

    de La Beaujardière, J.; Mendelssohn, R.; Ortiz, C.; Signell, R.

    2010-01-01

    We discuss progress to date and plans for the Integrated Ocean Observing System (IOOS??) Data Management and Communications (DMAC) subsystem. We begin by presenting a conceptual architecture of IOOS DMAC. We describe work done as part of a 3-year pilot project known as the Data Integration Framework and the subsequent assessment of lessons learned. We present work that has been accomplished as part of the initial version of the IOOS Data Catalog. Finally, we discuss near-term plans for augmenting IOOS DMAC capabilities.

  14. SSCL-PDSF Data Management System

    International Nuclear Information System (INIS)

    Allen, J.L.

    1992-09-01

    Physics and detector simulations at the Superconducting Super Collider Laboratory (SSCL) are performed on a heterogeneous network of RISC based workstations named the Physics and Detector Simulation Facility (PDSF). These simulations can be characterized by the consumption and generation of large amounts of data. It is clear that on-line disk storage must be supplemented by off-line tape storage. For the PDSF, an 8-mm tape robot system was initially chosen in order to provide tertiary data storage based on its compactness and low cost. In order to manage this data, the Physics Computing Department designed the Data Management System (DMS)

  15. Challenges of archiving science data from long duration missions: the Rosetta case

    Science.gov (United States)

    Heather, David

    2016-07-01

    Rosetta is the first mission designed to orbit and land on a comet. It consists of an orbiter, carrying 11 science experiments, and a lander, called 'Philae', carrying 10 additional instruments. Rosetta was launched on 2 March 2004, and arrived at the comet 67P/Churyumov-Gerasimenko on 6 August 2014. During its long journey, Rosetta has completed flybys of the Earth and Mars, and made two excursions to the main asteroid belt to observe (2867) Steins and (21) Lutetia. On 12 November 2014, the Philae probe soft landed on comet 67P/Churyumov-Gerasimenko, the first time in history that such an extraordinary feat has been achieved. After the landing, the Rosetta orbiter followed the comet through its perihelion in August 2015, and will continue to accompany 67P/Churyumov-Gerasimenko as it recedes from the Sun until the end of the mission. There are significant challenges in managing the science archive of a mission such as Rosetta. The first data were returned from Rosetta more than 10 years ago, and there have been flybys of several planetary bodies, including two asteroids from which significant science data were returned by many of the instruments. The scientific applications for these flyby data can be very different to those taken during the main science phase at the comet, but there are severe limitations on the changes that can be applied to the data pipelines managed by the various science teams as resources are scarce. The priority is clearly on maximising the potential science from the comet phase, so data formats and pipelines have been designed with that in mind, and changes limited to managing issues found during official archiving authority and independent science reviews. In addition, in the time that Rosetta has been operating, the archiving standards themselves have evolved. All Rosetta data are archived following version 3 of NASA's Planetary Data System (PDS) Standards. Currently, new and upcoming planetary science missions are delivering data

  16. Configuration and Data Management Process and the System Safety Professional

    Science.gov (United States)

    Shivers, Charles Herbert; Parker, Nelson C. (Technical Monitor)

    2001-01-01

    This article presents a discussion of the configuration management (CM) and the Data Management (DM) functions and provides a perspective of the importance of configuration and data management processes to the success of system safety activities. The article addresses the basic requirements of configuration and data management generally based on NASA configuration and data management policies and practices, although the concepts are likely to represent processes of any public or private organization's well-designed configuration and data management program.

  17. Challenges in network science: Applications to infrastructures, climate, social systems and economics

    Science.gov (United States)

    Havlin, S.; Kenett, D. Y.; Ben-Jacob, E.; Bunde, A.; Cohen, R.; Hermann, H.; Kantelhardt, J. W.; Kertész, J.; Kirkpatrick, S.; Kurths, J.; Portugali, J.; Solomon, S.

    2012-11-01

    Network theory has become one of the most visible theoretical frameworks that can be applied to the description, analysis, understanding, design and repair of multi-level complex systems. Complex networks occur everywhere, in man-made and human social systems, in organic and inorganic matter, from nano to macro scales, and in natural and anthropogenic structures. New applications are developed at an ever-increasing rate and the promise for future growth is high, since increasingly we interact with one another within these vital and complex environments. Despite all the great successes of this field, crucial aspects of multi-level complex systems have been largely ignored. Important challenges of network science are to take into account many of these missing realistic features such as strong coupling between networks (networks are not isolated), the dynamics of networks (networks are not static), interrelationships between structure, dynamics and function of networks, interdependencies in given networks (and other classes of links, including different signs of interactions), and spatial properties (including geographical aspects) of networks. This aim of this paper is to introduce and discuss the challenges that future network science needs to address, and how different disciplines will be accordingly affected.

  18. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    Science.gov (United States)

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Integrating Mercury Science and Policy in the Marine Context: Challenges and Opportunities

    Science.gov (United States)

    Lambert, Kathleen F.; Evers, David C.; Warner, Kimberly A.; King, Susannah L.; Selin, Noelle E.

    2014-01-01

    Mercury is a global pollutant and presents policy challenges at local, regional, and global scales. Mercury poses risks to the health of people, fish, and wildlife exposed to elevated levels of mercury, most commonly from the consumption of methylmercury in marine and estuarine fish. The patchwork of current mercury abatement efforts limits the effectiveness of national and multi-national policies. This paper provides an overview of the major policy challenges and opportunities related to mercury in coastal and marine environments, and highlights science and policy linkages of the past several decades. The U.S. policy examples explored here point to the need for a full life cycle approach to mercury policy with a focus on source reduction and increased attention to: (1) the transboundary movement of mercury in air, water, and biota; (2) the coordination of policy efforts across multiple environmental media; (3) the cross-cutting issues related to pollutant interactions, mitigation of legacy sources, and adaptation to elevated mercury via improved communication efforts; and (4) the integration of recent research on human and ecological health effects into benefits analyses for regulatory purposes. Stronger science and policy integration will benefit national and international efforts to prevent, control, and minimize exposure to methylmercury. PMID:22901766

  20. Semantic Enhancement for Enterprise Data Management

    Science.gov (United States)

    Ma, Li; Sun, Xingzhi; Cao, Feng; Wang, Chen; Wang, Xiaoyuan; Kanellos, Nick; Wolfson, Dan; Pan, Yue

    Taking customer data as an example, the paper presents an approach to enhance the management of enterprise data by using Semantic Web technologies. Customer data is the most important kind of core business entity a company uses repeatedly across many business processes and systems, and customer data management (CDM) is becoming critical for enterprises because it keeps a single, complete and accurate record of customers across the enterprise. Existing CDM systems focus on integrating customer data from all customer-facing channels and front and back office systems through multiple interfaces, as well as publishing customer data to different applications. To make the effective use of the CDM system, this paper investigates semantic query and analysis over the integrated and centralized customer data, enabling automatic classification and relationship discovery. We have implemented these features over IBM Websphere Customer Center, and shown the prototype to our clients. We believe that our study and experiences are valuable for both Semantic Web community and data management community.

  1. Generalized data management systems and scientific information

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    This report aims to stimulate scientists of all disciplines to consider the advantages of using a generalized data management system (GDMS) for storage, manipulation and retrieval of the data they collect and often need to share. It should also be of interest to managers and programmers who need to make decisions on the management of scientific (numeric or non-numeric) data. Another goal of this report is to expose the features that a GDMS should have which are specifically necessary to support scientific data, such as data types and special manipulation functions. A GDMS is a system that provides generalized tools for the purpose of defining a database structure, for loading the data, for modification of the data, and for organizing the database for efficient retrieval and formatted output. A data management system is 'generalized' when it provides a user-oriented language for the different functions, so that it is possible to define any new database, its internal organization, and to retrieve and modify the data without the need to develop special purpose software (program) for each new database

  2. Development of the environmental data management system

    International Nuclear Information System (INIS)

    Tatebe, Kazuaki; Suzuki, Yurina; Shirato, Seiichi; Sato, Yoshinori

    2012-02-01

    The recent society requires business activities with environmental consideration to every enterprise. Also, Japanese laws require those activities. For example, 'Law Concerning the Promotion of Business Activities with Environmental Consideration by Specified Corporations, etc, by Facilitating Access to Environmental Information, and Other Measures' (Environmental Consideration Law) mandates publication of a report relating to the activities of environmental consideration to each enterprise above designated size. 'Act on the Rational Use of Energy' mandates the report of the results of energy consumption and the long-term plan of the rational use of energy. Moreover, 'Act on Promotion of Global Warming Countermeasures' mandates the report of the greenhouse gas emissions. In addition to those, 'Water Pollution Control Law', 'Waste Management and Public Cleaning Law' and other environmental laws as well as environmental ordinances require business activities with environmental consideration to all companies. So, it is very important for Japan Atomic Energy Agency (JAEA) to report business activities with environmental consideration in order to build up trustful relations with the nation and communities. The Environmental Data Management System has been developed as the data base of business activities with environmental consideration in JAEA and as the means to promote the activities at every site and office of JAEA. This report summarizes the structure of the Environmental Data Management System, kinds of environmental performance data treated by the system, and gathering methods of the data. (author)

  3. DIRAC: reliable data management for LHCb

    International Nuclear Information System (INIS)

    Smith, A C; Tsaregorodtsev, A

    2008-01-01

    DIRAC, LHCb's Grid Workload and Data Management System, utilizes WLCG resources and middleware components to perform distributed computing tasks satisfying LHCb's Computing Model. The Data Management System (DMS) handles data transfer and data access within LHCb. Its scope ranges from the output of the LHCb Online system to Grid-enabled storage for all data types. It supports metadata for these files in replica and bookkeeping catalogues, allowing dataset selection and localization. The DMS controls the movement of files in a redundant fashion whilst providing utilities for accessing all metadata. To do these tasks effectively the DMS requires complete self integrity between its components and external physical storage. The DMS provides highly redundant management of all LHCb data to leverage available storage resources and to manage transient errors in underlying services. It provides data driven and reliable distribution of files as well as reliable job output upload, utilizing VO Boxes at LHCb Tier1 sites to prevent data loss. This paper presents several examples of mechanisms implemented in the DMS to increase reliability, availability and integrity, highlighting successful design choices and limitations discovered

  4. Development of a SCAT data management manual

    International Nuclear Information System (INIS)

    Lamarche, A.; Owens, E.H.

    2005-01-01

    The Shoreline Cleanup Assessment Technique (SCAT) is a commonly used method in North America to document oiling conditions in the aftermath of an oil spill. The data generated by SCAT can support all aspects of the process needed to develop and apply shoreline treatment methods such as planning treatment strategies, selecting treatment methods, providing detailed instructions to response personnel and evaluating the response effort. In order to be effective, SCAT data must be validated, entered within computerized systems, and transformed into support documents such as maps, tables and reports. This paper describes the development of a guidance manual for SCAT data coordinators and spill response managers that use the results of SCAT data. Guidance is presented for emergency procedures that enable the generation of minimal, but adequate, SCAT data management services. The creation of the manual involved the development of formal descriptions of the role, responsibilities and abilities of the SCAT data management team. The manual provides solution to several data processing issues, including those related to the presence of multiple parallel bands of oil within a segment of shoreline. Summary tables were created to report the length of oiled shoreline by oiling category and to present surface oiling characteristics in overview maps. The manual also includes details on how SCAT data is used for response planning, decision making and to support operations. 12 refs., 7 tabs., 8 figs

  5. The soil education technical commission of the Brazilian Soil Science Society: achievements and challenges

    Science.gov (United States)

    Muggler, Cristine Carole; Aparecida de Mello, Nilvania

    2013-04-01

    last three symposia was dramatically changed compared to the former ones, considering both participants and papers: basic school teachers, science mediators instead of university docents and a prevalence of papers on soil education in basic schools and non-formal education. The main challenge for soil scientists remains in how to spread the knowledge about the importance of soil and its care among individuals and society in general. Diversified experiences, strategies and instruments are on the move, still soils are overlooked in the present environmental issues. Within the commission the challenge remains with the popularity of the subject in the academic world: it is marginal, it is an interface between knowledge areas and it is commonly the second subject of researchers, easily abandoned when work pressure grows.

  6. Human Error and the International Space Station: Challenges and Triumphs in Science Operations

    Science.gov (United States)

    Harris, Samantha S.; Simpson, Beau C.

    2016-01-01

    Any system with a human component is inherently risky. Studies in human factors and psychology have repeatedly shown that human operators will inevitably make errors, regardless of how well they are trained. Onboard the International Space Station (ISS) where crew time is arguably the most valuable resource, errors by the crew or ground operators can be costly to critical science objectives. Operations experts at the ISS Payload Operations Integration Center (POIC), located at NASA's Marshall Space Flight Center in Huntsville, Alabama, have learned that from payload concept development through execution, there are countless opportunities to introduce errors that can potentially result in costly losses of crew time and science. To effectively address this challenge, we must approach the design, testing, and operation processes with two specific goals in mind. First, a systematic approach to error and human centered design methodology should be implemented to minimize opportunities for user error. Second, we must assume that human errors will be made and enable rapid identification and recoverability when they occur. While a systematic approach and human centered development process can go a long way toward eliminating error, the complete exclusion of operator error is not a reasonable expectation. The ISS environment in particular poses challenging conditions, especially for flight controllers and astronauts. Operating a scientific laboratory 250 miles above the Earth is a complicated and dangerous task with high stakes and a steep learning curve. While human error is a reality that may never be fully eliminated, smart implementation of carefully chosen tools and techniques can go a long way toward minimizing risk and increasing the efficiency of NASA's space science operations.

  7. Engaging High School Science Teachers in Field-Based Seismology Research: Opportunities and Challenges

    Science.gov (United States)

    Long, M. D.

    2015-12-01

    Research experiences for secondary school science teachers have been shown to improve their students' test scores, and there is a substantial body of literature about the effectiveness of RET (Research Experience for Teachers) or SWEPT (Scientific Work Experience Programs for Teachers) programs. RET programs enjoy substantial support, and several opportunities for science teachers to engage in research currently exist. However, there are barriers to teacher participation in research projects; for example, laboratory-based projects can be time consuming and require extensive training before a participant can meaningfully engage in scientific inquiry. Field-based projects can be an effective avenue for involving teachers in research; at its best, earth science field work is a fun, highly immersive experience that meaningfully contributes to scientific research projects, and can provide a payoff that is out of proportion to a relatively small time commitment. In particular, broadband seismology deployments provide an excellent opportunity to provide teachers with field-based research experience. Such deployments are labor-intensive and require large teams, with field tasks that vary from digging holes and pouring concrete to constructing and configuring electronics systems and leveling and orienting seismometers. A recently established pilot program, known as FEST (Field Experiences for Science Teachers) is experimenting with providing one week of summer field experience for high school earth science teachers in Connecticut. Here I report on results and challenges from the first year of the program, which is funded by the NSF-CAREER program and is being run in conjunction with a temporary deployment of 15 seismometers in Connecticut, known as SEISConn (Seismic Experiment for Imaging Structure beneath Connecticut). A small group of teachers participated in a week of field work in August 2015 to deploy seismometers in northern CT; this experience followed a visit of the

  8. The Nuclear Education and Staffing Challenge: Rebuilding Critical Skills in Nuclear Science and Technology

    International Nuclear Information System (INIS)

    Wogman, Ned A.; Bond, Leonard J.; Waltar, Alan E.; Leber, R E.

    2005-01-01

    The United States, the Department of Energy (DOE) and its National Laboratories, including the Pacific Northwest National Laboratory (PNNL), are facing a serious attrition of nuclear scientists and engineers and their capabilities through the effects of aging staff. Within the DOE laboratories, 75% of nuclear personnel will be eligible to retire by 2010. It is expected that there will be a significant loss of senior nuclear science and technology staff at PNNL within five years. PNNL's nuclear legacy is firmly rooted in the DOE Hanford site, the World War II Manhattan Project, and subsequent programs. Historically, PNNL was a laboratory were 70% of its activities were nuclear/radiological, and now just under 50% of its current business science and technology are nuclear and radiologically oriented. Programs in the areas of Nuclear Legacies, Global Security, Nonproliferation, Homeland Security and National Defense, Radiobiology and Nuclear Energy still involve more than 1,000 of the 3,800 current laboratory staff, and these include more than 420 staff who are certified as nuclear/radiological scientists and engineers. This paper presents the current challenges faced by PNNL that require an emerging strategy to solve the nuclear staffing issues through the maintenance and replenishment of the human nuclear capital needed to support PNNL nuclear science and technology programs

  9. Data Science Careers: A Sampling of Successful Strategies, Pitfalls, and Persistent Challenges

    Science.gov (United States)

    Stocks, K. I.; Duerr, R.; Wyborn, L. A.; Yarmey, L.

    2015-12-01

    Data Scientists do not have a single career trajectory or preparatory pathway. Successful data scientists have come from domain sciences, computer science, library science, and other diverse fields. They have worked up from entry-level staff positions, have started as academics with doctoral degrees, and have established themselves as management professionals. They have positions in government, industry, academia, and NGO's, and their responsibilities range from highly specialized, to generalists, to high-level leadership. This presents a potentially confusing landscape for students interested in the field: how to decide among the varied options to have the best chance at fulfilling employment? What are the mistakes to avoid? Many established data scientist, both old-timers and early career professionals, expressed interest in presenting in this session but were unable to justify using their one AGU abstract for something other than their funded projects. As the session chairs we interviewed them, plus our extended network of colleagues, to ask for their best advice on what was most critical to their success in their current position, what pitfalls to avoid, what ongoing challenges they see, and what advice they would give themselves, if they could do it all over again starting now. Here we consolidate those interviews with our own perspectives to present some of the common themes and standout advice.

  10. The Nuclear Education and Staffing Challenge: Rebuilding Critical Skills in Nuclear Science and Technology

    International Nuclear Information System (INIS)

    Wogman, Ned A.; Bond, Leonard J.; Waltar, Alan E.; Leber, R E.

    2005-01-01

    The United States, the Department of Energy (DOE) and its National Laboratories, including the Pacific Northwest National Laboratory (PNNL), are facing a serious attrition of nuclear scientists and engineers and their capabilities through the effects of aging staff. Within the DOE laboratories, 75% of nuclear personnel will be eligible to retire by 2010. It is expected that there will be a significant loss of senior nuclear science and technology staff at PNNL within five years. PNNL's nuclear legacy is firmly rooted in the DOE Hanford site, the World War II Manhattan Project, and subsequent programs. Historically, PNNL was a laboratory where 70% of its activities were nuclear/radiological, and now just under 50% of its current business science and technology are nuclear and radiologically oriented. Programs in the areas of Nuclear Legacies, Global Security, Nonproliferation, Homeland Security and National Defense, Radiobiology and Nuclear Energy still involve more than 1,000 of the 3,800 current laboratory staff, and these include more than 420 staff who are certified as nuclear/radiological scientists and engineers. This paper presents the current challenges faced by PNNL that require an emerging strategy to solve the nuclear staffing issues through the maintenance and replenishment of the human nuclear capital needed to support PNNL nuclear science and technology programs

  11. The nuclear education and staffing challenge: Rebuilding critical skills in nuclear science and technology

    International Nuclear Information System (INIS)

    Wogman, N.A.; Bond, L.J.; Waltar, A.E.; Leber, R.E.

    2005-01-01

    The United States, the Department of Energy (DOE) and its National Laboratories, including the Pacific Northwest National Laboratory (PNNL), are facing a serious attrition of nuclear scientists and engineers and their capabilities through the effects of aging staff. Within the DOE laboratories, 75% of nuclear personnel will be eligible to retire by 2010. It is expected that there will be a significant loss of senior nuclear science and technology staff at PNNL within five years. PNNL's nuclear legacy is firmly rooted in the DOE Hanford site, the World War II Manhattan Project, and subsequent programs. Historically, PNNL was a laboratory where 70% of its activities were nuclear/radiological, and now just under 50% of its current business science and technology are nuclear and radiologically oriented. Programs in the areas of nuclear legacies, global security, nonproliferation, homeland security and national defense, radiobiology and nuclear energy still involve more than 1,000 of the 3,800 current laboratory staff, and these include more than 420 staff who are certified as nuclear/radiological scientists and engineers. Current challenges faced by PNNL that require an emerging strategy to solve the nuclear staffing issues through the maintenance and replenishment of the human nuclear capital needed to support PNNL nuclear science and technology programs are presented. (author)

  12. Enabling a new Paradigm to Address Big Data and Open Science Challenges

    Science.gov (United States)

    Ramamurthy, Mohan; Fisher, Ward

    2017-04-01

    Data are not only the lifeblood of the geosciences but they have become the currency of the modern world in science and society. Rapid advances in computing, communi¬cations, and observational technologies — along with concomitant advances in high-resolution modeling, ensemble and coupled-systems predictions of the Earth system — are revolutionizing nearly every aspect of our field. Modern data volumes from high-resolution ensemble prediction/projection/simulation systems and next-generation remote-sensing systems like hyper-spectral satellite sensors and phased-array radars are staggering. For example, CMIP efforts alone will generate many petabytes of climate projection data for use in assessments of climate change. And NOAA's National Climatic Data Center projects that it will archive over 350 petabytes by 2030. For researchers and educators, this deluge and the increasing complexity of data brings challenges along with the opportunities for discovery and scientific breakthroughs. The potential for big data to transform the geosciences is enormous, but realizing the next frontier depends on effectively managing, analyzing, and exploiting these heterogeneous data sources, extracting knowledge and useful information from heterogeneous data sources in ways that were previously impossible, to enable discoveries and gain new insights. At the same time, there is a growing focus on the area of "Reproducibility or Replicability in Science" that has implications for Open Science. The advent of cloud computing has opened new avenues for not only addressing both big data and Open Science challenges to accelerate scientific discoveries. However, to successfully leverage the enormous potential of cloud technologies, it will require the data providers and the scientific communities to develop new paradigms to enable next-generation workflows and transform the conduct of science. Making data readily available is a necessary but not a sufficient condition. Data providers

  13. EPA Leadership on Science, Innovation, and Decision Support Tools for Addressing Current and Future Challenges.

    Science.gov (United States)

    Hecht, Alan D; Ferster, Aaron; Summers, Kevin

    2017-10-16

    When the U.S. Environmental Protection Agency (EPA) was established nearly 50 years ago, the nation faced serious threats to its air, land, and water, which in turn impacted human health. These threats were effectively addressed by the creation of EPA (in 1970) and many subsequent landmark environmental legislations which in turn significantly reduced threats to the Nation's environment and public health. A key element of historic legislation is research aimed at dealing with current and future problems. Today we face national and global challenges that go beyond classic media-specific (air, land, water) environmental legislation and require an integrated paradigm of action and engagement based on (1) innovation based on science and technology, (2) stakeholder engagement and collaboration, and (3) public education and support. This three-pronged approach recognizes that current environmental problems, include social as well as physical and environmental factors, are best addressed through collaborative problem solving, the application of innovation in science and technology, and multiple stakeholder engagement. To achieve that goal, EPA's Office of Research and Development (ORD) is working directly with states and local communities to develop and apply a suite of accessible decision support tools (DST) that aim to improve environmental conditions, protect human health, enhance economic opportunity, and advance a resilient and sustainability society. This paper showcases joint EPA and state actions to develop tools and approaches that not only meet current environmental and public health challenges, but do so in a way that advances sustainable, healthy, and resilient communities well into the future. EPA's future plans should build on current work but aim to effectively respond to growing external pressures. Growing pressures from megatrends are a major challenge for the new Administration and for cities and states across the country. The recent hurricanes hitting

  14. GeoDataspaces: Simplifying Data Management Tasks with Globus

    Science.gov (United States)

    Malik, T.; Chard, K.; Tchoua, R. B.; Foster, I.

    2014-12-01

    Data and its management are central to modern scientific enterprise. Typically, geoscientists rely on observations and model output data from several disparate sources (file systems, RDBMS, spreadsheets, remote data sources). Integrated data management solutions that provide intuitive semantics and uniform interfaces, irrespective of the kind of data source are, however, lacking. Consequently, geoscientists are left to conduct low-level and time-consuming data management tasks, individually, and repeatedly for discovering each data source, often resulting in errors in handling. In this talk we will describe how the EarthCube GeoDataspace project is improving this situation for seismologists, hydrologists, and space scientists by simplifying some of the existing data management tasks that arise when developing computational models. We will demonstrate a GeoDataspace, bootstrapped with "geounits", which are self-contained metadata packages that provide complete description of all data elements associated with a model run, including input/output and parameter files, model executable and any associated libraries. Geounits link raw and derived data as well as associating provenance information describing how data was derived. We will discuss challenges in establishing geounits and describe machine learning and human annotation approaches that can be used for extracting and associating ad hoc and unstructured scientific metadata hidden in binary formats with data resources and models. We will show how geounits can improve search and discoverability of data associated with model runs. To support this model, we will describe efforts related towards creating a scalable metadata catalog that helps to maintain, search and discover geounits within the Globus network of accessible endpoints. This talk will focus on the issue of creating comprehensive personal inventories of data assets for computational geoscientists, and describe a publishing mechanism, which can be used to

  15. Addressing big data challenges for scientific data infrastructure

    NARCIS (Netherlands)

    Demchenko, Y.; Zhao, Z.; Grosso, P.; Wibisono, A.; de Laat, C.

    2012-01-01

    This paper discusses the challenges that are imposed by Big Data Science on the modern and future Scientific Data Infrastructure (SDI). The paper refers to different scientific communities to define requirements on data management, access control and security. The paper introduces the Scientific

  16. The Biological and Chemical Oceanography Data Management Office

    Science.gov (United States)

    Allison, M. D.; Chandler, C. L.; Groman, R. C.; Wiebe, P. H.; Glover, D. M.; Gegg, S. R.

    2011-12-01

    Oceanography and marine ecosystem research are inherently interdisciplinary fields of study that generate and require access to a wide variety of measurements. In late 2006 the Biological and Chemical Oceanography Sections of the National Science Foundation (NSF) Geosciences Directorate Division of Ocean Sciences (OCE) funded the Biological and Chemical Oceanography Data Management Office (BCO-DMO). In late 2010 additional funding was contributed to support management of research data from the NSF Office of Polar Programs Antarctic Organisms & Ecosystems Program. The BCO-DMO is recognized in the 2011 Division of Ocean Sciences Sample and Data Policy as one of several program specific data offices that support NSF OCE funded researchers. BCO-DMO staff members offer data management support throughout the project life cycle to investigators from large national programs and medium-sized collaborative research projects, as well as researchers from single investigator awards. The office manages and serves all types of oceanographic data and information generated during the research process and contributed by the originating investigators. BCO-DMO has built a data system that includes the legacy data from several large ocean research programs (e.g. United States Joint Global Ocean Flux Study and United States GLOBal Ocean ECosystems Dynamics), to which data have been contributed from recently granted NSF OCE and OPP awards. The BCO-DMO data system can accommodate many different types of data including: in situ and experimental biological, chemical, and physical measurements; modeling results and synthesis data products. The system enables reuse of oceanographic data for new research endeavors, supports synthesis and modeling activities, provides availability of "real data" for K-12 and college level use, and provides decision-support field data for policy-relevant investigations. We will present an overview of the data management system capabilities including: map

  17. Challenges at the Frontiers of Matter and Energy: Transformative Opportunities for Discovery Science

    Energy Technology Data Exchange (ETDEWEB)

    Hemminger, John C. [Univ. of California, Irvine, CA (United States); Sarrao, John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); University of Illinois, Chicago; Flemming, Graham [Univ. of California, Berkeley, CA (United States); Ratner, Mark [Northwestern Univ., Evanston, IL (United States)

    2015-11-01

    FIVE TRANSFORMATIVE OPPORTUNITIES FOR DISCOVERY SCIENCE As a result of this effort, it has become clear that the progress made to date on the five Grand Challenges has created a springboard for seizing five new Transformative Opportunities that have the potential to further transform key technologies involving matter and energy. These five new Transformative Opportunities and the evidence supporting them are discussed in this new report, “Challenges at the Frontiers of Matter and Energy: Transformative Opportunities for Discovery Science.” Mastering Hierarchical Architectures and Beyond-Equilibrium Matter Complex materials and chemical processes transmute matter and energy, for example from CO2 and water to chemical fuel in photosynthesis, from visible light to electricity in solar cells and from electricity to light in light emitting diodes (LEDs) Such functionality requires complex assemblies of heterogeneous materials in hierarchical architectures that display time-dependent away-from-equilibrium behaviors. Much of the foundation of our understanding of such transformations however, is based on monolithic single- phase materials operating at or near thermodynamic equilibrium. The emergent functionalities enabling next-generation disruptive energy technologies require mastering the design, synthesis, and control of complex hierarchical materials employing dynamic far-from-equilibrium behavior. A key guide in this pursuit is nature, for biological systems prove the power of hierarchical assembly and far- from-equilibrium behavior. The challenges here are many: a description of the functionality of hierarchical assemblies in terms of their constituent parts, a blueprint of atomic and molecular positions for each constituent part, and a synthesis strategy for (a) placing the atoms and molecules in the proper positions for the component parts and (b) arranging the component parts into the required hierarchical structure. Targeted functionality will open the door

  18. The ATLAS Data Management Software Engineering Process

    CERN Document Server

    Lassnig, M; The ATLAS collaboration; Stewart, G A; Barisits, M; Beermann, T; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also hi...

  19. The ATLAS Data Management Software Engineering Process

    CERN Document Server

    Lassnig, M; The ATLAS collaboration; Stewart, G A; Barisits, M; Beermann, T; Vigne, R; Serfon, C; Goossens, L; Nairz, A

    2013-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also hi...

  20. Semantic-Based RFID Data Management

    Science.gov (United States)

    de Virgilio, Roberto; di Sciascio, Eugenio; Ruta, Michele; Scioscia, Floriano; Torlone, Riccardo

    Traditional Radio-Frequency IDentification (RFID) applications have been focused on replacing bar codes in supply chain management. Leveraging a ubiquitous computing architecture, the chapter presents a framework allowing both quick decentralized on-line item discovery and centralized off-line massive business logic analysis, according to needs and requirements of supply chain actors. A semantic-based environment, where tagged objects become resources exposing to an RFID reader not a trivial identification code but a semantic annotation, enables tagged objects to describe themselves on the fly without depending on a centralized infrastructure. On the other hand, facing on data management issues, a proposal is formulated for an effective off-line multidimensional analysis of huge amounts of RFID data generated and stored along the supply chain.

  1. Large data management and systematization of simulation

    International Nuclear Information System (INIS)

    Ueshima, Yutaka; Saitho, Kanji; Koga, James; Isogai, Kentaro

    2004-01-01

    In the advanced photon research large-scale simulations are powerful tools. In the numerical experiments, real-time visualization and steering system are thought as hopeful methods of data analysis. This approach is valid in the stereotype analysis at one time or short-cycle simulation. In the research for an unknown problem, it is necessary that the output data can be analyzed many times because profitable analysis is difficult at the first time. Consequently, output data should be filed to refer and analyze at any time. To support the research, we need the followed automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The Large Data Management system will be functional Problem Solving Environment distributed system. (author)

  2. ATF [Advanced Toroidal Facility] data management

    International Nuclear Information System (INIS)

    Kannan, K.L.; Baylor, L.R.

    1988-01-01

    Data management for the Advanced Toroidal Facility (ATF), a stellarator located at Oak Ridge National Laboratory (ORNL), is provided by DMG, a locally developed, VAX-based software system. DMG is a data storage and retrieval software system that provides the user interface to ATF raw and analyzed data. Data are described in terms of data models and data types and are organized as signals into files, which are internally documented. The system was designed with user accessibility, software maintainability, and extensibility as primary goals. Extensibility features include compatibility with ATF as it moves from pulsed to steady-state operation and capability for use of the DMG system with experiments other than ATF. DMG is implemented as a run-time library of routines available as a shareable image. General-purpose and specialized data acquisition and analysis applications have been developed using the DMG system. This paper describes the DMG system and the interfaces to it. 4 refs., 2 figs

  3. Data Management System of the DIRAC Project

    CERN Multimedia

    Haen, Christophe; Tsaregorodtsev, Andrei

    2015-01-01

    The DIRAC Interware provides a development framework and a complete set of components for building distributed computing systems. The DIRAC Data Management System (DMS) offers all the necessary tools to ensure data handling operations for small and large user communities. It supports transparent access to storage resources based on multiple technologies, and is easily expandable. The information on data files and replicas is kept in a File Catalog of which DIRAC offers a powerful and versatile implementation (DFC). Data movement can be performed using third party services including FTS3. Bulk data operations are resilient with respect to failures due to the use of the Request Management System (RMS) that keeps track of ongoing tasks. In this contribution we will present an overview of the DIRAC DMS capabilities and its connection with other DIRAC subsystems such as the Transformation System. The DIRAC DMS is in use by several user communities now. The contribution will present the experience of the LHCb exper...

  4. The Marshall Islands Data Management Program

    Energy Technology Data Exchange (ETDEWEB)

    Stoker, A.C.; Conrado, C.L.

    1995-09-01

    This report is a resource document of the methods and procedures used currently in the Data Management Program of the Marshall Islands Dose Assessment and Radioecology Project. Since 1973, over 60,000 environmental samples have been collected. Our program includes relational database design, programming and maintenance; sample and information management; sample tracking; quality control; and data entry, evaluation and reduction. The usefulness of scientific databases involves careful planning in order to fulfill the requirements of any large research program. Compilation of scientific results requires consolidation of information from several databases, and incorporation of new information as it is generated. The success in combining and organizing all radionuclide analysis, sample information and statistical results into a readily accessible form, is critical to our project.

  5. The Marshall Islands Data Management Program

    International Nuclear Information System (INIS)

    Stoker, A.C.; Conrado, C.L.

    1995-09-01

    This report is a resource document of the methods and procedures used currently in the Data Management Program of the Marshall Islands Dose Assessment and Radioecology Project. Since 1973, over 60,000 environmental samples have been collected. Our program includes relational database design, programming and maintenance; sample and information management; sample tracking; quality control; and data entry, evaluation and reduction. The usefulness of scientific databases involves careful planning in order to fulfill the requirements of any large research program. Compilation of scientific results requires consolidation of information from several databases, and incorporation of new information as it is generated. The success in combining and organizing all radionuclide analysis, sample information and statistical results into a readily accessible form, is critical to our project

  6. Chernobyl experience of emergency data management

    International Nuclear Information System (INIS)

    Bolshov, L.; Linge, I.; Arutyunyan, R.; Ilushkin, A.; Kanevsky, M.; Kiselev, V.; Melikhova, E.; Ossipiants, I.; Pavlovsky, O.

    1997-01-01

    The use of the Chernobyl experience in emergency data management is presented. Information technologies for the generalization of practical experience in the protection of the population after the Chernobyl accident are described. The two main components of this work are the development of the administrative information system (AIS) and the creation of the central data bank. The current state of the AIS, the data bank and the bank of models is described. Data accumulated and models are used to estimate the consequences of radiation accidents and to provide different types of prognosis. Experience of accumulated analysis data allows special software to be developed for large-scale simulation of radiation consequences of major radiation accidents and to organize practical exercises. Some examples of such activity are presented. (orig.)

  7. Data management in HEP: An approach

    CERN Document Server

    Furano, F

    2011-01-01

    In this work we describe an approach to data access and data management in High Energy Physics (HEP), which privileges performance, simplicity and scalability, in storage systems that co-operate. We also show why the typical HEP workload is well positioned to access geographically distributed data repositories and then weigh the advantages and disadvantages of accessing data across the Wide Area Network. We discuss some points related to the architecture that a data access/management system should have in order to exploit these possibilities. Currently, this kind of methods were explored by using the xrootd/Scalla software suite, that is a workable example of a distributed non-transactional data repository for the HEP environment. The Scalla architecture naturally allows us to build globally federated repositories congruent with diverse HEP collaborations and their data access needs. These methodologies, however, are based on the concept of caching associated to a performant messaging system and to an efficie...

  8. The DIRAC Data Management System (poster)

    CERN Document Server

    Haen, Christophe

    2015-01-01

    The DIRAC Interware provides a development framework and a complete set of components for building distributed computing systems. The DIRAC Data Management System (DMS) offers all the necessary tools to ensure data handling operations for small and large user communities. It supports transparent access to storage resources based on multiple technologies, and is easily expandable. The information on data files and replicas is kept in a File Catalog of which DIRAC offers a powerful and versatile implementation (DFC). Data movement can be performed using third party services including FTS3. Bulk data operations are resilient with respect to failures due to the use of the Request Management System (RMS) that keeps track of ongoing tasks. In this contribution we will present an overview of the DIRAC DMS capabilities and its connection with other DIRAC subsystems such as the Transformation System. The DIRAC DMS is in use by several user communities now. The contribution will present the experience of the LHCb exper...

  9. Space Station data management system architecture

    Science.gov (United States)

    Mallary, William E.; Whitelaw, Virginia A.

    1987-01-01

    Within the Space Station program, the Data Management System (DMS) functions in a dual role. First, it provides the hardware resources and software services which support the data processing, data communications, and data storage functions of the onboard subsystems and payloads. Second, it functions as an integrating entity which provides a common operating environment and human-machine interface for the operation and control of the orbiting Space Station systems and payloads by both the crew and the ground operators. This paper discusses the evolution and derivation of the requirements and issues which have had significant effect on the design of the Space Station DMS, describes the DMS components and services which support system and payload operations, and presents the current architectural view of the system as it exists in October 1986; one-and-a-half years into the Space Station Phase B Definition and Preliminary Design Study.

  10. The ATLAS Distributed Data Management System & Databases

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Barisits, M; Beermann, T; Vigne, R; Serfon, C

    2013-01-01

    The ATLAS Distributed Data Management (DDM) System is responsible for the global management of petabytes of high energy physics data. The current system, DQ2, has a critical dependency on Relational Database Management Systems (RDBMS), like Oracle. RDBMS are well-suited to enforcing data integrity in online transaction processing applications, however, concerns have been raised about the scalability of its data warehouse-like workload. In particular, analysis of archived data or aggregation of transactional data for summary purposes is problematic. Therefore, we have evaluated new approaches to handle vast amounts of data. We have investigated a class of database technologies commonly referred to as NoSQL databases. This includes distributed filesystems, like HDFS, that support parallel execution of computational tasks on distributed data, as well as schema-less approaches via key-value stores, like HBase. In this talk we will describe our use cases in ATLAS, share our experiences with various databases used ...

  11. Data management strategies for nuclear training

    International Nuclear Information System (INIS)

    Zerbo, J.N.; Gwinn, A.E.

    1993-01-01

    Use of systematic training development technologies has become a standard for the commercial nuclear power industry and for many Department of Energy facilities. Such systems involve detailed analysis of job functions, tasks and skill requirements and correlation of that information to the courses, curricula and testing instruments used in the training process. Nuclear training programs are subject to audit and evaluation by a number of government and industry organizations. The ability to establish an audit trail, from initial task analysis to final examination is crucial to demonstrating the completeness and validity of a systematic training program. This paper provides perspective on aspects of the training data management problem, status of technological solutions, and characteristics of data base management systems that are best suited for application to training programs

  12. Data management at Biosphere 2 center

    Science.gov (United States)

    McCreary, Leone F.

    1997-01-01

    Throughout the history of Biosphere 2, the collecting and recording of biological data has been sporadic. Currently no active effort to administer and record regular biological surveys is being made. Also, there is no central location, such as an on-site data library, where all records from various studies have been archived. As a research institute, good, complete data records are at the core of all Biosphere 2's scientific endeavors. It is therefore imperative that an effective data management system be implemented within the management and research departments as soon as possible. Establishing this system would require three general phases: (1) Design/implement a new archiving/management program (including storage, cataloging and retrieval systems); (2) Organize and input baseline and intermediate data from existing archives; and (3) Maintain records by inputting new data.

  13. The new Generation of Data Management

    CERN Multimedia

    CERN. Geneva; Grotz, Stephan

    2013-01-01

    Last year, Software AG acquired Terracotta, a leading player for In Memory Data Management for the Enterprise. With it, Software AG is bringing performance at any scale for the Business Application. Together with two other technologies, CEP (Complex Event Processing) and Nirvana (a Low latency Messaging System), Software AG offers a visionary platform for Big Data and Cloud Computing. The presentation will focus on Terracotta and CEP, addressing the following points: General presentation of Terracotta (Big Memory Max) & CEP Engine (Business Events) Explanation of the Terracotta architecture, scalability and possible fields of application CEP introduction, integration with Terracotta, analysis of real-time events For those interested in further details, there will be an additional session in the afternoon (14-16h) about fields of application and interesting Use Cases in different industries and CERN, namely: Examples of real use cases for Terracotta Code e...

  14. Simulation Data Management - Requirements and Design Specification

    Energy Technology Data Exchange (ETDEWEB)

    Clay, Robert L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Friedman-Hill, Ernest J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gibson, Marcus J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoffman, Edward L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olson, Kevin H. [Science Applications International Corporation (SAIC), Reston, VA (United States); Laney, Daniel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-11-01

    Simulation Data Management (SDM), the ability to securely organize, archive, and share analysis models and the artifacts used to create them, is a fundamental requirement for modern engineering analysis based on computational simulation. We have worked separately to provide secure, network SDM services to engineers and scientists at our respective laboratories for over a decade. We propose to leverage our experience and lessons learned to help develop and deploy a next-generation SDM service as part of a multi-laboratory team. This service will be portable across multiple sites and platforms, and will be accessible via a range of command-line tools and well-documented APIs. In this document, we’ll review our high-level and low-level requirements for such a system, review one existing system, and briefly discuss our proposed implementation.

  15. A Conditions Data Management System for HEP Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Laycock, P. J. [CERN; Dykstra, D. [Fermilab; Formica, A. [Saclay; Govi, G. [Fermilab; Pfeiffer, A. [CERN; Roe, S. [CERN; Sipos, R. [Eotvos U.

    2017-01-01

    Conditions data infrastructure for both ATLAS and CMS have to deal with the management of several Terabytes of data. Distributed computing access to this data requires particular care and attention to manage request-rates of up to several tens of kHz. Thanks to the large overlap in use cases and requirements, ATLAS and CMS have worked towards a common solution for conditions data management with the aim of using this design for data-taking in Run 3. In the meantime other experiments, including NA62, have expressed an interest in this cross- experiment initiative. For experiments with a smaller payload volume and complexity, there is particular interest in simplifying the payload storage. The conditions data management model is implemented in a small set of relational database tables. A prototype access toolkit consisting of an intermediate web server has been implemented, using standard technologies available in the Java community. Access is provided through a set of REST services for which the API has been described in a generic way using standard Open API specications, implemented in Swagger. Such a solution allows the automatic generation of client code and server stubs and further allows changes in the backend technology transparently. An important advantage of using a REST API for conditions access is the possibility of caching identical URLs, addressing one of the biggest challenges that large distributed computing solutions impose on conditions data access, avoiding direct DB access by means of standard web proxy solutions.

  16. A mass spectrometry proteomics data management platform.

    Science.gov (United States)

    Sharma, Vagisha; Eng, Jimmy K; Maccoss, Michael J; Riffle, Michael

    2012-09-01

    Mass spectrometry-based proteomics is increasingly being used in biomedical research. These experiments typically generate a large volume of highly complex data, and the volume and complexity are only increasing with time. There exist many software pipelines for analyzing these data (each typically with its own file formats), and as technology improves, these file formats change and new formats are developed. Files produced from these myriad software programs may accumulate on hard disks or tape drives over time, with older files being rendered progressively more obsolete and unusable with each successive technical advancement and data format change. Although initiatives exist to standardize the file formats used in proteomics, they do not address the core failings of a file-based data management system: (1) files are typically poorly annotated experimentally, (2) files are "organically" distributed across laboratory file systems in an ad hoc manner, (3) files formats become obsolete, and (4) searching the data and comparing and contrasting results across separate experiments is very inefficient (if possible at all). Here we present a relational database architecture and accompanying web application dubbed Mass Spectrometry Data Platform that is designed to address the failings of the file-based mass spectrometry data management approach. The database is designed such that the output of disparate software pipelines may be imported into a core set of unified tables, with these core tables being extended to support data generated by specific pipelines. Because the data are unified, they may be queried, viewed, and compared across multiple experiments using a common web interface. Mass Spectrometry Data Platform is open source and freely available at http://code.google.com/p/msdapl/.

  17. DMPTool 2: Expanding Functionality for Better Data Management Planning

    Directory of Open Access Journals (Sweden)

    Carly Strasser

    2014-07-01

    Full Text Available Scholarly researchers today are increasingly required to engage in a range of data management planning activities to comply with institutional policies, or as a precondition for publication or grant funding. The latter is especially true in the U.S. in light of the recent White House Office of Science and Technology Policy (OSTP mandate aimed at maximizing the availability of all outputs – data as well as the publications that summarize them – resulting from federally-funded research projects. To aid researchers in creating effective data management plans (DMPs, a group of organizations – California Digital Library, DataONE, Digital Curation Centre, Smithsonian Institution, University of Illinois Urbana-Champaign, and University of Virginia Library – collaborated on the development of the DMPTool, an online application that helps researchers create data management plans. The DMPTool provides detailed guidance, links to general and institutional resources, and walks a researcher through the process of generating a comprehensive plan tailored to specific DMP requirements. The uptake of the DMPTool has been positive: to date, it has been used by over 6,000 researchers from 800 institutions, making use of more than 20 requirements templates customized for funding bodies. With support from the Alfred P. Sloan Foundation, project partners are now engaged in enhancing the features of the DMPTool. The second version of the tool has enhanced functionality for plan creators and institutional administrators, as well as a redesigned user interface and an open RESTful application programming interface (API. New administrative functions provide the means for institutions to better support local research activities. New capabilities include support for plan co-ownership; workflow provisions for internal plan review; simplified maintenance and addition of DMP requirements templates; extensive capabilities for the customization of guidance and resources

  18. Forging the Solution to the Energy Challenge: The Role of Materials Science and Materials Scientists

    Science.gov (United States)

    Wadsworth, Jeffrey

    2010-05-01

    The energy challenge is central to the most important strategic problems facing the United States and the world. It is increasingly clear that even large-scale deployments of the best technologies available today cannot meet the rising energy demands of a growing world population. Achieving a secure and sustainable energy future will require full utilization of, and substantial improvements in, a comprehensive portfolio of energy systems and technologies. This goal is complicated by several factors. First, energy strategies are inextricably linked to national security and health issues. Second, in developing and deploying energy technologies, it is vital to consider not only environmental issues, such as global climate change, but also economic considerations, which strongly influence both public and political views on energy policy. Third, a significant and sustained effort in basic and applied research and development (R&D) will be required to deliver the innovations needed to ensure a desirable energy future. Innovations in materials science and engineering are especially needed to overcome the limits of essentially all energy technologies. A wealth of historical evidence demonstrates that such innovations are also the key to economic prosperity. From the development of the earliest cities around flint-trading centers, to the Industrial Revolution, to today’s silicon-based global economy, the advantage goes to those who lead in exploiting materials. I view our challenge by considering the rate of innovation and the transition of discovery to the marketplace as the relationship among R&D investment, a skilled and talented workforce, business innovations, and the activities of competitors. Most disturbing in analyzing this relationship is the need for trained workers in science, technology, engineering, and mathematics (STEM). To develop the STEM workforce needed for innovation, we need sustainable, positive change in STEM education at all levels from preschool

  19. Arctic System Science: Meeting Earth System and Social Impact Challenges through Integrative Approaches and Synthesis

    Science.gov (United States)

    Vorosmarty, C. J.; Hinzman, L. D.; Rawlins, M. A.; Serreze, M. C.; Francis, J. A.; Liljedahl, A. K.; McDonald, K. C.; Piasecki, M.; Rich, R. H.; Holland, M. M.

    2017-12-01

    The Arctic is an integral part of the Earth system where multiple interactions unite its natural and human elements. Recent observations show the Arctic to be experiencing rapid and amplified signatures of global climate change. At the same time, the Arctic system's response to this broader forcing has itself become a central research topic, given its potential role as a critical throttle on future planetary dynamics. Changes are already impacting life systems and economic prosperity and continued change is expected to bear major implications far outside the region. We also have entered an era when environmental management, traditionally local in scope, must confront regional, whole biome, and pan-Arctic biogeophysical challenges. While challenges may appear to operate in isolation, they emerge within the context of an evolving, integrated Arctic system defined by interactions among natural and social sub-systems. Clearly, new efforts aimed at community planning, industrial development, and infrastructure construction must consider this multiplicity of interacting processes. We recently organized an "Arctic System Synthesis Workshop Series" supported by the Arctic Systems Science Program of NSF and devoted to exploring approaches capable of uncovering the systems-level behavior in both the natural and social sciences domains. The series featured two topical meetings. The first identified the sources responsible for extreme climate events in the Arctic. The second focused on multiple "currencies" within the system (i.e., water, energy, carbon, nutrients) and how they interact to produce systems-level behaviors. More than 40 experts participated, drawn from the ranks of Arctic natural and social sciences. We report here on the workshop series consensus report, which identifies a broad array of topics. Principal among these are a consideration of why study the Arctic as a system, as well as an articulation of the major systems-level approaches to support basic as well

  20. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  1. Data management and statistical analysis for environmental assessment

    International Nuclear Information System (INIS)

    Wendelberger, J.R.; McVittie, T.I.

    1995-01-01

    Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities

  2. Federated and Cloud Enabled Resources for Data Management and Utilization

    Science.gov (United States)

    Rankin, R.; Gordon, M.; Potter, R. G.; Satchwill, B.

    2011-12-01

    The emergence of cloud computing over the past three years has led to a paradigm shift in how data can be managed, processed and made accessible. Building on the federated data management system offered through the Canadian Space Science Data Portal (www.cssdp.ca), we demonstrate how heterogeneous and geographically distributed data sets and modeling tools have been integrated to form a virtual data center and computational modeling platform that has services for data processing and visualization embedded within it. We also discuss positive and negative experiences in utilizing Eucalyptus and OpenStack cloud applications, and job scheduling facilitated by Condor and Star Cluster. We summarize our findings by demonstrating use of these technologies in the Cloud Enabled Space Weather Data Assimilation and Modeling Platform CESWP (www.ceswp.ca), which is funded through Canarie's (canarie.ca) Network Enabled Platforms program in Canada.

  3. Advancing data management and analysis in different scientific disciplines

    Science.gov (United States)

    Fischer, M.; Gasthuber, M.; Giesler, A.; Hardt, M.; Meyer, J.; Prabhune, A.; Rigoll, F.; Schwarz, K.; Streit, A.

    2017-10-01

    Over the past several years, rapid growth of data has affected many fields of science. This has often resulted in the need for overhauling or exchanging the tools and approaches in the disciplines’ data life cycles. However, this allows the application of new data analysis methods and facilitates improved data sharing. The project Large-Scale Data Management and Analysis (LSDMA) of the German Helmholtz Association has been addressing both specific and generic requirements in its data life cycle successfully since 2012. Its data scientists work together with researchers from the fields such as climatology, energy and neuroscience to improve the community-specific data life cycles, in several cases even all stages of the data life cycle, i.e. from data acquisition to data archival. LSDMA scientists also study methods and tools that are of importance to many communities, e.g. data repositories and authentication and authorization infrastructure.

  4. A data management system to enable urgent natural disaster computing

    Science.gov (United States)

    Leong, Siew Hoon; Kranzlmüller, Dieter; Frank, Anton

    2014-05-01

    consequences Hard deadline: Missing a hard deadline renders the computation useless and results in full catastrophic consequences. A prototype of this system has a REST-based service manager. The REST-based implementation provides a uniform interface that is easy to use. New and upcoming file transfer protocols can easily be extended and accessed via the service manager. The service manager interacts with the other four managers to coordinate the data activities so that the fundamental natural disaster urgent computing requirement, i.e. deadline, can be fulfilled in a reliable manner. A data activity can include data storing, data archiving and data storing. Reliability is ensured by the choice of a network of managers organisation model[1] the configuration manager and the fault tolerance manager. With this proposed design, an easy to use, resource-independent data management system that can support and fulfill the computation of a natural disaster prediction within stipulated deadlines can thus be realised. References [1] H. G. Hegering, S. Abeck, and B. Neumair, Integrated management of networked systems - concepts, architectures, and their operational application, Morgan Kaufmann Publishers, 340 Pine Stret, Sixth Floor, San Francisco, CA 94104-3205, USA, 1999. [2] H. Kopetz, Real-time systems design principles for distributed embedded applications, second edition, Springer, LLC, 233 Spring Street, New York, NY 10013, USA, 2011. [3] S. H. Leong, A. Frank, and D. Kranzlmu¨ ller, Leveraging e-infrastructures for urgent computing, Procedia Computer Science 18 (2013), no. 0, 2177 - 2186, 2013 International Conference on Computational Science. [4] N. Trebon, Enabling urgent computing within the existing distributed computing infrastructure, Ph.D. thesis, University of Chicago, August 2011, http://people.cs.uchicago.edu/~ntrebon/docs/dissertation.pdf.

  5. Transboundary fisheries science: Meeting the challenges of inland fisheries management in the 21st century

    Science.gov (United States)

    Midway, Stephen R.; Wagner, Tyler; Zydlewski, Joseph D.; Irwin, Brian J.; Paukert, Craig P.

    2016-01-01

    Managing inland fisheries in the 21st century presents several obstacles, including the need to view fisheries from multiple spatial and temporal scales, which usually involves populations and resources spanning sociopolitical boundaries. Though collaboration is not new to fisheries science, inland aquatic systems have historically been managed at local scales and present different challenges than in marine or large freshwater systems like the Laurentian Great Lakes. Therefore, we outline a flexible strategy that highlights organization, cooperation, analytics, and implementation as building blocks toward effectively addressing transboundary fisheries issues. Additionally, we discuss the use of Bayesian hierarchical models (within the analytical stage), due to their flexibility in dealing with the variability present in data from multiple scales. With growing recognition of both ecological drivers that span spatial and temporal scales and the subsequent need for collaboration to effectively manage heterogeneous resources, we expect implementation of transboundary approaches to become increasingly critical for effective inland fisheries management.

  6. Challenge

    International Nuclear Information System (INIS)

    Schwitters, R.F.

    1996-01-01

    The design of new and upgrades of existing high energy particle accelerators is reviewed in light of the current knowledge of the standard model determined from existing and past machines and funding factors. Current financing of science will delay determining unknowns, such as CP violation, proton decay, neutrino properties, and dark matter. Three options are given: (1) obtain more funding, (2) downsize scientific personnel as are private enterprises or (3) develop new technology which will reduce the high cost of building current designs of high energy accelerators. (AIP) copyright 1996 American Institute of Physics

  7. Machine learning of network metrics in ATLAS Distributed Data Management

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00218873; The ATLAS collaboration; Toler, Wesley; Vamosi, Ralf; Bogado Garcia, Joaquin Ignacio

    2017-01-01

    The increasing volume of physics data poses a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from one of our ongoing automation efforts that focuses on network metrics. First, we describe our machine learning framework built atop the ATLAS Analytics Platform. This framework can automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our m...

  8. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  9. Intelligent data management for real-time spacecraft monitoring

    Science.gov (United States)

    Schwuttke, Ursula M.; Gasser, Les; Abramson, Bruce

    1992-01-01

    Real-time AI systems have begun to address the challenge of restructuring problem solving to meet real-time constraints by making key trade-offs that pursue less than optimal strategies with minimal impact on system goals. Several approaches for adapting to dynamic changes in system operating conditions are known. However, simultaneously adapting system decision criteria in a principled way has been difficult. Towards this end, a general technique for dynamically making such trade-offs using a combination of decision theory and domain knowledge has been developed. Multi-attribute utility theory (MAUT), a decision theoretic approach for making one-time decisions is discussed and dynamic trade-off evaluation is described as a knowledge-based extension of MAUT that is suitable for highly dynamic real-time environments, and provides an example of dynamic trade-off evaluation applied to a specific data management trade-off in a real-world spacecraft monitoring application.

  10. Research data management practical strategies for information professionals

    CERN Document Server

    2014-01-01

    It has become increasingly accepted that important digital data must be retained and shared in order to preserve and promote knowledge, advance research in and across all disciplines of scholarly endeavor, and maximize the return on investment of public funds. To meet this challenge, colleges and universities are adding data services to existing infrastructures by drawing on the expertise of information professionals who are already involved in the acquisition, management and preservation of data in their daily jobs. Data services include planning and implementing good data management practices, thereby increasing researchers’ ability to compete for grant funding and ensuring that data collections with continuing value are preserved for reuse. This volume provides a framework to guide information professionals in academic libraries, presses, and data centers through the process of managing research data from the planning stages through the life of a grant project and beyond. It illustrates principle...

  11. Agile Data Management with the Global Change Information System

    Science.gov (United States)

    Duggan, B.; Aulenbach, S.; Tilmes, C.; Goldstein, J.

    2013-12-01

    We describe experiences applying agile software development techniques to the realm of data management during the development of the Global Change Information System (GCIS), a web service and API for authoritative global change information under development by the US Global Change Research Program. Some of the challenges during system design and implementation have been : (1) balancing the need for a rigorous mechanism for ensuring information quality with the realities of large data sets whose contents are often in flux, (2) utilizing existing data to inform decisions about the scope and nature of new data, and (3) continuously incorporating new knowledge and concepts into a relational data model. The workflow for managing the content of the system has much in common with the development of the system itself. We examine various aspects of agile software development and discuss whether or how we have been able to use them for data curation as well as software development.

  12. Machine learning of network metrics in ATLAS Distributed Data Management

    Science.gov (United States)

    Lassnig, Mario; Toler, Wesley; Vamosi, Ralf; Bogado, Joaquin; ATLAS Collaboration

    2017-10-01

    The increasing volume of physics data poses a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from one of our ongoing automation efforts that focuses on network metrics. First, we describe our machine learning framework built atop the ATLAS Analytics Platform. This framework can automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for networkaware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  13. Science for the Poor: How One Woman Challenged Researchers, Ranchers, and Loggers in Amazonia

    Directory of Open Access Journals (Sweden)

    Patricia Shanley

    2006-12-01

    Full Text Available In the lower Tocantins region of Brazil, one Amazonian woman questioned why scientists publish principally for elite audiences. Her experience suggests that the impact may be enhanced by also sharing data with people who depend upon forest goods. Having defended her family homestead near the city of Cameta against loggers in the late 1980s, Glória Gaia became interested in strengthening the information base of other villagers so that they would not lose their forests for meager sums. She challenged scientists to defy norms such as extracting data without giving back to rural villagers and publishing primarily for the privileged. Working with researchers, she helped them to publish an illustrated manual of the ecology, economics, management, and cultural importance of key Amazonian forest species. With and without funds or a formal project, she traveled by foot and boat to remote villages to disseminate the book. Using data, stories, and song, she brought cautionary messages to villages about the impacts of logging on livelihoods. She also brought locally useful processing techniques regarding medicinal plants, fruit, and tree oils. Her holistic teachings challenged traditional forestry to include the management of fruits, fibers, and medicines. A new version of the book, requested by the government of Brazil, contains the contributions of 90 leading Brazilian and international scientists and local people. Glória Gaia's story raises the questions: Who is science for and how can science reach disenfranchised populations? Lessons for scientists and practitioners from Glória's story include: broadening the range of products from research to reach local people, complementing local ecological knowledge with scientific data, sharing precautionary data demonstrating trends, and involving women and marginalized people in the research and outreach process.

  14. Synergy with HST and JWST Data Management Systems

    Science.gov (United States)

    Greene, Gretchen; Space Telescope Data Management Team

    2014-01-01

    The data processing and archive systems for the JWST will contain a petabyte of science data and the best news is that users will have fast access to the latest calibrations through a variety of new services. With a synergistic approach currently underway with the STScI science operations between the Hubble Space Telescope and James Webb Space Telescope data management subsystems (DMS), operational verification is right around the corner. Next year the HST archive will provide scientists on-demand fully calibrated data products via the Mikulski Archive for Space Telescopes (MAST), which takes advantage of an upgraded DMS. This enhanced system, developed jointly with the JWST DMS is based on a new CONDOR distributed processing system capable of reprocessing data using a prioritization queue which runs in the background. A Calibration Reference Data System manages the latest optimal configuration for each scientific instrument pipeline. Science users will be able to search and discover the growing MAST archive calibrated datasets from these missions along with the other multiple mission holdings both local to MAST and available through the Virtual Observatory. JWST data systems will build upon the successes and lessons learned from the HST legacy and move us forward into the next generation of multi-wavelength archive research.

  15. Data Publication: A Partnership between Scientists, Data Managers and Librarians

    Science.gov (United States)

    Raymond, L.; Chandler, C.; Lowry, R.; Urban, E.; Moncoiffe, G.; Pissierssens, P.; Norton, C.; Miller, H.

    2012-04-01

    Current literature on the topic of data publication suggests that success is best achieved when there is a partnership between scientists, data managers, and librarians. The Marine Biological Laboratory/Woods Hole Oceanographic Institution (MBLWHOI) Library and the Biological and Chemical Oceanography Data Management Office (BCO-DMO) have developed tools and processes to automate the ingestion of metadata from BCO-DMO for deposit with datasets into the Institutional Repository (IR) Woods Hole Open Access Server (WHOAS). The system also incorporates functionality for BCO-DMO to request a Digital Object Identifier (DOI) from the Library. This partnership allows the Library to work with a trusted data repository to ensure high quality data while the data repository utilizes library services and is assured of a permanent archive of the copy of the data extracted from the repository database. The assignment of persistent identifiers enables accurate data citation. The Library can assign a DOI to appropriate datasets deposited in WHOAS. A primary activity is working with authors to deposit datasets associated with published articles. The DOI would ideally be assigned before submission and be included in the published paper so readers can link directly to the dataset, but DOIs are also being assigned to datasets related to articles after publication. WHOAS metadata records link the article to the datasets and the datasets to the article. The assignment of DOIs has enabled another important collaboration with Elsevier, publisher of educational and professional science journals. Elsevier can now link from articles in the Science Direct database to the datasets available from WHOAS that are related to that article. The data associated with the article are freely available from WHOAS and accompanied by a Dublin Core metadata record. In addition, the Library has worked with researchers to deposit datasets in WHOAS that are not appropriate for national, international, or domain

  16. Collaborative Development of e-Infrastructures and Data Management Practices for Global Change Research

    Science.gov (United States)

    Samors, R. J.; Allison, M. L.

    2016-12-01

    An e-infrastructure that supports data-intensive, multidisciplinary research is being organized under the auspices of the Belmont Forum consortium of national science funding agencies to accelerate the pace of science to address 21st century global change research challenges. The pace and breadth of change in information management across the data lifecycle means that no one country or institution can unilaterally provide the leadership and resources required to use data and information effectively, or needed to support a coordinated, global e-infrastructure. The five action themes adopted by the Belmont Forum: 1. Adopt and make enforceable Data Principles that establish a global, interoperable e-infrastructure. 2. Foster communication, collaboration and coordination between the wider research community and Belmont Forum and its projects through an e-Infrastructure Coordination, Communication, & Collaboration Office. 3. Promote effective data planning and stewardship in all Belmont Forum agency-funded research with a goal to make it enforceable. 4. Determine international and community best practice to inform Belmont Forum research e-infrastructure policy through identification and analysis of cross-disciplinary research case studies. 5. Support the development of a cross-disciplinary training curriculum to expand human capacity in technology and data-intensive analysis methods. The Belmont Forum is ideally poised to play a vital and transformative leadership role in establishing a sustained human and technical international data e-infrastructure to support global change research. In 2016, members of the 23-nation Belmont Forum began a collaborative implementation phase. Four multi-national teams are undertaking Action Themes based on the recommendations above. Tasks include mapping the landscape, identifying and documenting existing data management plans, and scheduling a series of workshops that analyse trans-disciplinary applications of existing Belmont Forum

  17. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    International Nuclear Information System (INIS)

    Khaleel, Mohammad A.

    2009-01-01

    This report is an account of the deliberations and conclusions of the workshop on 'Forefront Questions in Nuclear Science and the Role of High Performance Computing' held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to (1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; (2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; (3) provide nuclear physicists the opportunity to influence the development of high performance computing; and (4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  18. Challenges and opportunities: using a science-based video game in secondary school settings

    Science.gov (United States)

    Muehrer, Rachel; Jenson, Jennifer; Friedberg, Jeremy; Husain, Nicole

    2012-12-01

    Simulations and games are not new artifacts to the study of science in secondary school settings (Hug, Kriajcik and Marx 2005), however teachers remain skeptical as to their value, use and appropriateness (Rice 2006). The difficulty is not only the design and development of effective play environments that produce measurable changes in knowledge and/or understanding, but also in their on-the-ground use (Jaipal and Figg 2010). This paper reports on the use of a science-focused video game in five very different secondary school settings in Ontario, Canada. A mixed-methods approach was used in the study, and included data gathered on general gameplay habits and technology use, as well as informal interviews with teachers and students who played the game. In total, 161 participants played a series of games focused on the "life of a plant", and were given both a pre and post quiz to determine if the game helped them retain and/or change what they knew about scientific processes like plant cell anatomy and photosynthesis. Participants showed statistically significant improvement on quizzes that were taken after playing the game for approximately one-hour sessions, despite difficulties in some cases both accessing and playing the game for the full hour. Our findings also reveal the ongoing challenges in making use of technology in a variety of school sessions, even when using a browser-based game, that demanded very little other than a reliable internet connection.

  19. The Anthropocene : A Challenge for the History of Science, Technology, and the Environment.

    Science.gov (United States)

    Trischler, Helmuth

    2016-09-01

    In 2000, when atmospheric chemist Paul J. Crutzen and limnologist Eugene F. Stoermer proposed to introduce a new geological era, the Anthropocene, they could not have foreseen the remarkable career of the new term. Within a few years, the geological community began to investigate the scientific evidence for the concept and established the Anthropocene Working Group. While the Working Group has started to examine possible markers and periodizations of the new epoch, scholars from numerous other disciplines have taken up the Anthropocene as a cultural concept. In addition, the media have developed a deep interest in the Anthropocene's broader societal ramifications. The article sheds light on the controversial debate about the Anthropocene and discusses its inextricably linked dual careers, first as a geological term and second as a cultural term. Third, it argues that the debate about the "Age of Humans" is a timely opportunity both to rethink the nature-culture relation and to re-assess the narratives that historians of science, technology, and the environment have written until now. Specifically, it examines both the heuristic and analytical power of the concept. It discusses new histories, new ideas to understand historical change, and new temporalities shaped by scholars who have taken up the challenge of the Anthropocene as a cultural concept that has the ability to question established stories and narratives. Fourth, it ends by stressing the potential of the Anthropocene concept to blur established epistemological boundaries and to stimulate cross-disciplinary collaborations between the sciences and the humanities.

  20. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2009-10-01

    This report is an account of the deliberations and conclusions of the workshop on "Forefront Questions in Nuclear Science and the Role of High Performance Computing" held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to 1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; 2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; 3) provide nuclear physicists the opportunity to influence the development of high performance computing; and 4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  1. History, achievements, and future challenges of Japanse Society of Soil Science and Plant Nutrition

    Science.gov (United States)

    Kosaki, Takashi

    2013-04-01

    established East and Southeast Asian Federation of Soil Science Societies (ESAFS) in 1991. Since the early 1990s the research topics have become more related to the global as well as regional environmental issues. Major achievements in the history of the society may include 1) development of research particularly on paddy soils and volcanic ash soils, 2) consistent commitment to the education for constructing sustainable society, and 3) international cooperation in improving rice production in the developing countries particularly in Tropical Asia. Today 2,699 members are registered in the society, which includes 9 divisions and holds an annual meeting every year. Two journals are bimonthly published, i.e. "Japanese Journal of Soil Science and Plant Nutrition" in Japanese and "SSPN" in English and the latter was recognized as a cooperating journal of IUSS in 2010. Future challenges of the society are 1) more commitment to international organizations, e.g. EGU in addition to IUSS, ESAFS and other soil-based communities, 2) enhancement of international cooperation for developing countries not only in Asia but also Africa, and 3) acceleration of soils research and education in association with related disciplines for constructing a holistically harmonized society on the planet earth.

  2. Challenges and Changes: Developing Teachers' and Initial Teacher Education Students' Understandings of the Nature of Science

    Science.gov (United States)

    Ward, Gillian; Haigh, Mavis

    2017-12-01

    Teachers need an understanding of the nature of science (NOS) to enable them to incorporate NOS into their teaching of science. The current study examines the usefulness of a strategy for challenging or changing teachers' understandings of NOS. The teachers who participated in this study were 10 initial teacher education chemistry students and six experienced teachers from secondary and primary schools who were introduced to an explicit and reflective activity, a dramatic reading about a historical scientific development. Concept maps were used before and after the activity to assess teachers' knowledge of NOS. The participants also took part in a focus group interview to establish whether they perceived the activity as useful in developing their own understanding of NOS. Initial analysis led us to ask another group, comprising seven initial teacher education chemistry students, to take part in a modified study. These participants not only completed the same tasks as the previous participants but also completed a written reflection commenting on whether the activity and focus group discussion enhanced their understanding of NOS. Both Lederman et al.'s (Journal of Research in Science Teaching, 39(6), 497-521, 2002) concepts of NOS and notions of "naive" and "informed" understandings of NOS and Hay's (Studies in Higher Education, 32(1), 39-57, 2007) notions of "surface" and "deep" learning were used as frameworks to examine the participants' specific understandings of NOS and the depth of their learning. The ways in which participants' understandings of NOS were broadened or changed by taking part in the dramatic reading are presented. The impact of the data-gathering tools on the participants' professional learning is also discussed.

  3. Distributed Data Management Service for VPH Applications

    NARCIS (Netherlands)

    Koulouzis, S.; Belloum, A.; Bubak, M.; Lamata, P.; Nolte, D.; Vasyunin, D.; de Laat, C.

    2016-01-01

    For many medical applications, it's challenging to access large datasets, which are often hosted across different domains on heterogeneous infrastructures. Homogenizing the infrastructure to simplify data access is unrealistic; therefore, it's important to develop distributed storage that doesn't

  4. Challenges of E-learning in Medical Sciences: A Review Article

    Directory of Open Access Journals (Sweden)

    mahim naderifar

    2017-06-01

    Full Text Available Background and objective: Extension of knowledge and information is given a new meaning to the concept of education. One of The most important reasons for the use of e-learning in medical education is that learning is learned by the learners themselves. This method facilitates their individualized education programs. This study introduced the challenges and solutions for the achievement of e-learning in medical education. Materials and Methods: This is a review article that was implemented a comprehensive review using the World Wide Web. The databases such as Medline, Ovid, ProQuest, and PubMed as well as key words “e-learning, educational challenges and medical education” in Persian and English languages were used. Of the 80 articles fund, 30 articles which were related to the research objective were chosen. Results: The research showed that e-learning, despite its advantages and wide applications, has drawbacks including the lack of implementation by lecturer due to lack of knowledge of its functioning, the fading role of lecturer, lack of expertise in its application, fear of its application, special cultural beliefs and insufficient resources. Conclusion: It is necessary to consider establishing standards and substructures for achieving the implementation of e-learning in medical education. Because of inexperience of universities of medical sciences in Iran compared with other universities around the world, we suggest using the experience of universities in other countries. Also holding workshops based on e-learning can be effective.

  5. Mining the Quantified Self: Personal Knowledge Discovery as a Challenge for Data Science.

    Science.gov (United States)

    Fawcett, Tom

    2015-12-01

    The last several years have seen an explosion of interest in wearable computing, personal tracking devices, and the so-called quantified self (QS) movement. Quantified self involves ordinary people recording and analyzing numerous aspects of their lives to understand and improve themselves. This is now a mainstream phenomenon, attracting a great deal of attention, participation, and funding. As more people are attracted to the movement, companies are offering various new platforms (hardware and software) that allow ever more aspects of daily life to be tracked. Nearly every aspect of the QS ecosystem is advancing rapidly, except for analytic capabilities, which remain surprisingly primitive. With increasing numbers of qualified self participants collecting ever greater amounts and types of data, many people literally have more data than they know what to do with. This article reviews the opportunities and challenges posed by the QS movement. Data science provides well-tested techniques for knowledge discovery. But making these useful for the QS domain poses unique challenges that derive from the characteristics of the data collected as well as the specific types of actionable insights that people want from the data. Using a small sample of QS time series data containing information about personal health we provide a formulation of the QS problem that connects data to the decisions of interest to the user.

  6. Making On-line Science Course Materials Easily Translatable and Accessible Worldwide: Challenges and Solutions

    Science.gov (United States)

    Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.

    2012-02-01

    The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET project team overcame this challenge by creating the Translation Utility. This tool allows a person fluent in both English and another language to easily translate any of the PhET simulations and requires minimal computer expertise. In this paper we discuss the technical issues involved in this software solution, as well as the issues involved in obtaining accurate translations. We share our solutions to many of the unexpected problems we encountered that would apply generally to making on-line scientific course materials available in many different languages, including working with: languages written right-to-left, different character sets, and different conventions for expressing equations, variables, units and scientific notation.

  7. A Longitudinal Study of Implementing Reality Pedagogy in an Urban Science Classroom: Effects, Challenges, and Recommendations for Science Teaching and Learning

    Science.gov (United States)

    Borges, Sheila Ivelisse

    Statistics indicate that students who reside in forgotten places do not engage in science-related careers. This is problematic because we are not tapping into diverse talent that could very well make scientific strides and because there is a moral obligation for equity as discussed in Science for all (AAAS, 1989). Research suggests that one of the reasons for this disparity is that students feel alienated from science early on in their K--12 education due to their inability to connect culturally with their teachers (Tobin, 2001). Urban students share an urban culture, a way of knowing and being that is separate from that of the majority of the teacher workforce whom have not experienced the nuances of urban culture. These teachers have challenges when teaching in urban classrooms and have a myriad of difficulties such as classroom management, limited access to experienced science colleagues and limited resources to teach effectively. This leads them to leaving the teaching profession affecting already high teacher attrition rates in urban areas (Ingersol, 2001). In order to address these issues a culturally relevant pedagogy, called reality pedagogy (Emdin, 2011), was implemented in an urban science classroom using a bricolage (Denzin & Lincoln, 2005) of different theories such as social capital (Bourdieu, 1986) and critical race theory (Ladson-Billings & Tate, 1995), along with reality pedagogy to construct a qualitative sociocultural lens. Reality pedagogy has five tools, which are cogenerative dialogues, coteaching, cosmopolitanism, context, and content. In this longitudinal critical ethnography a science teacher in an alternative teaching certification program was supported for two years as she implemented the tools of reality pedagogy with her urban students. Findings revealed that the science teacher enacted four racial microaggressions against her students, which negatively affected the teacher-student relationship and science teaching and learning. As the

  8. Data Management Consulting at the Johns Hopkins University

    Science.gov (United States)

    Varvel, Virgil E., Jr.; Shen, Yi

    2013-01-01

    As research data complexity and quantity grows and funding agency requirements for data management are articulated, there is a growing need for data management services (DMS). Within these services, one important role emerging is that of data management consultant (DMC). Roles were analyzed that these professionals play through case study analysis…

  9. Teaching and Learning Science in the 21st Century: Challenging Critical Assumptions in Post-Secondary Science

    Science.gov (United States)

    Glaze, Amanda L.

    2018-01-01

    It is widely agreed upon that the goal of science education is building a scientifically literate society. Although there are a range of definitions for science literacy, most involve an ability to problem solve, make evidence-based decisions, and evaluate information in a manner that is logical. Unfortunately, science literacy appears to be an…

  10. The ATLAS data management software engineering process

    International Nuclear Information System (INIS)

    Lassnig, M; Garonne, V; Stewart, G A; Barisits, M; Serfon, C; Goossens, L; Nairz, A; Beermann, T; Vigne, R; Molfetas, A

    2014-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  11. The ATLAS data management software engineering process

    Science.gov (United States)

    Lassnig, M.; Garonne, V.; Stewart, G. A.; Barisits, M.; Beermann, T.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  12. A data management system for radiological data

    International Nuclear Information System (INIS)

    Burns, R.E.; Shonka, J.J.; DeBord, D.M.; Sukalac, T.R.

    1996-01-01

    A data management system oriented toward the visualization of large data sets has been developed. The system, which runs under the Windows trademark operating environment, provides most of the image-based algorithms developed by NASA for space-based imaging. When used in conjunction with large data sets, such as those acquired using position-sensing proportional counter based survey methods, the system can show images of survey data and subject those images to numerous mathematical transformations. The application of such mathematical treatments allows radiological survey data to be analyzed to an unprecedented extent. Visualization of survey data permits a user to see minor artifacts that would have been missed using only conventional survey techniques. The imaged survey data can be subjected to many different treatments, such as filters, smoothing, thresholding, color mapping, and statistical analyses. A variety of radioactive objects and areas have been surveyed using this system in conjunction with a novel floor monitor described elsewhere. Collection and examination of data in this fashion poses a new paradigm for assessing surface contamination. This research demonstrated that new methods for assessing survey performance are needed

  13. ISDMS, Inel Scientific Data Management System

    International Nuclear Information System (INIS)

    Bruestle, H.R.; Russell, K.D.; Snider, D.M.; Stewart, H.D.

    1993-01-01

    Description of program or function: The Idaho National Engineering Laboratory (INEL) Scientific Data Management System, ISDMS, is a generalized scientific data processing system designed to meet the needs of the various organizations at the INEL. It consists of a set of general and specific processors running under the control of an executive processor which serves as the interface between the system and the user. The data requirements at the INEL are primarily for times series analyses. Data acquired at various site facilities are processed on the central CDC CYBER computers. This processing includes: data conversion, data calibration, computed parameter calculations, time series plots, and sundry other applications. The data structure used in ISDMS is CWAF, a common word addressable format. A table driven command language serves as the ISDMS control language. Execution in both batch and interactive mode is possible. All commands and their input arguments are specified in free form. ISDMS is a modular system both at the top executive or MASTER level and in the independent lower or sub-level modules. ISDMS processors were designed and isolated according to their function. This release of ISDMS, identified as 1.3A by the developers, includes processors for data conversion and reformatting for applications programs (e.g. RELAP4), interactive and batch graphics, data analysis, data storage, and archival and development aids

  14. Challenges of the science data processing, analysis and archiving approach in BepiColombo

    Science.gov (United States)

    Martinez, Santa

    BepiColombo is a joint mission of the European Space Agency (ESA) and the Japan Aerospace Exploration Agency (JAXA) to the planet Mercury. It comprises two separate orbiters: the Mercury Planetary Orbiter (MPO) and the Mercury Magnetospheric Orbiter (MMO). After approximately 7.5 years of cruise, BepiColombo will arrive at Mercury in 2024 and will gather data during a 1-year nominal mission, with a possible 1-year extension. The approach selected for BepiColombo for the processing, analysis and archiving of the science data represents a significant change with respect to previous ESA planetary missions. Traditionally Instrument Teams are responsible for processing, analysing and preparing their science data for the long-term archive, however in BepiColombo, the Science Ground Segment (SGS), located in Madrid, Spain, will play a key role in these activities. Fundamental aspects of this approach include: the involvement of the SGS in the definition, development and operation of the instrument processing pipelines; the production of ready-to-archive science products compatible with NASA’s Planetary Data System (PDS) standards in all the processing steps; the joint development of a quick-look analysis system to monitor deviations between planned and executed observations to feed back the results into the different planning cycles when possible; and a mission archive providing access to the scientific products and to the operational data throughout the different phases of the mission (from the early development phase to the legacy phase). In order to achieve these goals, the SGS will need to overcome a number of challenges. The proposed approach requires a flexible infrastructure able to cope with a distributed data processing system, residing in different locations but designed as a single entity. For this, all aspects related to the integration of software developed by different Instrument Teams and the alignment of their development schedules will need to be

  15. Next-Generation Photon Sources for Grand Challenges in Science and Energy

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-05-01

    report identifies two aspects of energy science in which next-generation ultraviolet and X-ray light sources will have the deepest and broadest impact: (1) The temporal evolution of electrons, spins, atoms, and chemical reactions, down to the femtosecond time scale. (2) Spectroscopic and structural imaging of nano objects (or nanoscale regions of inhomogeneous materials) with nanometer spatial resolution and ultimate spectral resolution. The dual advances of temporal and spatial resolution promised by fourth-generation light sources ideally match the challenges of control science. Femtosecond time resolution has opened completely new territory where atomic motion can be followed in real time and electronic excitations and decay processes can be followed over time. Coherent imaging with short-wavelength radiation will make it possible to access the nanometer length scale, where intrinsic quantum behavior becomes dominant. Performing spectroscopy on individual nanometer-scale objects rather than on conglomerates will eliminate the blurring of the energy levels induced by particle size and shape distributions and reveal the energetics of single functional units. Energy resolution limited only by the uncertainty relation is enabled by these advances. Current storage-ring-based light sources and their incremental enhancements cannot meet the need for femtosecond time resolution, nanometer spatial resolution, intrinsic energy resolution, full coherence over energy ranges up to hard X-rays, and peak brilliance required to enable the new science outlined in this report. In fact, the new, unexplored territory is so expansive that no single currently imagined light source technology can fulfill the whole potential. Both technological and economic challenges require resolution as we move forward. For example, femtosecond time resolution and high peak brilliance are required for following chemical reactions in real time, but lower peak brilliance and high repetition rate are needed

  16. From interventions to interactions: Science Museum Arts Projects’ history and the challenges of interpreting art in the Science Museum

    Directory of Open Access Journals (Sweden)

    Hannah Redler

    2009-06-01

    Full Text Available Hannah Redler’s paper examines the 13 year history of Science Museum, London’s contemporary art programme and explores how changing cultural conditions and the changing function of museums are making the questions raised by bringing art into the Science Museum context increasingly significant. It looks at how Science Museum Arts Projects started as a quirky, experimental sideline aimed at shaking up the Museum and its visitors’ assumptions, but has now become a fundamental means by which the Science Museum chooses to represent the impact of science, medicine, engineering and technology on peoples’ everyday lives.

  17. Medical Data Manager an Interface between PACS and the gLite Data Management System

    CERN Document Server

    Montagnat, Johan; Texier, Romain; Nienartowicz, Krzysztof; Baud, Jean-Philippe

    2008-01-01

    The medical imaging community uses the DICOM image format and protocol to store and exchange data. The Medical Data Manager (MDM) is an interface between DICOM compliant systems such as PACS and the EGEE Data Management System. It opens hospital imaging networks to the world scale Grid while protecting sensitive medical data. It can be accessed transparently from any gLite service. It is an important milestone towards adoption of Grid technologies in the medical imaging community. Hospitals continuously produce tremendous amounts of image data that is managed by local PACS (Picture Archiving and Communication Systems). These systems are often limited to a local network access although the community experiences a growing interest for data sharing and remote processing. Indeed, patient data is often spread out different medical data acquisition centers. Furthermore, researchers in the area often need to analyze large populations whose data can be gathered through federations of PACS. Opening PACS to the outer I...

  18. Data Management Practices and Perspectives of Atmospheric Scientists and Engineering Faculty

    Science.gov (United States)

    Wiley, Christie; Mischo, William H.

    2016-01-01

    This article analyzes 21 in-depth interviews of engineering and atmospheric science faculty at the University of Illinois Urbana-Champaign (UIUC) to determine faculty data management practices and needs within the context of their research activities. A detailed literature review of previous large-scale and institutional surveys and interviews…

  19. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science through Cloud-Enabled Climate Analytics-as-a-Service

    Science.gov (United States)

    Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.

    2013-12-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to

  20. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science Through Cloud-enabled Climate Analytics-as-a-service

    Science.gov (United States)

    Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.

    2014-01-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to

  1. The science of animal behavior and welfare: challenges, opportunities and global perspective

    Science.gov (United States)

    Animal welfare science is a relatively new scientific discipline. Originally heavily focused on animal behavior, it has emerged into a truly multi- and inter-disciplinary science, encompassing such sciences as behavior, physiology, pathology, immunology, endocrinology and neuroscience, and influence...

  2. Research Data Management: A Library Practitioner's Perspective

    Science.gov (United States)

    Yu, Siu Hong

    2017-01-01

    The Future Voices in Public Services column is a forum for students in graduate library and information science programs to discuss key issues they see in academic library public services, to envision what they feel librarians in public service have to offer to academia, to relate their visions for the profession, or to describe research that is…

  3. Australia's TERN: Advancing Ecosystem Data Management in Australia

    Science.gov (United States)

    Phinn, S. R.; Christensen, R.; Guru, S.

    2013-12-01

    Globally, there is a consistent movement towards more open, collaborative and transparent science, where the publication and citation of data is considered standard practice. Australia's Terrestrial Ecosystem Research Network (TERN) is a national research infrastructure investment designed to support the ecosystem science community through all stages of the data lifecycle. TERN has developed and implemented a comprehensive network of ';hard' and ';soft' infrastructure that enables Australia's ecosystem scientists to collect, publish, store, share, discover and re-use data in ways not previously possible. The aim of this poster is to demonstrate how TERN has successfully delivered infrastructure that is enabling a significant cultural and practical shift in Australia's ecosystem science community towards consistent approaches for data collection, meta-data, data licensing, and data publishing. TERN enables multiple disciplines, within the ecosystem sciences to more effectively and efficiently collect, store and publish their data. A critical part of TERN's approach has been to build on existing data collection activities, networks and skilled people to enable further coordination and collaboration to build each data collection facility and coordinate data publishing. Data collection in TERN is through discipline based facilities, covering long term collection of: (1) systematic plot based measurements of vegetation structure, composition and faunal biodiversity; (2) instrumented towers making systematic measurements of solar, water and gas fluxes; and (3) satellite and airborne maps of biophysical properties of vegetation, soils and the atmosphere. Several other facilities collect and integrate environmental data to produce national products for fauna and vegetation surveys, soils and coastal data, as well as integrated or synthesised products for modelling applications. Data management, publishing and sharing in TERN are implemented through a tailored data

  4. The Effect of Enrollment in Middle School Challenge Courses on Advanced Placement Exams in Social Studies and Science

    Science.gov (United States)

    Glaude-Bolte, Katherine

    Educators seek to guide students through appropriate programs and courses that prepare them for future success, in more advanced coursework and in other challenges of life. Some middle schools offer Challenge, or honors, courses for students who have demonstrated high ability. High schools often offer Advanced Placement (AP) courses, which are taught at the college level. This study examined the correlation between enrollment in middle school Challenge courses and subsequent AP exam category scores in social studies and science in a suburban school district. The independent variables were the number of years of enrollment in middle school social studies or science Challenge courses. The dependent variables were the AP exam category scores in the eight social studies AP courses or the six science AP courses. The sample sizes were limited to the number of students who took an AP social studies or science exam and also attended the middle school of study. The null hypothesis was that there was no relationship between the two variables. This study included eight social studies AP courses and six science AP courses. A significant positive correlation was indicated in only two of the courses, U.S. Government and Comparative Government, supporting the claim that enrollment in middle school Challenge social studies was correlated with success, at least on these two AP exams. In the remaining 12 courses, there was not enough evidence to reject the null hypothesis. Therefore, enrollment in middle school Challenge science and social studies courses generally did not seem to correlate with AP exam category scores. Results of this study call into question the validity of the claim by the district that enrollment in Challenge courses helps prepare students for rigorous coursework in high school. Several factors, including student readiness, teacher training, familiarity with course content, and previous AP experience may contribute more to a student's AP exam category score

  5. From Utopia to Science: Challenges of Personalised Genomics Information for Health Management and Health Enhancement

    Science.gov (United States)

    2009-01-01

    From 1900 onwards, scientists and novelists have explored the contours of a future society based on the use of “anthropotechnologies” (techniques applicable to human beings for the purpose of performance enhancement ranging from training and education to genome-based biotechnologies). Gradually but steadily, the technologies involved migrated from (science) fiction into scholarly publications, and from “utopia” (or “dystopia”) into science. Building on seminal ideas borrowed from Nietzsche, Peter Sloterdijk has outlined the challenges inherent in this development. Since time immemorial, and at least since the days of Plato’s Academy, human beings have been interested in possibilities for (physical or mental) performance enhancement. We are constantly trying to improve ourselves, both collectively and individually, for better or for worse. At present, however, new genomics-based technologies are opening up new avenues for self-amelioration. Developments in research facilities using animal models may to a certain extent be seen as expeditions into our own future. Are we able to address the bioethical and biopolitical issues awaiting us? After analyzing and assessing Sloterdijk’s views, attention will shift to a concrete domain of application, namely sport genomics. For various reasons, top athletes are likely to play the role of genomics pioneers by using personalized genomics information to adjust diet, life-style, training schedules and doping intake to the strengths and weaknesses of their personalized genome information. Thus, sport genomics may be regarded as a test bed where the contours of genomics-based self-management are tried out. PMID:20234832

  6. How to Visualize and Communicate Challenges in Climate and Environmental Sciences?

    Science.gov (United States)

    Vicari, R.; Schertzer, D. J. M.; Deutsch, J. C.

    2014-12-01

    The challenges of climate and environmental sciences need a renewed dialogue with a large spectrum of stakeholders, ranging from the general publics to specialists. This requires a better use of sophisticated visualization techniques to both forward the information and to follow the corresponding flow of information. A particular case of interest is the question of resilience to extreme weather events that also relies on increasing awareness of urban communities. This research looks at the development of exploration techniques of unstructured Big Data. Indeed access to information on environmental and climate sciences has hugely increased in terms of variety and quantity, as a consequence of different factors, among others the development of public relations by research institutes and the pervasive role of digital media (Bucchi 2013; Trench 2008). We are left with unthinkable amounts of information from blogs, social networks postings, public speeches, press releases, articles, etc. It is possible now to explore and visualize patterns followed by digital information with the support of automated analysis tools. On the other hand these techniques can provide important insights on how different techniques of visual communication can impact on urban resilience to extreme weather. The selected case studies correspond to several research projects under the umbrella of the Chair "Hydrology for resilient cities" aimed to develop and test new solutions in urban hydrology that will contribute to the resilience of our cities to extreme weather. These research projects - ranging from regional projects (e.g. RadX@IdF), European projects (e.g. Blue Green Dream and RainGain), to worldwide collaborations (e.g. TOMACS) - include awareness raising and capacity building activities aimed to foster cooperation between scientists, professionals, and beneficiaries. This presentation will explore how visualization techniques can be used in the above mentioned projects in order to support

  7. Plasma experiments elucidative for challenging problems investigated in other branches of science

    International Nuclear Information System (INIS)

    Sanduloviciu, M.; Popescu, S.

    2001-01-01

    Driving away from thermal equilibrium a plasma initially in an asymptotic stable state it is possible to identify the succession of the physical processes that form, as a whole, a new scenario of self-organization able to explain, besides the challenging problems of non-equilibrium physics, also some of the today not solved essential problems of the chemical and biological sciences. Thus, plasma experiments have revealed the presence of a local self-enhancement mechanism associated with long-range inhibition that explains pattern formation in general. Two successively produced instabilities originated in a positive feedback mechanism were identified to be at the origin of the spatial and spatial-temporal patterns, respectively. This feedback mechanism comprises a self-enhancing mechanism of the production of positive ions complemented by the creation of a net negative space charge by accumulation of electrons that have lost their kinetic energy in neutral excitations. The informational content concerning self-organization revealed by the plasma experiments suggests the presence of a new physical basis for the behavior of the systems working as differential negative resistance, but also new information on the actual cause of the anomalous transport of particles and energy. These results present special interest in solid state physics where the mechanism of current instabilities observed in semiconductors is today a non-conclusively solved problem. Anomalous transport of particles and energy is today a challenging problem of high energy physics because it is considered as the principal cause that impedes the improvement of the economical performances of fusion devices. Since all chemical and biological phenomena involve, at least, physical processes, the scenario of self-organization identified in plasma could be elucidative for understanding the phenomena, as for instance, the pattern formation in chemical media, but also the spontaneous self-assembling of the

  8. IceBridge Data Management and Access Strategies at NSIDC

    Science.gov (United States)

    Oldenburg, J.; Tanner, S.; Collins, J. A.; Lewis, S.; FitzGerrell, A.

    2013-12-01

    NASA's Operation IceBridge (OIB) mission, initiated in 2009, collects airborne remote sensing measurements over the polar regions to bridge the gap between NASA's Ice, Cloud and Land Elevation satellite (ICESat) mission and the upcoming ICESat-2 mission in 2016. OIB combines an evolving mix of instruments to gather data on topography, ice and snow thickness, high-resolution photography, and other properties that are more difficult or impossible to measure via satellite. Once collected, these data are stored and made available at the National Snow and Ice Data Center (NSIDC) in Boulder, Colorado. To date, there are nearly 90 terabytes of data available, and there are about three more years of data collection left. The main challenges faced in data management at NSIDC are derived from the quantity and heterogeneity of the data. To deal with the quantity of data, the technical teams at NSIDC have significantly automated the data ingest, metadata generation, and other required data management steps. Heterogeneity of data and the evolution of the Operation over time make technical automation complex. To limit complexity, the IceBridge team has agreed to such practices as using specific data file formats, limiting file sizes, using specific filename templates, etc. These agreements evolve as Operation IceBridge moves forward. The metadata generated about the flights and the data collected thereon make the storage of the data more robust, and enable data discoverability. With so much metadata, users can search the vast collection with ease using specific parameters about the data they seek. An example of this in action is the IceBridge data portal developed at NSIDC, http://nsidc.org/icebridge/portal/. This portal uses the GPS data from the flights projected onto maps as well as other flight and instrument metadata to help the user find the exact data file they seek. This implementation is only possible with dependable data management beneath the surface. The data files

  9. A Semantic Cross-Species Derived Data Management Application

    Directory of Open Access Journals (Sweden)

    David B. Keator

    2017-09-01

    Full Text Available Managing dynamic information in large multi-site, multi-species, and multi-discipline consortia is a challenging task for data management applications. Often in academic research studies the goals for informatics teams are to build applications that provide extract-transform-load (ETL functionality to archive and catalog source data that has been collected by the research teams. In consortia that cross species and methodological or scientific domains, building interfaces which supply data in a usable fashion and make intuitive sense to scientists from dramatically different backgrounds increases the complexity for developers. Further, reusing source data from outside one’s scientific domain is fraught with ambiguities in understanding the data types, analysis methodologies, and how to combine the data with those from other research teams. We report on the design, implementation, and performance of a semantic data management application to support the NIMH funded Conte Center at the University of California, Irvine. The Center is testing a theory of the consequences of “fragmented” (unpredictable, high entropy early-life experiences on adolescent cognitive and emotional outcomes in both humans and rodents. It employs cross-species neuroimaging, epigenomic, molecular, and neuroanatomical approaches in humans and rodents to assess the potential consequences of fragmented unpredictable experience on brain structure and circuitry. To address this multi-technology, multi-species approach, the system uses semantic web techniques based on the Neuroimaging Data Model (NIDM to facilitate data ETL functionality. We find this approach enables a low-cost, easy to maintain, and semantically meaningful information management system, enabling the diverse research teams to access and use the data.

  10. IMMUNOCAT—A Data Management System for Epitope Mapping Studies

    Directory of Open Access Journals (Sweden)

    Jo L. Chung

    2010-01-01

    Full Text Available To enable rationale vaccine design, studies of molecular and cellular mechanisms of immune recognition need to be linked with clinical studies in humans. A major challenge in conducting such translational research studies lies in the management and integration of large amounts and various types of data collected from multiple sources. For this purpose, we have established “IMMUNOCAT”, an interactive data management system for the epitope discovery research projects conducted by our group. The system provides functions to store, query, and analyze clinical and experimental data, enabling efficient, systematic, and integrative data management. We demonstrate how IMMUNOCAT is utilized in a large-scale research contract that aims to identify epitopes in common allergens recognized by T cells from human donors, in order to facilitate the rational design of allergy vaccines. At clinical sites, demographic information and disease history of each enrolled donor are captured, followed by results of an allergen skin test and blood draw. At the laboratory site, T cells derived from blood samples are tested for reactivity against a panel of peptides derived from common human allergens. IMMUNOCAT stores results from these T cell assays along with MHC:peptide binding data, results from RAST tests for antibody titers in donor serum, and the respective donor HLA typing results. Through this system, we are able to perform queries and integrated analyses of the various types of data. This provides a case study for the use of bioinformatics and information management techniques to track and analyze data produced in a translational research study aimed at epitope identification.

  11. IMMUNOCAT-a data management system for epitope mapping studies.

    Science.gov (United States)

    Chung, Jo L; Sun, Jian; Sidney, John; Sette, Alessandro; Peters, Bjoern

    2010-01-01

    To enable rationale vaccine design, studies of molecular and cellular mechanisms of immune recognition need to be linked with clinical studies in humans. A major challenge in conducting such translational research studies lies in the management and integration of large amounts and various types of data collected from multiple sources. For this purpose, we have established "IMMUNOCAT", an interactive data management system for the epitope discovery research projects conducted by our group. The system provides functions to store, query, and analyze clinical and experimental data, enabling efficient, systematic, and integrative data management. We demonstrate how IMMUNOCAT is utilized in a large-scale research contract that aims to identify epitopes in common allergens recognized by T cells from human donors, in order to facilitate the rational design of allergy vaccines. At clinical sites, demographic information and disease history of each enrolled donor are captured, followed by results of an allergen skin test and blood draw. At the laboratory site, T cells derived from blood samples are tested for reactivity against a panel of peptides derived from common human allergens. IMMUNOCAT stores results from these T cell assays along with MHC:peptide binding data, results from RAST tests for antibody titers in donor serum, and the respective donor HLA typing results. Through this system, we are able to perform queries and integrated analyses of the various types of data. This provides a case study for the use of bioinformatics and information management techniques to track and analyze data produced in a translational research study aimed at epitope identification.

  12. Pre-Service Science Teacher Education System in South Korea: Prospects and Challenges

    Science.gov (United States)

    Im, Sungmin; Yoon, Hye-Gyoung; Cha, Jeongho

    2016-01-01

    While much is known about the high academic but low affective achievement of Korean students on international comparative studies, little is known about science teacher education in Korea. As the quality of science teachers is an important factor determining the quality of science education, gaining an understanding of science education in Korea…

  13. Why Implementing History and Philosophy in School Science Education Is a Challenge: An Analysis of Obstacles

    Science.gov (United States)

    Hottecke, Dietmar; Silva, Cibelle Celestino

    2011-01-01

    Teaching and learning with history and philosophy of science (HPS) has been, and continues to be, supported by science educators. While science education standards documents in many countries also stress the importance of teaching and learning with HPS, the approach still suffers from ineffective implementation in school science teaching. In order…

  14. Research data management A European perspective

    CERN Document Server

    Kruse, Filip

    2017-01-01

    This new series presents and discusses new and innovative approaches used by professionals in library and information practice worldwide. The authors are chosen to provide critical analysis of issues and to present solutions to selected challenges in libraries and related fields, including information management and industry, and education of information professionals. The book series strives to present practical solutions that can be applied in institutions worldwide. It thereby contributes significantly to improvements in the field.

  15. Teaching and Learning Science in the 21st Century: Challenging Critical Assumptions in Post-Secondary Science

    Directory of Open Access Journals (Sweden)

    Amanda L. Glaze

    2018-01-01

    Full Text Available It is widely agreed upon that the goal of science education is building a scientifically literate society. Although there are a range of definitions for science literacy, most involve an ability to problem solve, make evidence-based decisions, and evaluate information in a manner that is logical. Unfortunately, science literacy appears to be an area where we struggle across levels of study, including with students who are majoring in the sciences in university settings. One reason for this problem is that we have opted to continue to approach teaching science in a way that fails to consider the critical assumptions that faculties in the sciences bring into the classroom. These assumptions include expectations of what students should know before entering given courses, whose responsibility it is to ensure that students entering courses understand basic scientific concepts, the roles of researchers and teachers, and approaches to teaching at the university level. Acknowledging these assumptions and the potential for action to shift our teaching and thinking about post-secondary education represents a transformative area in science literacy and preparation for the future of science as a field.

  16. [Productivity and academic assessment in the Brazilian public health field: challenges for Human and Social Sciences research].

    Science.gov (United States)

    Bosi, Maria Lúcia Magalhães

    2012-12-01

    This article analyzes some challenges for knowledge output in the human and social sciences in the public health field, under the current academic assessment model in Brazil. The article focuses on the qualitative research approach in human and social sciences, analyzing its status in comparison to the other traditions vying for hegemony in the public health field, conjugating the dialogue with the literature, especially the propositions pertaining to the social fields present in the work of Pierre Bourdieu, with elements concerning the field's dynamics, including some empirical data. Challenges identified in the article include hurdles to interdisciplinary dialogue and equity in the production of knowledge, based on recognition of the founding place of human and social sciences in the public health field. The article discusses strategies to reshape the current correlation of forces among centers of knowledge in public health, especially those capable of impacting the committees and agendas that define the accumulation of symbolic and economic capital in the field.

  17. Challenges Confronting Career-Changing Beginning Teachers: A Qualitative Study of Professional Scientists Becoming Science Teachers

    Science.gov (United States)

    Watters, James J.; Diezmann, Carmel M.

    2015-03-01

    Recruitment of highly qualified science and mathematics graduates has become a widespread strategy to enhance the quality of education in the field of STEM. However, attrition rates are very high suggesting preservice education programs are not preparing them well for the career change. We analyse the experiences of professionals who are scientists and have decided to change careers to become teachers. The study followed a group of professionals who undertook a 1-year preservice teacher education course and were employed by secondary schools on graduation. We examined these teachers' experiences through the lens of self-determination theory, which posits autonomy, confidence and relatedness are important in achieving job satisfaction. The findings indicated that the successful teachers were able to achieve a sense of autonomy and confidence and, in particular, had established strong relationships with colleagues. However, the unique challenges facing career-change professionals were often overlooked by administrators and colleagues. Opportunities to build a sense of relatedness in their new profession were often absent. The failure to establish supportive relationships was decisive in some teachers leaving the profession. The findings have implications for both preservice and professional in-service programs and the role that administrators play in supporting career-change teachers.

  18. Using Smartphones to Collect Behavioral Data in Psychological Science: Opportunities, Practical Considerations, and Challenges.

    Science.gov (United States)

    Harari, Gabriella M; Lane, Nicholas D; Wang, Rui; Crosier, Benjamin S; Campbell, Andrew T; Gosling, Samuel D

    2016-11-01

    Smartphones now offer the promise of collecting behavioral data unobtrusively, in situ, as it unfolds in the course of daily life. Data can be collected from the onboard sensors and other phone logs embedded in today's off-the-shelf smartphone devices. These data permit fine-grained, continuous collection of people's social interactions (e.g., speaking rates in conversation, size of social groups, calls, and text messages), daily activities (e.g., physical activity and sleep), and mobility patterns (e.g., frequency and duration of time spent at various locations). In this article, we have drawn on the lessons from the first wave of smartphone-sensing research to highlight areas of opportunity for psychological research, present practical considerations for designing smartphone studies, and discuss the ongoing methodological and ethical challenges associated with research in this domain. It is our hope that these practical guidelines will facilitate the use of smartphones as a behavioral observation tool in psychological science. © The Author(s) 2016.

  19. Games As Educational Tools in eARTh Science: MAREOPOLI and THE ENERGY CHALLENGE.

    Science.gov (United States)

    Garvani, Sara; Locritani, Marina; di Laura, Francesca; Stroobant, Mascha; Merlino, Silvia

    2017-04-01

    Research and researchers do have an important role in sustainable green and blue economy. It is also clear that outreach activities are fundamental to improve societal perception of Science past and present results and future insights or consequences and that is primary to change people's mentality. This is one of the main goals of the Scientific Dissemination Group (SDG) "La Spezia Gulf of Science", made up by Research Centres, Schools and Cultural associations located in La Spezia (Liguria, Italy). However, communicating scientific results means also improving educational methods: introducing tight relationship with artists (especially graphic designers), can produce unusual approaches and translate concepts in images which everyone can understand also under an emotional point of view. Images have a fundamental role for understanding and learning simple and less simple concepts, for example general public and high School students can be reached by interactive conferences with live speed painting (Locritani et al., 2016), and kids can be involved in interactive games. And games, especially, can reduce learning curves, since playing itself creates a natural forum for exchanging ideas and reflecting on natural phenomena and human impacts outside of class hours. Games, and the entertainment value of play, have the ability to teach and transform (Gobet et al., 2004). In this work we'll present two different games that raised from the collaboration between researchers and artists: MAREOPOLI and THE ENERGY CHALLENGE. MAREOPOLI (The City of Tides) is a simplified adaptation of the famous board game Monopoly, and consist of 36 spaces: 16 important historical and coastal cities having relevant tide phenomena, 8 Unexpected Events spaces (questions are asked on Modern Oceanography), 8 Curious Facts spaces (players receive information on historical records) and 4 corner squares: GO, (Blocked) in Limestone Grotto/Just Visiting, Free Beach Club, and Go to Limestone Grotto

  20. Collaborative Approaches to Undergraduate Research Training: Information Literacy and Data Management

    Directory of Open Access Journals (Sweden)

    Hailey Mooney

    2014-03-01

    Full Text Available The undergraduate research experience (URE provides an opportunity for students to engage in meaningful work with faculty mentors on research projects. An increasingly important component of scholarly research is the application of research data management best practices, yet this often falls out of the scope of URE programs. This article presents a case study of faculty and librarian collaboration in the integration of a library and research data management curriculum into a social work URE research team. Discussion includes reflections on the content and learning outcomes, benefits of a holistic approach to introducing undergraduate students to research practice, and challenges of scale.

  1. Geospatial Data Management Platform for Urban Groundwater

    Science.gov (United States)

    Gaitanaru, D.; Priceputu, A.; Gogu, C. R.

    2012-04-01

    Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis

  2. Total data management in the La Hague reprocessing plant

    International Nuclear Information System (INIS)

    Berthion, Y.; Perot, J.P.; Silie, P.

    1993-01-01

    Due to the complexity of a spent fuel reprocessing plant and its nuclear characteristics, the operators must have real-time access to updated information on many subjects. To meet these requirements effectively, Cogema has installed a number of diversified data processing systems linked by a communications network called Haguenet. The whole system forms the La Hague Total Data Management System (TDMS) which performs a full range of functions, namely production data management, maintenance data management, technical documentation and miscellaneous. Some examples of the main process data management applications implemented within the La Hague TDMS are briefly described (nuclear materials and waste follow-up, analytical data management, operating procedures management and site inspection management). Also presented are some examples of the maintenance-related systems implemented within the La Hague TDMS (diagnostic assistance system, software maintenance center, maintenance interventions demand and spare parts data management). (Z.S.)

  3. Bridging the Chasm: Challenges, Opportunities, and Resources for Integrating a Dissemination and Implementation Science Curriculum into Medical Education.

    Science.gov (United States)

    Ginossar, Tamar; Heckman, Carolyn J; Cragun, Deborah; Quintiliani, Lisa M; Proctor, Enola K; Chambers, David A; Skolarus, Ted; Brownson, Ross C

    2018-01-01

    Physicians are charged with implementing evidence-based medicine, yet few are trained in the science of Dissemination and Implementation (D&I). In view of the potential of evidence-based training in D&I to help close the gap between research and practice, the goal of this review is to examine the importance of D&I training in medical education, describe challenges to implementing such training, and provide strategies and resources for building D&I capacity. We conducted (1) a systematic review to identify US-based D&I training efforts and (2) a critical review of additional literature to inform our evaluation of the challenges and opportunities of integrating D&I training in medical education. Out of 269 unique articles reviewed, 11 described US-based D&I training. Although vibrant and diverse training opportunities exist, their capacity is limited, and they are not designed to meet physicians' needs. Synthesis of relevant literature using a critical review approach identified challenges inherent to changing medical education, as well as challenges related to D&I science. Finally, selected strategies and resources are available for facilitating incorporation of D&I training into medical education and overcoming existing challenges. Integrating D&I training in the medical education curriculum, and particularly in residency and fellowship training, holds promise for bridging the chasm between scientific discoveries and improved patient care and outcomes. However, unique challenges should be addressed, including the need for greater evidence.

  4. Challenges to Science and Technology Development Policy in the European Integration Policy

    Directory of Open Access Journals (Sweden)

    Valeriy Novytsky

    2004-10-01

    Full Text Available This article focuses on presentday aspects of Ukraine’s science and technology development policy in light of international phenomena and integration reali ties observed across the European continent. The author examines unique traits and practical challenges charac terizing an expansion of Ukraine — EU scientific and technological cooperation with the aim of improving the efficiency of Ukraine’s national economy and optimizing its international dimension. Special attention is paid to problems of adapting Ukraine’s technological policy to European standards, and relevant specific proposals are formulated. The article maintains that today’s advances in informa tion technology and the openness of national economies as a systemdeterminant factor of models of international cooperation broaden the scope of information technolo gies. Since telecommunications and other hitech sectors are vibrantly evolving not only in highly industrialized states but also in East European and other emerging mar ket economies, a key challenge for Ukraine appears to be lending better efficiency and productivity to its na tional policy of introducing information technologies into its socioeconomic sphere. The article provides insight into the international ex perience of the creation of technoparks and demonstrates the necessity of applying such innovation techniques of economic development to Ukraine.

  5. Marine data management: from early explorers to e-infrastructures (Ian McHarg Medal Lecture)

    Science.gov (United States)

    Glaves, Helen

    2016-04-01

    Ocean observations have been made for as long as Man has been exploring the seas. Early Phoenician and Viking explorers developed extensive knowledge of currents, tides and weather patterns that they shared directly with their peers. Eighteenth century log books from whaling ships and the voyages of explorers, such as Captain Cook, documented sea conditions and weather patterns, and it is these records that are used today to extend oceanographic records back to a time before systematic observing of the ocean began. This historical information is now being used to address the grand challenges being faced by Society in the 21st century in ways that the 18th century seafarers could never have imagined. Systematic ocean observation and the science of modern oceanography began in the late 19th century with the voyages of HMS Challenger. Since these early scientific cruises ocean observation has become more and more sophisticated. Increasingly diverse types of equipment mounted on different types of platforms are generating huge amounts of data delivered in a variety of formats and conforming to a range of standards and best practice. It is this heterogeneity of marine data that has presented one of the greatest challenges for the modern researcher. As marine research becomes increasingly international, cross-disciplinary and multiscale, it presents new and more complex challenges for data stewardship. Increasingly large volumes of interoperable data are needed to address fundamental questions such as the assessment of Man's impact on the marine environment or the sustainable exploitation of available marine resources whilst maintaining the good environmental status of the ocean. This presentation will provide a brief overview of the acquisition and sharing of information about the marine environment from the earliest explorers to the modern day. It will look at some of the challenges faced by today's marine researcher seeking to make use of multidisciplinary and

  6. Physics Teachers' Challenges in Using History and Philosophy of Science in Teaching

    Science.gov (United States)

    Henke, Andreas; Höttecke, Dietmar

    2015-01-01

    The inclusion of the history and philosophy of science (HPS) in science teaching is widely accepted, but the actual state of implementation in schools is still poor. This article investigates possible reasons for this discrepancy. The demands science teachers associate with HPS-based teaching play an important role, since these determine teachers'…

  7. Pre-Service Science Teacher Education in Africa: Prospects and Challenges

    Science.gov (United States)

    Ogunniyi, M. B.; Rollnick, Marissa

    2015-01-01

    Since the independence era in the 1950s and 1960s, many African countries have recognised the important role that science plays in the socio-economic development of any country. As a result, various African governments have enacted policies and allocated a large proportion of their gross national product to the science and science education sector…

  8. The scientific grand challenges of the 21st century for the Crop Science Society of America

    Science.gov (United States)

    Crop science is a highly integrative science field employing expertise from multiple disciplines to broaden our understanding of agronomic, turf, and forage crops. A major goal of crop science is to ensure an adequate and sustainable production of food, feed, fuel, and fiber for our world’s growing ...

  9. Konference Beyond the Leaky Pipeline: Future Challenges for Research on Gender and Science

    Czech Academy of Sciences Publication Activity Database

    Linková, Marcela; Tenglerová, Hana

    2010-01-01

    Roč. 11, č. 2 (2010), s. 76-83 ISSN 1213-0028 R&D Projects: GA MŠk OK08007 Institutional research plan: CEZ:AV0Z70280505 Keywords : gender and science * women in science * science related professions Subject RIV: AO - Sociology, Demography http://www.genderonline.cz

  10. Pre-Service Science Teacher Preparation in China: Challenges and Promises

    Science.gov (United States)

    Liu, Enshan; Liu, Cheng; Wang, Jian

    2015-01-01

    The purpose of this article was to present an overview of pre-service science teacher preparation in China, which is heavily influenced by Chinese tradition, Confucianism, and rapid social and economic development. The policies, science teacher education systems and related programs jointly contribute to producing enough science teachers for…

  11. The challenge of achieving professionalism and respect of diversity in a UK Earth Sciences department

    Science.gov (United States)

    Imber, Jonathan; Taylor, Michelle; Callaghan, Mark; Castiello, Gabriella; Cooper, George; Foulger, Gillian; Gregory, Emma; Herron, Louise; Hoult, Jill; Lo, Marissa; Love, Tara; Macpherson, Colin; Oakes, Janice; Phethean, Jordan; Riches, Amy

    2017-04-01

    The Department of Earth Sciences, Durham University, has a balanced gender profile at undergraduate, postgraduate and postdoctoral levels (38%, 42% and 45% females, respectively), but one of the lowest percentages, relative to the natural applicant pool, of female academic staff amongst UK geoscience departments. There are currently 9% female academic staff at Durham, compared with a median value (in November 2015) of 20% for all Russell Group geoscience departments in the UK. Despite the fact that the female staff group is relatively senior, the Department's current academic management is essentially entirely male. The Department has an informal working culture, in which academics operate an "open door" policy, and staff and students are on first name terms. This culture, open plan office space, and our fieldwork programme, allow staff and students to socialise. A positive outcome of this culture is that > 95% of final year undergraduate students deemed the staff approachable (National Student Survey 2016). Nevertheless, a survey of staff and research student attitudes revealed significant differences in the way males and females perceive our working environment. Females are less likely than males to agree with the statements that "the Department considers inappropriate language to be unacceptable" and "inappropriate images are not considered acceptable in the Department". That anyone could find "inappropriate" language and images "acceptable" is a measure of the challenge faced by the Department. Males disagree more strongly than females that they "have felt uncomfortable because of [their] gender". The Department is proactively working to improve equality and diversity. It held a series of focus group meetings, divided according to gender and job role, to understand the differences in male and female responses. Female respondents identified examples of inappropriate language (e.g. sexual stereotyping) that were directed at female, but not male, colleagues. Males

  12. Donaldson v. Van de Kamp: cryonics, assisted suicide, and the challenges of medical science.

    Science.gov (United States)

    Pommer, R W

    1993-01-01

    In recent years, advances in medical science have left the legal community with a wide array of social, ethical, and legal problems previously unimaginable. Historically, legislative and judicial responses to these advances lagged behind the rapid pace of such developments. The gap between the scientist's question, "Can we do it?," and the lawyer's question, "Should/may we do it?'" is most evident in the field of cryonics, with its technique of cryonic, or cryogenic, suspension. In cryonic suspension, a legally dead but biologically viable person is preserved at an extremely low temperature until advances in medical science make it possible to revive the person and implement an effective cure. The terminally ill patient who wishes to benefit from such treatment is faced with the dilemma that present life must be ceased with hope of future recovery. As a result, the process challenges our traditional notions of death and the prospects of immortality while raising a host of concomitant legal dilemmas. Some facets of this dilemma are exemplified by Donaldson v. Van de Kamp. In Donaldson, Thomas A. Donaldson sought the declaration of a constitutional right to premortem cryonic suspension of his body and the assistance of others in achieving that state. Donaldson, a forty-six-year-old mathematician and computer software scientist, suffers from a malignant brain tumor that was diagnosed by his physicians in 1988. This tumor is inoperable and continues to grow and invade his brain tissue. Donaldson's condition will gradually deteriorate into a persistent vegetative state and will ultimately result in death. Physicians predict his probable death by August 1993. Donaldson petitioned the California courts, seeking a declaration that he had a constitutional right to achieve cryonic suspension before his natural death. His doctors believe that if Donaldson waits until his natural death to be suspended, future reanimation will be futile because the tumor will have destroyed his

  13. Building a Snow Data Management System using Open Source Software (and IDL)

    Science.gov (United States)

    Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.

    2012-12-01

    At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version

  14. Data Management System for the National Energy-Water System (NEWS) Assessment Framework

    Science.gov (United States)

    Corsi, F.; Prousevitch, A.; Glidden, S.; Piasecki, M.; Celicourt, P.; Miara, A.; Fekete, B. M.; Vorosmarty, C. J.; Macknick, J.; Cohen, S. M.

    2015-12-01

    Aiming at providing a comprehensive assessment of the water-energy nexus, the National Energy-Water System (NEWS) project requires the integration of data to support a modeling framework that links climate, hydrological, power production, transmission, and economical models. Large amounts of Georeferenced data has to be streamed to the components of the inter-disciplinary model to explore future challenges and tradeoffs in the US power production, based on climate scenarios, power plant locations and technologies, available water resources, ecosystem sustainability, and economic demand. We used open source and in-house build software components to build a system that addresses two major data challenges: On-the-fly re-projection, re-gridding, interpolation, extrapolation, nodata patching, merging, temporal and spatial aggregation, of static and time series datasets in virtually any file formats and file structures, and any geographic extent for the models I/O, directly at run time; Comprehensive data management based on metadata cataloguing and discovery in repositories utilizing the MAGIC Table (Manipulation and Geographic Inquiry Control database). This innovative concept allows models to access data on-the-fly by data ID, irrespective of file path, file structure, file format and regardless its GIS specifications. In addition, a web-based information and computational system is being developed to control the I/O of spatially distributed Earth system, climate, and hydrological, power grid, and economical data flow within the NEWS framework. The system allows scenario building, data exploration, visualization, querying, and manipulation any loaded gridded, point, and vector polygon dataset. The system has demonstrated its potential for applications in other fields of Earth science modeling, education, and outreach. Over time, this implementation of the system will provide near real-time assessment of various current and future scenarios of the water-energy nexus.

  15. Biological Sciences for the 21st Century: Meeting the Challenges of Sustainable Development in an Era of Global Change

    Energy Technology Data Exchange (ETDEWEB)

    Joel Cracraft; Richard O' Grady

    2007-05-12

    The symposium was held 10-12 May, 2007 at the Capitol Hilton Hotel in Washington, D. C. The 30 talks explored how some of today's key biological research developments (such as biocomplexity and complex systems analysis, bioinformatics and computational biology, the expansion of molecular and genomics research, and the emergence of other comprehensive or system wide analyses, such as proteomics) contribute to sustainability science. The symposium therefore emphasized the challenges facing agriculture, human health, sustainable energy, and the maintenance of ecosystems and their services, so as to provide a focus and a suite of examples of the enormous potential contributions arising from these new developments in the biological sciences. This symposium was the first to provide a venue for exploring how the ongoing advances in the biological sciences together with new approaches for improving knowledge integration and institutional science capacity address key global challenges to sustainability. The speakers presented new research findings, and identified new approaches and needs in biological research that can be expected to have substantial impacts on sustainability science.

  16. The challenges of science journalism: The perspectives of scientists, science communication advisors and journalists from New Zealand.

    Science.gov (United States)

    Ashwell, Douglas James

    2016-04-01

    The news media play an important role in informing the public about scientific and technological developments. Some argue that restructuring and downsizing result in journalists coming under increased pressure to produce copy, leading them to use more public relations material to meet their deadlines. This article explores science journalism in the highly commercialised media market of New Zealand. Using semi-structured interviews with scientists, science communication advisors and journalists, the study finds communication advisors and scientists believe most media outlets, excluding public service media, report science poorly. Furthermore, restructuring and staff cuts have placed the journalists interviewed under increasing pressure. While smaller newspapers appear to be printing press releases verbatim, metropolitan newspaper journalists still exercise control over their use of such material. The results suggest these journalists will continue to resist increasing their use of public relations material for some time to come. © The Author(s) 2014.

  17. Data Management Services for VPH Applications

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The VPH-Share project [1] develops a cloud platform for the Virtual Physiological Human (VPH) research community. One of the key challenges is to share and access large datasets used by medical applications to transform them into meaningful diagnostic information. VHP researchers need advanced storage capabilities to enable collaboration without introducing additional complexity to the way data are accessed and shared. In the VPH-Share cloud platform [2], the data storage federation [3] is achieved by an aggregation of data resources in a client-centric manner and exposing it via a standardized protocol that can be also mounted and presented as a local storage so a kind of a file system abstraction is provided. There is a common management layer that uses loosely coupled and independent storage resources and with such a layer a variety of storage resources such as simple file servers, storage clouds and data grid may be aggregated exposing all available storage. As a result, distributed applications have ...

  18. Affordances and Challenges of Using Argument as a Connective Discourse for Scientific Practices to Teach Climate Science

    Science.gov (United States)

    Sezen-Barrie, A.; Wolfson, J.

    2015-12-01

    An important goal of science education is to support development of citizens to participate in public debate and make informed decisions relevant to their lives and their worlds. The NGSS (Next Generation Science Standards) suggest engaging students in science classrooms in argumentation as a practice to help enhance the quality of evidence based decision making. In this multi-case study, we explored the use of written argumentation in eight secondary school science classrooms during a lesson on the relationship between ocean temperature and its CO2 holding capacity. All teachers of these classrooms were trained during a day long NSF funded Climate Literacy Workshop on the basic concepts of climate science, scientific practices and implementation of an activity called "It's a Gassy World". The data of the current study involved students' written arguments, teachers' written reflections on the implementation of the activity as well as field notes from the Climate Literacy Workshop. A qualitative discourse analysis of the data was used to find common themes around affordances and challenges of argument as a connective discourse for scientific practices to teach climate change. The findings show that participating in written argumentation process encouraged students to discuss their experimental design and use data interpretation for their evidences. However, the results also indicated the following challenges: a) teachers themselves need support in connecting their evidence to their claims, b) arguing a socioscientific issue creates a sensitive environment c) conceptual quality of an argument needs to be strengthen through background in courses other than science, and d) graphing skills (or lack of) can interfere with constructing scientifically accurate claims. This study has implications in effectively teaching climate change through argumentation, and thus creating opportunities for practicing authentic climate science research in K-12 classrooms.

  19. Collaboration challenges in systematic reviews: a survey of health sciences librarians

    Directory of Open Access Journals (Sweden)

    Joey Nicholson

    2017-10-01

    Results: Of the 17 challenges listed in the survey, 8 were reported as common by over 40% of respondents. These included methodological issues around having too broad or narrow research questions, lacking eligibility criteria, having unclear research questions, and not following established methods. The remaining challenges were interpersonal, including issues around student-led projects and the size of the research team. Of the top 8 most frequent challenges, 5 were also ranked as most difficult to handle. Open-ended responses underscored many of the challenges included in the survey and revealed several additional challenges. Conclusions: These results suggest that the most frequent and challenging issues relate to development of the research question and general communication with team members. Clear protocols for collaboration on systematic reviews, as well as a culture of mentorship, can help librarians prevent and address these challenges.  This article has been approved for the Medical Library Association’s Independent Reading Program.

  20. Additional Insights Into Problem Definition and Positioning From Social Science Comment on "Four Challenges That Global Health Networks Face".

    Science.gov (United States)

    Quissell, Kathryn

    2017-09-10

    Commenting on a recent editorial in this journal which presented four challenges global health networks will have to tackle to be effective, this essay discusses why this type of analysis is important for global health scholars and practitioners, and why it is worth understanding and critically engaging with the complexities behind these challenges. Focusing on the topics of problem definition and positioning, I outline additional insights from social science theory to demonstrate how networks and network researchers can evaluate these processes, and how these processes contribute to better organizing, advocacy, and public health outcomes. This essay also raises multiple questions regarding these processes for future research. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.