GI-cat ideal users are data providers or service providers within the geoscience community. The former have their data already available through an access service (e.g. an OGC Web Service) and would have it published through a standard catalog service, in a seamless way. The latter would develop a catalog broker and let users query and access different geospatial resources through one or more standard interfaces and Application Profiles (AP) (e.g. OGC CSW ISO AP, CSW ebRIM/EO AP, etc.). GI-cat actually implements a broker components (i.e. a middleware service) which carries out distribution and mediation functionalities among "well-adopted" catalog interfaces and data access protocols. GI-cat also publishes different discovery interfaces: the OGC CSW ISO and ebRIM Application Profiles (the latter coming with support for the EO and CIM extension packages) and two different OpenSearch interfaces developed in order to explore Web 2.0 possibilities. An extended interface is also available to exploit all available GI-cat features, such as interruptible incremental queries and queries feedback. Interoperability tests performed in the context of different projects have also pointed out the importance to enforce compatibility with existing and wide-spread tools of the open source community (e.g. GeoNetwork and Deegree catalogs), which was then achieved. Based on a service-oriented framework of modular components, GI-cat can effectively be customized and tailored to support different deployment scenarios. In addition to the distribution functionality an harvesting approach has been lately experimented, allowing the user to switch between a distributed and a local search giving thus more possibilities to support different deployment scenarios. A configurator tool is available in order to enable an effective high level configuration of the broker service. A specific geobrowser was also naturally developed, for demonstrating the advanced GI-cat functionalities. This client
Full Text Available Abstract Background Many complementary solutions are available for the identifier mapping problem. This creates an opportunity for bioinformatics tool developers. Tools can be made to flexibly support multiple mapping services or mapping services could be combined to get broader coverage. This approach requires an interface layer between tools and mapping services. Results Here we present BridgeDb, a software framework for gene, protein and metabolite identifier mapping. This framework provides a standardized interface layer through which bioinformatics tools can be connected to different identifier mapping services. This approach makes it easier for tool developers to support identifier mapping. Mapping services can be combined or merged to support multi-omics experiments or to integrate custom microarray annotations. BridgeDb provides its own ready-to-go mapping services, both in webservice and local database forms. However, the framework is intended for customization and adaptation to any identifier mapping service. BridgeDb has already been integrated into several bioinformatics applications. Conclusion By uncoupling bioinformatics tools from mapping services, BridgeDb improves capability and flexibility of those tools. All described software is open source and available at http://www.bridgedb.org.
Argues that easy claims about the relationship between language mastery and academic or economic access (made by both conservative commentators on education and mainstream writing teachers) are false and obscure real social and political boundaries, such as racism, sexism, elitism, and homophobia, that really do prevent access. (SR)
The access structure is the primary guide structure in the central texts of any standard translation dictionary. The metalexicographical term "guide structures" refers to the set of structures that provides a framework within which the accessibility and availability of information types in the dictionary can be evaluated. The access ...
Many organizations issuing standards offer reduced prices for publications to their members. Paying a membership fee, even a site membership fee, can therefore be worthwhile - even if relatively few standards are needed. The Library is now exploring the possibility, in collaboration with the rest of the CERN community, of joining standards-issuing organizations. So why not share your costs with the rest of the organization wherever this can be done without violating copyright or access regulations? The Library now provides documentation and other member services from IPC, the Association Connecting Electronics Industries (http://www.ipc.org/html/fsabout.htm) at favourable prices for the entire CERN population. For more information, or if you are a member of any other organization which provides services which could be shared CERN-wide, please contact firstname.lastname@example.org . We remind all users of international standards, that CERN has special agreements with ISO and IEC (see Bulletin 50/2000). You can order st...
Gang Huang; Lian-Shan Sun
Reflective middleware opens up the implementation details of middleware platform and applications at runtime for improving the adaptability of middleware-based systems. However, such openness brings new challenges to access control of the middleware-based systems.Some users can access the system via reflective entities, which sometimes cannot be protected by access control mechanisms of traditional middleware. To deliver high adaptability securely, reflective middleware should be equipped with proper access control mechanisms for potential access control holes induced by reflection. One reason of integrating these mechanisms in reflective middleware is that one goal of reflective middleware is to equip applications with reflection capabilities as transparent as possible. This paper studies how to design a reflective J2EE middlewarePKUAS with access control in mind. At first, a computation model of reflective system is built to identify all possible access control points induced by reflection. Then a set of access control mechanisms, including the wrapper of MBeans and a hierarchy of Java class loaders, are equipped for controlling the identified access control points. These mechanisms together with J2EE access control mechanism form the access control framework for PKUAS. The paper evaluates the security and the performance overheads of the framework in quality and quantity.
WANG Lun-wei; LIAO Xiang-ke; WANG Huai-min
Weighted factor is given to access control policies to express the importance of policy and its effect on access control decision. According to this weighted access control framework, a trustworthiness model for access request is also given. In this model, we give the measure of trustworthiness factor to access request, by using some idea of uncertainty reasoning of expert system, present and prove the parallel propagation formula of request trustworthiness factor among multiple policies, and get the final trustworthiness factor to decide whether authorizing. In this model, authorization decision is given according to the calculation of request trustworthiness factor, which is more understandable, more suitable for real requirement and more powerful for security enhancement than traditional methods. Meanwhile the finer access control granularity is another advantage.
Currently, dominant web accessibility standards do not respect disability as a complex and culturally contingent interaction; recognizing that disability is a variable, contrary and political power relation, rather than a biological limit. Against this background there is clear scope to broaden the ways in which accessibility standards are understood, developed and applied. Commentary. The values that shape and are shaped by legislation promote universal, statistical and automated approaches to web accessibility. This results in web accessibility standards conveying powerful norms fixing the relationship between technology and disability, irrespective of geographical, social, technological or cultural diversity. Web accessibility standards are designed to enact universal principles; however, they express partial and biopolitical understandings of the relation between disability and technology. These values can be limiting, and potentially counter-productive, for example, for the majority of disabled people in the "Global South" where different contexts constitute different disabilities and different experiences of web access. To create more robust, accessible outcomes for disabled people, research and standards practice should diversify to embrace more interactional accounts of disability in different settings. Implications for Rehabilitation Creating accessible experiences is an essential aspect of rehabilitation. Web standards promote universal accessibility as a property of an online resource or service. This undervalues the importance of the user's intentions, expertize, their context, and the complex social and cultural nature of disability. Standardized, universal approaches to web accessibility may lead to counterproductive outcomes for disabled people whose impairments and circumstances do not meet Western disability and accessibility norms. Accessible experiences for rehabilitation can be enhanced through an additional focus on holistic approaches to
Kamateri, Eleni; Kalampokis, Evangelos; Tambouris, Efthimios; Tarabanis, Konstantinos
The integration of medical data coming from multiple sources is important in clinical research. Amongst others, it enables the discovery of appropriate subjects in patient-oriented research and the identification of innovative results in epidemiological studies. At the same time, the integration of medical data faces significant ethical and legal challenges that impose access constraints. Some of these issues can be addressed by making available aggregated instead of raw record-level data. In many cases however, there is still a need for controlling access even to the resulting aggregated data, e.g., due to data provider's policies. In this paper we present the Linked Medical Data Access Control (LiMDAC) framework that capitalizes on Linked Data technologies to enable controlling access to medical data across distributed sources with diverse access constraints. The LiMDAC framework consists of three Linked Data models, namely the LiMDAC metadata model, the LiMDAC user profile model, and the LiMDAC access policy model. It also includes an architecture that exploits these models. Based on the framework, a proof-of-concept platform is developed and its performance and functionality are evaluated by employing two usage scenarios. Copyright © 2014 Elsevier Inc. All rights reserved.
Sarantis, Demetrios; Tsiakaliaris, Christos; Lampathaki, Fenareti; Charalabidis, Yannis
Although most eGovernment interoperability frameworks (eGIFs) cover adequately the technical aspects of developing and supporting the provision of electronic services to citizens and businesses, they do not exclusively address several important areas regarding the organization, presentation, accessibility and security of the content and the electronic services offered through government portals. This chapter extends the scope of existing eGIFs presenting the overall architecture and the basic concepts of the Greek standardization framework for electronic government service portals which, for the first time in Europe, is part of a country's eGovernment framework. The proposed standardization framework includes standards, guidelines and recommendations regarding the design, development and operation of government portals that support the provision of administrative information and services to citizens and businesses. By applying the guidelines of the framework, the design, development and operation of portals in central, regional and municipal government can be systematically addressed resulting in an applicable, sustainable and ever-expanding framework.
The aim was to explore the use of an activity-based approach to determine the validity of a set of housing standards addressing accessibility. This included examination of the frequency and the extent of accessibility problems among older people with physical functional limitations who used...... participant groups were examined. Performing well-known kitchen activities was associated with accessibility problems for all three participant groups, in particular those using a wheelchair. The overall validity of the housing standards examined was poor. Observing older people interacting with realistic...... environments while performing real everyday activities seems to be an appropriate method for assessing accessibility problems....
The Americans with Disabilities Act (ADA) of 1990 mandated that facilities and programs are accessible, so people with disabilities can be included in all aspects of community life including recreation (Dattilo, 2002). Understanding accessibility standards is not an easy task. Educators are faced with the challenge of teaching technical content,…
Crespi, Alexander M
... security characteristics from the properties of individual components would aid in the creation of more secure systems In this thesis, a framework for characterizing the access control properties...
This paper proposes a coherent and unique set of 12 standards, adopting a neuroscience framework for biologically based on school reform. This model of educational principles and practices aligns with the long-standing principles and practices of the Progressive Education Movement in the United States and the emerging principles of neuroscience.…
Afshar, Majid; Samet, Saeed; Hu, Ting
Nowadays, access control is an indispensable part of the Personal Health Record and supplies for its confidentiality by enforcing policies and rules to ensure that only authorized users gain access to requested resources in the system. In other words, the access control means protecting patient privacy in healthcare systems. Attribute-Based Access Control (ABAC) is a new access control model that can be used instead of other traditional types of access control such as Discretionary Access Control, Mandatory Access Control, and Role-Based Access Control. During last five years ABAC has shown some applications in both recent academic fields and industry purposes. ABAC by using user’s attributes and resources, makes a decision according to an access request. In this paper, we propose an ABAC framework for healthcare system. We use the engine of ABAC for rendering and enforcing healthcare policies. Moreover, we handle emergency situations in this framework.
Helle, Tina; Iwarsson, Susanne; Brandt, Åse
Since standards for accessible housing seldom are manifestly based on research and vary cross nationally, it is important to examine if there exists any scientific evidence, supporting these standards. Thus, one aim of this study was to review the literature in search of such scientific evidence...... data on older citizens and their housing environment in Sweden, Germany and Latvia (n=1150), collected with the Housing Enabler instrument. Applying statistical simulation we explored how different national standards for housing design influenced the prevalence of common environmental barriers. Kaplan...... by the database search (n= 2,577), resulting in the inclusion of one publication. Contacts to leading researchers in the field identified five publications. The hand search of 22 journals led to one publication. We have exemplified how the prevalence of common environmental problems in housing environments...
T.K. Ashwin Kumar
Full Text Available Big data technologies have seen tremendous growth in recent years. They are widely used in both industry and academia. In spite of such exponential growth, these technologies lack adequate measures to protect data from misuse/abuse. Corporations that collect data from multiple sources are at risk of liabilities due to the exposure of sensitive information. In the current implementation of Hadoop, only file-level access control is feasible. Providing users with the ability to access data based on the attributes in a dataset or the user’s role is complicated because of the sheer volume and multiple formats (structured, unstructured and semi-structured of data. In this paper, we propose an access control framework, which enforces access control policies dynamically based on the sensitivity of the data. This framework enforces access control policies by harnessing the data context, usage patterns and information sensitivity. Information sensitivity changes over time with the addition and removal of datasets, which can lead to modifications in access control decisions. The proposed framework accommodates these changes. The proposed framework is automated to a large extent as the data itself determines the sensitivity with minimal user intervention. Our experimental results show that the proposed framework is capable of enforcing access control policies on non-multimedia datasets with minimal overhead.
Bulletin of the American Society for Information Science, 1992
This policy framework provides guidelines for federal agencies on public access to government electronic information. Highlights include reasons for disseminating information; defining user groups; which technology to use; pricing flexibility; security and privacy issues; and the private sector and state and local government roles. (LRW)
A theoretical framework for an access programme encompassing further education training: remedy for educational wastage? ... learners who have dropped out of school without completing their secondary-school education, there are the special needs of adult learners in the workplace that must be taken into consideration.
Doderer, Mark S; Burkhardt, Cory; Robbins, Kay A
Many bioinformatics algorithms and data sets are deployed using web services so that the results can be explored via the Internet and easily integrated into other tools and services. These services often include data from other sites that is accessed either dynamically or through file downloads. Developers of these services face several problems because of the dynamic nature of the information from the upstream services. Many publicly available repositories of bioinformatics data frequently update their information. When such an update occurs, the developers of the downstream service may also need to update. For file downloads, this process is typically performed manually followed by web service restart. Requests for information obtained by dynamic access of upstream sources is sometimes subject to rate restrictions. SideCache provides a framework for deploying web services that integrate information extracted from other databases and from web sources that are periodically updated. This situation occurs frequently in biotechnology where new information is being continuously generated and the latest information is important. SideCache provides several types of services including proxy access and rate control, local caching, and automatic web service updating. We have used the SideCache framework to automate the deployment and updating of a number of bioinformatics web services and tools that extract information from remote primary sources such as NCBI, NCIBI, and Ensembl. The SideCache framework also has been used to share research results through the use of a SideCache derived web service.
Souliotis, Kyriakos; Hasardzhiev, Stanimir; Agapidaki, Eirini
Research evidence suggests that access to health care is the key influential factor for improved population health outcomes and health care system sustainability. Although the importance of addressing barriers in access to health care across European countries is well documented, little has been done to improve the situation. This is due to different definitions, approaches and policies, and partly due to persisting disparities in access within and between European countries. To bridge this gap, the Patient Access Partnership (PACT) developed (a) the '5As' definition of access, which details the five critical elements (adequacy, accessibility, affordability, appropriateness, and availability) of access to health care, (b) a multi-stakeholders' approach for mapping access, and (c) a 13-item questionnaire based on the 5As definition in an effort to address these obstacles and to identify best practices. These tools are expected to contribute effectively to addressing access barriers in practice, by suggesting a common framework and facilitating the exchange of knowledge and expertise, in order to improve access to health care between and within European countries. © 2016 S. Karger AG, Basel.
The IAEA, uniquely among international organizations concerned with the use of radiation, radioactive materials and nuclear energy, has statutory functions to establish safety standards and to provide for their application in Member States. The IAEA also contributes towards another major element of the 'global safety culture', namely the establishment of legally binding international agreements on safety related issues. (author)
A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers
Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich
The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced
Duffy, Francis M.
The ten professional standards form what Francis Duffy refers to as a "National Framework of Professional Standards for Change Leadership in Education." Each standard has examples of the knowledge, skills, and dispositions that the research suggests are important for effective change leadership. Duffy's hope is that this proposed…
Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future. PMID:11720931
Jensen, Henning Tarp; Tarp, Finn
In this paper, we present a SAM-based methodology for integrating standard CGE features with a macroeconomic World Bank–International Monetary Fund (IMF) modelling framework. The resulting macro–micro framework is based on optimising agents, but it retains key features from the macroeconomic model...
Stiles, Katherine; Mundry, Susan; DiRanna, Kathy
In response to the need to develop leaders to guide the implementation of the Next Generation Science Standards (NGSS), the Carnegie Corporation of New York provided funding to WestEd to develop a framework that defines the leadership knowledge and actions needed to effectively implement the NGSS. The development of the framework entailed…
Competence frameworks and standards are increasingly used by professions in the UK, driven by pressures for professional accountability, and particularly by the trend towards assessing practice before fully-qualified status is granted. A review of 40 UK frameworks indicated that most are concerned primarily with the ability to undertake work…
Ndumele, Chima D; Cohen, Michael S; Cleary, Paul D
Medicaid recipients have consistently reported less timely access to specialists than patients with other types of coverage. By 2018, state Medicaid agencies will be required by the Center for Medicare and Medicaid Services (CMS) to enact time and distance standards for managed care organizations to ensure an adequate supply of specialist physicians for enrollees; however, there have been no published studies of whether these policies have significant effects on access to specialty care. To compare ratings of access to specialists for adult Medicaid and commercial enrollees before and after the implementation of specialty access standards. We used Consumer Assessment of Healthcare Providers and Systems survey data to conduct a quasiexperimental difference-in-differences (DID) analysis of 20 163 nonelderly adult Medicaid managed care (MMC) enrollees and 54 465 commercially insured enrollees in 5 states adopting access standards, and 37 290 MMC enrollees in 5 matched states that previously adopted access standards. Reported access to specialty care in the previous 6 months. Seven thousand six hundred ninety-eight (69%) Medicaid enrollees and 28 423 (75%) commercial enrollees reported that it was always or usually easy to get an appointment with a specialist before the policy implementation (or at baseline) compared with 11 889 (67%) of Medicaid enrollees in states that had previously implemented access standards. Overall, there was no significant improvement in timely access to specialty services for MMC enrollees in the period following implementation of standard(s) (adjusted difference-in-differences, -1.2 percentage points; 95% CI, -2.7 to 0.1), nor was there any impact of access standards on insurance-based disparities in access (0.6 percentage points; 95% CI, -4.3 to 5.4). There was heterogeneity across states, with 1 state that implemented both time and distance standards demonstrating significant improvements in access and reductions in disparities
Patrick, T B; Springer, G K; Mitchell, J A; Sievert, M E
Develop a framework for collections-based access to networked information sources that addresses the problem of location-dependent access to information sources. This framework uses a metaphor of a virtual shelf. A virtual shelf is a general-purpose server that is dedicated to a particular information subject class. The identifier of one of these servers identifies its subject class. Location-independent call numbers are assigned to information sources. Call numbers are based on standard vocabulary codes. The call numbers are first mapped to the location-independent identifiers of virtual shelves. When access to an information resource is required, a location directory provides a second mapping of these location-independent server identifiers to actual network locations. The framework has been implemented in two different systems. One system is based on the Open System Foundation/Distributed Computing Environment and the other is based on the World Wide Web. This framework applies in new ways traditional methods of library classification and cataloging. It is compatible with two traditional styles of selecting information searching and browsing. Traditional methods may be combined with new paradigms of information searching that will be able to take advantage of the special properties of digital information. Cooperation between the library-informational science community and the informatics community can provide a means for a continuing application of the knowledge and techniques of library science to the new problems of networked information sources.
Zhou, Nan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Khanna, Nina Zheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fridley, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Romankiewicz, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
As appliance energy efficiency standards and labeling (S&L) programs reach a broader geographic and product scope, a series of sophisticated and complex technical and economic analyses have been adopted by different countries in the world to support and enhance these growing S&L programs. The initial supporting techno-economic and impact analyses for S&L development make up a defined framework and process for setting and developing appropriate appliance efficiency standards and labeling programs. This report reviews in-depth the existing framework for standards setting and label development in the well-established programs of the U.S., Australia and the EU to identify and evaluate major trends in how and why key analyses are undertaken and to understand major similarities and differences between each of the frameworks.
Full Text Available of ICT-related emissions globally. The overarching goal is to reach consensus within the global ICT sector on a common methodological framework for the measurement of energy consumption and carbon emissions arising from the production, and operation... of the standards that are of interest in the context of this paper include: (Note all IEEE standards are defined in ): ETSI reconfigurable radio systems (RRS) : ETSI considers the feasibility of possible operations of the long- term evolution (LTE...
Neuhaus, Philipp; Doods, Justin; Dugas, Martin
Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.
Helle, Tina; Brandt, Åse; Slaug, Bjørn
openings at the entrance (defined ≥75cm) implied that the proportion of dwellings not meeting it was 11.3% compared to 64.4%, if the standard was set to ≥83cm. The proportion of individuals defined as having accessibility problems for profiles not using of mobility devices was 4-5%, 57% for profiles using...
Helle, Tina; Brandt, Åse; Iwarsson, Susanne
evaluations of task-surface heights in elderly people’s homes. Applied Ergonomics, 31, 109-119. Kohlbacher, F. (2006). The use of qualitative content analysis in case study research. Forum: Qualitative social research sozialforschung (FQS), Open Journal Systems, vol 7, No1. Kozey, J.W. & Das, B. (2004...... accessibility aspects such as either reach, seat height or space requirements • Targeted primarily industrial workstation design and only wheelchair/scooter users • Addressed positions (standing/seated) and sex difference with respect to reach • Was generated in lab-like environments, using methods...... of the validity of housing standards. Therefore, it is reasonable to question what type of knowledge that provides the most valid standards addressing accessibility and explore the consequences of using an alternative approach. The idea was thus to examine the validity of a set of housing standards using a so...
Casabianca, G.A.; Evans, J.M. [Research Centre Habitat and Energy, Facultad de Arquitectura, Diseno y Urbanismo, Universidad de Buenos Aires, Capital Federal (Argentina)
In southern Argentina, a region between latitudes 38 deg C and 55 deg C S, the heating demand in the residential sector is high while the availability of solar radiation is limited. A new proposal for solar access standards has been developed, taking into account the climatic conditions of each location, the effective availability of solar radiation and the direct sunlight requirements. This study analyses the climatic conditions for the Patagonia, relating heating demand and solar radiation availability in different sites, and presents the development of new sunlight standards that respond to these regional conditions. As a result of this study, the new Argentine standard TRAM 11.603 includes new conditions to protect solar access and provide design recommendations. (orig.) 4 refs.
Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 DCC DIFFUSE Standards Frameworks aims to offer domain specific advice on standards relevant to digital preservation and curation, to help curators identify which standards they should be using and where they can be appropriately implemented, to ensure authoritative digital material. The Project uses the DCC Curation Lifecycle Model and Web 2.0 technology, to visually present standards frameworks for a number of disciplines. The Digital Curation Centre (DCC is actively working with a different relevant organisations to present searchable frameworks of standards, for a number of domains. These include digital repositories, records management, the geo-information sector, archives and the museum sector. Other domains, such as e-science, will shortly be investigated.
Americans With Disabilities Act (ADA) and International Code Council (ICC) standards for accessible buildings and facilities affect design and construction of all new and renovated buildings throughout the United States, and form the basis for compliance with the ADA. While these standards may result in acceptable accessibility for people who are fully blind, they fall far short of what they could and should accomplish for those with low vision. In this article I critique the standards, detailing their lack of evidence base and other shortcomings. I suggest that simply making existing requirements stricter (e.g., by mandating larger letter size or higher contrasts) will not ensure visual accessibility and therefore cannot act as a valid basis for compliance with the law. I propose two remedies. First, requirements for visual characteristics of signs intended to improve access for those with low vision should be expressed not in terms of physical features, such as character height and contrast, but rather in terms of the distance at which a sign can be read by someone with nominally normal (20/20) visual acuity under expected lighting conditions for the installed environment. This would give sign designers greater choice in design parameters but place on them the burden of ensuring legibility. Second, mounting of directional signs, which are critical for effective and efficient wayfinding, should be required to be in consistent and approachable locations so that those with reduced acuity may view them at close distance.
Sözen, Seval; Avcioglu, Ebru; Ozabali, Asli; Görgun, Erdem; Orhon, Derin
Water Framework Directive aiming to maintain and improve the aquatic environment in the EU was launched by the European Parliament in 2000. According to this directive, control of quantity is an ancillary element in securing good water quality and therefore measures on quantity, serving the objective of ensuring good quality should also be established. Accordingly, it is a comprehensive and coordinated package that will ensure all European waters to be protected according to a common standard. Therefore, it refers to all other Directives related to water resources management such as Urban Wastewater Treatment Directive Nitrates Directive, Drinking Water Directive, Integrated Pollution Prevention Control etc. Turkey, as a candidate state targeting full-membership, should comply the necessary preparations for the implementation of the "Water Framework Directive" as soon as possible. In this study, the necessary legislative, political, institutional, and technical attempts of the pre-accession countries have been discussed and effective recommendations have been offered for future activities in Turkey.
Jirka, Gerhard H.; Burrows, Richard; Larsen, Torben
The "combined approach" in the new EC-Water Framework Directive(WFD) consisting of environmental quality standards in addition to emission limit values promises improvements in the quality characteristics of surface water. However, the specification of where in the water body the environmental...... quality standards apply is missing in the WFD. The omission will limit its administrative implementation. A clear mixing zone regulation is needed so that the quality objectives of the WFD are not jeopardized. This need is demonstrated using the examples of point source discharges into rivers and coastal...
Landini, Fernando; Cowes, Valeria González; D'Amore, Eliana
Health services accessibility is a key health policy issue. However, few in-depth studies have addressed it theoretically. Most distinguish between availability, accessibility, and acceptability, or between geographic, financial, administrative, and cultural accessibility. We discuss and analyze the concept of accessibility as conflictive articulation between supply and demand in health. The article addresses the importance of cultural accessibility, rethinking it as a social interface, i.e., a social arena with clashing worldviews (namely, those of physicians and patients). The approach sheds light on the complex processes of grasping, translating, and reshaping knowledge and recommendations within such interaction.
Memon, Mukhtiar; Wagner, Stefan Rahr; Pedersen, Christian Fischer; Beevi, Femina Hassan Aysha; Hansen, Finn Overgaard
Ambient Assisted Living (AAL) is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions.
Full Text Available Ambient Assisted Living (AAL is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions.
Sinha, Pradeep K; Bendale, Prashant; Mantri, Manisha; Dande, Atreya
Discover How Electronic Health Records Are Built to Drive the Next Generation of Healthcare Delivery The increased role of IT in the healthcare sector has led to the coining of a new phrase ""health informatics,"" which deals with the use of IT for better healthcare services. Health informatics applications often involve maintaining the health records of individuals, in digital form, which is referred to as an Electronic Health Record (EHR). Building and implementing an EHR infrastructure requires an understanding of healthcare standards, coding systems, and frameworks. This book provides an
Full Text Available The purpose of the paper is legal base in the context of the system of ensuring standards of living standards of the population of Ukraine. Methodology. The analysis of normative – legal documents on the basic level of life of different population groups. The legislative field is investigated through the official web portal of the Verkhovna Rada of Ukraine, the State statistics service of Ukraine clarified the period from 1991 to the present. Results. Functioning laws of the last century – outdated, not consistent with the goals of social policy and the contemporary economy. It is important to modernize the laws, concerning basic living standards of the population to the country's foreign policy, according to the EU methodology. Apply state social standard as a tool for poverty reduction, and the perspective tool starter package with a guaranteed standard of living government to its citizens. The practical implications. Different stages of development of economy of independent Ukraine, laid the foundations of the legislative framework of normative documents concerning social protection of the population. A country's legal framework contains a set of laws belonging to the last century, policy and regulatory documents that comply with EU standards. In turn, the regulatory framework has tenedency to the modernization of laws that establish the guaranteed state social standards and guarantees for every citizen. Value/originality. Analysis of the legislative base, revealed the ineffectiveness of the law guaranteeing basic social standard to citizens. Understanding of the process of modernization of a relatively large part of the laws adopted in the last century.
The goals and frameworks for traffic and transport policy for the Netherlands to 2020 are described in the Mobility Document. Whereas government policy previously viewed mobility as a problem or as something permissible, the assumption is now that mobility is a must. Mobility, for people as well as goods, is a prerequisite for society and the economy to function well. The Mobility Document contains ambitious goals to deal with current and anticipated traffic and transport problems: door to door, faster, cleaner and safer. Three interrelated pillars are to help achieve these goals: Building, Pricing and Utilisation. Work is being done on the Building and Pricing pillars; Utilisation is elaborated further in this policy framework. The Policy Framework for Utilisation is an elaboration of the Mobility Document for the 2008-2020 period and aims for faster, cleaner, safer travel from door to door. The purpose of this policy framework is to describe the direction of development of utilisation, in terms of content as well as process, to indicate actions that are required and to provide perspective on the expected effects. The policy framework is in line with current developments or plans, caters to new opportunities (technological and otherwise), encourages the innovative potential of the market and provides room for joint ventures between the government and the market. It will result in actions for the short term and provide direction for activities and developments for the longer term
... Formaldehyde; Third-Party Certification Framework for the Formaldehyde Standards for Composite Wood Products... Certification Framework for the Formaldehyde Standards for Composite Wood Products AGENCY: Environmental... certification, auditing and reporting of third-party certifiers, recordkeeping, enforcement, laminated products...
Full Text Available Spectrum decision is the ability of a cognitive radio (CR) system to select the best available spectrum band to satisfy dynamic spectrum access network (DSAN) users¿ quality of service (QoS) requirements without causing harmful interference...
Dragut, Eduard Constantin
An increasing number of Web sites expose their content via query interfaces, many of them offering the same type of products/services (e.g., flight tickets, car rental/purchasing). They constitute the so-called "Deep Web". Accessing the content on the Deep Web has been a long-standing challenge for the database community. For a user interested in…
Davy, Carol; Harfield, Stephen; McArthur, Alexa; Munn, Zachary; Brown, Alex
Indigenous peoples often find it difficult to access appropriate mainstream primary health care services. Securing access to primary health care services requires more than just services that are situated within easy reach. Ensuring the accessibility of health care for Indigenous peoples who are often faced with a vast array of additional barriers including experiences of discrimination and racism, can be complex. This framework synthesis aimed to identify issues that hindered Indigenous peoples from accessing primary health care and then explore how, if at all, these were addressed by Indigenous health care services. To be included in this framework synthesis papers must have presented findings focused on access to (factors relating to Indigenous peoples, their families and their communities) or accessibility of Indigenous primary health care services. Findings were imported into NVivo and a framework analysis undertaken whereby findings were coded to and then thematically analysed using Levesque and colleague's accessibility framework. Issues relating to the cultural and social determinants of health such as unemployment and low levels of education influenced whether Indigenous patients, their families and communities were able to access health care. Indigenous health care services addressed these issues in a number of ways including the provision of transport to and from appointments, a reduction in health care costs for people on low incomes and close consultation with, if not the direct involvement of, community members in identifying and then addressing health care needs. Indigenous health care services appear to be best placed to overcome both the social and cultural determinants of health which hamper Indigenous peoples from accessing health care. Findings of this synthesis also suggest that Levesque and colleague's accessibility framework should be broadened to include factors related to the health care system such as funding.
Sundaresan, Puma; Stockler, Martin R; Milross, Christopher G
Optimal radiation therapy (RT) utilisation rates (RURs) have been defined for various cancer indications through extensive work in Australia and overseas. These benchmarks remain unrealised. The gap between optimal RUR and actual RUR has been attributed to inadequacies in 'RT access'. We aimed to develop a conceptual framework for the consideration of 'RT access' by examining the literature for existing constructs and translating it to the context of RT services. We further aimed to use this framework to identify and examine factors influencing 'RT access'. Existing models of health care access were reviewed and used to develop a multi-dimensional conceptual framework for 'RT access'. A review of the literature was then conducted to identify factors reported to affect RT access and utilisation. The electronic databases searched, the host platform and date range of the databases searched were Ovid MEDLINE, 1946 to October 2014 and PsycINFO via OvidSP,1806 to October 2014. The framework developed demonstrates that 'RT access' encompasses opportunity for RT as well as the translation of this opportunity to RT utilisation. Opportunity for RT includes availability, affordability, adequacy (quality) and acceptability of RT services. Several factors at the consumer, referrer and RT service levels affect the translation of this opportunity for RT to actual RT utilisation. 'Access' is a term that is widely used in the context of health service related research, planning and political discussions. It is a multi-faceted concept with many descriptions. We propose a conceptual framework for the consideration of 'RT access' so that factors affecting RT access and utilisation may be identified and examined. Understanding these factors, and quantifying them where possible, will allow objective evaluation of their impact on RT utilisation and guide implementation of strategies to modify their effects.
Möller, Markus; Doms, Juliane; Gerstmann, Henning; Feike, Til
Climate change has been recognized as a main driver in the increasing occurrence of extreme weather. Weather indices (WIs) are used to assess extreme weather conditions regarding its impact on crop yields. Designing WIs is challenging, since complex and dynamic crop-climate relationships have to be considered. As a consequence, geodata for WI calculations have to represent both the spatio-temporal dynamic of crop development and corresponding weather conditions. In this study, we introduce a WI design framework for Germany, which is based on public and open raster data of long-term spatio-temporal availability. The operational process chain enables the dynamic and automatic definition of relevant phenological phases for the main cultivated crops in Germany. Within the temporal bounds, WIs can be calculated for any year and test site in Germany in a reproducible and transparent manner. The workflow is demonstrated on the example of a simple cumulative rainfall index for the phenological phase shooting of winter wheat using 16 test sites and the period between 1994 and 2014. Compared to station-based approaches, the major advantage of our approach is the possibility to design spatial WIs based on raster data characterized by accuracy metrics. Raster data and WIs, which fulfill data quality standards, can contribute to an increased acceptance and farmers' trust in WI products for crop yield modeling or weather index-based insurances (WIIs).
Ezell, Matthew A [ORNL; Rogers, Gary L [University of Tennessee, Knoxville (UTK); Peterson, Gregory D. [University of Tennessee, Knoxville (UTK)
As cyber attacks become increasingly sophisticated, the security measures used to mitigate the risks must also increase in sophistication. One time password (OTP) systems provide strong authentication because security credentials are not reusable, thus thwarting credential replay attacks. The credential changes regularly, making brute-force attacks significantly more difficult. In high performance computing, end users may require access to resources housed at several different service provider locations. The ability to share a strong token between multiple computing resources reduces cost and complexity. The National Science Foundation (NSF) Extreme Science and Engineering Discovery Environment (XSEDE) provides access to digital resources, including supercomputers, data resources, and software tools. XSEDE will offer centralized strong authentication for services amongst service providers that leverage their own user databases and security profiles. This work implements a scalable framework built on standards to provide federated secure access to distributed cyberinfrastructure.
Boldrini, E.; Salas, F.; Maidment, D. R.; Mazzetti, P.; Santoro, M.; Nativi, S.; Domenico, B.
services, and executes complex queries against the available metadata. - inventory service (implemented as a THREDDS) being able to hierarchically organize and publish a local collection of multi-dimensional arrays (e.g. NetCDF, GRIB files), as well as publish auxiliary standard services to realize the actual data access and visualization (e.g. WCS, OPeNDAP, WMS). The approach followed in this research is to build on top of the existing standards and implementations, by setting up a standard-aware interoperable framework, able to deal with the existing heterogeneity in an organic way. As a methodology, interoperability tests against real services were performed; existing problems were thus highlighted and possibly solved. The use of flexible tools, able to deal in a smart way with heterogeneity has proven to be successful, in particular experiments were carried on with both GI-cat broker and ESRI GeoPortal frameworks. GI-cat discovery broker was proven successful at implementing the CSW interface, as well as federating heterogeneous resources, such as THREDDS and WCS services published by Unidata, HydroServer, WFS and SOS services published by CUAHSI. Experiments with ESRI GeoPortal were also successful: the GeoPortal was used to deploy a web interface able to distribute searches amongst catalog implementations from both the hydrologic and the atmospheric communities, including HydroServers and GI-cat, combining results from both the domains in a seamless way.
Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly
Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly
... alterations begun before January 26, 1992, in a good faith effort to make a facility accessible to individuals..., including advertisement in appropriate media, such as newspapers of general and special interest circulation...
O'Brien, K.; Kern, K.; Smith, B.; Schweitzer, R.; Simons, R.; Mendelssohn, R.; Diggs, S. C.; Belbeoch, M.; Hankin, S.
The Tropical Pacific Observing System (TPOS) has been functioning and capturing measurements since the mid 1990s during the very successful Tropical Ocean Global Atmosphere (TOGA) project. Unfortunately, in the current environment, some 20 years after the end of the TOGA project, sustaining the observing system is proving difficult. With the many advances in methods of observing the ocean, a group of scientists is taking a fresh look at what the Tropical Pacific Observing System requires for sustainability. This includes utilizing a wide variety of observing system platforms, including Argo floats, unmanned drifters, moorings, ships, etc. This variety of platforms measuring ocean data also provides a significant challenge in terms of integrated data management. It is recognized that data and information management is crucial to the success and impact of any observing system. In order to be successful, it is also crucial to avoid building stovepipes for data management. To that end, NOAA's Observing System Monitoring Center (OSMC) has been tasked to create a testbed of integrated real time and delayed mode observations for the Tropical Pacific region in support of the TPOS. The observing networks included in the prototype are: Argo floats, OceanSites moorings, drifting buoys, hydrographic surveys, underway carbon observations and, of course, real time ocean measurements. In this presentation, we will discuss how the OSMC project is building the integrated data prototype using existing free and open source software. We will explore how we are leveraging successful data management frameworks pioneered by efforts such as NOAA's Unified Access Framework project. We will also show examples of how conforming to well known conventions and standards allows for discoverability, usability and interoperability of data.
Gupta, Amarnath; Bug, William; Marenco, Luis; Qian, Xufei; Condit, Christopher; Rangarajan, Arun; Müller, Hans Michael; Miller, Perry L; Sanders, Brian; Grethe, Jeffrey S; Astakhov, Vadim; Shepherd, Gordon; Sternberg, Paul W; Martone, Maryann E
The overarching goal of the NIF (Neuroscience Information Framework) project is to be a one-stop-shop for Neuroscience. This paper provides a technical overview of how the system is designed. The technical goal of the first version of the NIF system was to develop an information system that a neuroscientist can use to locate relevant information from a wide variety of information sources by simple keyword queries. Although the user would provide only keywords to retrieve information, the NIF system is designed to treat them as concepts whose meanings are interpreted by the system. Thus, a search for term should find a record containing synonyms of the term. The system is targeted to find information from web pages, publications, databases, web sites built upon databases, XML documents and any other modality in which such information may be published. We have designed a system to achieve this functionality. A central element in the system is an ontology called NIFSTD (for NIF Standard) constructed by amalgamating a number of known and newly developed ontologies. NIFSTD is used by our ontology management module, called OntoQuest to perform ontology-based search over data sources. The NIF architecture currently provides three different mechanisms for searching heterogeneous data sources including relational databases, web sites, XML documents and full text of publications. Version 1.0 of the NIF system is currently in beta test and may be accessed through http://nif.nih.gov.
Talanquer, Vicente; Sevian, Hannah
Science education frameworks and standards play a central role in the development of curricula and assessments, as well as in guiding teaching practices in grades K-12. Recently, the National Research Council published a new Framework for K-12 Science Education that has guided the development of the Next Generation Science Standards. In this…
... Formaldehyde; Third-Party Certification Framework for the Formaldehyde Standards for Composite Wood Products..., concerning a third-party certification framework for the formaldehyde standards for composite wood products... Environmental protection, Composite wood products, Formaldehyde, Reporting and recordkeeping, Third-party...
... Formaldehyde; Third-Party Certification Framework for the Formaldehyde Standards for Composite Wood Products..., concerning a third-party certification framework for the formaldehyde standards for composite wood products... INFORMATION CONTACT. List of Subjects in 40 CFR Part 770 Environmental protection, Composite wood products...
Full Text Available The harmonization of national standards in tourism to international requirements is a prerequisite to perform in accordance with the signed Ukraine and the European Union Association Agreement. The current situation of the national standardization in tourism and directions of its development in the context of European integration are formed in the article. The content and objectives of standardization in the field of tourism are determined. The legislation of the national tourism standardization is reviewed: there are 11 standards in the field of tourism, including 6 interstate standards (GOST which adapted as national. The current system of standards has a numerous outdated requirements, Ukrainian enterprises doesn’t use international standards on the organization of trips of adventure tourism, safety management, customer service on cruise ships and ferries, requirements for tourist services etc. In order to satisfy the requirements of quality of tourism services to the European level is recommended to adapt existing ISO standards to the national tourism legislation and to approve them in 2017.
Luzia, Karina; Harvey, Marina; Parker, Nicola; McCormack, Coralie; Brown, Natalie R.
Benchmarking as a type of knowledge-sharing around good practice within and between institutions is increasingly common in the higher education sector. More recently, benchmarking as a process that can contribute to quality enhancement has been deployed across numerous institutions with a view to systematising frameworks to assure and enhance the…
Background: Many ontologies have been developed in biology and these ontologies increasingly contain large volumes of formalized knowledge commonly expressed in the Web Ontology Language (OWL). Computational access to the knowledge contained within these ontologies relies on the use of automated reasoning. Results: We have developed the Aber-OWL infrastructure that provides reasoning services for bio-ontologies. Aber-OWL consists of an ontology repository, a set of web services and web interfaces that enable ontology-based semantic access to biological data and literature. Aber-OWL is freely available at http://aber-owl.net. Conclusions: Aber-OWL provides a framework for automatically accessing information that is annotated with ontologies or contains terms used to label classes in ontologies. When using Aber-OWL, access to ontologies and data annotated with them is not merely based on class names or identifiers but rather on the knowledge the ontologies contain and the inferences that can be drawn from it.
Corominas, L.; Rieger, L.; Takacs, I.
Many unit process models are available in the field of wastewater treatment. All of these models use their own notation, causing problems for documentation, implementation and connection of different models (using different sets of state variables). The main goal of this paper is to propose a new...... is a framework that can be used in whole plant modelling, which consists of different fields such as activated sludge, anaerobic digestion, sidestream treatment, membrane bioreactors, metabolic approaches, fate of micropollutants and biofilm processes. The main objective of this consensus building paper...... notational framework which allows unique and systematic naming of state variables and parameters of biokinetic models in the wastewater treatment field. The symbols are based on one main letter that gives a general description of the state variable or parameter and several subscript levels that provide...
De Lusignan, Simon; Liyanage, Harshana; Di Iorio, Concetta Tania; Chan, Tom; Liaw, Siaw-Teng
The use of health data for public health, surveillance, quality improvement and research is crucial to improve health systems and health care. However, bodies responsible for privacy and ethics often limit access to routinely collected health data. Ethical approvals, issues around protecting privacy and data access are often dealt with by different layers of regulations, making approval processes appear disjointed. To create a comprehensive framework for defining the ethical and privacy status of a project and for providing guidance on data access. The framework comprises principles and related questions. The core of the framework will be built using standard terminology definitions such as ethics-related controlled vocabularies and regional directives. It is built in this way to reduce ambiguity between different definitions. The framework is extensible: principles can be retired or added to, as can their related questions. Responses to these questions should allow data processors to define ethical issues, privacy risk and other unintended consequences. The framework contains three steps: (1) identifying possible ethical and privacy principles relevant to the project; (2) providing ethics and privacy guidance questions that inform the type of approval needed; and (3) assessing case-specific ethics and privacy issues. The outputs from this process should inform whether the balance between public interests and privacy breach and any ethical considerations are tipped in favour of societal benefits. If they are then this should be the basis on which data access is permitted. Tightly linking ethical principles to governance and data access may help maintain public trust.
Soffel, Michael H.
The classical post-Newtonian (PN) framework is formulated in one single reference system. In a series of papers Damour, Soffel and Xu laid the foundations for a new improved PN framework dealing with the celestial mechanical problem of N gravitationally interacting rotating bodies of arbitrary shape and the problem of astronomical reference systems. In the DSX-framework a total of N+1 reference systems with corresponding coordinates is introduced in the N-body problem: a global one covering the entire model manifold where the translational equations of motion are formulated and one local system attached to each of the N bodies that is co-moving with the body under consideration. In each of these systems the metric tensor is assumed to be of a special form determined by two potentials: a scalar and a vector potential. Theorems are given for the transformations between local and global coordinates and metric potentials. In each of the local systems outside the local body the metric potentials are expressed in terms of Blanchet-Damour mass- and spin-multipole moments. The talk first introduces the original DSX formalism and then concentrates on IAU resolutions related with it. Finally, the formalism is extended to include also effects from the cosmic expansion. The influence of the Hubble expansion on the dynamics of the solar system is explicitly discussed in some detail.
junk food out of America’s schools. Mission: Readiness. Washington, DC. April 2010. 4. Niebuhr DW, Cavicchia MA, Bedno SA, et al. Accession...specified 464 1.8 322 1.8 99 1.9 41 1.3 Adverse food reactions, not elsewhere classified 437 1.7 402 2.2 94 1.8 68 2.2 Elevated blood pressure without...waivers for elevated blood pressure without a diagnosis of hypertension (88.4%) and toxic effect of noxious substances eaten as food (90.3%) had the
Karatzas, K.; Moussiopoulos, N.
The European Union (EU) legislative framework related to air quality, together with national legislation and relevant declarations of the United Nations (UN), requires an integrated approach concerning air quality management (AQM), and accessibility of related information for the citizens. In the present paper, the main requirements of this legislative framework are discussed and main air quality management and information system characteristics are drawn. The use of information technologies is recommended for the construction of such systems. The World Wide Web (WWW) is considered a suitable platform for system development and integration and at the same time as a medium for communication and information dissemination. (author)
Karatzas, K.; Moussiopoulos, N. [Aristotle University of Thessaloniki (Greece). Department of Mechanical Engineering, Laboratory of Heat Transfer and Environmental Engineering
The European Union (EU) legislative framework related to air quality, together with national legislation and relevant declarations of the United Nations (UN), requires an integrated approach concerning air quality management (AQM), and accessibility of related information for the citizens. In the present paper, the main requirements of this legislative framework are discussed and main air quality management and information system characteristics are drawn. The use of information technologies is recommended for the construction of such systems. The World Wide Web (WWW) is considered a suitable platform for system development and integration and at the same time as a medium for communication and information dissemination. (author)
Asghari, Shabnam; Hurd, Jillian; Marshall, Zack; Maybank, Allison; Hesselbarth, Lydia; Hurley, Oliver; Farrell, Alison; Kendall, Claire E; Rourke, Sean B; Becker, Marissa; Johnston, Sharon; Lundrigan, Phil; Rosenes, Ron; Bibeau, Christine; Liddy, Clare
Accessing healthcare can be difficult but the barriers multiply for people living with HIV (PLHIV). To improve access and the health of PLHIV, we must consider their perspectives and use them to inform standard practice. A better understanding of the current literature related to healthcare access from the perspective of PLHIV, can help to identify evidence gaps and highlight research priorities and opportunities. To identify relevant peer-reviewed publications, search strategies were employed. Electronic and grey literature databases were explored. Articles were screened based on their title and abstract and those that met the screening criteria, were reviewed in full. Data analysis was conducted using a collaborative approach that included knowledge user consultation. Initial concepts were extracted, summarized and through framework synthesis, developed into emerging and final themes. From 20,678 articles, 326 articles met the initial screening criteria and 64 were reviewed in full. The final themes identified, in order of most to least frequent were: Acceptability, Availability, Accessibility, Affordability, Other Barriers, Communication, Satisfaction, Accommodation, Preferences and Equity in Access. The most frequently discussed concepts related to negative interactions with staff, followed by long wait times, limited household resources or inability to pay fees, and fear of one's serostatus being disclosed. Knowledge users were in agreement with the categorization of initial concepts and final themes; however, some gaps in the literature were identified. Specific changes are critical to improving access to healthcare for PLHIV. These include improving availability by ensuring staff and healthcare professionals have proper training, cultivating acceptability and reducing stigma through improving HIV awareness, increasing accessibility through increased HIV information for PLHIV and improved dissemination of this information to increase patient knowledge and
Librarians and libraries have long been committed to providing equitable access to information. In the past decade and a half, the growth of the Internet and the rapid increase in the number of online library resources and tools have added a new dimension to this core duty of the profession: ensuring accessibility of online resources to users with…
Bellaire, Gunter; Steines, Daniel; Graschew, Georgi; Thiel, Andreas; Bernarding, Johannes; Tolxdorff, Thomas; Schlag, Peter M.
The system presented here enhances documentation and data- secured, second-opinion facilities by integrating video sequences into DICOM 3.0. We present an implementation for a medical video server extended by a DICOM interface. Security mechanisms conforming with DICOM are integrated to enable secure internet access. Digital video documents of diagnostic and therapeutic procedures should be examined regarding the clip length and size necessary for second opinion and manageable with today's hardware. Image sources relevant for this paper include 3D laparoscope, 3D surgical microscope, 3D open surgery camera, synthetic video, and monoscopic endoscopes, etc. The global DICOM video concept and three special workplaces of distinct applications are described. Additionally, an approach is presented to analyze the motion of the endoscopic camera for future automatic video-cutting. Digital stereoscopic video sequences are especially in demand for surgery . Therefore DSVS are also integrated into the DICOM video concept. Results are presented describing the suitability of stereoscopic display techniques for the operating room.
Abel, Steven; Sannino, Francesco
We present a consistent embedding of the matter and gauge content of the Standard Model into an underlying asymptotically safe theory that has a well-determined interacting UV fixed point in the large color/flavor limit. The scales of symmetry breaking are determined by two mass-squared parameters...... with the breaking of electroweak symmetry being driven radiatively. There are no other free parameters in the theory apart from gauge couplings....
Full Text Available , and then also evaluated the usability and direct accessibility support provided by the Digital Doorway, a non-standard computer system deployed amongst underprivileged communities in South Africa with the aim of promoting computer literacy. This paper discusses...
Neto, A.; Fernandes, H.; Valcarcel, D.; Varandas, C.; Vega, J.; Sanchez, E.; Pena, A.; Hron, M.
Each EURATOM association stores data using proprietary schemes, usually developed by the research unit or using third party software. The temporary exchange of researchers between laboratories is a common practice nowadays. When the researchers returns to the home laboratory, usually there is the need to continue to follow the work started in the foreign country. The quantity of available data has also become enormous and the principal data index is changing from the shot number to time and events, where the shot number is just one of the most relevant. To solve these problems a common software layer between end-users and laboratories must exist. The components needed to create this software abstraction layer, between users and laboratories data, have already been developed using an universal and well known remote procedure call standard based on XML: XML-RPC. The library allows data retrieving using the same methods for all associations. Users are authenticated through the PAPI system (http://papi.rediris.es), allowing each organization to use its own authentication schema. Presently there are libraries and server implementations in Java and C++. These libraries have been included and tested in some of the most common data analysis programs like MatLab and IDL. The system is already being used in ISTTOK/PT and CASTOR/CZ. (author)
Indiana Department of Education, 2015
The "Foundations" (English/language arts, mathematics, social emotional skills, approaches to play and learning, science, social studies, creative arts, and physical health and growth) are Indiana's early learning development framework and are aligned to the 2014 Indiana Academic Standards. This framework provides core elements that…
... 7 Agriculture 11 2010-01-01 2010-01-01 false RUS standard for service installations at customers... customers access locations. (a) Sections 1755.501 through 1755.510 cover service installations at permanent or mobile home customer access locations. Sections 1755.501 through 1755.510 do not cover service...
Liu, Ye; Palmer, Bart; Recker, Mimi
Professional education is increasingly facing accessibility challenges with the emergence of webbased learning. This paper summarizes related U.S. legislation, standards, guidelines, and validation tools to make web-based learning accessible for all potential learners. We also present lessons learned during the implementation of web accessibility…
Rada Cristina IRIMIE
Full Text Available The term of “globalization” has been very much used in recent days in order to explain a series of phenomena, especially from the economic field. Indeed, after a thorough research, globalization could be mostly relegated to the economic field. However, the term can also designate a series of other processes, from other fields of activities. The present paper shall deal with an analysis of two economic colossus – the European Union and China by applying to them the variable of “globalization”. It is globalization that made possible the establishment of such relations, as well as their effective management and constant improvement. Even if the relations have had their ups and downs, the general framework provided by globalization helped appease conflicts when they were about to break out, and even offered an alternative to situations which seemed impossible to manage (as is the arms embargo. The paper shall be structured as follows: an analysis of the term from several perspectives, including the social and economic one, followed by its application to the given situation: the European Union-Chinese relations. In this regard, given time and space constraints, we shall limit the research to only two types of cooperation: economic and political and security-related cooperation. The final chapter of the paper shall also refer to several elements of discontent within this relation, such as the arms embargo, the disrespect for human rights in China and the unstable situation of Taiwan. However, these elements of discontent shall only be referred to when necessary, leaving a deeper analysis of them to a future academic endeavor.
Designing assessments and tests is one of the more challenging aspects of creating an accessible learning environment for students who are deaf or hard of hearing (DHH), particularly for deaf students with a disability (DWD). Standardized assessments are a key mechanism by which the educational system in the United States measures student progress, teacher effectiveness, and the impact of school reform. The diversity of student characteristics within DHH and DWD populations is only now becoming visible in the research literature relating to standardized assessments and their use in large-scale accountability reforms. The purpose of this article is to explore the theoretical frameworks surrounding assessment policy and practice, current research related to standardized assessment and students who are DHH and DWD, and potential implications for practice within both the assessment and instruction contexts.
The concept of Web accessibility refers to a combined set of measures, namely, how easily and how efficiently different types of users may make use of a given service. While some recommendations for accessibility are focusing on people with variousspecific disabilities, this document seeks...... to broaden the scope to any type of user and any type of use case. The document provides an introduction to some required concepts and technical standards for designing accessible Web sites. A brief review of thelegal requirements in a few countries for Web accessibility complements the recommendations...
Avdeeva, Svetlana; Zaichkina, Olga; Nikulicheva, Nataliya; Khapaeva, Svetlana
The paper deals with problems of working out a test framework for the assessment of teachers' ICT competency in line with the requirements of "Teacher" occupational standard. The authors have analyzed the known approaches to assessing teachers' ICT competency--ISTE Standards and UNESCO ICT CFT and have suggested their own approach to…
Louden, William; Wildy, Helen
Professional standards for school principals typically describe an ideal performance in a generalized context. This article describes an alternative method of developing a standards framework, combining qualitative vignettes with probabilistic measurement techniques to provide essential or ideal performance qualities with contextually rich…
Murphy, Aileen; Garavan, Thomas N.
This article proposes a conceptual framework to explain the adoption and diffusion of a national human resource development (NHRD) standard. NHRD standards are used by governments to promote training and development in organizations and increase the professionalization of practices used by organizations. Institutional theory suggests that adoption…
Arias, M. C.; Bernaldez, A.L.; Ghiggeri, M.; Tula, C.
The right of access to information by citizens about activities related to scientific and technological development of nuclear energy for peaceful uses, has evolved over time. Governments began to perceive the necessity and the benefits of informing the community, who manifested certain prejudices about nuclear activity as a consequence of the propelling of nuclear bombs in Nagasaki and Hiroshima. With the advent of environmental law and the influence of its principles, the idea of transparency of information in the nuclear field was imposed, and also the importance of both the inhabitants of countries with nuclear developments and neighbouring countries who may be affected by the bordering effects of ionizing radiation, could have access to information and to participate actively. The access to information and citizen participation has been institutionalized and reflected in international regulations through international conventions subscribed by our country and nationally through the National Constitution, the Provincials Constitutions, the City of Buenos Aires Constitution, Laws No. 25.675, 25.831 and PEN Decree No. 1172/03, among others. The present work aims to make an overview of the legal framework related to access to information on nuclear activity. (authors) [es
to the physical media (i.e., the wireless RF network). On the transmission side, it is responsible for framing IP packets for physical transmission ...resolution bandwidth of 30 kHz. It was measured during the steady power condition during a burst transmission . Telemetry Standards, RCC Standard... power levels available for modulated burst transmission . Table 27-1. Transceiver Phase Noise Mask dBc/Hz Frequency Offset −30 dBc/Hz 10 Hz −60 dBc
van Bussel, Erik Martijn; van der Voort, Marc Boudewijn Victor Rouppe; Wessel, Ronald N; van Merode, Godefridus G
While theoretical frameworks for optimization of the outpatient processes are abundant, practical step-by-step analyses to give leads for improvement, to forecast capacity, and to support decision making are sparse. This article demonstrates how to evaluate and optimize the triad of demand, (future) capacity, and access time of the outpatient clinic using a structured six-step method. All individual logistical patient data of an orthopaedic outpatient clinic of one complete year were analysed using a 6-step method to evaluate demand, supply, and access time. Trends in the data were retrospectively analysed and evaluated for potential improvements. A model for decision making was tested. Both the analysis of the method and actual results were considered as main outcomes. More than 25 000 appointments were analysed. The 6-step method showed to be sufficient to result in valuable insights and leads for improvement. While the overall match between demand and capacity was considered adequate, the variability in capacity was much higher than in demand, thereby leading to delays in access time. Holidays and subsequent weeks showed to be of great influence for demand, capacity, and access time. Using the six-step method, several unfavourable characteristics of the outpatient clinic were revealed and a better match between demand, supply, and access time could have been reached with only minor adjustments. Last, a clinic specific prediction and decision model for demand and capacity was made using the 6-step method. The 6-step analysis can successfully be applied to redesign and improve the outpatient health care process. The results of the analysis showed that national holidays and variability in demand and capacity have a big influence on the outpatient clinic. Using the 6-step method, practical improvements in outpatient logistics were easily found and leads for future decision making were contrived. © 2018 The Authors Journal of Evaluation in Clinical Practice
Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.
Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives
Simon de Lusignan
Full Text Available Background The use of health data for public health, surveillance, quality improvement and research is crucial to improve health systems and health care. However, bodies responsible for privacy and ethics often limit access to routinely collected health data. Ethical approvals, issues around protecting privacy and data access are often dealt with by different layers of regulations, making approval processes appear disjointed.Objective To create a comprehensive framework for defining the ethical and privacy status of a project and for providing guidance on data access.Method The framework comprises principles and related questions. The core of the framework will be built using standard terminology definitions such as ethics-related controlled vocabularies and regional directives. It is built in this way to reduce ambiguity between different definitions. The framework is extensible: principles can be retired or added to, as can their related questions. Responses to these questions should allow data processors to define ethical issues, privacy risk and other unintended consequences.Results The framework contains three steps: (1 identifying possible ethical and privacy principles relevant to the project; (2 providing ethics and privacy guidance questions that inform the type of approval needed; and (3 assessing case-specific ethics and privacy issues. The outputs from this process should inform whether the balance between public interests and privacy breach and any ethical considerations are tipped in favour of societal benefits. If they are then this should be the basis on which data access is permitted. Tightly linking ethical principles to governance and data access may help maintain public trust.
Duran, Felicia A.; Camp, Allen L.; Apostolakis, George E.; Golay, Michael W.
This paper summarizes the development of a framework for risk-based regulation and design for new nuclear power plants. Probabilistic risk assessment methods and a rationalist approach to defense in depth are used to develop a framework that can be applied to identify systematically the regulations and standards required to maintain the desired level of safety and reliability. By implementing such a framework, it is expected that the resulting body of requirements will provide a regulatory environment that will ensure protection of the public, will eliminate the burden of requirements that do not contribute significantly to safety, and thereby will improve the market competitiveness of new plants. (author)
Lee, Woongryol; Park, Mikyung; Lee, Taegu; Lee, Sangil; Yun, Sangwon; Park, Jinseop; Park, Kaprai
Highlights: • We performed a standardized of control system in KSTAR. • EPICS based software framework is developed for the realization of various control systems. • The applicability of the framework is widened from a simple command dispatcher to the real time application. • Our framework supports the implementation of embedded IOC in FPGA board. - Abstract: Standardization of control system is an important issue in KSTAR which is organized with various heterogeneous systems. Diverse control systems in KSTAR have been adopting new application software since 2010. Development of this software was launched for easy implementation of a data acquisition system but it is extended to as a Standard Framework (SFW) of control system in KSTAR. It is composed with a single library, database, template, and descriptor files. The SFW based controller has common factors. It has non-blocking control command method with a thread. The internal sequence handler makes it can be synchronized with KSTAR experiment. It also has a ring buffer pool mechanism for streaming input data handling. Recently, there are two important functional improvements in the framework. Processor embedded FPGA was proposed as a standard hardware platform for specific application. These are also manipulated by the SFW based embedded application. This approach gives single board system an ability of low level distributed control under the EPICS environments. We also developed a real time monitoring system as a real time network inspection tool in 2012 campaign using the SFW
Madan, Hennadii; Pernuš, Franjo; Likar, Boštjan; Špiclin, Žiga
Advanced image-guided medical procedures incorporate 2D intra-interventional information into pre-interventional 3D image and plan of the procedure through 3D/2D image registration (32R). To enter clinical use, and even for publication purposes, novel and existing 32R methods have to be rigorously validated. The performance of a 32R method can be estimated by comparing it to an accurate reference or gold standard method (usually based on fiducial markers) on the same set of images (gold standard dataset). Objective validation and comparison of methods are possible only if evaluation methodology is standardized, and the gold standard dataset is made publicly available. Currently, very few such datasets exist and only one contains images of multiple patients acquired during a procedure. To encourage the creation of gold standard 32R datasets, we propose an automatic framework. The framework is based on rigid registration of fiducial markers. The main novelty is spatial grouping of fiducial markers on the carrier device, which enables automatic marker localization and identification across the 3D and 2D images. The proposed framework was demonstrated on clinical angiograms of 20 patients. Rigid 32R computed by the framework was more accurate than that obtained manually, with the respective target registration error below 0.027 mm compared to 0.040 mm. The framework is applicable for gold standard setup on any rigid anatomy, provided that the acquired images contain spatially grouped fiducial markers. The gold standard datasets and software will be made publicly available.
Kiilerich Pratas, Nuno; Thomsen, Henning; Popovski, Petar
In this chapter, we describe and discuss the current LTE random access procedure and the Radio Access Network Load Control solution within LTE/LTE-A. We provide an overview of the several considered load control solutions and give a detailed description of the standardized Extended Access Class B...
Zhu, Hongwei; Wu, Harris
The primary purpose of data standards or metadata schemas is to improve the interoperability of data created by multiple standard users. Given the high cost of developing data standards, it is desirable to assess the quality of data standards. We develop a set of metrics and a framework for assessing data standard quality. The metrics include completeness and relevancy. Standard quality can also be indirectly measured by assessing interoperability of data instances. We evaluate the framework using data from the financial sector: the XBRL (eXtensible Business Reporting Language) GAAP (Generally Accepted Accounting Principles) taxonomy and US Securities and Exchange Commission (SEC) filings produced using the taxonomy by approximately 500 companies. The results show that the framework is useful and effective. Our analysis also reveals quality issues of the GAAP taxonomy and provides useful feedback to taxonomy users. The SEC has mandated that all publicly listed companies must submit their filings using XBRL. Our findings are timely and have practical implications that will ultimately help improve the quality of financial data.
Gabbard, Anita; Mupinga, Davison M.
Community colleges act as the gateway for students to higher education. Many of these colleges realize this mission through open-door policies where students lacking in basic reading, writing, and mathematics skills can enroll. But, this open-access policy often creates challenges when meeting academic standards. Based on data collected from…
Full Text Available General risk management standard, e.g. ISO 31000:2009, approaches risk as a coin with a pair of two sides, i.e. the threat and the opportunity. However, it is hardly the case of flood events which mainly come as threats. Despite the contrary, this study explores the potential applicability of the available risk management standards specifically for flood. It then also synthesizes the components to result a framework for allocating resources among various strategies to result the optimum flood risk reduction. In order to review its applicability, the framework is then reviewed using several historic flood risk reduction cases. Its results are qualitatively discussed and summarized including the possible improvement of the framework for further applications.
Andres, Ellie; Baird, Sarah; Bingenheimer, Jeffrey Bart; Markus, Anne Rossier
Background Maternity leave is integral to postpartum maternal and child health, providing necessary time to heal and bond following birth. However, the relationship between maternity leave and health outcomes has not been formally and comprehensively assessed to guide public health research and policy in this area. This review aims to address this gap by investigating both the correlates of maternity leave utilization in the US and the related health benefits for mother and child. Methods We searched the peer-reviewed scholarly literature using six databases for the years 1990 to early 2015 and identified 37 studies to be included in the review. We extracted key data for each of the included studies and assessed study quality using the "Weight of the Evidence" approach. Results The literature generally confirms a positive, though limited correlation between maternity leave coverage and utilization. Likewise, longer maternity leaves are associated with improved breastfeeding intentions and rates of initiation, duration and predominance as well as improved maternal mental health and early childhood outcomes. However, the literature points to important disparities in access to maternity leave that carry over into health outcomes, such as breastfeeding. Synthesis We present a conceptual framework synthesizing what is known to date related to maternity leave access and health outcomes.
...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...
Craig F. Berning
Full Text Available The debate as to whether to require mandatory labeling of genetically modified organism (GMO foods was partially settled on 29 July 2016, when President Obama signed the National Bioengineered Food Disclosure Standard into public law. In contrast to precipitating legislation passed by the State of Vermont that required disclosure of GMO ingredients on food shelves or food packages, the superseding National Standard allows firms to disclose bioengineered ingredients to consumers via symbols, electronic or digital links, or phone numbers, and further requires a study assessing the ability of consumers to access disclosure information by these means. This communication analyzes survey responses from 525 adults to investigate whether U.S. consumers are able to obtain information as per the disclosure methods allowed in the Federal legislation. The survey probes deeper to investigate consumer perceptions of genetically modified organisms and whether consumers would use the tools available to access disclosure about bioengineered ingredients. Findings from the survey show that 93.8% of respondents have the ability to access information via the disclosure methods permitted. Those in the lowest income group, and from the oldest age group are least likely to have such access. This provides the United State Department of Agriculture with information relevant to how they can implement the law and highlights particular demographic segments that may require additional attention to ensure the disclosed information is universally accessible.
The National Academy of Sciences has created a committee of 18 National Academy of Science and Engineering members, academic scientists, cognitive and learning scientists, and educators, educational policymakers and researchers to develop a framework to guide new K-12 science education standards. The committee began its work in January, 2010, released a draft of the framework in July, 2010, and intends to have the final framework in the first quarter of 2011. The committee was helped in early phases of the work by consultant design teams. The framework is designed to help realize a vision for science and engineering education in which all students actively engage in science and engineering practices in order to deepen their understanding of core ideas in science over multiple years of school. These three dimensions - core disciplinary ideas, science and engineering practices, and cross-cutting elements - must blend together to build an exciting, relevant, and forward looking science education. The framework will be used as a base for development of next generation K-12 science education standards.
Goodman, Julie E; Prueitt, Robyn L; Sax, Sonja N; Bailey, Lisa A; Rhomberg, Lorenz R
Abstract A scientifically sound assessment of the potential hazards associated with a substance requires a systematic, objective and transparent evaluation of the weight of evidence (WoE) for causality of health effects. We critically evaluated the current WoE framework for causal determination used in the United States Environmental Protection Agency's (EPA's) assessments of the scientific data on air pollutants for the National Ambient Air Quality Standards (NAAQS) review process, including its methods for literature searches; study selection, evaluation and integration; and causal judgments. The causal framework used in recent NAAQS evaluations has many valuable features, but it could be more explicit in some cases, and some features are missing that should be included in every WoE evaluation. Because of this, it has not always been applied consistently in evaluations of causality, leading to conclusions that are not always supported by the overall WoE, as we demonstrate using EPA's ozone Integrated Science Assessment as a case study. We propose additions to the NAAQS causal framework based on best practices gleaned from a previously conducted survey of available WoE frameworks. A revision of the NAAQS causal framework so that it more closely aligns with these best practices and the full and consistent application of the framework will improve future assessments of the potential health effects of criteria air pollutants by making the assessments more thorough, transparent, and scientifically sound.
Duda, Catherine; Rajaram, Kumar; Barz, Christiane; Rosenthal, J Thomas
There has been an increasing emphasis on health care efficiency and costs and on improving quality in health care settings such as hospitals or clinics. However, there has not been sufficient work on methods of improving access and customer service times in health care settings. The study develops a framework for improving access and customer service time for health care settings. In the framework, the operational concept of the bottleneck is synthesized with queuing theory to improve access and reduce customer service times without reduction in clinical quality. The framework is applied at the Ronald Reagan UCLA Medical Center to determine the drivers for access and customer service times and then provides guidelines on how to improve these drivers. Validation using simulation techniques shows significant potential for reducing customer service times and increasing access at this institution. Finally, the study provides several practice implications that could be used to improve access and customer service times without reduction in clinical quality across a range of health care settings from large hospitals to small community clinics.
Kaiser, Mary Elizabeth; Morris, Matthew; Aldoroty, Lauren; Kurucz, Robert; McCandliss, Stephan; Rauscher, Bernard; Kimble, Randy; Kruk, Jeffrey; Wright, Edward L.; Feldman, Paul; Riess, Adam; Gardner, Jonathon; Bohlin, Ralph; Deustua, Susana; Dixon, Van; Sahnow, David J.; Perlmutter, Saul
Establishing improved spectrophotometric standards is important for a broad range of missions and is relevant to many astrophysical problems. Systematic errors associated with astrophysical data used to probe fundamental astrophysical questions, such as SNeIa observations used to constrain dark energy theories, now exceed the statistical errors associated with merged databases of these measurements. ACCESS, “Absolute Color Calibration Experiment for Standard Stars”, is a series of rocket-borne sub-orbital missions and ground-based experiments designed to enable improvements in the precision of the astrophysical flux scale through the transfer of absolute laboratory detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35‑1.7μm bandpass. To achieve this goal ACCESS (1) observes HST/ Calspec stars (2) above the atmosphere to eliminate telluric spectral contaminants (e.g. OH) (3) using a single optical path and (HgCdTe) detector (4) that is calibrated to NIST laboratory standards and (5) monitored on the ground and in-flight using a on-board calibration monitor. The observations are (6) cross-checked and extended through the generation of stellar atmosphere models for the targets. The ACCESS telescope and spectrograph have been designed, fabricated, and integrated. Subsystems have been tested. Performance results for subsystems, operations testing, and the integrated spectrograph will be presented. NASA sounding rocket grant NNX17AC83G supports this work.
Raveh, Ira; Koichu, Boris; Peled, Irit; Zaslavsky, Orit
In this article we present an integrative framework of knowledge for teaching the standard algorithms of the four basic arithmetic operations. The framework is based on a mathematical analysis of the algorithms, a connectionist perspective on teaching mathematics and an analogy with previous frameworks of knowledge for teaching arithmetic…
Jacques Prefontaine; Jean Desrochers; Lise Godbout
The market turmoil that began in mid-2007 re-emphasized the importance of liquidity to the functioning of financial markets and the banking sector. In December 2009, the Basel Committee on Banking Supervision (BCBS) of the Bank for International Settlements (BIS) released a consultative document entitled: “International Framework for Liquidity Risk Measurement, Standards and Monitoring”. Interested parties were invited to provide written comments by April 16th 2010. Given our interest in prom...
Full Text Available South Africa’s performance in international benchmark tests is a major cause for concern amongst educators and policymakers, raising questions about the effectiveness of the curriculum reform efforts of the democratic era. The purpose of the study reported in this article was to investigate the degree of alignment between the TIMSS 2003 Grade 8 Mathematics assessment frameworks and the Revised National Curriculum Statements (RNCS assessment standards for Grade 8 Mathematics, later revised to become the Curriculum and Assessment Policy Statements (CAPS. Such an investigation could help to partly shed light on why South African learners do not perform well and point out discrepancies that need to be attended to. The methodology of document analysis was adopted for the study, with the RNCS and the TIMSS 2003 Grade 8 Mathematics frameworks forming the principal documents. Porter’s moderately complex index of alignment was adopted for its simplicity. The computed index of 0.751 for the alignment between the RNCS assessment standards and the TIMSS assessment objectives was found to be significantly statistically low, at the alpha level of 0.05, according to Fulmer’s critical values for 20 cells and 90 or 120 standard points. The study suggests that inadequate attention has been paid to the alignment of the South African mathematics curriculum to the successive TIMSS assessment frameworks in terms of the cognitive level descriptions. The study recommends that participation in TIMSS should rigorously and critically inform ongoing curriculum reform efforts.
Crosby, C. J.; Nandigam, V.; Krishnan, S.; Cowart, C.; Baru, C.; Arrowsmith, R.
Lidar (Light Detection and Ranging) data, collected from space, airborne and terrestrial platforms, have emerged as an invaluable tool for a variety of Earth science applications ranging from ice sheet monitoring to modeling of earth surface processes. However, lidar present a unique suite of challenges from the perspective of building cyberinfrastructure systems that enable the scientific community to access these valuable research datasets. Lidar data are typically characterized by millions to billions of individual measurements of x,y,z position plus attributes; these "raw" data are also often accompanied by derived raster products and are frequently terabytes in size. As a relatively new and rapidly evolving data collection technology, relevant open data standards and software projects are immature compared to those for other remote sensing platforms. The NSF-funded OpenTopography Facility project has developed an online lidar data access and processing system that co-locates data with on-demand processing tools to enable users to access both raw point cloud data as well as custom derived products and visualizations. OpenTopography is built on a Service Oriented Architecture (SOA) in which applications and data resources are deployed as standards compliant (XML and SOAP) Web services with the open source Opal Toolkit. To develop the underlying applications for data access, filtering and conversion, and various processing tasks, OpenTopography has heavily leveraged existing open source software efforts for both lidar and raster data. Operating on the de facto LAS binary point cloud format (maintained by ASPRS), open source libLAS and LASlib libraries provide OpenTopography data ingestion, query and translation capabilities. Similarly, raster data manipulation is performed through a suite of services built on the Geospatial Data Abstraction Library (GDAL). OpenTopography has also developed our own algorithm for high-performance gridding of lidar point cloud data
Mueller, Sandra R; Wäger, Patrick A; Turner, David A; Shaw, Peter J; Williams, Ian D
An increasing number of geochemically scarce metallic raw materials are entering into our lives via new technologies. A reversal of this trend is not foreseeable, leading to concerns regarding the security of their supply. However, the evaluation of raw material supply is currently hampered by inconsistent use of fundamental terminologies and incomplete assessment criteria. In this paper, we aim to establish a consistent framework for evaluating raw material supply from both anthropogenic and geological sources. A method for concept extraction was applied to evaluate systematically the use of fundamental terms in the evaluation of raw material supply. The results have shown that 'availability' is commonly used in raw material supply evaluations, whilst other researchers suggest that raw material supply should be evaluated based on 'accessibility'. It was revealed that 'accessibility' actually comprises two aspects: 'availability' and 'approachability'. Raw material 'approachability' has not previously been explicitly addressed at a system level. A novel, consistent framework for evaluating raw material supply was therefore developed. To demonstrate the application of the established framework, we evaluated the raw material supply of four rare earth element case studies. Three case studies are End-of-Life products (the anthroposphere) from Switzerland: (i) phosphors in fluorescent lamps, (i) permanent magnets in the drive motors of electric cars and (iii) fibre optic cable. The fourth case study source is the Earth's crust (the geosphere): Mount Weld deposit in Australia. The framework comprises a comprehensive evaluation of six components relating to raw material mining and processing: their geological knowledge, eligibility, technology, economic, societal and environmental impacts. Our results show that metals are not considered to be fully accessible in any of the case studies due to a lack of necessary technologies and potential societal and environmental
Backes, Michael; Bugiel, Sven; Gerling, Sebastian; von Styp-Rekowsky, Philipp
We introduce the Android Security Framework (ASF), a generic, extensible security framework for Android that enables the development and integration of a wide spectrum of security models in form of code-based security modules. The design of ASF reflects lessons learned from the literature on established security frameworks (such as Linux Security Modules or the BSD MAC Framework) and intertwines them with the particular requirements and challenges from the design of Android's software stack. ...
O'Connell, Jane; Gardner, Glenn; Coyer, Fiona
This paper presents a discussion on the application of a capability framework for advanced practice nursing standards/competencies. There is acceptance that competencies are useful and necessary for definition and education of practice-based professions. Competencies have been described as appropriate for practice in stable environments with familiar problems. Increasingly competencies are being designed for use in the health sector for advanced practice such as the nurse practitioner role. Nurse practitioners work in environments and roles that are dynamic and unpredictable necessitating attributes and skills to practice at advanced and extended levels in both familiar and unfamiliar clinical situations. Capability has been described as the combination of skills, knowledge, values and self-esteem which enables individuals to manage change, be flexible and move beyond competency. A discussion paper exploring 'capability' as a framework for advanced nursing practice standards. Data were sourced from electronic databases as described in the background section. As advanced practice nursing becomes more established and formalized, novel ways of teaching and assessing the practice of experienced clinicians beyond competency are imperative for the changing context of health services. Leading researchers into capability in health care state that traditional education and training in health disciplines concentrates mainly on developing competence. To ensure that healthcare delivery keeps pace with increasing demand and a continuously changing context there is a need to embrace capability as a framework for advanced practice and education. © 2014 John Wiley & Sons Ltd.
Greco, Ernesto; Barriuso, Clemente; Castro, Miguel Angel; Fita, Guillermina; Pomar, José L
Port-Access surgery has been one of the most innovative and controversial methods in the spectrum of minimally invasive techniques for cardiac operations and has been widely used for the treatment of several cardiac diseases. The technique was introduced in our center to evaluate its efficacy in reproducing standardized results without an additional risk. Endovascular cardiopulmonary bypass (CPB) through femoral access and endoluminal aortic occlusion were used in 129 patients for a variety of surgical procedures, all of which were video-assisted. A minimal (4-6 cm) anterior thoracotomy through the fourth intercostal space was used in all cases as the surgical approach. More than 96% of the planned cases concluded as true Port-Access procedures. Mean CBP and crossclamp times were 87.2 min. +/- 51.2 (range of 10-457) and 54.9 min. +/- 30.6 (range of 10-190), respectively. Hospital mortality for the overall group was 1.5%, and mitral valve surgery had a 2.2% hospital death rate. The incidence of early neurological events was 0.7%. Mean extubation time, ICU stay, and total length of hospital stay were 5 hours +/- 6 hrs. (range of 2-32), 12 hours +/- 11.8 hrs. (range of 5-78), and 7 days +/- 7.03 days (range of 1-72), respectively. Our experience indicates that the Port- Access technique is safe and permits reproduction of standardized results with the use of a very limited surgical approach. We are convinced that this is a superior procedure for certain types of surgery, including isolated primary or redo mitral surgery, repair of a variety of atrial septal defects (ASDs), and atrial tumors. It is especially useful in high-risk patients, such as elderly patients or those requiring reoperation. Simplification of the procedure is nevertheless desirable in order to further reduce the time of operation and to address other drawbacks.
Full Text Available This paper presents the SEMAINE API, an open source framework for building emotion-oriented systems. By encouraging and simplifying the use of standard representation formats, the framework aims to contribute to interoperability and reuse of system components in the research community. By providing a Java and C++ wrapper around a message-oriented middleware, the API makes it easy to integrate components running on different operating systems and written in different programming languages. The SEMAINE system 1.0 is presented as an example of a full-scale system built on top of the SEMAINE API. Three small example systems are described in detail to illustrate how integration between existing and new components is realised with minimal effort.
Pons Rotger, Gabriel Angel; Nielsen, Thomas Alexander Sick
increases the probability of long commutes (> 4 km) - and decreases the probability of short commutes (men and women...... it is mainly women that are affected by the accessibility gain and commute longer distances in responses to proximity to the metro. Comparing older and younger commuters it is mainly the older commuters that respond to the increased accessibility offered by metro access – by commuting longer distances....... Comparing income groups a considerably stronger response to the increased accessibility is seen in the highest earning and presumably most skilled group. Comparing commuting responses to metro access grouped by the past commuting behavior of the responspondents indicate a positive effect of proximity...
Mense, Alexander; Urbauer, Philipp; Sauermann, Stefan
The adoption of the Internet of Things (IoT) and mobile applications in the healthcare may transform the healthcare industry by offering better disease tracking and management as well as patient empowerment. Unfortunately, almost all of these new systems set up their own ecosystem and to be really valuable for the care process they need to be integrated or federated with user managed access control services based on international standards and profiles to enable interoperability. Thus, this work presents the results of an evaluation of available specifications for federated authorization, based on a set of basic requirements.
Full Text Available The target of the Open Geospatial Consortium (OGC is interoperability of geographic information, which means creating opportunities to access geodata in a consistent, standardized way. In the domain of sensor data, the target will be picked up within the OGC Sensor Web Enablement Initiative and especially reached through the Sensor Observation Service (SOS standard. This one defines a service for a standardized access to time series data and is usually used for in situ sensors (like discharge gauges and climate stations. Although the standard considers raster data, no implementation of the standard for raster data exists presently. In this paper an OGC-compliant Sensor Observation Service for a standardized access to raster data is described. A data model was developed that enables effective storage of the raster data with the corresponding metadata in a database, reading this data in an efficient way, and encoding it with result formats that the SOS-standard provides.
ZHAO Haitao; ZHANG Shaojie; Emiliano Garcia-Palacios
Densely deployed WiFi networks will play a crucial role in providing the capacity for next generation mobile internet.However,due to increasing interference,overlapped channels in WiFi networks and throughput efficiency degradation,densely deployed WiFi networks is not a guarantee to obtain higher throughput.An emergent challenge is how to efficiently utilize scarce spectrum resources,by matching physical layer resources to traffic demand.In this aspect,access control allocation strategies play a pivotal role but remain too coarse-grained.As a solution,this research proposes a flexible framework for fine-grained channel width adaptation and multi-channel access in WiFi networks.This approach,named SFCA (Subcarrier Fine-grained Channel Access),adopts DOFDM (Discontinuous Orthogonal Frequency Division Multiplexing) at the PHY layer.It allocates the frequency resource with a subcarrier granularity,which facilitates the channel width adaptation for multi-channel access and thus brings more flexibility and higher frequency efficiency.The MAC layer uses a frequencytime domain backoff scheme,which combines the popular time-domain BEB scheme with a frequency-domain backoff to decrease access collision,resulting in higher access probability for the contending nodes.SFCA is compared with FICA (an established access scheme) showing significant outperformance.Finally we present results for next generation 802.11 ac WiFi networks.
D’Aucelli, Giuseppe Maria; Giaquinto, Nicola; Mannatrizio, Sabino; Savino, Mario
In this paper, a full featured MATLAB framework for Measurement System Analysis, fully compliant with the ISO 5725 Repeatability and Reproducibility (R and R) assessment is presented. While preserving the operations prescribed in the ISO standard, the software presents distinct improvements. First of all, all computations are made using exact closed-form formulae (instead of statistical tables) allowing a consistent analysis without limitations on the number of participating laboratories and measurements, and using custom significance levels of statistical tests. Second, a double threshold decision system for each test step has been implemented, helping the statistician to decide on the elimination of outliers/stragglers. Third, ANOVA analysis has been included. The software therefore, besides producing quickly and efficiently all the graphical and numerical results required in an inter-laboratory experiment, provide guidelines for properly updating the ISO 5725 standard. (paper)
Wees, M.T. van; Uyterlinde, M.A.; Maly, M.
The main barrier for end-use energy efficiency and renewable energy in the Czech Republic is the lack of a stable political and regulatory framework. Market incentives can only properly work if the market conditions and restrictions are clear and stable. However, no comprehensive policies and regulation have been implemented in the Czech Republic. Although the acquis communautaire of the European Union includes regulation on energy efficiency and renewable energy, this topic remains low on the negotiation agenda for accession. This paper reports on the current situation in the Czech Republic, including the potentials for end-use energy efficiency and renewable energy, on the existing policy and regulatory framework, and on the remaining gaps with the requirements of accession to the European Union. Also, the impact of the recent increase of nuclear capacity on energy efficiency and renewable energy in the Czech Republic is discussed
Qiu, Junchao; Zhang, Lin; Li, Diyang; Liu, Xingcheng
Chaotic sequences can be applied to realize multiple user access and improve the system security for a visible light communication (VLC) system. However, since the map patterns of chaotic sequences are usually well known, eavesdroppers can possibly derive the key parameters of chaotic sequences and subsequently retrieve the information. We design an advanced encryption standard (AES) interleaving aided multiple user access scheme to enhance the security of a chaotic code division multiple access-based visible light communication (C-CDMA-VLC) system. We propose to spread the information with chaotic sequences, and then the spread information is interleaved by an AES algorithm and transmitted over VLC channels. Since the computation complexity of performing inverse operations to deinterleave the information is high, the eavesdroppers in a high speed VLC system cannot retrieve the information in real time; thus, the system security will be enhanced. Moreover, we build a mathematical model for the AES-aided VLC system and derive the theoretical information leakage to analyze the system security. The simulations are performed over VLC channels, and the results demonstrate the effectiveness and high security of our presented AES interleaving aided chaotic CDMA-VLC system.
YANG Chang; CHEN Xiaolin; ZHANG Huanguo
The current multicast model provides no access control mechanism. Any host can send data directly to a multicast address or join a multicast group to become a member, which brings safety problems to multicast. In this paper, we present a new active multicast group access control mechanism that is founded on trust management. This structure can solve the problem that exists in multicast members' access control and distributing authorization of traditional IP multicast.
Lucido, J. M.
Scientists in the fields of hydrology, geophysics, and climatology are increasingly using the vast quantity of publicly-available data to address broadly-scoped scientific questions. For example, researchers studying contamination of nearshore waters could use a combination of radar indicated precipitation, modeled water currents, and various sources of in-situ monitoring data to predict water quality near a beach. In discovering, gathering, visualizing and analyzing potentially useful data sets, data portals have become invaluable tools. The most effective data portals often aggregate distributed data sets seamlessly and allow multiple avenues for accessing the underlying data, facilitated by the use of open standards. Additionally, adequate metadata are necessary for attribution, documentation of provenance and relating data sets to one another. Metadata also enable thematic, geospatial and temporal indexing of data sets and entities. Furthermore, effective portals make use of common vocabularies for scientific methods, units of measure, geologic features, chemical, and biological constituents as they allow investigators to correctly interpret and utilize data from external sources. One application that employs these principles is the National Ground Water Monitoring Network (NGWMN) Data Portal (http://cida.usgs.gov/ngwmn), which makes groundwater data from distributed data providers available through a single, publicly accessible web application by mediating and aggregating native data exposed via web services on-the-fly into Open Geospatial Consortium (OGC) compliant service output. That output may be accessed either through the map-based user interface or through the aforementioned OGC web services. Furthermore, the Geo Data Portal (http://cida.usgs.gov/climate/gdp/), which is a system that provides users with data access, subsetting and geospatial processing of large and complex climate and land use data, exemplifies the application of International Standards
Huang, Wen-Yen; Hung, Weiteng; Vu, Chi Thanh; Chen, Wei-Ting; Lai, Jhih-Wei; Lin, Chitsan
Taiwan has a large number of poorly managed contaminated sites in need of remediation. This study proposes a framework, a set of standards, and a spreadsheet-based evaluation tool for implementing green and sustainable principles into remediation projects and evaluating the projects from this perspective. We performed a case study to understand how the framework would be applied. For the case study, we used a spreadsheet-based evaluation tool (SEFA) and performed field scale cultivation tests on a site contaminated with total petroleum hydrocarbons (TPHs). The site was divided into two lots: one treated by chemical oxidation and the other by bioremediation. We evaluated five core elements of green and sustainable remediation (GSR): energy, air, water resources, materials and wastes, and land and ecosystem. The proposed evaluation tool and field scale cultivation test were found to efficiently assess the effectiveness of the two remediation alternatives. The framework and related tools proposed herein can potentially be used to support decisions about the remediation of contaminated sites taking into account engineering management, cost effectiveness, and social reconciliation.
Poff, N.L.; Richter, B.D.; Arthington, A.H.; Bunn, S.E.; Naiman, R.J.; Kendy, E.; Acreman, M.; Apse, C.; Bledsoe, B.P.; Freeman, Mary C.; Henriksen, J.; Jacobson, R.B.; Kennen, J.G.; Merritt, D.M.; O'Keeffe, J. H.; Olden, J.D.; Rogers, K.; Tharme, R.E.; Warner, A.
The flow regime is a primary determinant of the structure and function of aquatic and riparian ecosystems for streams and rivers. Hydrologic alteration has impaired riverine ecosystems on a global scale, and the pace and intensity of human development greatly exceeds the ability of scientists to assess the effects on a river-by-river basis. Current scientific understanding of hydrologic controls on riverine ecosystems and experience gained from individual river studies support development of environmental flow standards at the regional scale. 2. This paper presents a consensus view from a group of international scientists on a new framework for assessing environmental flow needs for many streams and rivers simultaneously to foster development and implementation of environmental flow standards at the regional scale. This framework, the ecological limits of hydrologic alteration (ELOHA), is a synthesis of a number of existing hydrologic techniques and environmental flow methods that are currently being used to various degrees and that can support comprehensive regional flow management. The flexible approach allows scientists, water-resource managers and stakeholders to analyse and synthesise available scientific information into ecologically based and socially acceptable goals and standards for management of environmental flows. 3. The ELOHA framework includes the synthesis of existing hydrologic and ecological databases from many rivers within a user-defined region to develop scientifically defensible and empirically testable relationships between flow alteration and ecological responses. These relationships serve as the basis for the societally driven process of developing regional flow standards. This is to be achieved by first using hydrologic modelling to build a 'hydrologic foundation' of baseline and current hydrographs for stream and river segments throughout the region. Second, using a set of ecologically relevant flow variables, river segments within the
This article discusses the issue of social enterprises gaining access to public procurement processes and contracts at the EU and national level. It primarily examines the opportunities for social enterprises to access public procurement contracts provided for in the Public Procurement Directive
Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen
In recent years, a growing number of funding agencies and professional societies have issued policies calling for open access to research data. The Rolling Deck to Repository (R2R) program is working to ensure open access to the environmental sensor data routinely acquired by the U.S. academic research fleet. Currently 25 vessels deliver 7 terabytes of data to R2R each year, acquired from a suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. R2R is working to ensure these data are preserved in trusted repositories, discoverable via standard protocols, and adequately documented for reuse. R2R maintains a master catalog of cruises for the U.S. academic research fleet, currently holding essential documentation for over 3,800 expeditions including vessel and cruise identifiers, start/end dates and ports, project titles and funding awards, science parties, dataset inventories with instrument types and file formats, data quality assessments, and links to related content at other repositories. A Digital Object Identifier (DOI) is published for 1) each cruise, 2) each original field sensor dataset, 3) each post-field data product such as quality-controlled shiptrack navigation produced by the R2R program, and 4) each document such as a cruise report submitted by the science party. Scientists are linked to personal identifiers, such as the Open Researcher and Contributor ID (ORCID), where known. Using standard global identifiers such as DOIs and ORCIDs facilitates linking with journal publications and generation of citation metrics. Since its inception, the R2R program has worked in close collaboration with other data repositories in the development of shared semantics for oceanographic research. The R2R cruise catalog uses community-standard terms and definitions hosted by the NERC Vocabulary Server, and publishes ISO metadata records for each cruise that use community-standard profiles developed with the NOAA Data
Full Text Available In order for a Mobile Device (MD to support the Licensed Shared Access (LSA, the MD should be reconfigurable, meaning that the configuration of a MD must be adaptively changed in accordance with the communication standard adopted in a given LSA system. Based on the standard architecture for reconfigurable MD defined in Working Group (WG 2 of the Technical Committee (TC Reconfigurable Radio System (RRS of the European Telecommunications Standards Institute (ETSI, this paper presents a procedure to transfer control signals among the software entities of a reconfigurable MD required for implementing the LSA. This paper also presents an implementation of a reconfigurable MD prototype that realizes the proposed procedure. The modem and Radio Frequency (RF part of the prototype MD are implemented with the NVIDIA GeForce GTX Titan Graphic Processing Unit (GPU and the Universal Software Radio Peripheral (USRP N210, respectively. With a preset scenario that consists of five time slots from different signal environments, we demonstrate superb performance of the reconfigurable MD in comparison to the conventional nonreconfigurable MD in terms of the data receiving rate available in the LSA band at 2.3–2.4 GHz.
Brooks, Anthony Lewis
This contribution is timely as it addresses accessibility in regards system hardware and software aligned with introduction of the Twenty-First Century Communications and Video Accessibility Act (CVAA) and adjoined game industry waiver that comes into force January 2017. This is an act created...... by the USA Federal Communications Commission (FCC) to increase the access of persons with disabilities to modern communications, and for other purposes. The act impacts advanced communications services and products including text messaging; e-mail; instant messaging; video communications; browsers; game...... platforms; and games software. However, the CVAA has no legal status in the EU. This text succinctly introduces and questions implications, impact, and wider adoption. By presenting the full CVAA and game industry waiver the text targets to motivate discussions and further publications on the subject...
Eichler, M.; Francke, T.; Kneis, D.; Reusser, D.
Monitoring and modelling projects usually involve time series data originating from different sources. Often, file formats, temporal resolution and meta-data documentation rarely adhere to a common standard. As a result, much effort is spent on converting, harmonizing, merging, checking, resampling and reformatting these data. Moreover, in work groups or during the course of time, these tasks tend to be carried out redundantly and repeatedly, especially when new data becomes available. The resulting duplication of data in various formats strains additional ressources. We propose a database structure and complementary scripts for facilitating these tasks. The GOLM- (General Observation and Location Management) framework allows for import and storage of time series data of different type while assisting in meta-data documentation, plausibility checking and harmonization. The imported data can be visually inspected and its coverage among locations and variables may be visualized. Supplementing scripts provide options for data export for selected stations and variables and resampling of the data to the desired temporal resolution. These tools can, for example, be used for generating model input files or reports. Since GOLM fully supports network access, the system can be used efficiently by distributed working groups accessing the same data over the internet. GOLM's database structure and the complementary scripts can easily be customized to specific needs. Any involved software such as MySQL, R, PHP, OpenOffice as well as the scripts for building and using the data base, including documentation, are free for download. GOLM was developed out of the practical requirements of the OPAQUE-project. It has been tested and further refined in the ERANET-CRUE and SESAM projects, all of which used GOLM to manage meteorological, hydrological and/or water quality data.
Full Text Available Many factors impact on the ability to create a digitally inclusive society in a developing world context. These include lack of access to information and communication technology (ICT), infrastructure, low literacy levels as well as low ICT related...
Wong, Tony E.; Bakker, Alexander M. R.; Ruckert, Kelsey; Applegate, Patrick; Slangen, Aimée B. A.; Keller, Klaus
Simple models can play pivotal roles in the quantification and framing of uncertainties surrounding climate change and sea-level rise. They are computationally efficient, transparent, and easy to reproduce. These qualities also make simple models useful for the characterization of risk. Simple model codes are increasingly distributed as open source, as well as actively shared and guided. Alas, computer codes used in the geosciences can often be hard to access, run, modify (e.g., with regards to assumptions and model components), and review. Here, we describe the simple model framework BRICK (Building blocks for Relevant Ice and Climate Knowledge) v0.2 and its underlying design principles. The paper adds detail to an earlier published model setup and discusses the inclusion of a land water storage component. The framework largely builds on existing models and allows for projections of global mean temperature as well as regional sea levels and coastal flood risk. BRICK is written in R and Fortran. BRICK gives special attention to the model values of transparency, accessibility, and flexibility in order to mitigate the above-mentioned issues while maintaining a high degree of computational efficiency. We demonstrate the flexibility of this framework through simple model intercomparison experiments. Furthermore, we demonstrate that BRICK is suitable for risk assessment applications by using a didactic example in local flood risk management.
Mirvis, E.; Iredell, M.
The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the
Aglina, Moses Kwame; Agbejule, Adebayo; Nyamuame, Godwin Yao
Energy has become the main driver for development as industries grow, agricultural sectors become more modernized, economies boom and countries become wealthy. There are still vast majority of people living under the poverty line especially in the ECOWAS region. The purpose of this study is to explore how improvements in energy access can be a key driver in economic development and progress in the ECOWAS region. Data for the study was obtained from the database of the World Bank. A regression analysis was carried out to establish the relationships between energy access and development indicators. The paper suggests the need for policy makers in the ECOWAS region to focus on targets, such as household access, consumption of electricity, and ease of use instead on supply targets that focus merely on physical coverage. A case on how Ghana is improving energy access is presented. - Highlights: • Energy policies in the ECOWAS region must focus on demand side targets. • Energy policies should target rural and peri-urban areas of the ECOWAS region. • Improved energy access requires a new supply chain energy model.
Full Text Available Cognitive radio sensor networks are one of the kinds of application where cognitive techniques can be adopted and have many potential applications, challenges and future research trends. According to the research surveys, dynamic spectrum access is an important and necessary technology for future cognitive sensor networks. Traditional methods of dynamic spectrum access are based on spectrum holes and they have some drawbacks, such as low accessibility and high interruptibility, which negatively affect the transmission performance of the sensor networks. To address this problem, in this paper a new initialization mechanism is proposed to establish a communication link and set up a sensor network without adopting spectrum holes to convey control information. Specifically, firstly a transmission channel model for analyzing the maximum accessible capacity for three different polices in a fading environment is discussed. Secondly, a hybrid spectrum access algorithm based on a reinforcement learning model is proposed for the power allocation problem of both the transmission channel and the control channel. Finally, extensive simulations have been conducted and simulation results show that this new algorithm provides a significant improvement in terms of the tradeoff between the control channel reliability and the efficiency of the transmission channel.
Full Text Available The growing importance of automatized systems for interchange of commercial and technical information prompts searching for new forms of knowledge modelling. One of the possible solutions presented in this article is the use of relational databases for acquisition of information comprised in technical standards and of a rule-based reasoning system providing access to these bases. The study describes the structure of a database created under the assumption that concepts describing knowledge will be recorded in data collecting system and not in relational schema. The user is free to define by himself the structure of the knowledge without the need of introducing any changes to the relational schema. At the same time, the reasoning system enables very efficient knowledge searching. At present, tests are carried out on a software operating on a MS SQL Server. The project work and test performance of the system confirm that the adopted assumptions have been right and correct. The development of a new interface is anticipated using internet techniques. As a next step, the information resources collected in the system will be successively expanded using also information sources other than the technical standards.
Mendell, Mark J.; Fisk, William J.
Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effects associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each
were accessed with a history of medical disqualification that was either remediated prior to accession or waived, 6% accessed with a waiver, and 3...disability discharge in the first year of service. Among National Guard accessions between 2008 and 2013, 15% accessed with a history of previous...intellectual efficiency, non- delinquency , optimism, order, self-control, sociability, tolerance, and physical conditioning, which is a dimension created
cytological, histological , immunological and DNA test findings 796 Other nonspecific abnormal findings 995 Certain adverse effects not elsewhere classified...dissociative, or factititous disorder 795 Abnormal histological and 404 1.4 293 1.9 immunological findings 746 Congenital anomalies of heart and 368 1.3...several conditions for which a medical accession waiver was sought in 2006 had approval rates in excess of 90%. The highest were for chronic gastritis (100
Xu, Fei; Xu, Hong; Chen, Xiong; Wu, Dingcai; Wu, Yang; Liu, Hao; Gu, Cheng; Fu, Ruowen; Jiang, Donglin
Ordered π-columns and open nanochannels found in covalent organic frameworks (COFs) could render them able to store electric energy. However, the synthetic difficulty in achieving redox-active skeletons has thus far restricted their potential for energy storage. A general strategy is presented for converting a conventional COF into an outstanding platform for energy storage through post-synthetic functionalization with organic radicals. The radical frameworks with openly accessible polyradicals immobilized on the pore walls undergo rapid and reversible redox reactions, leading to capacitive energy storage with high capacitance, high-rate kinetics, and robust cycle stability. The results suggest that channel-wall functional engineering with redox-active species will be a facile and versatile strategy to explore COFs for energy storage. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Meier, Benjamin Mason; Gelpi, Adriane; Kavanagh, Matthew M; Forman, Lisa; Amon, Joseph J
The scale of the HIV pandemic - and the stigma, discrimination and violence that surrounded its sudden emergence - catalyzed a public health response that expanded human rights in principle and practice. In the absence of effective treatment, human rights activists initially sought to protect individuals at high risk of HIV infection. With advances in antiretroviral therapy, activists expanded their efforts under international law, advocating under the human right to health for individual access to treatment. As a clinical cure comes within reach, human rights obligations will continue to play a key role in political and programmatic decision-making. Building upon the evolving development and implementation of the human right to health in the global response to HIV, we outline a human rights research agenda to prepare for HIV cure access, investigating the role of human rights law in framing 1) resource allocation, 2) international obligations, 3) intellectual property and 4) freedom from coercion. The right to health is widely recognized as central to governmental, intergovernmental and non-governmental responses to the pandemic and critical both to addressing vulnerability to infection and to ensuring universal access to HIV prevention, treatment, care and support. While the advent of an HIV cure will raise new obligations for policymakers in implementing the right to health, the resolution of past debates surrounding HIV prevention and treatment may inform claims for universal access.
Full Text Available The global migration of television (TV) from analogue to digital broadcast will see a large amount of TV spectrum available (called TV white space - TVWS) for other services such as mobile and broadband wireless access (BWA). Leading spectrum...
Multidomain environments where multiple organizations interoperate with each other are becoming a reality as can be seen in emerging Internet-based enterprise applications. Access control to ensure secure interoperation in such an environment is a crucial challenge. A multidomain environment can be categorized as "tightly-coupled" and…
Vandegriff, J. D.; King, T. A.; Weigel, R. S.; Faden, J.; Roberts, D. A.; Harris, B. T.; Lal, N.; Boardsen, S. A.; Candey, R. M.; Lindholm, D. M.
We present the Heliophysics Application Programmers Interface (HAPI), a new interface specification that both large and small data centers can use to expose time series data holdings in a standard way. HAPI was inspired by the similarity of existing services at many Heliophysics data centers, and these data centers have collaborated to define a single interface that captures best practices and represents what everyone considers the essential, lowest common denominator for basic data access. This low level access can serve as infrastructure to support greatly enhanced interoperability among analysis tools, with the goal being simplified analysis and comparison of data from any instrument, model, mission or data center. The three main services a HAPI server must perform are 1. list a catalog of datasets (one unique ID per dataset), 2. describe the content of one dataset (JSON metadata), and 3. retrieve numerical content for one dataset (stream the actual data). HAPI defines both the format of the query to the server, and the response from the server. The metadata is lightweight, focusing on use rather than discovery, and the data format is a streaming one, with Comma Separated Values (CSV) being required and binary or JSON streaming being optional. The HAPI specification is available at GitHub, where projects are also underway to develop reference implementation servers that data providers can adapt and use at their own sites. Also in the works are data analysis clients in multiple languages (IDL, Python, Matlab, and Java). Institutions which have agreed to adopt HAPI include Goddard (CDAWeb for data and CCMC for models), LASP at the University of Colorado Boulder, the Particles and Plasma Interactions node of the Planetary Data System (PPI/PDS) at UCLA, the Plasma Wave Group at the University of Iowa, the Space Sector at the Johns Hopkins Applied Physics Lab (APL), and the tsds.org site maintained at George Mason University. Over the next year, the adoption of a
Chen, Xiaoxiao; Narkeviciute, Rasa; Haselip, James Arthur
, absolute, measures of best practice and highly contextual realities where baselines are often lacking. However, the methodology does offer a comparative means to highlight the relative strengths and weaknesses of any given project, enabling both ex-post assessments and project learning. The study features...... an analysis of cases selected from the Energy Access Knowledge Base, published by the Global Network on Energy for Sustainable Development (GNESD). Copyright © 2015 John Wiley & Sons, Ltd and ERP Environment...
Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye
A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.
NOTES The original document contains color photos. And four Supplemental Applicants and Accessions tables for: Army, Air Force, Marine, and Navy 14...accession characteristics and risk of attrition in the accessed population. T\\.VO manuscripts focus on the Assessment of Recruit Motivation and Strength...programs to improve military readiness by maximizing both the accession and retention of motivated and capable recruits. This report provides abstracts
DEVRIESE, Joke; POTTEL, Hans; BEELS, Laurence; MAES, Alex; VAN DE WIELE, Christophe; GHEYSENS, Olivier
With the routine use of 2-deoxy-2-[ 18 F]-fluoro-D-glucose (18F-FDG) positron emission tomography/computed tomography (PET/CT) scans, metabolic activity of tumors can be quantitatively assessed through calculation of SUVs. One possible normalization parameter for the standardized uptake value (SUV) is lean body mass (LBM), which is generally calculated through predictive equations based on height and body weight. (Semi-)direct measurements of LBM could provide more accurate results in cancer populations than predictive equations based on healthy populations. In this context, four methods to determine LBM are reviewed: bioelectrical impedance analysis, dual-energy X-ray absorptiometry. CT, and magnetic resonance imaging. These methods were selected based on clinical accessibility and are compared in terms of methodology, precision and accuracy. By assessing each method’s specific advantages and limitations, a well-considered choice of method can hopefully lead to more accurate SUVLBM values, hence more accurate quantitative assessment of 18F-FDG PET images.
Since the reauthorization of the Individuals With Disabilities Education Act (IDEA) in 2004, standards-based individualized education plans (IEPs) have been an expectation for serving students with disabilities in the K-12 public school setting. Nearly a decade after the mandates calling for standards-based IEPs, special educators still struggle…
. While use of the libraries was high, most responses reflected severely limited educational, rehabilitative or cultural programming and access to the internet, and lack of space for collections and reading purposes. Conclusion – Libraries in Croatia fail to meet international standards for staffing, collections, and services. Recommendations for immediate improvement are made, including legislative advocacy and funding, improved public library involvement, and the creation of national standards aligned with international standards.
Deshmukh, Ranjit [Univ. of California, Berkeley, CA (United States); Carvallo, Juan Pablo [Univ. of California, Berkeley, CA (United States); Gambhir, Ashwin [Univ. of California, Berkeley, CA (United States)
We emphasize the importance of concurrently considering all components of a mini-grid policy, designing each component through the lenses of different stakeholders, and fostering mini-grids as an integral part of a country’s electricity access efforts. Policymakers have multiple options, and it is the combination of these in the institutional and financial capacity of the government context that will decide the success of the program. There are no silver bullet solutions, but a thorough understanding of the existing technical and institutional capacities, as well as the stakeholders’ interests and sociocultural context will enable the design of an effective policy instrument.
Kumar, Tanesh; Pandey, Bishwajeet; Das, Teerath
In this paper, we analyzed how does life and reliability of an integrated circuit is affected when it is operated in different regions under different temperatures. We have taken Fibonacci generator as our target circuit and LVCMOS as I/O standards. WPA and WPA2 (Wi-Fi Protected Access) key can...... be generated with Fibonacci generator. Here, thermal efficient green Fibonacci Generator is used to generate key for Wi-Fi Protected Access in order to make green communication possible under different room temperature. By analysis it is observed that at standard normal temperature (21degrees C), LVCMOS12 have...
Full Text Available Human interaction environments (HIE must be understood as any place where people carry out their daily life, including their work, family life, leisure and social life, interacting with technology to enhance or facilitate the experience. The integration of technology in these environments has been achieved in a disorderly and incompatible way, with devices operating in isolated islands with artificial edges delimited by the manufacturers. In this paper we are presenting the UniDA framework, an integral solution for the development of systems that require the integration and interoperation of devices and technologies in HIEs. It provides developers and installers with a uniform conceptual framework capable of modelling an HIE, together with a set of libraries, tools and devices to build distributed instrumentation networks with support for transparent integration of other technologies. A series of use case examples and a comparison to many of the existing technologies in the field has been included in order to show the benefits of using UniDA.
Kim, Younggab; Hur, Nam Young; Jeong, Hyeon Jong
In order to eliminate the vague fears of the people about the nuclear power and operate continuously NPPs, a strong safety culture of NPPs should be demonstrated. Strong safety culture awareness of workers can overcome social distrust about NPPs. KHNP has been a variety efforts to improve and establish safety culture of NPPs. Safety culture framework applying global standards was set up and safety culture assessment has been carried out periodically to enhance safety culture of workers. In addition, KHNP developed various safety culture contents and they are being used in NPPs by workers. As a result of these efforts, safety culture awareness of workers is changed positively and the safety environment of NPPs is expected to be improved. KHNP makes an effort to solve areas for improvement derived from safety culture assessment. However, there are some areas to take a long time in completing the work. Therefore, these actions are necessary to be carried out consistently and continuously. KHNP also developed recently safety culture enhancement system based on web. All information related to safety culture in KHNP will be shared through this web system and this system will be used to safety culture assessment. In addition to, KHNP plans to develop safety culture indicators for monitoring the symptoms of safety culture weakening
Kim, Younggab; Hur, Nam Young; Jeong, Hyeon Jong [KHNP Central Research Institute, Daejeon (Korea, Republic of)
In order to eliminate the vague fears of the people about the nuclear power and operate continuously NPPs, a strong safety culture of NPPs should be demonstrated. Strong safety culture awareness of workers can overcome social distrust about NPPs. KHNP has been a variety efforts to improve and establish safety culture of NPPs. Safety culture framework applying global standards was set up and safety culture assessment has been carried out periodically to enhance safety culture of workers. In addition, KHNP developed various safety culture contents and they are being used in NPPs by workers. As a result of these efforts, safety culture awareness of workers is changed positively and the safety environment of NPPs is expected to be improved. KHNP makes an effort to solve areas for improvement derived from safety culture assessment. However, there are some areas to take a long time in completing the work. Therefore, these actions are necessary to be carried out consistently and continuously. KHNP also developed recently safety culture enhancement system based on web. All information related to safety culture in KHNP will be shared through this web system and this system will be used to safety culture assessment. In addition to, KHNP plans to develop safety culture indicators for monitoring the symptoms of safety culture weakening.
Krukow, Karl Kristian; Nielsen, Mogens; Sassone, Vladimiro
-based trust-management systems provide no formal security-guarantees. In this extended abstract, we describe a mathematical framework for a class of simple reputation-based systems. In these systems, decisions about interaction are taken based on policies that are exact requirements on agents' past histories....... We present a basic declarative language, based on pure-past linear temporal logic, intended for writing simple policies. While the basic language is reasonably expressive (encoding e.g. Chinese Wall policies) we show how one can extend it with quantification and parameterized events. This allows us...... to encode other policies known from the literature, e.g., `one-out-of-k'. The problem of checking a history with respect to a policy is efficient for the basic language, and tractable for the quantified language when policies do not have too many variables....
More, S J; Hanlon, A; Marchewka, J; Boyle, L
In recent years, 'private standards' in animal health and welfare have become increasingly common, and are often incorporated into quality assurance (QA) programmes. Here, we present an overview of the use of private animal health and welfare standards in QA programmes, and propose a generic framework to facilitate critical programme review. Private standards are being developed in direct response to consumer demand for QA, and offer an opportunity for product differentiation and a means to drive consumer choice. Nonetheless, a range of concerns have been raised, relating to the credibility of these standards, their potential as a discriminatory barrier to trade, the multiplicity of private standards that have been developed, the lack of consumer input and compliance costs. There is a need for greater scrutiny of private standards and of associated QA programmes. We propose a framework to clarify the primary programme goal(s) and measureable outputs relevant to animal health and welfare, the primary programme beneficiaries and to determine whether the programme is effective, efficient and transparent. This paper provides a theoretical overview, noting that this framework could be used as a tool directly for programme evaluation, or as a tool to assist with programme development and review. British Veterinary Association.
Sujansky, Walter V; Faus, Sam A; Stone, Ethan; Brennan, Patricia Flatley
Online personal health records (PHRs) enable patients to access, manage, and share certain of their own health information electronically. This capability creates the need for precise access-controls mechanisms that restrict the sharing of data to that intended by the patient. The authors describe the design and implementation of an access-control mechanism for PHR repositories that is modeled on the eXtensible Access Control Markup Language (XACML) standard, but intended to reduce the cognitive and computational complexity of XACML. The authors implemented the mechanism entirely in a relational database system using ANSI-standard SQL statements. Based on a set of access-control rules encoded as relational table rows, the mechanism determines via a single SQL query whether a user who accesses patient data from a specific application is authorized to perform a requested operation on a specified data object. Testing of this query on a moderately large database has demonstrated execution times consistently below 100ms. The authors include the details of the implementation, including algorithms, examples, and a test database as Supplementary materials. Copyright © 2010 Elsevier Inc. All rights reserved.
Designing assessments and tests is one of the more challenging aspects of creating an accessible learning environment for students who are deaf or hard of hearing (DHH), particularly for deaf students with a disability (DWD). Standardized assessments are a key mechanism by which the educational system in the United States measures student…
... format so that patients, caregivers, and healthcare providers may access and utilize device labeling as... labeling, and what they would want in a standard version of device labeling. Key findings from the survey... survey with the National Family Caregivers Association (NFCA) on medical device labeling to elicit home...
Wan, Yik-Ki J; Staes, Catherine J
Healthcare organizations use care pathways to standardize care, but once developed, adoption rates often remain low. One challenge for usage concerns clinicians' difficulty in accessing guidance when it is most needed. Although the HL7 'Infobutton Standard' allows clinicians easier access to external references, access to locally-developed resources often requires clinicians to deviate from their normal electronic health record (EHR) workflow to use another application. To address this gap between internal and external resources, we reviewed the literature and existing practices at the University of Utah Health Care. We identify the requirements to meet the needs of a healthcare enterprise and clinicians, describe the design and development of a prototype to aggregate both internal and external resources from within or outside the EHR, and evaluated strengths and limitations of the prototype. The system is functional but not implemented in a live EHR environment. We suggest next steps and enhancements.
... advanced approaches rules, several commenters, mostly representing the largest U.S. financial institutions... principles for preparing financial statements instead of the statutory accounting principles applicable to...-Based Capital Standards: Advanced Capital Adequacy Framework--Basel II; Establishment of a Risk-Based...
The aim of the article is to shed light on the historical development of language studies in military and social context and to compare the current status of NATO Stanag (Standard Agreement) 6001 language scale with Common European Framework (CEF). Language studies in military context date back to World War II and the emergence of Army Specialized…
Maloy, Robert W.; Poirier, Michelle; Smith, Hilary K.; Edwards, Sharon A.
This article explores using a wiki, one of the newest forms of interactive computer-based technology, as a resource for teaching the Massachusetts K-12 History and Social Science Curriculum Framework, a set of state-mandated learning standards. Wikis are web pages that can be easily edited by multiple authors. They invite active involvement by…
Shah, Mahsood; Whannell, Robert
Open access enabling courses have experienced growth in Australia. The growth is evidenced in student enrolments and the number of public and private institutions offering such courses. Traditionally these courses have provided a second chance to many students from various equity groups who have been unable to access tertiary education due to poor…
Full Text Available One of the primary competitive factors of a country is the state of institutions. EU membership requires substantial changes in the content and structure of the institutional framework of the acceding countries, which could generate progress or regress in their development if the existing institutions and the transferred ones are incompatible. In this article the author has undertaken the task to examine, theoretically and conceptually, the process of institutional change in terms of three concepts: Europeanization, institutional transfer and path dependence. There have been analysed the forms, tools, methods and costs of institutional transfer and possible institutional failures. This research has come to the conclusion that in the situation of the Republic of Moldova, the partial modification of institutions is mainly determined by the desire of the veto players to survive and not by the consistent abidance of the accession process to the EU.
In this paper, we provide an analysis to the performance of optical time-wavelength code-division multiple-access (OTW-CDMA) network when the system is working above the nominal transmission rate limit imposed by the passive encoding-decoding operation. We address the problem of overlapping in such a system and how it can directly affect the bit error rate (BER). A unified mathematical framework is presented under the assumption of one coincidence sequences with non-repeating wavelengths. A closed form expression of the multiple access interference limited BER is provided as a function of different system parameters. Results show that the performance of OTW-CDMA system may be critically affected when working above the nominal limit; an event that may happen when the network operates at high transmission rate. In addition, the impact of the derived error probability on the performance of two newly proposed MAC protocols, the S-ALOHA and the R3T, is also investigated. It is shown that for low transmission rates, the S-ALOHA is better than the R3T; while the R3T is better at very high transmission rates. However, in general it is postulated that the R3T protocol suffers a higher delay mainly because of the presence of additional modes.
Wiles, Benedict M; Child, Nicholas; Roberts, Paul R
Bedside vascular ultrasound machines are increasingly available. They are used to facilitate safer vascular access across a number of different specialties. In the electrophysiology laboratory however, where patients are frequently anticoagulated and require the insertion of multiple venous sheaths, anatomical landmark techniques predominate. Despite the high number of vascular complications associated with electrophysiological procedures and the increasing evidence to support its use in electrophysiology, ultrasound remains underutilised. A new standard of care is required. A comprehensive technical report, providing a detailed explanation of this important technique, will provide other electrophysiology centres with the knowledge and justification for adopting ultrasound guidance as their standard practice. We review the increasing body of evidence which demonstrates that routine ultrasound usage can substantially improve the safety of femoral venous access in the electrophysiology laboratory. We offer a comprehensive technical report to guide operators through the process of ultrasound-guided venous access, with a specific focus on the electrophysiology laboratory. Additionally, we detail a novel technique which utilises real-time colour Doppler ultrasound to accurately identify needle tip location during venous puncture. The use of vascular ultrasound to guide femoral venous cannulation is rapid, inexpensive and easily learnt. Ultrasound is readily available and offers the potential to significantly reduce vascular complications in the unique setting of the electrophysiology laboratory. Ultrasound guidance to achieve femoral venous access should be the new standard of care in electrophysiology.
National Aeronautics and Space Administration — We propose to investigate the feasibility and value of the "Software as a Service" paradigm in facilitating access to Earth Science numerical models. We envision...
José Augusto Campos Garcia
Full Text Available This work presents the results of a study that aimed to establish a framework to standardize the process of a scientific journal. It has a team that performs operational routines regulated by standards (external and patterns (internal and external. The high turnover rate of the supportive team has generated information loss and increased service variability. The research started from the assumption that the process standardization (which includes the formalization could be a way to reduce this secondary effect. Standardization techniques were identified through a literature review of the main national and international databases of journals and congresses. The identified standardization techniques were analyzed considering the number of times they appeared in the papers analyzed and performance objectives proposed by Slack et al. (2009. As result of this research, a framework was obtained for the standardization of processes adapted to the needs of the journal studied. The model is feasible to be used more widely, given its structural similarity to the one proposed by Campos (2004, a Brazilian model that is a reference in the field.
Liyanage, H; Liaw, S-T; Di Iorio, C T; Kuziemsky, C; Schreiber, R; Terry, A L; de Lusignan, S
Privacy, ethics, and data access issues pose significant challenges to the timely delivery of health research. Whilst the fundamental drivers to ensure that data access is ethical and satisfies privacy requirements are similar, they are often dealt with in varying ways by different approval processes. To achieve a consensus across an international panel of health care and informatics professionals on an integrated set of privacy and ethics principles that could accelerate health data access in data-driven health research projects. A three-round consensus development process was used. In round one, we developed a baseline framework for privacy, ethics, and data access based on a review of existing literature in the health, informatics, and policy domains. This was further developed using a two-round Delphi consensus building process involving 20 experts who were members of the International Medical Informatics Association (IMIA) and European Federation of Medical Informatics (EFMI) Primary Health Care Informatics Working Groups. To achieve consensus we required an extended Delphi process. The first round involved feedback on and development of the baseline framework. This consisted of four components: (1) ethical principles, (2) ethical guidance questions, (3) privacy and data access principles, and (4) privacy and data access guidance questions. Round two developed consensus in key areas of the revised framework, allowing the building of a newly, more detailed and descriptive framework. In the final round panel experts expressed their opinions, either as agreements or disagreements, on the ethics and privacy statements of the framework finding some of the previous round disagreements to be surprising in view of established ethical principles. This study develops a framework for an integrated approach to ethics and privacy. Privacy breech risk should not be considered in isolation but instead balanced by potential ethical benefit.
Full Text Available The purpose of this study is Study of Descriptive Metadata Standards in Archival open source software, to determine the most appropriate descriptive metadata standard (s and also Encoder Software support of these standards. The approach of present study is combination and library methods, Delphi and descriptive survey are used. Data gathering in library study is fiche, in the Delphi method is questionnaire and in descriptive survey is checklist. Statistical population contains 5 Archival open source software. The findings suggest that 5 metadata standards, consist of EAD, ISAD, EAC-CPF, ISAAR & ISDF, diagnosed appropriate by Delphi Panel members as the most appropriate descriptive metadata standards to use for archival software. Moreover, ICA-ATOM and Archivist toolkit in terms of support for standards that were suitable, diagnosed as the most appropriate archival software.
van Wessel, R.M.
From a practical point of view, this research provides insight into how company IT standards affect business process performance. Furthermore it gives recommendations on how to govern and manage such standards successfully with regard to their selection, implementation and usage. After evaluating
disorders are listed under the same standard within the medical standards for enlistment regardless of the type of curvature: scoliosis, lordosis ...spinal curvature disorders are referred to generically as scoliosis, estimates of the prevalence of lordosis and kyphosis are difficult to obtain...the specific type of spinal curvature disorder (i.e. scoliosis, lordosis , kyphosis) associated with disqualification or waiver could be
One of the principal features of accounting in the 21st century is harmonisation and stanardisation. Regulation of the European Parliament and European Council No. 1606/2002 harmonizes financial reporting for certain companies in the EU. However, national accounting principles are of great importance for financial reporting. The main purpose of this research was to investigate the application of generally accepted accounting principles, the regulatory accounting framework and the standard-set...
Tackett, Sean; Grant, Janet; Mmari, Kristin
To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.
Full Text Available A new framework intended for representing and segmenting multidimensional datasets resulting in low spatial complexity requirements and with appropriate access to their contained information is described. Two steps are going to be taken in account. The first step is to specify (n-1D hypervoxelizations, n≥2, as Orthogonal Polytopes whose nth dimension corresponds to color intensity. Then, the nD representation is concisely expressed via the Extreme Vertices Model in the n-Dimensional Space (nD-EVM. Some examples are presented, which, under our methodology, have storing requirements minor than those demanded by their original hypervoxelizations. In the second step, 1-Dimensional Kohonen Networks (1D-KNs are applied in order to segment datasets taking in account their geometrical and topological properties providing a non-supervised way to compact even more the proposed n-Dimensional representations. The application of our framework shares compression ratios, for our set of study cases, in the range 5.6496 to 32.4311. Summarizing, the contribution combines the power of the nD-EVM and 1D-KNs by producing very concise datasets’ representations. We argue that the new representations also provide appropriate segmentations by introducing some error functions such that our 1D-KNs classifications are compared against classifications based only in color intensities. Along the work, main properties and algorithms behind the nD-EVM are introduced for the purpose of interrogating the final representations in such a way that it efficiently obtains useful geometrical and topological information.
National Oceanic and Atmospheric Administration, Department of Commerce — The content of the NODC Taxonomic Code, Version 8 CD-ROM (CD-ROM NODC-68) distributed by NODC is archived in this accession. Version 7 of the NODC Taxonomic Code...
Waldmann, Ashley K.; Blackwell, Terry L.
This article addresses the changes in the Commission on Rehabilitation Counselor Certification's 2010 "Code of Professional Ethics for Rehabilitation Counselors" as they relate to Section C: Advocacy and Accessibility. Ethical issues are identified and discussed in relation to advocacy skills and to advocacy with, and on behalf of, the client; to…
Viegas, Vítor; Pereira, José Dias; Girão, P. Silva
In 1999, the 1451.1 Std was published defining a common object model and interface specification to develop open, multi-vendor distributed measurement and control systems. However, despite the well-known advantages of the model, few have been the initiatives to implement it. In this paper we describe the implementation of a NCAP – Network Capable Application Processor, in a well-known and well-proven infrastructure: the Microsoft .NET Framework. The choice of a commercial framework was part o...
Kirişli, H A; Schaap, M; Metz, C T; Dharampal, A S; Meijboom, W B; Papadopoulou, S L; Dedic, A; Nieman, K; de Graaf, M A; Meijs, M F L; Cramer, M J; Broersen, A; Cetin, S; Eslami, A; Flórez-Valencia, L; Lor, K L; Matuszewski, B; Melki, I; Mohr, B; Oksüz, I; Shahzad, R; Wang, C; Kitslaar, P H; Unal, G; Katouzian, A; Örkisz, M; Chen, C M; Precioso, F; Najman, L; Masood, S; Ünay, D; van Vliet, L; Moreno, R; Goldenberg, R; Vuçini, E; Krestin, G P; Niessen, W J; van Walsum, T
Though conventional coronary angiography (CCA) has been the standard of reference for diagnosing coronary artery disease in the past decades, computed tomography angiography (CTA) has rapidly emerged, and is nowadays widely used in clinical practice. Here, we introduce a standardized evaluation framework to reliably evaluate and compare the performance of the algorithms devised to detect and quantify the coronary artery stenoses, and to segment the coronary artery lumen in CTA data. The objective of this evaluation framework is to demonstrate the feasibility of dedicated algorithms to: (1) (semi-)automatically detect and quantify stenosis on CTA, in comparison with quantitative coronary angiography (QCA) and CTA consensus reading, and (2) (semi-)automatically segment the coronary lumen on CTA, in comparison with expert's manual annotation. A database consisting of 48 multicenter multivendor cardiac CTA datasets with corresponding reference standards are described and made available. The algorithms from 11 research groups were quantitatively evaluated and compared. The results show that (1) some of the current stenosis detection/quantification algorithms may be used for triage or as a second-reader in clinical practice, and that (2) automatic lumen segmentation is possible with a precision similar to that obtained by experts. The framework is open for new submissions through the website, at http://coronary.bigr.nl/stenoses/. Copyright © 2013 Elsevier B.V. All rights reserved.
Kurtz, S.; Wohlgemuth, J.; Yamamichi, M.; Sample, T.; Miller, D.; Meakin, D.; Monokroussos, C.; TamizhMani, M.; Kempe, M.; Jordan, D.; Bosco, N.; Hacke, P.; Bermudez, V.; Kondo, M.
As the photovoltaic industry has grown, the interest in comparative accelerated testing has also grown. Private test labs offer testing services that apply greater stress than the standard qualification tests as tools for differentiating products and for gaining increased confidence in long-term PV investments. While the value of a single international standard for comparative accelerated testing is widely acknowledged, the development of a consensus is difficult. This paper strives to identify a technical basis for a comparative standard.
Le Manh-Béna, Anne; Ramond, Olivier,
Following the debate on the Conceptual Framework revision undertaken by the IASB and the FASB, this paper discusses three major concerns about the way financial reporting standards should be determined: (1) What is the role a Conceptual Framework?; (2) For whom and for which needs are accounting and financial reporting standards made?; and (3) What information set should financial reporting provide? We show that the perceived need of a Framework has resulted in practice in weak usefulness We ...
Velliaris, Donna M.; Breen, Paul
In this paper, the authors explore a holistic three-stage framework currently used by the Eynesbury Institute of Business and Technology (EIBT), focused on academic staff identification and remediation processes for the prevention of (un)intentional student plagiarism. As a pre-university pathway provider--whose student body is 98%…
National Oceanic and Atmospheric Administration, Department of Commerce — Observed and standard level profile data (along with quality control flags) used in the production of these atlases were made available in a World Ocean Atlas 1994...
National Oceanic and Atmospheric Administration, Department of Commerce — This archival information package contains a listing of codes and chemical names that were used in NODC Standard Format Marine Toxic Substances and Pollutants (F144)...
Khan, Aziz; Fornes, Oriol; Stigliani, Arnaud; Gheorghe, Marius; Castro-Mondragon, Jaime A; van der Lee, Robin; Bessy, Adrien; Chèneby, Jeanne; Kulkarni, Shubhada R; Tan, Ge; Baranasic, Damir; Arenillas, David J; Sandelin, Albin; Vandepoele, Klaas; Lenhard, Boris; Ballester, Benoît; Wasserman, Wyeth W; Parcy, François; Mathelier, Anthony
JASPAR (http://jaspar.genereg.net) is an open-access database of curated, non-redundant transcription factor (TF)-binding profiles stored as position frequency matrices (PFMs) and TF flexible models (TFFMs) for TFs across multiple species in six taxonomic groups. In the 2018 release of JASPAR, the CORE collection has been expanded with 322 new PFMs (60 for vertebrates and 262 for plants) and 33 PFMs were updated (24 for vertebrates, 8 for plants and 1 for insects). These new profiles represent a 30% expansion compared to the 2016 release. In addition, we have introduced 316 TFFMs (95 for vertebrates, 218 for plants and 3 for insects). This release incorporates clusters of similar PFMs in each taxon and each TF class per taxon. The JASPAR 2018 CORE vertebrate collection of PFMs was used to predict TF-binding sites in the human genome. The predictions are made available to the scientific community through a UCSC Genome Browser track data hub. Finally, this update comes with a new web framework with an interactive and responsive user-interface, along with new features. All the underlying data can be retrieved programmatically using a RESTful API and through the JASPAR 2018 R/Bioconductor package. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Eberle, J.; Schmullius, C.
Increasing archives of global satellite data present a new challenge to handle multi-source satellite data in a user-friendly way. Any user is confronted with different data formats and data access services. In addition the handling of time-series data is complex as an automated processing and execution of data processing steps is needed to supply the user with the desired product for a specific area of interest. In order to simplify the access to data archives of various satellite missions and to facilitate the subsequent processing, a regional data and processing middleware has been developed. The aim of this system is to provide standardized and web-based interfaces to multi-source time-series data for individual regions on Earth. For further use and analysis uniform data formats and data access services are provided. Interfaces to data archives of the sensor MODIS (NASA) as well as the satellites Landsat (USGS) and Sentinel (ESA) have been integrated in the middleware. Various scientific algorithms, such as the calculation of trends and breakpoints of time-series data, can be carried out on the preprocessed data on the basis of uniform data management. Jupyter Notebooks are linked to the data and further processing can be conducted directly on the server using Python and the statistical language R. In addition to accessing EO data, the middleware is also used as an intermediary between the user and external databases (e.g., Flickr, YouTube). Standardized web services as specified by OGC are provided for all tools of the middleware. Currently, the use of cloud services is being researched to bring algorithms to the data. As a thematic example, an operational monitoring of vegetation phenology is being implemented on the basis of various optical satellite data and validation data from the German Weather Service. Other examples demonstrate the monitoring of wetlands focusing on automated discovery and access of Landsat and Sentinel data for local areas.
da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset
Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact email@example.com PMID:28379341
da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset
BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. firstname.lastname@example.org. © The Author(s) 2017. Published by Oxford University Press.
Full Text Available One of the principal features of accounting in the 21st century is harmonisation and stanardisation. Regulation of the European Parliament and European Council No. 1606/2002 harmonizes financial reporting for certain companies in the EU. However, national accounting principles are of great importance for financial reporting. The main purpose of this research was to investigate the application of generally accepted accounting principles, the regulatory accounting framework and the standard-setting bodies of EU member states. The analysis of these accounting issues was conducted with respect to all 28 EU member states. The results indicate that EU member states regulate their principal accounting issues through separate accounting acts or implement those issues in companies acts. Some EU member states do not have national accounting standards, the national accounting principles being incorporated in companies acts and accounting acts. Nevertheless, national accounting standard-setting bodies are governmental organisations in almost half the member states.
Park, Jihoon; Pawełczak, Przemysław; Grønsund, Pål; Čabrić, Danijela
We present an analytical model that enables throughput evaluation of Opportunistic Spectrum Orthogonal Frequency Division Multiple Access (OS-OFDMA) networks. The core feature of the model, based on a discrete time Markov chain, is the consideration of different channel and subchannel allocation strategies under different Primary and Secondary user types, traffic and priority levels. The analytical model also assesses the impact of different spectrum sensing strategies on the throughput of OS...
Full Text Available Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.
Song, Y.; Gui, Z.; Wu, H.; Wei, Y.
Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.
Reale, S.; Corvi, A.
One of the aims of the various Engineering Standards related to Non-destructive Examination (NDE) is to identify and limit some characteristics of defects in a structure, since the degree of damage of a structure can be associated with these defect characteristics. One way that the damage level can be evaluated is by means of Fracture Mechanics. The objective of the present paper is to compare and identify the differences in the flaw acceptance criteria of national NDE Standards so as to suggest some guidelines for a future common European Standard. This paper examines the Standards adopted in France (RCC-MR), Germany (DIN), Italy (ASME) and the UK (BSI). It concentrates on both ultrasonic and radiographic inspection methods. The flaw acceptance criteria in these standards relating to non-destructive tests performed on a component during manufacturing are compared and evaluated by the Fracture Mechanics CEGB R6 procedure. General guidelines and results supporting the significance of the Fracture Mechanics approach are given. (Author)
Jha, Abhinav K; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M
Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis.
Baribaud, G.; Barnett, I.; Benincasa, G.
Control protocol provides a normalized access procedure for equipment of the same kind from a control system. Modelisation and the subsequent identification of functionalities with their parameters, variables and attributes have now been carried out at CERN for representative families of devices. ISO specifications, such as the ASN.1 metalanguage for data structure representation and MMS definitions and services have, to some extent, been introduced in the design for generality and compatibility with external world. The final product of this design is totally independent of the control systems and permits object oriented implementations in any controls frame. The present paper describes the different phases of the project with a short overview of the various implementations under development at CERN. (author)
Full Text Available Motivation. The solvent accessibility of protein residues is one of the driving forces of protein folding, while the contact number of protein residues limits the possibilities of protein conformations. The de novo prediction of these properties from protein sequence is important for the study of protein structure and function. Although these two properties are certainly related with each other, it is challenging to exploit this dependency for the prediction. Method. We present a method AcconPred for predicting solvent accessibility and contact number simultaneously, which is based on a shared weight multitask learning framework under the CNF (conditional neural fields model. The multitask learning framework on a collection of related tasks provides more accurate prediction than the framework trained only on a single task. The CNF method not only models the complex relationship between the input features and the predicted labels, but also exploits the interdependency among adjacent labels. Results. Trained on 5729 monomeric soluble globular protein datasets, AcconPred could reach 0.68 three-state accuracy for solvent accessibility and 0.75 correlation for contact number. Tested on the 105 CASP11 domain datasets for solvent accessibility, AcconPred could reach 0.64 accuracy, which outperforms existing methods.
Ma, Jianzhu; Wang, Sheng
The solvent accessibility of protein residues is one of the driving forces of protein folding, while the contact number of protein residues limits the possibilities of protein conformations. The de novo prediction of these properties from protein sequence is important for the study of protein structure and function. Although these two properties are certainly related with each other, it is challenging to exploit this dependency for the prediction. We present a method AcconPred for predicting solvent accessibility and contact number simultaneously, which is based on a shared weight multitask learning framework under the CNF (conditional neural fields) model. The multitask learning framework on a collection of related tasks provides more accurate prediction than the framework trained only on a single task. The CNF method not only models the complex relationship between the input features and the predicted labels, but also exploits the interdependency among adjacent labels. Trained on 5729 monomeric soluble globular protein datasets, AcconPred could reach 0.68 three-state accuracy for solvent accessibility and 0.75 correlation for contact number. Tested on the 105 CASP11 domain datasets for solvent accessibility, AcconPred could reach 0.64 accuracy, which outperforms existing methods.
We present a consistent embedding of the matter and gauge content of the Standard Model into an underlying asymptotically safe theory that has a well-determined interacting UV fixed point in the large color/flavor limit. The scales of symmetry breaking are determined by two mass-squared parameters with the breaking of electroweak symmetry being driven radiatively. There are no other free parameters in the theory apart from gauge couplings.
In this paper it will be shown that the standard model in 3+1 dimensions is a gauge fixed version of a 2T physics field theory in 4+2 dimensions, thus establishing that 2T physics provides a correct description of nature from the point of view of 4+2 dimensions. The 2T formulation leads to phenomenological consequences of considerable significance. In particular, the higher structure in 4+2 dimensions prevents the problematic F*F term in QCD. This resolves the strong CP problem without a need for the Peccei-Quinn symmetry or the corresponding elusive axion. Mass generation with the Higgs mechanism is less straightforward in the new formulation of the standard model, but its resolution leads to an appealing deeper physical basis for mass, coupled with phenomena that could be measurable. In addition, there are some brand new mechanisms of mass generation related to the higher dimensions that deserve further study. The technical progress is based on the construction of a new field theoretic version of 2T physics including interactions in an action formalism in d+2 dimensions. The action is invariant under a new type of gauge symmetry which we call 2T-gauge symmetry in field theory. This opens the way for investigations of the standard model directly in 4+2 dimensions, or from the point of view of various embeddings of 3+1 dimensions, by using the duality, holography, symmetry, and unifying features of 2T physics
Clark, Nathan E.
This paper explores from the view of the data recipient and user the complexities of creating a common licensing scheme for the access and use of satellite earth observation (EO) data in international disaster management (DM) activities. EO data contributions in major disaster events often involve numerous data providers with separate licensing mechanisms for controlling the access, uses, and distribution of data by the end users. A lack of standardization among the terminology, wording, and conditions within these licenses creates a complex legal environment for users, and often prevents them from using, sharing and combining datasets in an effective and timely manner. It also creates uncertainty among data providers as to the types of licensing controls that should be applied in disaster scenarios. This paper builds from an ongoing comparative analysis of the common and conflicting conditions among data licenses that must be addressed in order to facilitate easier access and use of EO data within the DM sector and offers recommendations towards the alignment of the structural and technical aspects of licenses among data providers.
Kosyakov, S.; Kowalkowski, J.; Litvintsev, D.; Lueking, L.; Paterno, M.; White, S.P.; Autio, Lauri; Blumenfeld, B.; Maksimovic, P.; Mathis, M.
A high performance system has been assembled using standard web components to deliver database information to a large number of broadly distributed clients. The CDF Experiment at Fermilab is establishing processing centers around the world imposing a high demand on their database repository. For delivering read-only data, such as calibrations, trigger information, and run conditions data, we have abstracted the interface that clients use to retrieve data objects. A middle tier is deployed that translates client requests into database specific queries and returns the data to the client as XML datagrams. The database connection management, request translation, and data encoding are accomplished in servlets running under Tomcat. Squid Proxy caching layers are deployed near the Tomcat servers, as well as close to the clients, to significantly reduce the load on the database and provide a scalable deployment model. Details the system's construction and use are presented, including its architecture, design, interfaces, administration, performance measurements, and deployment plan
Becker, G.; Hussels, U.; Epstein, S.; Rauzy, A.; Schubert, B.
In its present state, the open PSA standard is helpful to determine capabilities of PSA approaches, which have been taken into account by those who formulated it. As soon, as tools come up, which can automatically bring a given PSA into the standard form, the data will be accessible by other software tools, which either are supplementary to the original one, or they may act in the context of quality control. Taking into account, that a PSA model represents a value of some two to ten person years (dependent on level of completeness and level of detail), it is important to have the data in a transparent way, which does not depend on proprietary formats, and can thus be used for more purposes than those, which are implemented in given PSA codes. (orig.)
Alasonati, Enrica; Fettig, Ina; Richter, Janine; Philipp, Rosemarie; Milačič, Radmila; Sčančar, Janez; Zuliani, Tea; Tunç, Murat; Bilsel, Mine; Gören, Ahmet Ceyhan; Fisicaro, Paola
The European Union (EU) has included tributyltin (TBT) and its compounds in the list of priority water pollutants. Quality standards demanded by the EU Water Framework Directive (WFD) require determination of TBT at so low concentration level that chemical analysis is still difficult and further research is needed to improve the sensitivity, the accuracy and the precision of existing methodologies. Within the frame of a joint research project "Traceable measurements for monitoring critical pollutants under the European Water Framework Directive" in the European Metrology Research Programme (EMRP), four metrological and designated institutes have developed a primary method to quantify TBT in natural water using liquid-liquid extraction (LLE) and species-specific isotope dilution mass spectrometry (SSIDMS). The procedure has been validated at the Environmental Quality Standard (EQS) level (0.2ngL(-1) as cation) and at the WFD-required limit of quantification (LOQ) (0.06ngL(-1) as cation). The LOQ of the methodology was 0.06ngL(-1) and the average measurement uncertainty at the LOQ was 36%, which agreed with WFD requirements. The analytical difficulties of the method, namely the presence of TBT in blanks and the sources of measurement uncertainties, as well as the interlaboratory comparison results are discussed in detail. Copyright © 2016 Elsevier B.V. All rights reserved.
Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.
The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.
Willson, Gloria; Angell, Katelyn
The authors developed a rubric for assessing undergraduate nursing research papers for information literacy skills critical to their development as researchers and health professionals. We developed a rubric mapping six American Nurses Association professional standards onto six related concepts of the Association of College & Research Libraries (ACRL) Framework for Information Literacy for Higher Education. We used this rubric to evaluate fifty student research papers and assess inter-rater reliability. Students tended to score highest on the "Information Has Value" dimension and lowest on the "Scholarship as Conversation" dimension. However, we found a discrepancy between the grading patterns of the two investigators, with inter-rater reliability being "fair" or "poor" for all six rubric dimensions. The development of a rubric that dually assesses information literacy skills and maps relevant disciplinary competencies holds potential. This study offers a template for a rubric inspired by the ACRL Framework and outside professional standards. However, the overall low inter-rater reliability demands further calibration of the rubric. Following additional norming, this rubric can be used to help students identify the key information literacy competencies that they need in order to succeed as college students and future nurses. These skills include developing an authoritative voice, determining the scope of their information needs, and understanding the ramifications of their information choices.
active component accessions, between 2008 and 2013, 13% were accessed with a history of medical disqualification that was either remediated prior to...between 2008 and 2013, 15% accessed with a history of previous medical disqualification and 1% were hospitalized in the first year of service. About...include achievement, adjustment, attention-seeking, cooperation, dominance, even- temperedness, generosity, intellectual efficiency, non- delinquency
Neto, A. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisboa (Portugal); Fernandes, H. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisboa (Portugal)], E-mail: email@example.com; Alves, D.; Valcarcel, D.F.; Carvalho, B.B.; Ferreira, J. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisboa (Portugal); Vega, J.; Sanchez, E.; Pena, A. [Asociacion Euratom/CIEMAT para Fusion, Madrid (Spain); Hron, M. [Asociace EURATOM IPP.CR, Prague (Czech Republic); Varandas, C.A.F. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisboa (Portugal)
Each EURATOM Association stores data using proprietary schemes, usually developed by the research unit or using third party software. The temporary exchange of researchers between laboratories is a common practice nowadays. When the researchers returns to the home laboratory, there is usually the need to follow the work. The amount of available data is becoming enormous and the main data index is changing from shot number to time and events, where the pulse number is just one among the most relevant events against data is catalogued. These difficulties can be overcome by using a common software layer between end-users and laboratories. The components needed to create this software abstraction layer, between users and laboratories data, have already been developed using a universal and well known remote procedure call standard (RPC) based on the eXtensible Markup Language (XML): XML-RPC. The library allows data retrieval using the same methods for all associations. Users are authenticated through the PAPI system ( (http://papi.rediris.es)), allowing each organization to use its own authentication schema. Presently there are libraries and server implementations in Java and C++. These libraries have been included and tested in some of the most common data analysis programs such as MatLab and IDL. The system is already being used in ISTTOK/PT and CASTOR/CZ.
Neto, A.; Fernandes, H.; Alves, D.; Valcarcel, D.F.; Carvalho, B.B.; Ferreira, J.; Vega, J.; Sanchez, E.; Pena, A.; Hron, M.; Varandas, C.A.F.
Each EURATOM Association stores data using proprietary schemes, usually developed by the research unit or using third party software. The temporary exchange of researchers between laboratories is a common practice nowadays. When the researchers returns to the home laboratory, there is usually the need to follow the work. The amount of available data is becoming enormous and the main data index is changing from shot number to time and events, where the pulse number is just one among the most relevant events against data is catalogued. These difficulties can be overcome by using a common software layer between end-users and laboratories. The components needed to create this software abstraction layer, between users and laboratories data, have already been developed using a universal and well known remote procedure call standard (RPC) based on the eXtensible Markup Language (XML): XML-RPC. The library allows data retrieval using the same methods for all associations. Users are authenticated through the PAPI system ( (http://papi.rediris.es)), allowing each organization to use its own authentication schema. Presently there are libraries and server implementations in Java and C++. These libraries have been included and tested in some of the most common data analysis programs such as MatLab and IDL. The system is already being used in ISTTOK/PT and CASTOR/CZ
Addressing legal and political barriers to global pharmaceutical access: options for remedying the impact of the Agreement on Trade-Related Aspects of Intellectual Property Rights (TRIPS) and the imposition of TRIPS-plus standards.
Cohen-Kohler, Jillian Clare; Forman, Lisa; Lipkus, Nathaniel
Despite myriad programs aimed at increasing access to essential medicines in the developing world, the global drug gap persists. This paper focuses on the major legal and political constraints preventing implementation of coordinated global policy solutions - particularly, the Agreement on Trade-Related Aspects of Intellectual Property Rights (TRIPS) and bilateral and regional free trade agreements. We argue that several policy and research routes should be taken to mitigate the restrictive impact of TRIPS and TRIPS-plus rules, including greater use of TRIPS flexibilities, advancement of human rights, and an ethical framework for essential medicines distribution, and a broader campaign that debates the legitimacy of TRIPS and TRIPS-plus standards themselves.
Full Text Available RDA (Resource Description and Access is going to promote a great change. In fact, guidelines – rather than rules – are addressed to anyone wishes to describe and make accessible a cultural heritage collection or tout court a collection: librarians, archivists, curators and professionals in any other branch of knowledge. The work is organized in two parts: the former contains theoretical foundations of cataloguing (FRBR, ICP, semantic web and linked data, the latter a critical presentation of RDA guidelines. RDA aims to make possible creation of well-structured metadata for any kind of resources, reusable in any context and technological environment. RDA offers a “set of guidelines and instructions to create data for discovery of resources”. Guidelines stress four actions – to identify, to relate (from FRBR/FRAD user tasks and ICP, to represent and to discover – and a noun: resource. To identify entities of Group 1 and Group 2 of FRBR (Work, Expression, Manifestation, Item, Person, Family, Corporate Body; to relate entities of Group 1 and Group 2 of FRBR, by means of relationships. To enable users to represent and discover entities of Group 1 and Group 2 by means of their attributes and relationships. These last two actions are the reason of users’ searches, and users are the pinpoint of the process. RDA enables the discovery of recorded knowledge, that is any resource conveying information, any resources transmitting intellectual or artistic content by means of any kind of carrier and media. RDA is a content standard, not a display standard nor an encoding standard: it gives instructions to identify data and does not care about how display or encode data produced by guidelines. RDA requires an original approach, a metanoia, a deep change in the way we think about cataloguing. Innovations in RDA are many: it promotes interoperability between catalogs and other search tools, it adopts terminology and concepts of the Semantic Web, it
Tobias, Karen Marie
An analysis of curriculum frameworks from the fifty states to ascertain the compliance with the National Science Education Standards for integrating Science-Technology-Society (STS) themes is reported within this dissertation. Science standards for all fifty states were analyzed to determine if the STS criteria were integrated at the elementary, middle, and high school levels of education. The analysis determined the compliance level for each state, then compared each educational level to see if the compliance was similar across the levels. Compliance is important because research shows that using STS themes in the science classroom increases the student's understanding of the concepts, increases the student's problem solving skills, increases the student's self-efficacy with respect to science, and students instructed using STS themes score well on science high stakes tests. The two hypotheses for this study are: (1) There is no significant difference in the degree of compliance to Science-Technology-Society themes (derived from National Science Education Standards) between the elementary, middle, and high school levels. (2) There is no significant difference in the degree of compliance to Science-Technology-Society themes (derived from National Science Education Standards) between the elementary, middle, and high school level when examined individually. The Analysis of Variance F ratio was used to determine the variance between and within the three educational levels. This analysis addressed hypothesis one. The Analysis of Variance results refused to reject the null hypothesis, meaning there is significant difference in the compliance to STS themes between the elementary, middle and high school educational levels. The Chi-Square test was the statistical analysis used to compare the educational levels for each individual criterion. This analysis addressed hypothesis two. The Chi-Squared results showed that none of the states were equally compliant with each
Robert P. Evans
Cyber security standards, guidelines, and best practices for control systems are critical requirements that have been delineated and formally recognized by industry and government entities. Cyber security standards provide a common language within the industrial control system community, both national and international, to facilitate understanding of security awareness issues but, ultimately, they are intended to strengthen cyber security for control systems. This study and the preliminary findings outlined in this report are an initial attempt by the Control Systems Security Center (CSSC) Standard Awareness Team to better understand how existing and emerging industry standards, guidelines, and best practices address cyber security for industrial control systems. The Standard Awareness Team comprised subject matter experts in control systems and cyber security technologies and standards from several Department of Energy (DOE) National Laboratories, including Argonne National Laboratory, Idaho National Laboratory, Pacific Northwest National Laboratory, and Sandia National Laboratories. This study was conducted in two parts: a standard identification effort and a comparison analysis effort. During the standard identification effort, the Standard Awareness Team conducted a comprehensive open-source survey of existing control systems security standards, regulations, and guidelines in several of the critical infrastructure (CI) sectors, including the telecommunication, water, chemical, energy (electric power, petroleum and oil, natural gas), and transportation--rail sectors and sub-sectors. During the comparison analysis effort, the team compared the requirements contained in selected, identified, industry standards with the cyber security requirements in ''Cyber Security Protection Framework'', Version 0.9 (hereafter referred to as the ''Framework''). For each of the seven sector/sub-sectors listed above, one standard was
An international frequency comparison was carried out at the Bundesamt fuer Eich- und Vermessungswessen (BEV), Vienna, within the framework of the EUROMET Project #498 from August 29 to September 5, 1999. The frequency differences obtained when the RO.1 laser from the National Institute for Laser, Plasma and Radiation Physics (NILPRP), Romania, was compared with five lasers from Austria (BEV1), Czech Republic (PLD1), France (BIPM3), Poland (GUM1) and Hungary (OMH1) are reported. Frequency differences were computed by using the matrix determinations for the group d, e, f, g. Considering the frequency differences measured for a group of three lasers compared to each other, we call the closing frequency the difference between measured and expected frequency difference (resulting from the previous two measurements). For the RO1 laser, when the BIPM3 laser was the reference laser, the closing frequencies range from +8.1 kHz to - 3.8 kHz. The relative Allan standard deviation was used to express the frequency stability and resulted 3.8 parts in 1012 for 100 s sampling time and 14000 s duration of the measurements. The averaged offset frequency relative to the BIPM4 stationary laser was 5.6 kHz and the standard deviation was 9.9 kHz.
Palalloi, Irfan Andi; Anwar, Azwar; Syarifuddin
Most of the activities of Majene regency society dominant as a fisherman, by and large, they work based on the hereditary experiences of their ancestors. This is proven by fishery industry statistic that highest from other industry with 18,30 % in the distribution of the gross regional domestic product. In each specific case, utilization of technology becomes a necessity that plays a key role. Adoption of technology for fishermen groups in use of GPSequipment has frequently committed by the government also non-profit organization go through training and mentoring. Nowadays there are some modern mobile applications has been developed by government agency assist the group of fishermen handy on managing their fishing activity. Such us ZPPI data row from Lapan Satelite, nelpinwas also known as smart fisheries, infrastructure development for space oceanography (indeso). However, all of them carry out of the risk and problems on the user side. One of them related toaccuracy and reliability. In this research, we elaborate technical factor, governance, through Cobit framework and analyze the best practice standardfor implementing the technology. All in all, the result presented of the governance standard on control and implementing technology under customer dimension in information technology governance on the standard process to ensure benefit delivery for implementing mobile application fishery in DKP Majene regency.
T. E. Wong
Full Text Available Simple models can play pivotal roles in the quantification and framing of uncertainties surrounding climate change and sea-level rise. They are computationally efficient, transparent, and easy to reproduce. These qualities also make simple models useful for the characterization of risk. Simple model codes are increasingly distributed as open source, as well as actively shared and guided. Alas, computer codes used in the geosciences can often be hard to access, run, modify (e.g., with regards to assumptions and model components, and review. Here, we describe the simple model framework BRICK (Building blocks for Relevant Ice and Climate Knowledge v0.2 and its underlying design principles. The paper adds detail to an earlier published model setup and discusses the inclusion of a land water storage component. The framework largely builds on existing models and allows for projections of global mean temperature as well as regional sea levels and coastal flood risk. BRICK is written in R and Fortran. BRICK gives special attention to the model values of transparency, accessibility, and flexibility in order to mitigate the above-mentioned issues while maintaining a high degree of computational efficiency. We demonstrate the flexibility of this framework through simple model intercomparison experiments. Furthermore, we demonstrate that BRICK is suitable for risk assessment applications by using a didactic example in local flood risk management.
Hutchings, Jeremy; Corr, Susan
The paper describes how specific descriptors for the Conservation-Restoration profession have been developed by the European Confederation of Conservator-Restorers' Organizations. The result of which is in accordance with the threefold rubric of Knowledge, Skills and Competence as defined by the European Qualifications Framework. Instead of giving…
Vries, H. de; Riet, J.P. van 't; Panday, S.; Reubsaet, A.
This study analyzed possibilities to access European adolescents for tobacco control activities in out-of-school settings as part of comprehensive tobacco control programs. Data on leisure time behaviors of secondary school students were gathered during three waves from six European Union countries
Ong, Jane Jun-Xin; Steele, Catriona M; Duizer, Lisa M
Sensory characteristics are important for the acceptance of thickened liquids, but those of liquids thickened to the new standards put forth by the International Dysphagia Diet Standardization Initiative (IDDSI) are unknown. This research sought to identify and rate the perception of important sensory properties of liquids thickened to levels specified in the IDDSI framework. Samples were made with water, with and without added barium sulfate, and were thickened with a cornstarch or xanthan gum based thickener. Samples were characterized using projective mapping/ultra-flash profiling to identify important sample attributes, and then with trained descriptive analysis panels to characterize those attributes in non-barium and barium thickened liquids. Three main groups of attributes were observed. Taste and flavor attributes decreased in intensity with increasing thickener. Thickener specific attributes included graininess and chalkiness for the cornstarch thickened samples, and slipperiness for the xanthan gum samples. Within the same type of thickener, ratings of thickness-related attributes (perceived viscosity, adhesiveness, manipulation, and swallowing) at different IDDSI levels were significantly different from each other. However, in non-barium samples, cornstarch samples were perceived as thicker than xanthan gum samples even though they had similar apparent viscosities at 50 s -1 . On the other hand, the two thickeners had similar perceived thickness in the barium samples even though the apparent viscosities of cornstarch samples were higher than those of the xanthan gum samples. In conclusion, IDDSI levels can be distinguished based on sensory properties, but these properties may be affected by the type of thickener and medium being thickened.
Borrás-Pascual, Maria Josep; Busquets-Font, Josep Maria; García-Martínez, Anna; Manent-González, Martí
The Constitution and especially the Constitutional Court's jurisprudence have recognized the so-called right of habeas data, providing legal protection at the highest level of personal data. Health information, falls within the scope of protection, but we see that there are peculiarities in the health and development legislation that compels us to treat such information with special characteristics. This article will review the citizen's rights to access to health information, taking into account both the protection of personal data such as regulating access to specific health information and tools that have been developed for the exercise of these rights under the "Shared Medical Record" project developed by the Department of Health of the Generalitat of Catalonia. In particular the rights that are discussed are: the right of access to information, the right of correction, the right of cancellation. The right of access to information enables anyone to know if their personal data are processed, the purpose of treatment and the available information on the origin of personal data. In addition the law also allows to know whether the data have been disclosed to a third party. The right of rectification gives -concerned in this case the patient- the right to correct any data that contain errors. The cancellation right is restricted to situations where it really is exercising a right of correction against information. Finally, the right to object is for patients to be able to oppose their health data is consulted by various health care facilities to generate them. 2010 Elsevier España S.L. All rights reserved.
The purpose of this report is to gain an understanding of the governance arrangements, procedures, and capacity for setting auditing standards in a jurisdiction, covering: (a) the adoption of International Standards on Auditing (ISA) where applicable, and (b) national auditing standards. The questions are based on examples of good practice followed by international standard-setting bodies....
Full Text Available Access to health care is a major requirement in improving health and fostering socioeconomic development. In the People's Republic of China (P.R. China, considerable changes have occurred in the social, economic, and health systems with a shift from a centrally planned to a socialist market economy. This brought about great benefits and new challenges, particularly for vertical disease control programs, including schistosomiasis. We explored systemic barriers in access to equitable and effective control of schistosomiasis.Between August 2002 and February 2003, 66 interviews with staff from anti-schistosomiasis control stations and six focus group discussions with health personnel were conducted in the Dongting Lake area, Hunan Province. Additionally, 79 patients with advanced schistosomiasis japonica were interviewed. The health access livelihood framework was utilized to examine availability, accessibility, affordability, adequacy, and acceptability of schistosomiasis-related health care.We found sufficient availability of infrastructure and human resources at most control stations. Many patients with advanced schistosomiasis resided in non-endemic or moderately endemic areas, however, with poor accessibility to disease-specific knowledge and specialized health services. Moreover, none of the patients interviewed had any form of health insurance, resulting in high out-of-pocket expenditure or unaffordable care. Reports on the adequacy and acceptability of care were mixed.There is a need to strengthen health awareness and schistosomiasis surveillance in post-transmission control settings, as well as to reduce diagnostic and treatment costs. Further studies are needed to gain a multi-layered, in-depth understanding of remaining barriers, so that the ultimate goal of schistosomiasis elimination in P.R. China can be reached.
Abstract The significance of private standards and associated local level initiatives in agri-food value chains are increasingly recognised. However whilst issues related to compliance and impact at the smallholder or worker level have frequently been analysed, the governance implications in terms of how private standards affect national level institutions, public, private and non-governmental, have had less attention. This article applies an extended value chain framework for crit...
Full Text Available There is a clear need to measure the correct implementation of the European Framework through the employability of the alumni. The evaluation of the deployment of the Qualifications Frameworks in the European Higher Education Area (QF-EHEA/QF should shed significant light on the action that must be taken by legislators and higher education managers to foster employability and guarantee the sustainability of the EHEA. We propose a methodology based on a Survey on Access to the Labour Market (SALM to assess the correlation between the education provided to the students and the practical utility of the knowledge acquired in the workplace. A questionnaire has been produced to measure the competencies and descriptors that had been theoretically defined within the QF-EHEA. Fifteen questions were disguised so that the six QF-EHEA descriptors were quantified through the difference between education and utility. The quantification methodology for the framework has been tested successfully on the former students of a higher education center in Spain. In this center, the alumni perceived that the utility of their acquired competencies and their employability level was greater than their education content, while both levels were reasonably high. The results hold for both Bachelor’s and Master’s degrees.
In light of the continuing spread of HIV infection and the devastating impact of the disease on lives, communities, and economies, particularly in the developing world, the investment in new treatments, vaccines, and microbicides has clearly been inadequate. Efforts must be intensified to develop effective HIV vaccines and to ensure that they are accessible to people in all parts of the world. This article is a summary of a paper by Sam Avrett presented at "Putting Third First: Vaccines, Access to Treatments and the Law," a satellite meeting held at Barcelona on 5 July 2002 and organized by the Canadian HIV/AIDS Legal Network, the AIDS Law Project, South Africa, and the Lawyers Collective HIV/AIDS Unit, India. In the article, Avrett calls for immediate action to increase commitment and funding for HIV vaccines, enhance public support and involvement, accelerate vaccine development, and plan for the eventual delivery of the vaccines. The article briefly outlines steps that governments need to take to implement each of these objectives. The article also provides a menu of potential actions for vaccine advocates to consider as they lobby governments.
Zhao, Changming; Dai, Xinyao; Yao, Tao; Chen, Wenxing; Wang, Xiaoqian; Wang, Jing; Yang, Jian; Wei, Shiqiang; Wu, Yuen; Li, Yadong
Single-atom catalysts often exhibit unexpected catalytic activity for many important chemical reactions because of their unique electronic and geometric structures with respect to their bulk counterparts. Herein we adopt metal-organic frameworks (MOFs) to assist the preparation of a catalyst containing single Ni sites for efficient electroreduction of CO 2 . The synthesis is based on ionic exchange between Zn nodes and adsorbed Ni ions within the cavities of the MOF. This single-atom catalyst exhibited an excellent turnover frequency for electroreduction of CO 2 (5273 h -1 ), with a Faradaic efficiency for CO production of over 71.9% and a current density of 10.48 mA cm -2 at an overpotential of 0.89 V. Our findings present some guidelines for the rational design and accurate modulation of nanostructured catalysts at the atomic scale.
Klein, Gary M.
Online public access catalogs from 67 libraries using NOTIS software were searched using Internet connections to determine the positional operators selected as the default keyword operator on each catalog. Results indicate the lack of a processing standard for keyword searches. Five tables provide information. (Author/AEF)
Full Text Available and used by people with varying abilities. Although accessibility concerns are aimed at making systems usable for people with disabilities, support for direct accessibility, the built-in redundancies in an application that enable as many people as possible...
Efstathiou, Nectarios; Skitsas, Michael; Psaroudakis, Chrysostomos; Koutras, Nikolaos
Nowadays, video surveillance cameras are used for the protection and monitoring of a huge number of facilities worldwide. An important element in such surveillance systems is the use of aerial video streams originating from onboard sensors located on Unmanned Aerial Vehicles (UAVs). Video surveillance using UAVs represent a vast amount of video to be transmitted, stored, analyzed and visualized in a real-time way. As a result, the introduction and development of systems able to handle huge amount of data become a necessity. In this paper, a new approach for the collection, transmission and storage of aerial videos and metadata is introduced. The objective of this work is twofold. First, the integration of the appropriate equipment in order to capture and transmit real-time video including metadata (i.e. position coordinates, target) from the UAV to the ground and, second, the utilization of the ADITESS Versatile Media Content Management System (VMCMS-GE) for storing of the video stream and the appropriate metadata. Beyond the storage, VMCMS-GE provides other efficient management capabilities such as searching and processing of videos, along with video transcoding. For the evaluation and demonstration of the proposed framework we execute a use case where the surveillance of critical infrastructure and the detection of suspicious activities is performed. Collected video Transcodingis subject of this evaluation as well.
Klug, Hermann; Kmoch, Alexander
Transboundary and cross-catchment access to hydrological data is the key to designing successful environmental policies and activities. Electronic maps based on distributed databases are fundamental for planning and decision making in all regions and for all spatial and temporal scales. Freshwater is an essential asset in New Zealand (and globally) and the availability as well as accessibility of hydrological information held by or held for public authorities and businesses are becoming a crucial management factor. Access to and visual representation of environmental information for the public is essential for attracting greater awareness of water quality and quantity matters. Detailed interdisciplinary knowledge about the environment is required to ensure that the environmental policy-making community of New Zealand considers regional and local differences of hydrological statuses, while assessing the overall national situation. However, cross-regional and inter-agency sharing of environmental spatial data is complex and challenging. In this article, we firstly provide an overview of the state of the art standard compliant techniques and methodologies for the practical implementation of simple, measurable, achievable, repeatable, and time-based (SMART) hydrological data management principles. Secondly, we contrast international state of the art data management developments with the present status for groundwater information in New Zealand. Finally, for the topics (i) data access and harmonisation, (ii) sensor web enablement and (iii) metadata, we summarise our findings, provide recommendations on future developments and highlight the specific advantages resulting from a seamless view, discovery, access, and analysis of interoperable hydrological information and metadata for decision making.
Johan, Kartina; Mohd Turan, Faiz
Malaysian Engineering Accreditation (Engineering Programme Accreditation Manual, 2007) requires all bachelor degree in engineering programmes to incorporate a minimum of two months industrial training in order for the programme to be accredited by the council. The industrial training has the objective to provide students on the insights of being an engineer at the workplace hence increasing their knowledge in employability skills prior to graduation. However the current structure of industrial training is not able to inculcate good leadership ability and prepare students with sustainability competencies needed in the era of Sustainable Development (SD). This paper aims to study project management methodology as a framework to create a training pathway in industrial training for students in engineering programs using Green Project Management (GPM) P5 standard for sustainability in project management. The framework involves students as interns, supervisors from both university and industry and also participation from NonProfit Organisation (NPO). The framework focus on the development of the student's competency in employability skills, lean leadership and sustainability competencies using experiential learning approach. Deliverables of the framework include internship report, professional sustainability report using GPM P5 standard and competency assessment. The post-industrial phase of the framework is constructed for students to be assessed collaboratively by the university, industry and the sustainability practitioner in the country. The ability for the interns to act as a change agent in sustainability practices is measured by the competency assessment and the quality of the sustainability report. The framework support the call for developing holistic students based on Malaysian Education Blueprint (Higher Education) 2015-2025 and address the gap between the statuses of engineering qualification to the sustainability competencies in the 21st century in
Mussini, P.R.; Mussini, T.; Rondinini, S.
Recommended Reference Value Standards based on the potassium hydro-genphthalate buffer at various temperatures are reported for pH measurements in various binary solvent mixtures of water with eight organic solvents: methanol, ethanol, 2-propanol, 1,2-ethanediol, 2-methoxyethanol (''methylcellosolve''), acetonitrile, 1,4-dioxane, and dimethyl sulfoxide, together with Reference Value Standard based on the potassium deuterium phthalate buffer for pD measurements in D 2 O. In addition are reported Primary Standards for pH based on numerous buffers in various binary solvent mixtures of water with methanol, ethanol, and dimethyl sulfoxide, together with Primary Standards for pD in D 2 O based on the citrate, phosphate and carbonate buffers. (author)
Alfatih Alamin Elfaki
Full Text Available This study aimed to clarify the importance of having national standards and their role in achieving quality, as well as establishing a framework for the actual application of national standards in quality assurance so as to achieve quality in higher education institutions. The researchers followed a descriptive analytical method to achieve the objectives of the study and developed a questionnaire covering primary and secondary variables that have a role in the design of specific models to help in applying the national standards by the Sudanese universities. The questionnaire included one dependent variable; the effective application of national standards of quality assurance in higher education institutions, and the four main variables (independent are: the national standards of quality assurance in higher education in Sudan, the standard of quality assurance, the standard of teaching and learning and the standard of scientific research and publication. The study revealed a number of conclusions: there were statistically significant differences in the extent of familiarity with the national quality assurance standards in Sudan according to the academic rank of the faculty members; there were also significant differences in the extent of compliance with the national quality assurance standards in Sudan according to the academic rank of the faculty members; there was full agreement between the national standards for quality assurance in Sudan and the international standards for quality assurance; and there were statistically significant differences in that the absence of specific models would have a negative impact on effective application of national standards of quality assurance in higher education in Sudan, according to the academic rank of the faculty members. Keywords: Quality, The program, Standards, University, Total quality management.
against an enemy of the United States or against an opposing military force. It also includes any military operations that result in the retention of...active duty for that day, precluding them from doing anything else, including working their civilian jobs . While access problems were identified in...certified psychiatric nurse specialists, clinical psychologists, certified marriage and family therapists, pastoral counselors, and mental health
Gilmore, William; Mitsilegas, Valsamis
This article examines the evolution of the EU anti-money laundering legislative framework (which in recent years has also included measures to counter terrorist finance), by focusing in particular on recent legislation such as the third money laundering Directive and the Regulation on controls of cash entering the EU, both adopted in 2005. The analysis highlights the relationship between these instruments and international initiatives in the field (in particular FATF standards), and addresses...
The objective of the SAFESTAR project is the formulation of design standards or recommendations exclusively based on safety arguments. Workpackage 3 (WP3) of SAFESTAR, of which this report is the concluding report, should result in design recommendations for single and dual-carriageway express roads
The 2001 UNESCO Convention on the Protection of the Underwater Cultural Heritage is slowly but peremptorily becoming a standard reference tool for underwater archaeology and underwater cultural heritage management. The many provisions included within the Convention touch on many aspects that are key to an effective protection and promotion of the underwater cultural heritage. Within the web of these provisions many aspects are gaining consideration and driving research in underwater archaeology worldwide. These provisions, when seen within a wider frame of social, economical and technological dynamics, pinpoint many aspects requiring further scrutiny from the disciplinary circle. In the framework of the 2001 UNESCO Convention, this article will analyze the path traveled in technological acquisition in the practice of Italian underwater archaeology and how this has affected the approach to underwater cultural heritage management, particularly highlighting how this process has been further influenced by the adoption in 2001 of the Convention and Italy's ratification of it in 2010.
National Oceanic and Atmospheric Administration, Department of Commerce — This Atlas contains monthly climatic charts of temperature, salinity, and oxygen at the sea surface and at standard depth levels for the Bering Sea, Sea of Okhotsk,...
National Oceanic and Atmospheric Administration, Department of Commerce — NODC maintains data in three NODC Standard Format Marine Mammal Data Sets: Marine Mammal Sighting and Census (F127); Marine Mammal Specimens (F025); Marine Mammal...
National Oceanic and Atmospheric Administration, Department of Commerce — NODC maintains data in three NODC Standard Format Marine Mammal Data Sets: Marine Mammal Sighting and Census (F127); Marine Mammal Specimens (F025); Marine Mammal...
The Virginia Department of Transportation (VDOT) Road Design Manual requires that new commercial entrances meet certain minimum spacing standards depending on a facilitys speed limit and functional classification. Landowners, however, may request ...
..., documents, charts, posters, presentations (such as Microsoft PowerPoint), or video material that is specifically intended for publication on, or delivery via, an HHS-owned or -funded Web site, the Project... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide...
Zhang, Shaoru; Li, Xiaohong; Zhang, Tianhua; Wang, Xiangni; Liu, Weiping; Ma, Xuexue; Li, Yuelu; Fan, Yahui
College student community is the one with high risk of tuberculosis (TB). A systemic and standardized administration model for prevention and control of TB is significance in controlling TB spread in universities. Currently, the universities in China have not established the comprehensive and standardized administration system for TB prevention and control in college student community. Firstly, the literature research and brainstorming method (n=13) were used to construct the clause and sub-clause pool for the administration of TB prevention and control within college student community in 2014. Secondly, a total of twenty experts in the field of TB prevention and control who are representatives of the east, west, south and north parts of China were selected and invited to participate the Delphi letter-inquiry. After two rounds of letter-inquiry, the opinions of the experts reached a consensus and the framework for the administration system was constructed. A framework for the administration system was constructed, which included 8 first class indexes, 26 second class indexes and 104 third class indexes. The results are highly scientific and reliable, which can be helpful for improving the systemic and standardized levels for the administration of TB prevention and control in universities in China and perhaps in other developing counties with high TB burden as well.
Corvo, Arthur Francis
Given the reality that active and competitive participation in the 21 st century requires American students to deepen their scientific and mathematical knowledge base, the National Research Council (NRC) proposed a new conceptual framework for K--12 science education. The framework consists of an integration of what the NRC report refers to as the three dimensions: scientific and engineering practices, crosscutting concepts, and core ideas in four disciplinary areas (physical, life and earth/spaces sciences, and engineering/technology). The Next Generation Science Standards (NGSS ), which are derived from this new framework, were released in April 2013 and have implications on teacher learning and development in Science, Technology, Engineering, and Mathematics (STEM). Given the NGSS's recent introduction, there is little research on how teachers can prepare for its release. To meet this research need, I implemented a self-study aimed at examining my teaching practices and classroom outcomes through the lens of the NRC's conceptual framework and the NGSS. The self-study employed design-based research (DBR) methods to investigate what happened in my secondary classroom when I designed, enacted, and reflected on units of study for my science, engineering, and mathematics classes. I utilized various best practices including Learning for Use (LfU) and Understanding by Design (UbD) models for instructional design, talk moves as a tool for promoting discourse, and modeling instruction for these designed units of study. The DBR strategy was chosen to promote reflective cycles, which are consistent with and in support of the self-study framework. A multiple case, mixed-methods approach was used for data collection and analysis. The findings in the study are reported by study phase in terms of unit planning, unit enactment, and unit reflection. The findings have implications for science teaching, teacher professional development, and teacher education.
Jacques, David A; Guss, Jules Mitchell; Trewhella, Jill
Small-angle scattering is becoming an increasingly popular tool for the study of bio-molecular structures in solution. The large number of publications with 3D-structural models generated from small-angle solution scattering data has led to a growing consensus for the need to establish a standard reporting framework for their publication. The International Union of Crystallography recently established a set of guidelines for the necessary information required for the publication of such structural models. Here we describe the rationale for these guidelines and the importance of standardising the way in which small-angle scattering data from bio-molecules and associated structural interpretations are reported.
Ashish Ukidve; Ds S SMantha; Milind Tadvalkar
The Payment Card Industry Data Security Standard (PCI DSS) aims to enhance the security of cardholder data and is required when cardholder data or authentication data are stored, processed or transmitted. The implementation of enabling processes from COBIT 5 can complement compliance to PCI DSS. COBIT 5 assists enterprises in governance and management of enterprise IT and, at the same time, supports the need to meet security requirements with supporting processes and management activities. Th...
Helaine M. Alessio
Full Text Available Lifelong physical inactivity is associated with morbidity in adulthood, possibly influenced by changes in gene and protein expressions occurring earlier in life. mRNA (Affymetrix gene array and proteomic (2D-DIGE MALDI-TOF/MS analyses were determined in cardiac tissue of young (3 months and old (16 months Sprague-Dawley rats housed with no access to physical activity (SED versus an exercise wheel (EX. Unfavorable phenotypes for body weight, dyslipidemia, and tumorogenesis appeared more often in adult SED versus EX. No differentially expressed genes (DEGs occurred between groups at 3 or 16 months. Within groups, SED and EX shared 215 age-associated DEGs. In SED, ten unique DEGs occurred with age; three had cell adhesion functions (fn1, lgals3, ncam2. In EX, five unique DEGs occurred with age; two involved hypothalamic, pituitary, and gonadal hormone axis (nrob2, xpnpep2. Protein expression involved in binding, sugar metabolic processes, and vascular regulation declined with age in SED (KNT1, ALBU, GPX1, PYGB, LDHB, G3P, PYGM, PGM1, ENOB. Protein expression increased with age in EX for ATP metabolic processes (MYH6, MYH7, ATP5J, ATPA and vascular function (KNT1, ALBU, GPX1. Differences in select gene and protein expressions within sedentary and active animals occurred with age and contributed to distinct health-related phenotypes in adulthood.
Full Text Available Researchers and practitioners often use standardized vocabulary tests such as the Peabody Picture Vocabulary Test-4 (PPVT-4; Dunn and Dunn, 2007 and its companion, the Expressive Vocabulary Test-2 (EVT-2; Williams, 2007, to assess English vocabulary skills as an indicator of children's school readiness. Despite their psychometric excellence in the norm sample, issues arise when standardized vocabulary tests are used to asses children from culturally, linguistically and ethnically diverse backgrounds (e.g., Spanish-speaking English language learners or delayed in some manner. One of the biggest challenges is establishing the appropriateness of these measures with non-English or non-standard English speaking children as often they score one to two standard deviations below expected levels (e.g., Lonigan et al., 2013. This study re-examines the issues in analyzing the PPVT-4 and EVT-2 scores in a sample of 4-to-5-year-old low SES Hispanic preschool children who were part of a larger randomized clinical trial on the effects of a supplemental English shared-reading vocabulary curriculum (Pollard-Durodola et al., 2016. It was found that data exhibited strong floor effects and the presence of floor effects made it difficult to differentiate the invention group and the control group on their vocabulary growth in the intervention. A simulation study is then presented under the multilevel structural equation modeling (MSEM framework and results revealed that in regular multilevel data analysis, ignoring floor effects in the outcome variables led to biased results in parameter estimates, standard error estimates, and significance tests. Our findings suggest caution in analyzing and interpreting scores of ethnically and culturally diverse children on standardized vocabulary tests (e.g., floor effects. It is recommended appropriate analytical methods that take into account floor effects in outcome variables should be considered.
Full Text Available As is the case in many countries, in Ethiopia human trafficking causes multi-dimensional harmful consequences on individuals. With a view to addressing the problem, in 2012 Ethiopia acceded to the Protocol to Prevent, Suppress and Punish Trafficking in Persons, Especially Women and Children, supplementing the United Nations Convention against Transnational Organized Crime. For the purpose of translating the requirements of the UN Trafficking Protocol into reality, the government has taken various steps including legislative measures. Proclamation No. 909/2015 (Prevention and Suppression of Trafficking in Persons and Smuggling of Migrants Proclamation is the most recent law adopted to deal with smuggling of migrants and human trafficking. The Proclamation comprises four key aspects: criminalization and prosecution; prevention; protection, rehabilitation and compensation; and cooperation. This article critically examines whether the criminalization and prosecution aspect of the Proclamation complies with international standards.
Alcantara, Luiz Carlos Junior; Cassol, Sharon; Libin, Pieter; Deforche, Koen; Pybus, Oliver G; Van Ranst, Marc; Galvão-Castro, Bernardo; Vandamme, Anne-Mieke; de Oliveira, Tulio
Human immunodeficiency virus type-1 (HIV-1), hepatitis B and C and other rapidly evolving viruses are characterized by extremely high levels of genetic diversity. To facilitate diagnosis and the development of prevention and treatment strategies that efficiently target the diversity of these viruses, and other pathogens such as human T-lymphotropic virus type-1 (HTLV-1), human herpes virus type-8 (HHV8) and human papillomavirus (HPV), we developed a rapid high-throughput-genotyping system. The method involves the alignment of a query sequence with a carefully selected set of pre-defined reference strains, followed by phylogenetic analysis of multiple overlapping segments of the alignment using a sliding window. Each segment of the query sequence is assigned the genotype and sub-genotype of the reference strain with the highest bootstrap (>70%) and bootscanning (>90%) scores. Results from all windows are combined and displayed graphically using color-coded genotypes. The new Virus-Genotyping Tools provide accurate classification of recombinant and non-recombinant viruses and are currently being assessed for their diagnostic utility. They have incorporated into several HIV drug resistance algorithms including the Stanford (http://hivdb.stanford.edu) and two European databases (http://www.umcutrecht.nl/subsite/spread-programme/ and http://www.hivrdb.org.uk/) and have been successfully used to genotype a large number of sequences in these and other databases. The tools are a PHP/JAVA web application and are freely accessible on a number of servers including: http://bioafrica.mrc.ac.za/rega-genotype/html/, http://lasp.cpqgm.fiocruz.br/virus-genotype/html/, http://jose.med.kuleuven.be/genotypetool/html/.
DiFranza, J.; Savageau, J.; Bouchard, J.
OBJECTIVE—To determine if the standard compliance check protocol is a valid measure of the experience of underage smokers when purchasing tobacco in unfamiliar communities. SETTING—160 tobacco outlets in eight Massachusetts communities where underage tobacco sales laws are vigorously enforced. PROCEDURE—Completed purchase rates were compared between underage smokers who behaved normally and inexperienced non-smoking youths who were not allowed to lie or present proof of age (ID). RESULTS—The "smoker protocol" increased the likelihood of a sale nearly sixfold over that for the non-smokers (odds ratio (OR) 5.7, 95% confidence interval (CI) 1.5 to 22). When the youths presented an ID with an underage birth date, the odds of a completed sale increased dramatically (OR 27, 95% CI 3.4 to 212). Clerks judged to be under 21 years of age were seven times more likely to make an illegal sale (OR 7.6, 95% CI 2.4 to 24.0). CONCLUSIONS—Commonly used compliance check protocols are too artificial to reflect accurately the experience of underage smokers. The validity of compliance checks might be improved by having youths present ID, and by employing either tobacco users, or non-tobacco users who are sufficiently experienced to mimic the self confidence exhibited by tobacco users in this situation. Consideration should be given to prohibiting the sale of tobacco by individuals under 21 years of age. Keywords: compliance check protocol; underage smokers PMID:11544386
Park, Sohyun; Onufrak, Stephen; Wilking, Cara; Cradock, Angie
We examined community-level characteristics associated with free drinking water access policies in U.S. municipalities using data from a nationally representative survey of city managers/officials from 2,029 local governments in 2014. Outcomes were 4 free drinking water access policies. Explanatory measures were population size, rural/urban status, census region, poverty prevalence, education, and racial/ethnic composition. We used multivariable logistic regression to test differences and presented only significant findings. Many (56.3%) local governments had at least one community plan with a written objective to provide free drinking water in outdoor areas; municipalities in the Northeast and South regions and municipalities with ≤ 50% of non-Hispanic whites were less likely and municipalities with larger population size were more likely to have a plan. About 59% had polices/budget provisions for free drinking water in parks/outdoor recreation areas; municipalities in the Northeast and South regions were less likely and municipalities with larger population size were more likely to have it. Only 9.3% provided development incentives for placing drinking fountains in outdoor, publicly accessible areas; municipalities with larger population size were more likely to have it. Only 7.7% had a municipal plumbing code with a drinking fountain standard that differed from the statewide plumbing code; municipalities with a lower proportion of non-Hispanic whites were more likely to have it. In conclusion, over half of municipalities had written plans or a provision for providing free drinking water in parks, but providing development incentives or having a local plumbing code provision were rare.
Olafsson, G; Sigurdsson, J A
To examine the access, workload, duties, commitments and quality standards of primary care physicians (GPs) resulting from out-of-hours service. All GPs (n = 96) in rural Iceland. Answers to a postal survey. The participation rate was 80%. The GPs estimated that in 97% of the cases they could be contacted within 5 minutes in an emergency. Under usual circumstances (weather conditions) and within a distance of 10 km, 70% of them could reach the patient within 30 minutes of receiving the call. In severe weather conditions, 50% of the GPs in smaller districts (650-6000 inhabitants) estimated that it could take up to 5 hours or more to reach the patient (which could happen once a year). In the least populated districts, 84% of the GPs had to be on call 14 days or more per month. Serious emergencies (involving special training such as cardiac resuscitation or tracheal intubation) were relatively rare, and GPs expressed the necessity for regular refresher courses in such fields. Modern telecommunication networks guarantee good access to out-of-hours service. The workload and on-call duties are great and do not comply with European Union (EU) recommendations regarding minimal rest time. If GPs in rural areas are to be expected to provide frontline health care, including in severe emergency situations, regular training courses are needed.
Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram
Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Jacques David A
Full Text Available Abstract Small-angle scattering is becoming an increasingly popular tool for the study of bio-molecular structures in solution. The large number of publications with 3D-structural models generated from small-angle solution scattering data has led to a growing consensus for the need to establish a standard reporting framework for their publication. The International Union of Crystallography recently established a set of guidelines for the necessary information required for the publication of such structural models. Here we describe the rationale for these guidelines and the importance of standardising the way in which small-angle scattering data from bio-molecules and associated structural interpretations are reported.
Bellgard, Matthew I; Render, Lee; Radochonski, Maciej; Hunter, Adam
Information management systems are essential to capture data be it for public health and human disease, sustainable agriculture, or plant and animal biosecurity. In public health, the term patient registry is often used to describe information management systems that are used to record and track phenotypic data of patients. Appropriate design, implementation and deployment of patient registries enables rapid decision making and ongoing data mining ultimately leading to improved patient outcomes. A major bottleneck encountered is the static nature of these registries. That is, software developers are required to work with stakeholders to determine requirements, design the system, implement the required data fields and functionality for each patient registry. Additionally, software developer time is required for ongoing maintenance and customisation. It is desirable to deploy a sophisticated registry framework that can allow scientists and registry curators possessing standard computing skills to dynamically construct a complete patient registry from scratch and customise it for their specific needs with little or no need to engage a software developer at any stage. This paper introduces our second generation open source registry framework which builds on our previous rare disease registry framework (RDRF). This second generation RDRF is a new approach as it empowers registry administrators to construct one or more patient registries without software developer effort. New data elements for a diverse range of phenotypic and genotypic measurements can be defined at any time. Defined data elements can then be utilised in any of the created registries. Fine grained, multi-level user and workgroup access can be applied to each data element to ensure appropriate access and data privacy. We introduce the concept of derived data elements to assist the data element standards communities on how they might be best categorised. We introduce the second generation RDRF that
Full Text Available This data article provides a data description on article entitled “A framework for improving web accessibility and usability of Open Course Ware sites”  This Data in Brief presents the data obtained from the accessibility and usability evaluation of the UTPL OCW. The data obtained from the framework evaluation consists of the manual evaluation of the standards criteria and the automatic evaluation of the tools Google PageSpeed and Google Analytics. In addition, this article presents the synthetized tables from standards that are used by the framework to evaluate the accessibility and usability of OCW, and the questionnaires required to extract the data. As a result, the article also provides the data required to reproduce the evaluation of other OCW.
Jackson, Mary Lou; Schoessow, Kimberly A.; Selivanova, Alexandra; Wallis, Jennifer
Purpose Both optical and electronic magnification are available to patients with low vision. Electronic video magnifiers are more expensive than optical magnifiers, but they offer additional benefits, including variable magnification and contrast. This study aimed to evaluate the effect of access to a video magnifier (VM) added to standard comprehensive vision rehabilitation (VR). Methods In this prospective study, 37 subjects with central field loss were randomized to receive standard VR (VR group, 18 subjects) or standard VR plus VM (VM group, 19 subjects). Subjects read the International Reading Speed Texts (IReST), a bank check, and a phone number at enrollment, at 1 month, and after occupational therapy (OT) as indicated to address patient goals. The Impact of Vision Impairment (IVI) questionnaire, a version of the Activity Inventory (AI), and the Depression Anxiety and Stress Scale (DASS) were administered at enrollment, 1 month, after OT, 1 month later, and 1 year after enrollment. Assessments at enrollment and 1 month later were evaluated. Results At 1 month, the VM group displayed significant improvement in reading continuous print as measured by the IReST (P = 0.01) but did not differ on IVI, AI, or DASS. From enrollment to 1 month all subjects improved in their ability to spot read (phone number and check; P read a number in a phone book more than the VR group at 1 month after initial consultation (P = 0.02). All reported better well-being (P = 0.02). Conclusions All subjects reported better well-being on the IVI. The VM group read faster and was better at two spot reading tasks but did not differ from the VR group in other outcome measures. PMID:28924412
The works realized by this working group show some strong conclusions. The citizen confidence in the information access availability must be reinforced. The existence of secrets protecting the industrialists and the Nation interests seems all the more legitimate since they are well limited.The respect of the industrial and commercial secret is not an opposition to a better access to the nuclear safety documents. The defense secret is an indispensable element of the nuclear safety but its role and limits must be debated. (A.L.B.)
Full Text Available We have created the Knowledgebase of Standard Biological Parts (SBPkb as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org. The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org. SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL, a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate "promoter" parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible.
Galdzicki, Michal; Rodriguez, Cesar; Chandran, Deepak; Sauro, Herbert M.; Gennari, John H.
We have created the Knowledgebase of Standard Biological Parts (SBPkb) as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org). The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org). SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL), a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate “promoter” parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible. PMID:21390321
Galdzicki, Michal; Rodriguez, Cesar; Chandran, Deepak; Sauro, Herbert M; Gennari, John H
We have created the Knowledgebase of Standard Biological Parts (SBPkb) as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org). The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org). SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL), a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate "promoter" parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible.
Social movements and access to assets and services. Proposal of an analytical framework based on a comparative analysis of cases: political confrontation of unemployed persons (Argentina and occupancy of housing (Spain
Full Text Available This work is the result of collaboration between researchers from two Academic Units of Social Work located in Spain and Argentina. The overall objective is to explore the analytical frameworks and empirical objects of research projects linked to social movements developed in both countries. Specifically, two phenomena that lead to social mobilization, unemployment through the territorial expressions of the unemployed (Córdoba, Argentina and housing through the occupation of buildings by people who are evicted (Seville, Spain are compared. The work includes the description of the cases in each country, and the nuclear concepts of analysis, community social innovation in the Spanish case of occupation of housing and political confrontation in the case of Argentina. Finally, we discuss some conclusions that present the proposed dimensions for an analytical framework that responds to the relationship between social movements and access to assets and services.
Full Text Available Development of image analysis and machine learning methods for segmentation of clinically significant pathology in retinal spectral-domain optical coherence tomography (SD-OCT, used in disease detection and prediction, is limited due to the availability of expertly annotated reference data. Retinal segmentation methods use datasets that either are not publicly available, come from only one device, or use different evaluation methodologies making them difficult to compare. Thus we present and evaluate a multiple expert annotated reference dataset for the problem of intraretinal cystoid fluid (IRF segmentation, a key indicator in exudative macular disease. In addition, a standardized framework for segmentation accuracy evaluation, applicable to other pathological structures, is presented. Integral to this work is the dataset used which must be fit for purpose for IRF segmentation algorithm training and testing. We describe here a multivendor dataset comprised of 30 scans. Each OCT scan for system training has been annotated by multiple graders using a proprietary system. Evaluation of the intergrader annotations shows a good correlation, thus making the reproducibly annotated scans suitable for the training and validation of image processing and machine learning based segmentation methods. The dataset will be made publicly available in the form of a segmentation Grand Challenge.
Full Text Available and other development agencies focusing on the use of appropriate technologies for civil construction be it in the road, transport, water, agriculture or environmental sectors (World Bank 2001; ILO, 2002; Clegg, 2003). The seminal study conducted... and contribution of local resource based approach to rural and community access roads requires a review of the labour based initiatives in the road, transport and infrastructure sub-sector. An apt statement that captures the essence of building local community...
This report seeks opportunities for standardization of these data and explains findings on three principal tasks. First, it assesses the current state of standardized transportation data. By studying documentation of other programs of standardized da...
Full Text Available The United Nations Millennium Development Goals galvanized global efforts to alleviate suffering of the world's poorest people through unprecedented public-private partnerships. Donor aid agencies have demonstrably saved millions of lives that might otherwise have been lost to disease through increased access to quality-assured vaccines and medicines. Yet, the introduction of these health interventions in low- and middle-income countries (LMICs continues to face a time lag due to factors which remain poorly understood.A recurring theme from our partnership engagements was that an optimized regulatory process would contribute to improved access to quality health products. Therefore, we investigated the current system for medicine and vaccine registration in LMICs as part of our comprehensive regulatory strategy. Here, we report a fact base of the registration timelines for vaccines and drugs used to treat certain communicable diseases in LMICs. We worked with a broad set of stakeholders, including the World Health Organization's prequalification team, national regulatory authorities, manufacturers, procurers, and other experts, and collected data on the timelines between first submission and last approval of applications for product registration sub-Saharan Africa. We focused on countries with the highest burden of communicable disease and the greatest need for the products studied. The data showed a typical lag of 4 to 7 years between the first regulatory submission which was usually to a regulatory agency in a high-income country, and the final approval in Sub-Saharan Africa. Two of the three typical registration steps which products undergo before delivery in the countries involve lengthy timelines. Failure to leverage or rely on the findings from reviews already performed by competent regulatory authorities, disparate requirements for product approval by the countries, and lengthy timelines by manufacturers to respond to regulatory queries
Hyunsun Catherine Yoon
Full Text Available In recent years, arts organizations in the UK have faced challenging times due to severe funding cuts from government and depressed box office sales during the recession. In the UK’s current cultural policy, ‘social impacts’ of the arts are highly emphasized and state interventions are intensified both in terms of finance and legitimacy. What is necessary for arts organizations to produce social impacts is their active provision of ‘deliberate extra activities’, which are generally conducted in the form of education, community, participation or outreach programs. The Royal Opera House (ROH case study provides an apt example of how to exercise these activities effectively to deliver social impacts. Based on Rothchild’s theoretical Motivation, Opportunity, and Ability (MOA framework, this study aims to find out how the UK Royal Opera House stimulated motivation among arts audiences and facilitated opportunities for them, thereby allowed them to translate motivation into action. The results show that the ROH implemented several specific strategies: ‘interest triggerings,’ ‘value creation and transmission,’ relationship building and management,’ and education. These strategies can motivate potential arts consumers to be familiar with classic arts, which lowers psychological barriers and stimulates intrinsic motivation to satisfy long-lasting and self-sustaining cultural needs.
Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian
GLORIA stands for “GLObal Robotic-telescopes Intelligent Array”. GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.
Lykken, Joseph D.
- new directions for BSM model building. Contrary to popular shorthand jargon, supersymmetry (SUSY) is not a BSM model: it is a symmetry principle characterizing a BSM framework with an infinite number of models. Indeed we do not even know the full dimensionality of the SUSY parameter space, since this presumably includes as-yet-unexplored SUSY-breaking mechanisms and combinations of SUSY with other BSM principles. The SUSY framework plays an important role in BSM physics partly because it includes examples of models that are 'complete' in the same sense as the Standard Model, i.e. in principle the model predicts consequences for any observable, from cosmology to b physics to precision electroweak data to LHC collisions. Complete models, in addition to being more explanatory and making connections between diverse phenomena, are also much more experimentally constrained than strawman scenarios that focus more narrowly. One sometimes hears: 'Anything that is discovered at the LHC will be called supersymmetry.' There is truth behind this joke in the sense that the SUSY framework incorporates a vast number of possible signatures accessible to TeV colliders. This is not to say that the SUSY framework is not testable, but we are warned that one should pay attention to other promising frameworks, and should be prepared to make experimental distinctions between them. Since there is no formal classification of BSM frameworks I have invented my own. At the highest level there are six parent frameworks: (1) Terascale supersymmetry; (2) PNGB Higgs; (3) New strong dynamics; (4) Warped extra dimensions; (5) Flat extra dimensions; and (6) Hidden valleys. Here is the briefest possible survey of each framework, with the basic idea, the generic new phenomena, and the energy regime over which the framework purports to make comprehensive predictions.
Lykken, Joseph D.; /Fermilab
- to those who get close enough to listen - new directions for BSM model building. Contrary to popular shorthand jargon, supersymmetry (SUSY) is not a BSM model: it is a symmetry principle characterizing a BSM framework with an infinite number of models. Indeed we do not even know the full dimensionality of the SUSY parameter space, since this presumably includes as-yet-unexplored SUSY-breaking mechanisms and combinations of SUSY with other BSM principles. The SUSY framework plays an important role in BSM physics partly because it includes examples of models that are 'complete' in the same sense as the Standard Model, i.e. in principle the model predicts consequences for any observable, from cosmology to b physics to precision electroweak data to LHC collisions. Complete models, in addition to being more explanatory and making connections between diverse phenomena, are also much more experimentally constrained than strawman scenarios that focus more narrowly. One sometimes hears: 'Anything that is discovered at the LHC will be called supersymmetry.' There is truth behind this joke in the sense that the SUSY framework incorporates a vast number of possible signatures accessible to TeV colliders. This is not to say that the SUSY framework is not testable, but we are warned that one should pay attention to other promising frameworks, and should be prepared to make experimental distinctions between them. Since there is no formal classification of BSM frameworks I have invented my own. At the highest level there are six parent frameworks: (1) Terascale supersymmetry; (2) PNGB Higgs; (3) New strong dynamics; (4) Warped extra dimensions; (5) Flat extra dimensions; and (6) Hidden valleys. Here is the briefest possible survey of each framework, with the basic idea, the generic new phenomena, and the energy regime over which the framework purports to make comprehensive predictions.
. There is a need for improved diagnosis and consistent, effectively communicated information, especially regarding medication. Parents made several suggestions for improving services: presentations about asthma at easily accessible community venues; an advice centre or telephone helpline to answer queries; opportunities for sharing experiences with other families; having information provided in South Asian languages; longer GP appointments; extended use of asthma nurses; and better education for healthcare professionals to ensure consistency of care and advice.
Hudson, Nicky; Culley, Lorraine; Johnson, Mark; McFeeters, Melanie; Robertson, Noelle; Angell, Emma; Lakhanpaul, Monica
communicated information, especially regarding medication. Parents made several suggestions for improving services: presentations about asthma at easily accessible community venues; an advice centre or telephone helpline to answer queries; opportunities for sharing experiences with other families; having information provided in South Asian languages; longer GP appointments; extended use of asthma nurses; and better education for healthcare professionals to ensure consistency of care and advice.
For a theory as genuinely elegant as the Standard Model--the current framework describing elementary particles and their forces--it can sometimes appear to students to be little more than a complicated collection of particles and ranked list of interactions. The Standard Model in a Nutshell provides a comprehensive and uncommonly accessible introduction to one of the most important subjects in modern physics, revealing why, despite initial appearances, the entire framework really is as elegant as physicists say. Dave Goldberg uses a "just-in-time" approach to instruction that enables students to gradually develop a deep understanding of the Standard Model even if this is their first exposure to it. He covers everything from relativity, group theory, and relativistic quantum mechanics to the Higgs boson, unification schemes, and physics beyond the Standard Model. The book also looks at new avenues of research that could answer still-unresolved questions and features numerous worked examples, helpful illustrat...
National Oceanic and Atmospheric Administration, Department of Commerce — This Atlas, Climatic Atlas of the Sea of Azov 2008 on CD-ROM, is an update to Volume 10, Climatic Atlas of the Sea of Azov 2006 on CD-ROM (NODC Accession 0098572),...
National Oceanic and Atmospheric Administration, Department of Commerce — NODC Standard Marine Bird Sighting, Land Census (F034) is one of a group of seven datasets related to Marine Birds from Coastal Alaska and Puget Sound Data (1975...
National Oceanic and Atmospheric Administration, Department of Commerce — NODC maintains data in three NODC Standard Format Marine Mammal Data Sets: Marine Mammal Sighting and Census (F127); Marine Mammal Specimens (F025); Marine Mammal...
Many place based accessibility studies ignore the time component. Relying on theoretical frameworks that treat distance between two fixed points as constant, these methods ignore the diurnal and seasonal changes in accessibility. Network distances between two nodes are dependent on the network structure and weight distribution on the edges. These weights can change quite frequently and the network structure itself is subject to modification because of availability and unavailability of links ...
Brooks, Christopher; Lee, Edward A
.... The Pedigree Management and Assessment Framework (PMAF) enables the publisher of information to record standard pedigree, such as information about the source, manner of collection, and the chain of modification of that information...
Mireia Ribera Turró
Full Text Available Adobe PDF is one of the most widely used formats in scientific communications and in administrative documents. In its latest versions it has incorporated structural tags and improvements that increase its level of accessibility. This article reviews the concept of accessibility in the reading of digital documents and evaluates the accessibility of PDF according to the most widely established standards.
First page Back Continue Last page Overview Graphics. Broadband Access. Worldwide market for broadband access $30 Billion! Over 200 million broadband subscribers worldwide! Various Competing Broadband access. Digital Subscriber line; Wireless; Optical Fiber.
The purpose of this report is to gain an understanding of the financial reporting requirements for the banks in a jurisdiction in addition to or instead of the requirements for commercial enterprises in general. The term bank in this assessment is used to refer to institutions authorized to receive deposits and to lend money as defined by the legal framework in the jurisdiction. There are also ...
This is a presentation of the The Gender Dimensions Framework (GDF). The GDF was developed to provide guidance to USAID staff and partner organizations for working with USAID projects looking at promoting equitable opportunities in agricultural value chains. The GDF contemplates four dimensions: access to and control over key productive assets (tangible and intangible); beliefs and perceptions; practices and participation, and legal frameworks. CCRA-7 (Gendered Knowledge)
Kubalek, J.; Hajek, B.
This standard establishes the requirements for supplementary Control Points provided to enable the operating staff to shut down the reactor and maintain the plant in a safe shut-down condition when the main control room is no longer available. This standard covers the functional selection, design and organization of the man/machine interface. It also establishes requirements for procedures which systematically verify and validate the functional design of supplementary control points. The requirements reflect the application of human engineering principles as they apply to man/machine interface. This standard does not cover special emergency response centres (e.g. a Technical Support Centre). It also does not include the detailed equipment design. Unavailability of the main control room controls due to intentionally man-induced events is not considered
Modern authorization systems span domains of administration, rely on many different authentication sources, and manage complex attributes as part of the authorization process. This . paper presents Cardea, a distributed system that facilitates dynamic access control, as a valuable piece of an inter-operable authorization framework. First, the authorization model employed in Cardea and its functionality goals are examined. Next, critical features of the system architecture and its handling of the authorization process are then examined. Then the S A M L and XACML standards, as incorporated into the system, are analyzed. Finally, the future directions of this project are outlined and connection points with general components of an authorization system are highlighted.
The Codes and Standards Working Group (CSWG) is one of the issue-specific working groups that the MDEP members are undertaking; its long term goal is harmonisation of regulatory and code requirements for design and construction of pressure-retaining components in order to improve the effectiveness and efficiency of the regulatory design reviews, increase quality of safety assessments, and to enable each regulator to become stronger in its ability to make safety decisions. The CSWG has interacted closely with the Standards Development Organisations (SDOs) and CORDEL in code comparison and code convergence. The Code Comparison Report STP-NU-051 has been issued by SDO members to identify the extent of similarities and differences amongst the pressure-boundary codes and standards used in various countries. Besides the differences in codes and standards, the way how the codes and standards are applied to systems, structures and components also affects the design and construction of nuclear power plant. Therefore, to accomplish the goal of potential harmonisation, it is also vital that the regulators learn about each other's procedures, processes, and regulations. To facilitate the learning process, the CSWG meets regularly to discuss issues relevant to licensing new reactors and using codes and standards in licensing safety reviews. The CSWG communicates very frequently with the SDOs to discuss similarities and differences among the various codes and how to proceed with potential harmonisation. It should be noted that the IAEA is invited to all of the issue-specific working groups within MDEP to ensure consistency with IAEA standards. The primary focus of this technical report is to consolidate information shared and accomplishments achieved by the member countries. This report seeks to document how each MDEP regulator utilises national or regional mechanical codes and standards in its safety reviews and licensing of new reactors. The preparation of this report
Adamson, Vidyah; Barrass, Emma; McConville, Stephen; Irikok, Chantelle; Taylor, Kim; Pitt, Steve; Van Duyn, Rob; Bennett, Susan; Jackson, Lisa; Carroll, Jon; Andrews, Mark; Parker, Ann; Wright, Caroline; Greathead, Katie; Price, David
Improving timely access to evidence-based treatment for people aged 14-65 years experiencing a first episode psychosis (FEP) or an at-risk mental state (ARMS) for psychosis is a national priority within the United Kingdom. An early intervention in psychosis (EIP) access and waiting time standard has been set which has extended the age range and acceptance criteria for services. This descriptive evaluation reports upon the referrals and access to treatment times within an EIP service over the first year of operating in line with the access and waiting time standard. Patient pathways and post-assessment status are also described. The service received 406 referrals, of which 88% (n = 357) were assessed. The mean length of time to treatment was 1.5 weeks, with 88% being seen within 2 weeks. Of those who engaged in an assessment, 34% (n = 138) were identified as ARMS cases and 30% (n = 123) were identified as FEP. Over 35 year olds accounted for 22% (n = 80) of the total accepted cases. The findings indicate clinical and operational issues, which will need careful consideration in the future planning of services. The high number of ARMS cases highlights the importance of clear treatment pathways and targeted interventions and may suggest a need to commission distinct ARMS services. The number of people who met the extended age and service acceptance criteria may suggest a need to adapt or redesign clinical services to meet the age-specific needs of over 35 year olds and those with an ARMS. It is unclear how changes to the remit of EIP services will impact upon future clinical outcomes. © 2018 John Wiley & Sons Australia, Ltd.
Fahma, Fakhrina; Zakaria, Roni; Fajar Gumilang, Royan
Since the ASEAN Economic Community (AEC) is released, the opportunity to expand market share has become very open, but the level of competition is also very high. Standardization is believed to be an important factor in seizing opportunities in the AEC’s era and other free trade agreements in the future. Standardization activities in industry can be proven by obtaining certification of SNI (Indonesian National Standard). This is a challenge for SMEs, considering that currently only 20% of SMEs had SNI certification both product and process. This research will be designed a model of readiness assessment to obtain SNI certification for SMEs. The stages of model development used an innovative approach by Roger (2003). Variables that affect the readiness of SMEs are obtained from product certification requirements established by BSN (National Standardization Agency) and LSPro (Certification body). This model will be used for mapping the readiness for SNI certification of SMEs’product. The level of readiness of SMEs is determined by the percentage of compliance with those requirements. Based on the result of this study, the five variables are determined as main aspects to assess SME readiness. For model validation, trials were conducted on Batik SMEs in Laweyan Surakarta.
McGee, Steven; Nutakki, Nivedita
Urban school districts face a dilemma in providing professional development support for teachers in transition to the Next Generation Science Standards (NGSS). Districts need to maximize the quality and amount of professional development within practical funding constraints. In this paper, we discuss preliminary results from a…
Cole, Stephen R; Lau, Bryan; Eron, Joseph J; Brookhart, M Alan; Kitahata, Mari M; Martin, Jeffrey N; Mathews, William C; Mugavero, Michael J
There are few published examples of absolute risk estimated from epidemiologic data subject to censoring and competing risks with adjustment for multiple confounders. We present an example estimating the effect of injection drug use on 6-year risk of acquired immunodeficiency syndrome (AIDS) after initiation of combination antiretroviral therapy between 1998 and 2012 in an 8-site US cohort study with death before AIDS as a competing risk. We estimate the risk standardized to the total study sample by combining inverse probability weights with the cumulative incidence function; estimates of precision are obtained by bootstrap. In 7,182 patients (83% male, 33% African American, median age of 38 years), we observed 6-year standardized AIDS risks of 16.75% among 1,143 injection drug users and 12.08% among 6,039 nonusers, yielding a standardized risk difference of 4.68 (95% confidence interval: 1.27, 8.08) and a standardized risk ratio of 1.39 (95% confidence interval: 1.12, 1.72). Results may be sensitive to the assumptions of exposure-version irrelevance, no measurement bias, and no unmeasured confounding. These limitations suggest that results be replicated with refined measurements of injection drug use. Nevertheless, estimating the standardized risk difference and ratio is straightforward, and injection drug use appears to increase the risk of AIDS. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: firstname.lastname@example.org.
Pandya-Wood, Raksha; Barron, Duncan S; Elliott, Jim
Researchers who conduct studies in health and social care are encouraged to involve the public as early as possible in the process of designing their studies. Before their studies are allowed to start researchers must seek approval from a Research Ethics Committee, which will assess whether the study is going to be safe and ethical for patients or healthy volunteers to take part in. The process of ethical review does not consider how researchers work with patients and the public early on to design their studies. Furthermore, there is no requirement for researchers to seek ethical approval for public involvement. However, in our work advising researchers about public involvement we have found that the ways in which researchers involve the public in the design of their studies are sometimes unintentionally unethical, and this is the focus of our paper. We have observed ten areas where ethical issues may arise because of the actions researchers may or may not take and which might consequently have a negative impact. Therefore, we have used these observations to develop a "framework" to help researchers and the public work together at the early design stage in ways that are ethical. Our intention for the framework is to help researchers be mindful of these ten areas and how easily ethical issues can arise. The framework suggests some ways to overcome the potential issues in each of the ten areas. The ten areas are: 1) Allocating sufficient time for public involvement; 2) Avoiding tokenism; 3) Registering research design stage public involvement work with NHS Research & Development Trust Office at earliest opportunity; 4) Communicating clearly from the outset; 5) Entitling public contributors to stop their involvement for any unstated reasons; 6) Operating fairness of opportunity; 7) Differentiating qualitative research methods and public involvement activities; 8) Working sensitively; 9) Being conscious of confidentiality and 10) Valuing, acknowledging and rewarding
Poirier, S.; Buteau, A.; Ounsy, M.; Rodriguez, C.; Hauser, N.; Lam, T.; Xiong, N.
For almost 20 years, the scientific community of neutron and synchrotron institutes have been dreaming of a common data format for exchanging experimental results and applications for reducing and analyzing the data. Using HDF5 as a data container has become the standard in many facilities. The big issue is the standardization of the data organization (schema) within the HDF5 container. By introducing a new level of indirection for data access, the Common-Data-Model-Access (CDMA) framework proposes a solution and allows separation of responsibilities between data reduction developers and the institute. Data reduction developers are responsible for data reduction code; the institute provides a plug-in to access the data. The CDMA is a core API that accesses data through a data format plug-in mechanism and scientific application definitions (sets of keywords) coming from a consensus between scientists and institutes. Using a innovative 'mapping' system between application definitions and physical data organizations, the CDMA allows data reduction application development independent of the data file container AND schema. Each institute develops a data access plug-in for its own data file formats along with the mapping between application definitions and its data files. Thus data reduction applications can be developed from a strictly scientific point of view and are immediately able to process data acquired from several institutes. (authors)
Singh, Rahul Rajat
This book is for .NET developers who are developing data-driven applications using ADO.NET or other data access technologies. This book is going to give you everything you need to effectively develop and manage data-driven applications using Entity Framework.
Rojo, Marcial García; Daniel, Christel; Schrader, Thomas
EURO-TELEPATH is a European COST Action IC0604. It started in 2007 and will end in November 2011. Its main objectives are evaluating and validating the common technological framework and communication standards required to access, transmit, and manage digital medical records by pathologists and other medical specialties in a networked environment. Working Group 1, "Business Modelling in Pathology," has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy - using Business Process Modelling Notation (BPMN). Working Group 2 has been dedicated to promoting the application of informatics standards in pathology, collaborating with Integrating Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Health terminology standardization research has become a topic of great interest. Future research work should focus on standardizing automatic image analysis and tissue microarrays imaging.
A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.
First page Back Continue Last page Overview Graphics. Wireless Access. Wireless connect to the Base station. Easy and Convenient access. Costlier as compared to the wired technology. Reliability challenges. We see it as a complementary technology to the DSL.
Gambito, Ephraim D V; Gonzalez-Suarez, Consuelo B; Grimmer, Karen A; Valdecañas, Carolina M; Dizon, Janine Margarita R; Beredo, Ma Eulalia J; Zamora, Marcelle Theresa G
Clinical practice guidelines need to be regularly updated with current literature in order to remain relevant. This paper reports on the approach taken by the Philippine Academy of Rehabilitation Medicine (PARM). This dovetails with its writing guide, which underpinned its foundational work in contextualizing guidelines for stroke and low back pain (LBP) in 2011. Working groups of Filipino rehabilitation physicians and allied health practitioners met to reconsider and modify, where indicated, the 'typical' Filipino patient care pathways established in the foundation guidelines. New clinical guidelines on stroke and low back pain which had been published internationally in the last 3 years were identified using a search of electronic databases. The methodological quality of each guideline was assessed using the iCAHE Guideline Quality Checklist, and only those guidelines which provided full text references, evidence hierarchy and quality appraisal of the included literature, were included in the PARM update. Each of the PARM-endorsed recommendations was then reviewed, in light of new literature presented in the included clinical guidelines. A novel standard updating approach was developed based on the criteria reported by Johnston et al. (Int J Technol Assess Health Care 19(4):646-655, 2003) and then modified to incorporate wording from the foundational PARM writing guide. The new updating tool was debated, pilot-tested and agreed upon by the PARM working groups, before being applied to the guideline updating process. Ten new guidelines on stroke and eleven for low back pain were identified. Guideline quality scores were moderate to good, however not all guidelines comprehensively linked the evidence body underpinning recommendations with the literature. Consequently only five stroke and four low back pain guidelines were included. The modified PARM updating guide was applied by all working groups to ensure standardization of the wording of updated recommendations
Full Text Available On the last decades the accounting system haven‘t been able to follow the dynamics of the economic systems generated by the globalization process. In order to reduce the lag between the demand of financial information and the offer of financial information, IASB has started numerous initiatives aiming the increase on the quality of the financial information. Among the current list of current IASB major projects there is also the project of revising the actual conceptual framework for financial reporting. This study is designed to give some directions that will be considered on the exposure draft of this project, analyzing the comment letters submitted by the members of ASAF and the Big4 as well. The study reveals the increasing importance the preparers and users give to the disclosures included on the notes to the primary financial statements. Moreover, on this study we emphasize several challenges that IASB has to face on issuing the exposure draft for this important project. Some of the main challenges refer to the narrow scope of the financial statements, the criteria used on classification, aggregation and offsetting, or the use of the materiality concept
Full Text Available United States Supreme Court doctrine has, for a quarter century, permitted regulations designed—through facts or nudges, but not force—to persuade pregnant women to choose childbirth over abortion. States have increasingly exceeded the bounds of this persuasive power by subjecting women to emotive and potentially distressing ‘information’ like real-time fetal images, heart beat recordings, or state-mandated directives by their doctors that abortion would “terminate the life of a whole, separate, unique, living human being.” This article advances a novel approach to informed consent in abortion that draws on established principles in the U.S. Federal Rules of Evidence (FRE. Evidentiary rules requiring “completeness”, exempting “common knowledge”, and prohibiting evidence that is “more prejudicial than probative” provide a sounder way for courts to determine which informed consent regulations on abortion mislead and demean a woman in ways that violate her constitutional right to make the ultimate decision about whether to continue a pregnancy. This evidence law framework would resolve conflicts between a woman’s right and the state’s interest by forbidding mandatory disclosures of incomplete, unnecessary, and emotionally charged information designed to promote childbirth over abortion.
Open access week Van 19 tot en met 25 oktober 2015 vond wereldwijd de Open Access Week plaats. Tijdens deze week werden er over de hele wereld evenementen georganiseerd waar open access een rol speelt. Ook in Nederland zijn er diverse symposia, workshops en debatten georganiseerd zoals het debat in
Manson, T.; Blok, M.
A general review of the measures involved in restoring abandoned access road sites in British Columbia was presented. Permits and licences are needed for the use of crown land for roads used by the petroleum and natural gas industry for exploration activities. However, the regulatory framework for road site reclamation is not well developed. The nature of access road reclamation is very site-specific. Some of the issues that are considered for all reclamation projects include slope stability, water control, revegetation, soil rehabilitation, access management and monitoring. The primary objective of reclaiming access road sites is to return the site to conditions that are equal or better than pre-disturbance conditions. Restoration measures must be approved by BC Environment and by the Department of Fisheries and Oceans where federal fisheries responsibilities are involved. 54 refs., 5 tabs., 3 figs
The thesis presents one of the four most popular PHP web frameworks: Laravel, Symfony, CodeIgniter and CakePHP. These frameworks are compared with each other according to the four criteria, which can help with the selection of a framework. These criteria are size of the community, quality of official support, comprehensibility of framework’s documentation and implementation of functionalities in individual frameworks, which are automatic code generation, routing, object-relational mapping and...
Maxwell, T. P.; Duffy, D.
Gallo, J.; Stryker, T. S.; Sherman, R.
Each year, the Federal government records petabytes of data about our home planet. That massive amount of data in turn provides enormous benefits to society through weather reports, agricultural forecasts, air and water quality warnings, and countless other applications. To maximize the ease of transforming the data into useful information for research and for public services, the U.S. Group on Earth Observations released the first Common Framework for Earth Observation Data in March 2016. The Common Framework recommends practices for Federal agencies to adopt in order to improve the ability of all users to discover, access, and use Federal Earth observations data. The U.S. Government is committed to making data from civil Earth observation assets freely available to all users. Building on the Administration's commitment to promoting open data, open science, and open government, the Common Framework goes beyond removing financial barriers to data access, and attempts to minimize the technical impediments that limit data utility. While Earth observation systems typically collect data for a specific purpose, these data are often also useful in applications unforeseen during development of the systems. Managing and preserving these data with a common approach makes it easier for a wide range of users to find, evaluate, understand, and utilize the data, which in turn leads to the development of a wide range of innovative applications. The Common Framework provides Federal agencies with a recommended set of standards and practices to follow in order to achieve this goal. Federal agencies can follow these best practices as they develop new observing systems or modernize their existing collections of data. This presentation will give a brief on the context and content of the Common Framework, along with future directions for implementation and keeping its recommendations up-to-date with developing technology.
Zheng, Jun; Ansari, Nirwan
Call for Papers: Optical Access Networks With the wide deployment of fiber-optic technology over the past two decades, we have witnessed a tremendous growth of bandwidth capacity in the backbone networks of today's telecommunications infrastructure. However, access networks, which cover the "last-mile" areas and serve numerous residential and small business users, have not been scaled up commensurately. The local subscriber lines for telephone and cable television are still using twisted pairs and coaxial cables. Most residential connections to the Internet are still through dial-up modems operating at a low speed on twisted pairs. As the demand for access bandwidth increases with emerging high-bandwidth applications, such as distance learning, high-definition television (HDTV), and video on demand (VoD), the last-mile access networks have become a bandwidth bottleneck in today's telecommunications infrastructure. To ease this bottleneck, it is imperative to provide sufficient bandwidth capacity in the access networks to open the bottleneck and thus present more opportunities for the provisioning of multiservices. Optical access solutions promise huge bandwidth to service providers and low-cost high-bandwidth services to end users and are therefore widely considered the technology of choice for next-generation access networks. To realize the vision of optical access networks, however, many key issues still need to be addressed, such as network architectures, signaling protocols, and implementation standards. The major challenges lie in the fact that an optical solution must be not only robust, scalable, and flexible, but also implemented at a low cost comparable to that of existing access solutions in order to increase the economic viability of many potential high-bandwidth applications. In recent years, optical access networks have been receiving tremendous attention from both academia and industry. A large number of research activities have been carried out or
Hydrogeologic framework, hydrology, and refined conceptual model of groundwater flow for Coastal Plain aquifers at the Standard Chlorine of Delaware, Inc. Superfund Site, New Castle County, Delaware, 2005-12
Brayton, Michael J.; Cruz, Roberto M.; Myers, Luke; Degnan, James R.; Raffensperger, Jeff P.
From 1966 to 2002, activities at the Standard Chlorine of Delaware chemical facility in New Castle County, Delaware resulted in the contamination of groundwater, soils, and wetland sediment. In 2005, the U.S. Geological Survey (USGS), in partnership with the U.S. Environmental Protection Agency, Region 3, and the Delaware Department of Natural Resources and Environmental Control began a multi-year investigation of the hydrogeologic framework and hydrology of the confined aquifer system. The goals of the ongoing study at the site (the Potomac Aquifer Study) are to determine the hydraulic connection between the Columbia and Potomac aquifers, determine the direction of groundwater flow in the Potomac aquifer, and identify factors affecting the fate of contaminated groundwater. This report describes progress made towards these goals based on available data collected through September 2012.
Peisner, Elizabeth Suzanne
Utilizing a qualitative case study, this dissertation analyzed how one university provided accessibility to international experiential learning opportunities for a primarily disabled student population. The Council for the Advancement of Standards (CAS, 2006) in Higher Education consists of a self-assessment guide adapted as a framework to analyze…
The Internet lets us share perfect copies of our work with a worldwide audience at virtually no cost. We take advantage of this revolutionary opportunity when we make our work "open access": digital, online, free of charge, and free of most copyright and licensing restrictions. Open access is made possible by the Internet and copyright-holder…
Prof. Dennis Ocholla
The argument that access to information is an instrumental and individual as well as ... and Dean School of Information Studies, University of Wisconsin, Milwaukee, USA. ... to scholarly publications and can be in any digital format, including text, movies and ... language barriers, censorship, lack of access to the Internet and ...
Chen, Li-Chin; Chen, Chi-Wen; Weng, Yung-Ching; Shang, Rung-Ji; Yu, Hui-Chu; Chung, Yufang; Lai, Feipei
Telehealthcare has been used to provide healthcare service, and information technology infrastructure appears to be essential while providing telehealthcare service. Insufficiencies have been identified, such as lack of integration, need of accommodation of diverse biometric sensors, and accessing diverse networks as different houses have varying facilities, which challenge the promotion of telehealthcare. This study designs an information technology framework to strengthen telehealthcare delivery. The proposed framework consists of a system architecture design and a network transmission design. The aim of the framework is to integrate data from existing information systems, to adopt medical informatics standards, to integrate diverse biometric sensors, and to provide different data transmission networks to support a patient's house network despite the facilities. The proposed framework has been evaluated with a case study of two telehealthcare programs, with and without the adoption of the framework. The proposed framework facilitates the functionality of the program and enables steady patient enrollments. The overall patient participations are increased, and the patient outcomes appear positive. The attitudes toward the service and self-improvement also are positive. The findings of this study add up to the construction of a telehealthcare system. Implementing the proposed framework further assists the functionality of the service and enhances the availability of the service and patient acceptances.
Full Text Available In this paper we present Zend Architecture, which is an open source technology for developing web applications and services, based on object-oriented components, and the Model-View-Controller architectural pattern, also known as MVC, which is the fundament of this architecture. The MVC presentation emphasises its main characteristics, such as facilitating the components reuse by dividing the application into distinct interconnected modules, tasks distribution in the process of developing an application, the MVC life cycle and also the essential features of the components in which it separates the application: model, view, controller. The controller coordinates the models and views and it’s responsible with manipulating the user events through the corresponding actions. The model contains application rules, respectively the scripts that implement the database manipulation. The third component, the view represents the controllers interface with the user or the way it displays the response to the event triggered by the user. Another aspect treated in this paper consists in highlighting the Zend architecture advantages and disadvantages. Among the framework advantages, we can enumerate good code organization, due to its delimitation into three sections, presentation, logic and data access, and dividing the code into components, which facilitates the code reuse and testing. Other advantages are the open-source license and the support for multiple database systems. The main disadvantages are represented by its size and complexity, that makes it hard to understand for a beginner programmer, the resources it needs etc. The last section of the paper presents a comparison between Zend and other PHP architectures, like Symphony, CakePHP and CodeIgniter, which includes their essential features and points out their similarities and differences, based on the unique functions that set them apart from others. The main thing that distinguishes ZF from the
312 26.5 182 58.3 76 41.8 12 15.8 Contusion of bone or joint 454 4.4 82 18.1 52 63.4 35 67.3 7 20.0 Condition that requires frequent treatment 426...13 14.8 10 76.9 4 40.0 0 0.0 Surgical correction for GERD 75 0.7 31 41.3 22 71.0 11 50.0 1 9.1 Hypothyroidism 62 0.6 15 24.2 8 53.3 1 12.5 1 100.0...preceding 2 years while off all medications for treatment of this condition. Recurrent loss of consciousness for any reason. Seizure Any
Kong, Lei; Wang, Jun; Zhao, Shuqi; Gu, Xiaocheng; Luo, Jingchu; Gao, Ge
With the rapid growth of genome sequencing projects, genome browser is becoming indispensable, not only as a visualization system but also as an interactive platform to support open data access and collaborative work. Thus a customizable genome browser framework with rich functions and flexible configuration is needed to facilitate various genome research projects. Based on next-generation web technologies, we have developed a general-purpose genome browser framework ABrowse which provides interactive browsing experience, open data access and collaborative work support. By supporting Google-map-like smooth navigation, ABrowse offers end users highly interactive browsing experience. To facilitate further data analysis, multiple data access approaches are supported for external platforms to retrieve data from ABrowse. To promote collaborative work, an online user-space is provided for end users to create, store and share comments, annotations and landmarks. For data providers, ABrowse is highly customizable and configurable. The framework provides a set of utilities to import annotation data conveniently. To build ABrowse on existing annotation databases, data providers could specify SQL statements according to database schema. And customized pages for detailed information display of annotation entries could be easily plugged in. For developers, new drawing strategies could be integrated into ABrowse for new types of annotation data. In addition, standard web service is provided for data retrieval remotely, providing underlying machine-oriented programming interface for open data access. ABrowse framework is valuable for end users, data providers and developers by providing rich user functions and flexible customization approaches. The source code is published under GNU Lesser General Public License v3.0 and is accessible at http://www.abrowse.org/. To demonstrate all the features of ABrowse, a live demo for Arabidopsis thaliana genome has been built at http://arabidopsis.cbi.edu.cn/.
Background With the rapid growth of genome sequencing projects, genome browser is becoming indispensable, not only as a visualization system but also as an interactive platform to support open data access and collaborative work. Thus a customizable genome browser framework with rich functions and flexible configuration is needed to facilitate various genome research projects. Results Based on next-generation web technologies, we have developed a general-purpose genome browser framework ABrowse which provides interactive browsing experience, open data access and collaborative work support. By supporting Google-map-like smooth navigation, ABrowse offers end users highly interactive browsing experience. To facilitate further data analysis, multiple data access approaches are supported for external platforms to retrieve data from ABrowse. To promote collaborative work, an online user-space is provided for end users to create, store and share comments, annotations and landmarks. For data providers, ABrowse is highly customizable and configurable. The framework provides a set of utilities to import annotation data conveniently. To build ABrowse on existing annotation databases, data providers could specify SQL statements according to database schema. And customized pages for detailed information display of annotation entries could be easily plugged in. For developers, new drawing strategies could be integrated into ABrowse for new types of annotation data. In addition, standard web service is provided for data retrieval remotely, providing underlying machine-oriented programming interface for open data access. Conclusions ABrowse framework is valuable for end users, data providers and developers by providing rich user functions and flexible customization approaches. The source code is published under GNU Lesser General Public License v3.0 and is accessible at http://www.abrowse.org/. To demonstrate all the features of ABrowse, a live demo for Arabidopsis thaliana genome
Full Text Available Abstract Background With the rapid growth of genome sequencing projects, genome browser is becoming indispensable, not only as a visualization system but also as an interactive platform to support open data access and collaborative work. Thus a customizable genome browser framework with rich functions and flexible configuration is needed to facilitate various genome research projects. Results Based on next-generation web technologies, we have developed a general-purpose genome browser framework ABrowse which provides interactive browsing experience, open data access and collaborative work support. By supporting Google-map-like smooth navigation, ABrowse offers end users highly interactive browsing experience. To facilitate further data analysis, multiple data access approaches are supported for external platforms to retrieve data from ABrowse. To promote collaborative work, an online user-space is provided for end users to create, store and share comments, annotations and landmarks. For data providers, ABrowse is highly customizable and configurable. The framework provides a set of utilities to import annotation data conveniently. To build ABrowse on existing annotation databases, data providers could specify SQL statements according to database schema. And customized pages for detailed information display of annotation entries could be easily plugged in. For developers, new drawing strategies could be integrated into ABrowse for new types of annotation data. In addition, standard web service is provided for data retrieval remotely, providing underlying machine-oriented programming interface for open data access. Conclusions ABrowse framework is valuable for end users, data providers and developers by providing rich user functions and flexible customization approaches. The source code is published under GNU Lesser General Public License v3.0 and is accessible at http://www.abrowse.org/. To demonstrate all the features of ABrowse, a live demo for
Dolog, Peter; Simon, Bernd; Nejdl, Wolfgang
In this article, we describe a Smart Space for Learning™ (SS4L) framework and infrastructure that enables personalized access to distributed heterogeneous knowledge repositories. Helping a learner to choose an appropriate learning resource or activity is a key problem which we address in this fra......In this article, we describe a Smart Space for Learning™ (SS4L) framework and infrastructure that enables personalized access to distributed heterogeneous knowledge repositories. Helping a learner to choose an appropriate learning resource or activity is a key problem which we address...... in this framework, enabling personalized access to federated learning repositories with a vast number of learning offers. Our infrastructure includes personalization strategies both at the query and the query results level. Query rewriting is based on learning and language preferences; rule-based and ranking...
Transforming Global Information and Communication Markets: The Political Economy of ... 8 Control and Resistance: Attacks on Burmese Opposition Media 153 ...... “Reluctant Gatekeepers: Corporate Ethics on a Filtered Internet,” in Access ...
Yoon, Doe Hyun; Muralimanohar, Naveen; Chang, Jichuan; Ranganthan, Parthasarathy
A disclosed example method involves performing simultaneous data accesses on at least first and second independently selectable logical sub-ranks to access first data via a wide internal data bus in a memory device. The memory device includes a translation buffer chip, memory chips in independently selectable logical sub-ranks, a narrow external data bus to connect the translation buffer chip to a memory controller, and the wide internal data bus between the translation buffer chip and the memory chips. A data access is performed on only the first independently selectable logical sub-rank to access second data via the wide internal data bus. The example method also involves locating a first portion of the first data, a second portion of the first data, and the second data on the narrow external data bus during separate data transfers.
C. Colloca TS/FM
TS/FM group informs you that, for the replacement of the door of the main entrance at bldg. 500, the access will be closed to the public between 19 and 30 July 2004. Access to the Main Building complex will be assured at any time through both of the side doors and from bldg. 64. For more information, please contact 73273. C. Colloca TS/FM
Journals assessed against the JPPS criteria are given one of six levels: inactive title; new title; no stars; one star; two stars; and three stars. The assigned JPPS levels serve a dual purpose. For readers, they provide assurance that the journals meet an internationally recognized set of criteria at a particular level. For journal ...
Balfour, Margaret E; Tanner, Kathleen; Jurica, Paul J; Rhoads, Richard; Carson, Chris A
Crisis and emergency psychiatric services are an integral part of the healthcare system, yet there are no standardized measures for programs providing these services. We developed the Crisis Reliability Indicators Supporting Emergency Services (CRISES) framework to create measures that inform internal performance improvement initiatives and allow comparison across programs. The framework consists of two components-the CRISES domains (timely, safe, accessible, least-restrictive, effective, consumer/family centered, and partnership) and the measures supporting each domain. The CRISES framework provides a foundation for development of standardized measures for the crisis field. This will become increasingly important as pay-for-performance initiatives expand with healthcare reform.
Kirkeby, Inge Mette
Although serious efforts are made internationally and nationally, it is a slow process to make our physical environment accessible. In the actual design process, architects play a major role. But what kinds of knowledge, including research-based knowledge, do practicing architects make use of when...... designing accessible environments? The answer to the question is crucially important since it affects how knowledge is distributed and how accessibility can be ensured. In order to get first-hand knowledge about the design process and the sources from which they gain knowledge, 11 qualitative interviews...... were conducted with architects with experience of designing for accessibility. The analysis draws on two theoretical distinctions. The first is research-based knowledge versus knowledge used by architects. The second is context-independent knowledge versus context-dependent knowledge. The practitioners...
The Internet lets us share perfect copies of our work with a worldwide audience at virtually no cost. We take advantage of this revolutionary opportunity when we make our work "open access": digital, online, free of charge, and free of most copyright and licensing restrictions. Open access is made possible by the Internet and copyright-holder consent, and many authors, musicians, filmmakers, and other creators who depend on royalties are understandably unwilling to give their consent. But for 350 years, scholars have written peer-reviewed journal articles for impact, not for money, and are free to consent to open access without losing revenue. In this concise introduction, Peter Suber tells us what open access is and isn't, how it benefits authors and readers of research, how we pay for it, how it avoids copyright problems, how it has moved from the periphery to the mainstream, and what its future may hold. Distilling a decade of Suber's influential writing and thinking about open access, this is the indispe...
Zheng, Jun; Ansari, Nirwan
Call for Papers: Optical Access Networks Guest Editors Jun Zheng, University of Ottawa Nirwan Ansari, New Jersey Institute of Technology Submission Deadline: 1 June 2005 Background With the wide deployment of fiber-optic technology over the past two decades, we have witnessed a tremendous growth of bandwidth capacity in the backbone networks of today's telecommunications infrastructure. However, access networks, which cover the "last-mile" areas and serve numerous residential and small business users, have not been scaled up commensurately. The local subscriber lines for telephone and cable television are still using twisted pairs and coaxial cables. Most residential connections to the Internet are still through dial-up modems operating at a low speed on twisted pairs. As the demand for access bandwidth increases with emerging high-bandwidth applications, such as distance learning, high-definition television (HDTV), and video on demand (VoD), the last-mile access networks have become a bandwidth bottleneck in today's telecommunications infrastructure. To ease this bottleneck, it is imperative to provide sufficient bandwidth capacity in the access networks to open the bottleneck and thus present more opportunities for the provisioning of multiservices. Optical access solutions promise huge bandwidth to service providers and low-cost high-bandwidth services to end users and are therefore widely considered the technology of choice for next-generation access networks. To realize the vision of optical access networks, however, many key issues still need to be addressed, such as network architectures, signaling protocols, and implementation standards. The major challenges lie in the fact that an optical solution must be not only robust, scalable, and flexible, but also implemented at a low cost comparable to that of existing access solutions in order to increase the economic viability of many potential high-bandwidth applications. In recent years, optical access networks
Access is the major new language series designed with the needs of today's generation of students firmly in mind. Whether learning for leisure or business purposes or working towards a curriculum qualification, Access French is specially designed for adults of all ages and gives students a thorough grounding in all the skills required to understand, speak, read and write contemporary French from scratch. The coursebook consists of 10 units covering different topic areas, each of which includes Language Focus panels explaining the structures covered and a comprehensive glossary. Learning tips
Wand, Sean; Thermos, Adam C.
Explains the issues to consider before a college decides to purchase a card-access system. The benefits of automation, questions involving implementation, the criteria for technology selection, what typical card technology involves, privacy concerns, and the placement of card readers are discussed. (GR)
Cooper, Antony K
Full Text Available Data content standards tend to be more accessible. Easier to understand. Used directly by many end users. Immediately applicable to Africa. More susceptible to culture and language – Hence, more important to have local standards...
Legal constraints on EU Member States as primary law makers : a case study of the proposed permanent safeguard clause on free movement of persons in the EU negotiating framework for Turkey's accession
Do Member States of the EU have a free hand in drafting Accession Treaties, or are there legal constraints on their primary law making function in this context? That is the central question this thesis addresses. It argues that such constraints do exist, and tries to identify them, thereby hoping to
Lorch, Marcus; Proctor, Seth; Lepro, Rebekah; Kafura, Dennis; Shah, Sumit
Authorization systems today are increasingly complex. They span domains of administration, rely on many different authentication sources, and manage permissions that can be as complex as the system itself. Worse still, while there are many standards that define authentication mechanisms, the standards that address authorization are less well defined and tend to work only within homogeneous systems. This paper presents XACML, a standard access control language, as one component of a distributed and inter-operable authorization framework. Several emerging systems which incorporate XACML are discussed. These discussions illustrate how authorization can be deployed in distributed, decentralized systems. Finally, some new and future topics are presented to show where this work is heading and how it will help connect the general components of an authorization system.
Preuveneers, D.; Joosen, W.; Ilie-Zudor, E.
In dynamic cross-enterprise collaborations, different enterprises form a - possibly temporary - business relationship. To integrate their business processes, enterprises may need to grant each other limited access to their information systems. Authentication and authorization are key to secure information handling. However, access control policies often rely on non-standardized attributes to describe the roles and permissions of their employees which convolutes cross-organizational authorization when business relationships evolve quickly. Our framework addresses the managerial overhead of continuous updates to access control policies for enterprise information systems to accommodate disparate attribute usage. By inferring attribute relationships, our framework facilitates attribute and policy reconciliation, and automatically aligns dynamic entitlements during the evaluation of authorization decisions. We validate our framework with a Industry 4.0 motivating scenario on networked production where such dynamic cross-enterprise collaborations are quintessential. The evaluation reveals the capabilities and performance of our framework, and illustrates the feasibility of liberating the security administrator from manually provisioning and aligning attributes, and verifying the consistency of access control policies for cross-enterprise collaborations.
Vogt, Lars; Nickel, Michael; Jenner, Ronald A; Deans, Andrew R
eScience is a new approach to research that focuses on data mining and exploration rather than data generation or simulation. This new approach is arguably a driving force for scientific progress and requires data to be openly available, easily accessible via the Internet, and compatible with each other. eScience relies on modern standards for the reporting and documentation of data and metadata. Here, we suggest necessary components (i.e., content, concept, nomenclature, format) of such standards in the context of zoomorphology. We document the need for using data repositories to prevent data loss and how publication practice is currently changing, with the emergence of dynamic publications and the publication of digital datasets. Subsequently, we demonstrate that in zoomorphology the scientific record is still limited to published literature and that zoomorphological data are usually not accessible through data repositories. The underlying problem is that zoomorphology lacks the standards for data and metadata. As a consequence, zoomorphology cannot participate in eScience. We argue that the standardization of morphological data requires i) a standardized framework for terminologies for anatomy and ii) a formalized method of description that allows computer-parsable morphological data to be communicable, compatible, and comparable. The role of controlled vocabularies (e.g., ontologies) for developing respective terminologies and methods of description is discussed, especially in the context of data annotation and semantic enhancement of publications. Finally, we introduce the International Consortium for Zoomorphology Standards, a working group that is open to everyone and whose aim is to stimulate and synthesize dialog about standards. It is the Consortium's ultimate goal to assist the zoomorphology community in developing modern data and metadata standards, including anatomy ontologies, thereby facilitating the participation of zoomorphology in e
Katajainen, Jyrki; Simonsen, Bo
The CPH STL is a special edition of the STL, the containers and algorithms part of the C++ standard library. The specification of the generic components of the STL is given in the C++ standard. Any implementation of the STL, e.g. the one that ships with your standard-compliant C++ compiler, should...... for vector, which is undoubtedly the most used container of the C++ standard library. In particular, we specify the details of a vector implementation that is safe with respect to referential integrity and strong exception safety. Additionally, we report the experiences and lessons learnt from...... the development of component frameworks which we hope to be of benefit to persons engaged in the design and implementation of generic software libraries....
Shankar, D.; Kotamraju, V.; Shetye, S.R
of information on the variables associated with hydrology, and second, the absence of an easily accessible quantitative framework to put these variables in perspective. In this paper, we discuss a framework that has been assembled to address both these issues...
An evaluation of the Essential Medicines List, Standard Treatment Guidelines and prescribing restrictions, as an integrated strategy to enhance quality, efficacy and safety of and improve access to essential medicines in Papua New Guinea.
Joshua, Isaac B; Passmore, Phillip R; Sunderland, Bruce V
The World Health Organization (WHO) has advocated the development and use of country specific Standard Treatment Guidelines (STGs) and Essential Medicines Lists (EML) as strategies to promote the rational use of medicines. When implemented effectively STGs offer many health advantages. Papua New Guinea (PNG) has official STGs and a Medical and Dental Catalogue (MDC) which serves as a national EML for use at different levels of health facilities. This study evaluated consistency between the PNG Adult STGs (2003 and 2012) and those for children (2005 and 2011) with respect to the MDCs (2002, 2012) for six chronic and/or acute diseases: asthma, arthritis, diabetes, hypertension, pneumonia and psychosis. Additionally, the potential impact of prescriber level restrictions on rational medicines use for patient's living in rural areas, where no medical officer is present, was evaluated. Almost all drugs included in the STGs for each disease state evaluated were listed in the MDCs. However, significant discrepancies occurred between the recommended treatments in the STGs with the range of related medicines listed in the MDCs. Many medicines recommended in the STGs for chronic diseases had prescriber level restrictions hindering access for most of the PNG population who live in rural and remote areas. In addition many more medicines were listed in the MDCs which are commonly used to treat arthritis, high blood pressure and psychosis than were recommended in the STGs contributing to inappropriate prescribing. We recommend the public health and rational use of medicines deficiencies associated with these findings are addressed requiring: reviewing prescriber level restrictions; updating the STGs; aligning the MDC to reflect recommendations in the STGs; establishing the process where the MDC would automatically be updated based on any changes made to the STGs; and developing STGs for higher levels of care. © The Author 2015. Published by Oxford University Press. All rights
The Foundations of Operational Resilience Assessing the Ability to Operate in an Anti-Access/Area Denial (A2/AD) Environment: The Analytical Framework, Lexicon, and Characteristics of the Operational Resilience Analysis Model (ORAM)
University , Maxwell Air Force Base, Alabama). He shaped the definition of operational resilience and, therefore, the resulting analytic framework with...of airbases: • Type A: Main Operating Base ( MOB ). A facility outside the United States and U.S. territories with permanently stationed operating... MOB . [JP 1-02, 2014] • Type C: Forward Operating Location (FOL). A forward operating base that is served by a less extensive support structure than
Social movements and access to assets and services. Proposal of an analytical framework based on a comparative analysis of cases: political confrontation of unemployed persons (Argentina) and occupancy of housing (Spain)
Maria-Rosa Herrera-Gutierrez; Maria-Inés Peralta; Silvina Cuella; Rosa-María Díaz-Jiménez
This work is the result of collaboration between researchers from two Academic Units of Social Work located in Spain and Argentina. The overall objective is to explore the analytical frameworks and empirical objects of research projects linked to social movements developed in both countries. Specifically, two phenomena that lead to social mobilization, unemployment through the territorial expressions of the unemployed (Córdoba, Argentina) and housing through the occupation of buildings by peo...
Jayabalan, Manoj; O'Daniel, Thomas
This study presents a systematic literature review of access control for electronic health record systems to protect patient's privacy. Articles from 2006 to 2016 were extracted from the ACM Digital Library, IEEE Xplore Digital Library, Science Direct, MEDLINE, and MetaPress using broad eligibility criteria, and chosen for inclusion based on analysis of ISO22600. Cryptographic standards and methods were left outside the scope of this review. Three broad classes of models are being actively investigated and developed: access control for electronic health records, access control for interoperability, and access control for risk analysis. Traditional role-based access control models are extended with spatial, temporal, probabilistic, dynamic, and semantic aspects to capture contextual information and provide granular access control. Maintenance of audit trails and facilities for overriding normal roles to allow full access in emergency cases are common features. Access privilege frameworks utilizing ontology-based knowledge representation for defining the rules have attracted considerable interest, due to the higher level of abstraction that makes it possible to model domain knowledge and validate access requests efficiently.
Krämer, Sebastian; Plankensteiner, David; Ostermann, Laurin; Ritsch, Helmut
We present an open source computational framework geared towards the efficient numerical investigation of open quantum systems written in the Julia programming language. Built exclusively in Julia and based on standard quantum optics notation, the toolbox offers speed comparable to low-level statically typed languages, without compromising on the accessibility and code readability found in dynamic languages. After introducing the framework, we highlight its features and showcase implementations of generic quantum models. Finally, we compare its usability and performance to two well-established and widely used numerical quantum libraries.
Elsea, Jennifer K
Recent incidents involving leaks of classified information have heightened interest in the legal framework that governs security classification, access to classified information, and penalties for improper disclosure...
Al-Turany, M; Klein, D; Manafov, A; Rybalchenko, A; Uhlig, F
The FairRoot framework is the standard framework for simulation, reconstruction and data analysis for the FAIR experiments. The framework is designed to optimise the accessibility for beginners and developers, to be flexible and to cope with future developments. FairRoot enhances the synergy between the different physics experiments. As a first step toward simulation of free streaming data, the time based simulation was introduced to the framework. The next step is the event source simulation. This is achieved via a client server system. After digitization the so called 'samplers' can be started, where sampler can read the data of the corresponding detector from the simulation files and make it available for the reconstruction clients. The system makes it possible to develop and validate the online reconstruction algorithms. In this work, the design and implementation of the new architecture and the communication layer will be described.
Aalst, van der W.M.P.; Beisiegel, M.; Hee, van K.M.; König, D.; Stahl, C.
We present an Service-Oriented Architecture (SOA)– based architecture framework. The architecture framework is designed to be close to industry standards, especially to the Service Component Architecture (SCA). The framework is language independent and the building blocks of each system, activities
Furano, Fabrizio; Devresse, Adrien; Keeble, Oliver; Hellmich, Martin; Ayllón, Alejandro Álvarez
In this contribution we present a vision for the use of the HTTP protocol for data access and data management in the context of HEP. The evolution of the DPM/LFC software stacks towards a modern framework that can be plugged into Apache servers triggered various initiatives that successfully demonstrated the use of HTTP-based protocols for data access, federation and transfer. This includes the evolution of the FTS3 system towards being able to manage third-party transfers using HTTP. Given the flexibility of the methods, the feature set may also include a subset of the SRM functionality that is relevant to disk systems. The application domain for such an ecosystem of services goes from large scale, Gridlike computing to the data access from laptops, profiting from tools that are shared with the Web community, like browsers, clients libraries and others. Particular focus was put into emphasizing the flexibility of the frameworks, which can interface with a very broad range of components, data stores, catalogues and metadata stores, including the possibility of building high performance dynamic federations of endpoints that build on the fly the feeling of a unique, seamless very efficient system. The overall goal is to leverage standards and standard practices, and use them to provide the higher level functionalities that are needed to fulfil the complex problem of Data Access in HEP. Other points of interest are about harmonizing the possibilities given by the HTTP/WebDAV protocols with existing frameworks like ROOT and already existing Storage Federations based on the XROOTD framework. We also provide quantitative evaluations of the performance that is achievable using HTTP for remote transfer and remote I/O in the context of HEP data. The idea is to contribute the parts that can make possible an ecosystem of services and applications, where the HEP-related features are covered, and the door is open to standard solutions and tools provided by third parties, in the
Adame, Silvia I.; Llorens, Luis
This paper presents a description of the metadata harvester software development. This system provides access to reliable and quality educational resources, shared by Mexican Universities through their repositories, to anyone with Internet Access. We present the conceptual and contextual framework, followed by the technical basis, the results and…
Social Education, 2013
"The C3 Framework for Social Studies State Standards will soon be released under the title "The College, Career, and Civic Life (C3) Framework for Social Studies State Standards: State Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History." The C3 Project Director and Lead Writer was NCSS member Kathy…
Full Text Available The importance of accessibility to digital e-learning resources is widely acknowledged. The World Wide Web Consortium Web Accessibility Initiative has played a leading role in promoting the importance of accessibility and developing guidelines that can help when developing accessible web resources. The accessibility of e-learning resources provides additional challenges. While it is important to consider the technical and resource related aspects of e-learning when designing and developing resources for students with disabilities, there is a need to consider pedagogic and contextual issues as well. A holistic framework is therefore proposed and described, which in addition to accessibility issues takes into account learner needs, learning outcomes, local factors, infrastructure, usability and quality assurance. The practical application and implementation of this framework is discussed and illustrated through the use of examples and case studies.
Full Text Available It is common course that equitable access to water and sanitation must be accordingly and primarily regarded as criteria for the realization of other several human rights, such as the right to life, dignity, health, food, adequate standard of living and education. Access to safe drinking water and sanitation is essential to the enjoyment of safety and environment that is not hazardous to human health. The lack of water and sanitation does not only hinder access to other available rights, but also magnifies the vulnerability of women, girls and people with disabilities. Water and sanitation services are of outmost important to the health and wellbeing of all people. South Africa is operating under one of the most outstanding legislative and policy frameworks for basic services in the world, including the Constitutional right of access to sufficient water and right to basic sanitation.
Full Text Available Internet knowledge is increasing steadily among instructors in the academic world. As courses incorporate more instructional technology, traditional undergraduate research assignments are adapting to reflect the changing world of information and information access. New library assignments reflect this shift as well, with term papers and research projects asking students to use Web sites as an information resource, in addition to the standard literature of periodicals and monographs. But the many pitfalls the library profession has learned in its own metamorphosis during the past decade are often repeated in these newer course assignments. The authors in this paper present a framework for librarians to interact with instructors to incorporate Internet resources into traditional term paper and research assignments. They suggest a framework for creating sample assignments librarians can take to campus instructional units, to show the teaching community at large what the library profession has learned from first-hand experience.
Ensminger, David C.; Fry, Michelle L.
This article introduces a descriptive conceptual framework to provide teachers with a means of recognizing and describing instructional activities that use primary sources. The framework provides structure for professional development programs that have been established to train teachers to access and integrate primary sources into lessons. The…
accessibility problems before the planning of housing intervention strategies. It is also critical that housing standards addressing accessibility intended to accommodate people with functional limitations are valid in the sense that their definitions truly support accessibility. However, there is a paucity...... of valid and reliable assessment instruments targeting housing accessibility, and in-depth analysis of factors potentially impacting on reliability in complex assessment situations is remarkably absent. Moreover, the knowledge base informing the housing standards appears to be vague. We may therefore...... reasonably question the validity of the housing standards addressing accessibility. This thesis addresses housing accessibility methodology in general and the reliability of assessment and the validity of standards targeting older people with functional limitations and a dependence on mobility devices...
The concept of inter-operability among a consortium of oil companies and E and P applications vendors was discussed. This paper describes the business problem including cost, cycle-time and efficiency and presents a solution. Benefits for applications developers will be single development environment for cross platform deployment; focus on value-added functionality instead of infrastructure; transparent access to widely-used data-stores; higher productivity and faster turnaround time to market. Application end-users will also benefit through elimination of the need to reformat due to cross platform and multi-vendor data access; utilization of Web browsers; integration with MS Office applications leveraging desk-top tools; lower purchase and integration cost of new applications. Strategic issues for application developers, and for oil companies, and competitive advantages of the E an P industry adopting OpenSpirit as the standard business object framework were also reviewed
The legislation and regulations, a regulatory authority to authorise and inspect the regulated activities and to enforce the legislation and regulations, sufficient financial and man-power resources are the essential parts of a national infrastructure to implement the Basic Safety Standards. The legal framework consists of legislation (Act passed by Parliament) and the regulations (framed by the government and endorsed by the Parliament). This paper is primarily deals with the the legal framework set up in India for atomic energy activities
FINE, V.; FISYAK, Y.; PEREVOZTCHIKOV, V.; WENAUS, T.
The Solenoidal Tracker At RHIC (STAR) is a-large acceptance collider detector, commissioned at Brookhaven National Laboratory in 1999. STAR has developed a software framework supporting simulation, reconstruction and analysis in offline production, interactive physics analysis and online monitoring environments that is well matched both to STAR's present status of transition between Fortran and C++ based software and to STAR's evolution to a fully OO software base. This paper presents the results of two years effort developing a modular C++ framework based on the ROOT package that encompasses both wrapped Fortran components (legacy simulation and reconstruction code) served by IDL-defined data structures, and fully OO components (all physics analysis code) served by a recently developed object model for event data. The framework supports chained components, which can themselves be composite subchains, with components (''makers'') managing ''data sets'' they have created and are responsible for. An St-DataSet class from which data sets and makers inherit allows the construction of hierarchical organizations of components and data, and centralizes almost all system tasks such as data set navigation, I/O, database access, and inter-component communication. This paper will present an overview of this system, now deployed and well exercised in production environments with real and simulated data, and in an active physics analysis development program
CERN. Geneva HR-RFA
Suggested Readings: Aspects of Quantum Chromodynamics/A Pich, arXiv:hep-ph/0001118. - The Standard Model of Electroweak Interactions/A Pich, arXiv:hep-ph/0502010. - The Standard Model of Particle Physics/A Pich The Standard Model of Elementary Particle Physics will be described. A detailed discussion of the particle content, structure and symmetries of the theory will be given, together with an overview of the most important experimental facts which have established this theoretical framework as the Standard Theory of particle interactions.
...) The investigation standard for“Q” access authorizations and for access to top secret (including top secret Special Access Programs) and Sensitive Compartmented Information; (c) The reinvestigation standard... authorizations and for access to confidential and secret (including all secret-level Special Access Programs not...
Stewart, J.; Lynge, J.; Hackathorn, E.; MacDermaid, C.; Pierce, R.; Smith, J.
Interoperability is a complex subject and often leads to different definitions in different environments. An interoperable framework of web services can improve the user experience by providing an interface for interaction with data regardless of it's format or physical location. This in itself improves accessibility to data, fosters data exploration and use, and provides a framework for new tools and applications. With an interoperable system you have: -- Data ready for action. Services model facilitates agile response to events. Services can be combined or reused quickly, upgraded or modified independently. -- Any data available through an interoperable framework can be operated on or combined with other data. Integrating standardized formats and access. -- New and existing systems have access to wide variety of data. Any new data added is easily incorporated with minimal changes required. The possibilities are limitless. The NOAA Earth Information System (NEIS) at the Earth System Research Laboratory (ESRL) is continuing research into an interoperable framework of layered services designed to facilitate the discovery, access, integration, visualization, and understanding of all NOAA (past, present, and future) data. An underlying philosophy of NEIS is to take advantage of existing off-the-shelf technologies and standards to minimize development of custom code allowing everyone to take advantage of the framework to meet these goals above. This framework, while built by NOAA are not limited to NOAA data or applications. Any other data available through similar services or applications that understand these standards can work interchangeably. Two major challenges are under active research at ESRL are data discoverability and fast access to big data. This presentation will provide an update on development of NEIS, including these challenges, the findings, and recommendations on what is needed for an interoperable system, as well as ongoing research activities
van Dam, Joris; Tadmor, Brigitta; Spector, Jonathan; Musuku, John; Zühlke, Liesl J; Zühlke, Liesl J; Engel, Mark E; Mayosi, Bongani M; Nestle, Nick
Summary Background Rheumatic heart disease (RHD) remains a major disease burden in low-resource settings globally. Patient registers have long been recognised to be an essential instrument in RHD control and elimination programmes, yet to date rely heavily on paper-based data collection and non-networked data-management systems, which limit their functionality. Objectives To assess the feasibility and potential benefits of producing an electronic RHD patient register. Methods We developed an eRegister based on the World Heart Federation’s framework for RHD patient registers using CommCare, an open-source, cloud-based software for health programmes that supports the development of customised data capture using mobile devices. Results The resulting eRegistry application allows for simultaneous data collection and entry by field workers using mobile devices, and by providers using computer terminals in clinics and hospitals. Data are extracted from CommCare and are securely uploaded into a cloud-based database that matches the criteria established by the WHF framework. The application can easily be tailored to local needs by modifying existing variables or adding new ones. Compared with traditional paper-based data-collection systems, the eRegister reduces the risk of data error, synchronises in real-time, improves clinical operations and supports management of field team operations. Conclusions The user-friendly eRegister is a low-cost, mobile, compatible platform for RHD treatment and prevention programmes based on materials sanctioned by the World Heart Federation. Readily adaptable to local needs, this paperless RHD patient register program presents many practical benefits. PMID:26444995
Probst, Christian W.; Hansen, René Rydhof
When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... common tool to answer this question, analysis of log files, faces the problem that the amount of logged data may be overwhelming. This problems gets even worse in the case of insider attacks, where the attacker’s actions usually will be logged as permissible, standard actions—if they are logged at all....... Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...
Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.
Kelley, Jay; Wessels, Denzil
Network access control (NAC) is how you manage network security when your employees, partners, and guests need to access your network using laptops and mobile devices. Network Access Control For Dummies is where you learn how NAC works, how to implement a program, and how to take real-world challenges in stride. You'll learn how to deploy and maintain NAC in your environment, identify and apply NAC standards, and extend NAC for greater network security. Along the way you'll become familiar with what NAC is (and what it isn't) as well as the key business drivers for deploying NAC.Learn the step
Walter, J.; de la Beaujardiere, J.; Bristol, S.
The United States Group on Earth Observations (USGEO) Data Management Working Group (DMWG) is an interagency body established by the White House Office of Science and Technology Policy (OSTP). The primary purpose of this group is to foster interagency cooperation and collaboration for improving the life cycle data management practices and interoperability of federally held earth observation data consistent with White House documents including the National Strategy for Civil Earth Observations, the National Plan for Civil Earth Observations, and the May 2013 Executive Order on Open Data (M-13-13). The members of the USGEO DMWG are working on developing a Common Framework for Earth Observation Data that consists of recommended standards and approaches for realizing these goals as well as improving the discoverability, accessibility, and usability of federally held earth observation data. These recommendations will also guide work being performed under the Big Earth Data Initiative (BEDI). This talk will summarize the Common Framework, the philosophy behind it, and next steps forward.
Control of grid user payment. Antitrust legal standards of control for the examination of grid user payments of the german operators of electricity distribution networks in the system of the negotiated grid access; Netznutzungsentgeltkontrolle. Kartellrechtliche Kontrollmassstaebe fuer die Ueberpruefung von Netznutzungsentgelten der deutschen Elektrizitaetsverteilungsnetzbetreiber im System des verhandelten Netzzungangs
For years their exists a controversy concerning to the permissible height of payments for the use of distribution networks in the electricity supply in the system of the negotiated grid access. Under this aspect, the author of the contribution under consideration reports on antitrust legal standards of control for the examination of grid user payments of the German operators of electricity distribution networks. The main aspects are: test standard; relation to energy law; market demarcation; position of the norm receiver; control methods; spatial comparison of interior prices; control of costs.
Full Text Available Wearable health tech provides doctors with the ability to remotely supervise their patients' wellness. It also makes it much easier to authorize someone else to take appropriate actions to ensure the person's wellness than ever before. Information Technology may soon change the way medicine is practiced, improving the performance, while reducing the price of healthcare. We analyzed the secrecy demands of wearable devices, including Smartphone, smart watch and their computing techniques, that can soon change the way healthcare is provided. However, before this is adopted in practice, all devices must be equipped with sufficient privacy capabilities related to healthcare service. In this paper, we formulated a new improved conceptual framework for wearable healthcare systems. This framework consists of ten principles and nine checklists, capable of providing complete privacy protection package to wearable device owners. We constructed this framework based on the analysis of existing mobile technology, the results of which are combined with the existing security standards. The approach also incorporates the market share percentage level of every app and its respective OS. This framework is evaluated based on the stringent CIA and HIPAA principles for information security. This evaluation is followed by testing the capability to revoke rights of subjects to access objects and ability to determine the set of available permissions for a particular subject for all models Finally, as the last step, we examine the complexity of the required initial setup.
Kamaraj, Deepan C; Bray, Nathan; Rispin, Karen; Kankipati, Padmaja; Pearlman, Jonathan; Borg, Johan
Currently, inadequate wheelchair provision has forced many people with disabilities to be trapped in a cycle of poverty and deprivation, limiting their ability to access education, work and social facilities. This issue is in part because of the lack of collaboration among various stakeholders who need to work together to design, manufacture and deliver such assistive mobility devices. This in turn has led to inadequate evidence about intervention effectiveness, disability prevalence and subsequent costeffectiveness that would help facilitate appropriate provision and support for people with disabilities. In this paper, we describe a novel conceptual framework that can be tested across the globe to study and evaluate the effectiveness of wheelchair provision. The Comparative Effectiveness Research Subcommittee (CER-SC), consisting of the authors of this article, housed within the Evidence-Based Practice Working Group (EBP-WG) of the International Society of Wheelchair Professionals (ISWP), conducted a scoping review of scientific literature and standard practices used during wheelchair service provision. The literature review was followed by a series of discussion groups. The three iterations of the conceptual framework are described in this manuscript. We believe that adoption of this conceptual framework could have broad applications in wheelchair provision globally to develop evidence-based practices. Such a perspective will help in the comparison of different strategies employed in wheelchair provision and further improve clinical guidelines. Further work is being conducted to test the efficacy of this conceptual framework to evaluate effectiveness of wheelchair service provision in various settings across the globe.
Molfetas, Angelos; Dimitrov, Gancho; Lassnig, Mario; Garonne, Vincent; Stewart, Graeme; Barisits, Martin; Beermann, Thomas
This paper describes a monitoring framework for large scale data management systems with frequent data access. This framework allows large data management systems to generate meaningful information from collected tracing data and to be queried on demand for specific user usage patterns in respect to source and destination locations, period intervals, and other searchable parameters. The feasibility of such a system at the petabyte scale is demonstrated by describing the implementation and operational experience of a real world management information system for the ATLAS experiment employing the proposed framework. Our observations suggest that the proposed user monitoring framework is capable of scaling to meet the needs of very large data management systems.
Full Text Available . Improving web site accessibility has proven to be a challenging task with a myriad of standards, accessibility testing tools and few technical guides for implementation. This paper presents the South African National Accessibility Portal (NAP), which is used...
If It Ain't Broke, Why Fix It? Framework and Processes for Engaging in Constructive Institutional Development and Renewal in the Context of Increasing Standards, Assessments, and Accountability for University-Based Teacher Preparation
Lit, Ira; Nager, Nancy; Snyder, Jon David
In this article, the authors offer a descriptive essay outlining the framework and processes of a five-year institutional renewal effort at Bank Street College of Education. Extended the opportunity to participate in the "Teachers for a New Era" (TNE) initiative, a multi-year, multi-million dollar effort to enhance and "radically…
Bastian, Nathaniel D; Munoz, David; Ventura, Marta
The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.
Ghassan F. Issa
Full Text Available This paper presents a comprehensive framework for building collaborative learning networks within higher educational institutions. This framework focuses on systems design and implementation issues in addition to a complete set of evaluation, and analysis tools. The objective of this project is to improve the standards of higher education in Jordan through the implementation of transparent, collaborative, innovative, and modern quality educational programs. The framework highlights the major steps required to plan, design, and implement collaborative learning systems. Several issues are discussed such as unification of courses and program of studies, using appropriate learning management system, software design development using Agile methodology, infrastructure design, access issues, proprietary data storage, and social network analysis (SNA techniques.
We may be at the cusp of a next generation framework for science which can be facilitated by understanding current limitations in the context of a divergence of 'scientific' tradition from the Axial Age (800-200 BCE) to the present. A powerful advance may come from fusing certain elements from Western and Eastern traditions, synthesizing the framework with an apt understanding of the divergence. Key traits will include the ethopoetic nature of the scientist with attention to his/her experience of self. The framework will also 'access' knowledge through a state of mind less encumbered with paradoxes, duality, incompatibility and other aporias. Case studies in biology and physics illustrate possibilities. Copyright © 2017 Elsevier Ltd. All rights reserved.
Temperature profile data from XBT casts from the WASHINGTON STANDARD and other platforms as part of the International Decade of Ocean Exploration (IDOE) from 1968-10-18 to 1972-10-18 (NODC Accession 7300888)
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected from XBT casts from the WASHINGTON STANDARD and other platforms from 18 October 1968 to 18 October 1972. Data were collected...
McGlynn, Thomas; Fabbiano, Giuseppina; Accomazzi, Alberto; Smale, Alan; White, Richard L.; Donaldson, Thomas; Aloisi, Alessandra; Dower, Theresa; Mazzerella, Joseph M.; Ebert, Rick; Pevunova, Olga; Imel, David; Berriman, Graham B.; Teplitz, Harry I.; Groom, Steve L.; Desai, Vandana R.; Landry, Walter
Since the turn of the millennium a constant concern of astronomical archives have begun providing data to the public through standardized protocols unifying data from disparate physical sources and wavebands across the electromagnetic spectrum into an astronomical virtual observatory (VO). In October 2014, NASA began support for the NASA Astronomical Virtual Observatories (NAVO) program to coordinate the efforts of NASA astronomy archives in providing data to users through implementation of protocols agreed within the International Virtual Observatory Alliance (IVOA). A major goal of the NAVO collaboration has been to step back from a piecemeal implementation of IVOA standards and define what the appropriate presence for the US and NASA astronomy archives in the VO should be. This includes evaluating what optional capabilities in the standards need to be supported, the specific versions of standards that should be used, and returning feedback to the IVOA, to support modifications as needed. We discuss a standard archive model developed by the NAVO for data archive presence in the virtual observatory built upon a consistent framework of standards defined by the IVOA. Our standard model provides for discovery of resources through the VO registries, access to observation and object data, downloads of image and spectral data and general access to archival datasets. It defines specific protocol versions, minimum capabilities, and all dependencies. The model will evolve as the capabilities of the virtual observatory and needs of the community change.
Full Text Available The Conceptual Framework for Financial Reporting (CFfFR) was developed to provide guidance to users and preparers of financial reports and standards. However, general consensus within the accounting community is that the Conceptual Framework fails...
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Lucido, Jessica M.
The U.S. Geological Survey has developed an open-standard data integration framework for working efficiently and effectively with large collections of climate and other geoscience data. A web interface accesses catalog datasets to find data services. Data resources can then be rendered for mapping and dataset metadata are derived directly from these web services. Algorithm configuration and information needed to retrieve data for processing are passed to a server where all large-volume data access and manipulation takes place. The data integration strategy described here was implemented by leveraging existing free and open source software. Details of the software used are omitted; rather, emphasis is placed on how open-standard web services and data encodings can be used in an architecture that integrates common geographic and atmospheric data.
Diaz-Merced, Wanda Liz; Casado, Johanna; Garcia, Beatriz; Aarnio, Alicia; Knierman, Karen; Monkiewicz, Jacqueline; Alicia Aarnio.
Big Data" is a subject that has taken special relevance today, particularly in Astrophysics, where continuous advances in technology are leading to ever larger data sets. A multimodal approach in perception of astronomical data data (achieved through sonification used for the processing of data) increases the detection of signals in very low signal-to-noise ratio limits and is of special importance to achieve greater inclusion in the field of Astronomy. In the last ten years, different software tools have been developed that perform the sonification of astronomical data from tables or databases, among them the best known and in multiplatform development are Sonification Sandbox, MathTrack, and xSonify.In order to determine the accessibility of software we propose to start carrying out a conformity analysis of ISO (International Standard Organization) 9241-171171: 2008. This standard establishes the general guidelines that must be taken into account for accessibility in software design, and it is applied to software used in work, public places, and at home. To analyze the accessibility of web databases, we take into account the "Web Content Content Accessibility Guidelines (WCAG) 2.0", accepted and published by ISO in the ISO / IEC 40500: 2012 standard.In this poster, we present a User Centered Design (UCD), Human Computer Interaction (HCI), and User Experience (UX) framework to address a non-segregational provision of access to bibliographic databases and telemetry databases in Astronomy. Our framework is based on an ISO evaluation on a selection of data bases such as ADS, Simbad and SDSS. The WCAG 2.0 and ISO 9241-171171: 2008 should not be taken as absolute accessibility standards: these guidelines are very general, are not absolute, and do not address particularities. They are not to be taken as a substitute for UCD, HCI, UX design and evaluation. Based on our results, this research presents the framework for a focus group and qualitative data analysis aimed to
Baring-Gould, Ian [National Renewable Energy Lab. (NREL), Golden, CO (United States); Burman, Kari [National Renewable Energy Lab. (NREL), Golden, CO (United States); Singh, Mohit [National Renewable Energy Lab. (NREL), Golden, CO (United States); Esterly, Sean [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mutiso, Rose [US Department of Energy, Washington, DC (United States); McGregor, Caroline [US Department of Energy, Washington, DC (United States)
Providing clean and affordable energy services to the more than 1 billion people globally who lack access to electricity is a critical driver for poverty reduction, economic development, improved health, and social outcomes. More than 84% of populations without electricity are located in rural areas where traditional grid extension may not be cost-effective; therefore, distributed energy solutions such as mini-grids are critical. To address some of the root challenges of providing safe, quality, and financially viable mini-grid power systems to remote customers, the U.S. Department of Energy (DOE) teamed with the National Renewable Energy Laboratory (NREL) to develop a Quality Assurance Framework (QAF) for isolated mini-grids. The QAF for mini-grids aims to address some root challenges of providing safe, quality, and affordable power to remote customers via financially viable mini-grids through two key components: (1) Levels of service: Defines a standard set of tiers of end-user service and links them to technical parameters of power quality, power availability, and power reliability. These levels of service span the entire energy ladder, from basic energy service to high-quality, high-reliability, and high-availability service (often considered 'grid parity'); (2) Accountability and performance reporting framework: Provides a clear process of validating power delivery by providing trusted information to customers, funders, and/or regulators. The performance reporting protocol can also serve as a robust monitoring and evaluation tool for mini-grid operators and funding organizations. The QAF will provide a flexible alternative to rigid top-down standards for mini-grids in energy access contexts, outlining tiers of end-user service and linking them to relevant technical parameters. In addition, data generated through implementation of the QAF will provide the foundation for comparisons across projects, assessment of impacts, and greater confidence that
The National Rural Electric Cooperative Association (NRECA) believes that while access to the interconnected transmission system (grid) is necessary to provide the most efficient and economical development and use of the bulk power supply system, the grid cannot be unconditionally opened. Additionally, access should be provided only under reasonable terms, conditions, and cost-based compensation, within a framework of joint planning and coordinated operations. NRECA describes here its transmission policy, a coordinated planning and utilization model (CPU)
David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.
The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the
Andersen, Ove Kjeld; Hjulmand, Christian
Approximately 30% of the Danish population has severe problems in reading everyday text. In the light of the increasing amount of text available on the Internet this poses a democratic challenge to ensure “equal access” to information. The Talking Internet service - Access For All (AFA) - offers...... a free Internet-based tool for reading aloud any marked text with a synthetic voice. The only requirements are a standard equipped PC running a recent Windows OS and an Internet connection. Experiences gathered from running the service for more than 28 months underline the viability of the concept....... There is a clear need for a free internet based Danish text-to-speech synthesizer. Furthermore, the current state of technology i.e. internet bandwidth, response time and server technology is sufficient for setting up an online automatic reading service that is used by steadily growing number of individuals...
U.S. Environmental Protection Agency — The National Water Quality Standards Database (WQSDB) provides access to EPA and state water quality standards (WQS) information in text, tables, and maps. This data...
Vanessa V Sochat
Full Text Available The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (de Leeuw (2015; McDonnell et al. (2012; Mason and Suri (2011; Lange et al. (2015 have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker (2015; Open Science Collaboration (2015 highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms.
Mazzetti, Paolo; Latre, Miguel Á.; Ernst, Julia; Brumana, Raffaella; Brauman, Stefan; Nativi, Stefano
In October 2014 the ENERGIC-OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". In ENERGIC-OD, Virtual Hubs are conceived as information systems supporting the full life cycle of Open Data: publishing, discovery and access. They facilitate the use of Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and data services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus, beyond a certain extent, heterogeneity is irreducible especially in interdisciplinary contexts. ENERGIC-OD Virtual Hubs address heterogeneity adopting a mediation and brokering approach: specific components (brokers) are dedicated to harmonize service interfaces, metadata and data models, enabling seamless discovery and access to heterogeneous infrastructures and datasets. As an innovation project, ENERGIC-OD will integrate several existing technologies to implement Virtual Hubs as single points of access to geospatial datasets provided by new or existing platforms and infrastructures, including INSPIRE-compliant systems and Copernicus services. ENERGIC OD will deploy a
In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same
Vermont Center for Geographic Information — The Vermont Fish & Wildlife Department maintains developed fishing access areas. These sites provide public access to waters in Vermont for shore fishing...
and accessibility. Sensory accessibility accommodates aspects of a sensory disability and describes architectural design requirements needed to ensure access to architectural experiences. In the context of architecture accessibility has become a design concept of its own. It is generally described as ensuring...... physical access to the built environment by accommodating physical disabilities. While the existing concept of accessibility ensures the physical access of everyone to a given space, sensory accessibility ensures the choice of everyone to stay and be able to participate and experience....
EPA has agreed to establish a series of environmental standards for the safe disposal of radioactive waste through participation in the Interagency Review Group on Nuclear Waste Management (IRG). One of the standards required under the IRG is the standard for decommissioning of radioactive contaminated sites, facilities, and materials. This standard is to be proposed by December 1980 and promulgated by December 1981. Several considerations are important in establishing these standards. This study includes discussions of some of these considerations and attempts to evaluate their relative importance. Items covered include: the form of the standards, timing for decommissioning, occupational radiation protection, costs and financial provisions. 4 refs
Regli, Laura; Zecchina, Adriano; Vitillo, Jenny G; Cocina, Donato; Spoto, Giuseppe; Lamberti, Carlo; Lillerud, Karl P; Olsbye, Unni; Bordiga, Silvia
We have recently highlighted that H-SSZ-13, a highly siliceous zeolite (Si/Al = 11.6) with a chabazitic framework, is the most efficient zeolitic material for hydrogen storage [A. Zecchina, S. Bordiga, J. G. Vitillo, G. Ricchiardi, C. Lamberti, G. Spoto, M. Bjørgen and K. P. Lillerud, J. Am. Chem. Soc., 2005, 127, 6361]. The aim of this new study is thus to clarify both the role played by the acidic strength and by the density of the polarizing centers hosted in the same framework topology in the increase of the adsorptive capabilities of the chabazitic materials towards H2. To achieve this goal, the volumetric experiments of H2 uptake (performed at 77 K) and the transmission IR experiment of H2 adsorption at 15 K have been performed on H-SSZ-13, H-SAPO-34 (the isostructural silico-aluminophosphate material with the same Brønsted site density) and H-CHA (the standard chabazite zeolite: Si/Al = 2.1) materials. We have found that a H2 uptake improvement has been obtained by increasing the acidic strength of the Brønsted sites (moving from H-SAPO-34 to H-SSZ-13). Conversely, the important increase of the Brønsted sites density (moving from H-SSZ-13 to H-CHA) has played a negative role. This unexpected behavior has been explained as follows. The additional Brønsted sites are in mutual interaction via H-bonds inside the small cages of the chabazitic framework and for most of them the energetic cost needed to displace the adjacent OH ligand is higher than the adsorption enthalpy of the OH...H2 adduct. From our work it can be concluded that proton exchanged chabazitic frameworks represent, among zeolites, the most efficient materials for hydrogen storage. We have shown that a proper balance between available space (volume accessible to hydrogen), high contact surface, and specific interaction with strong and isolated polarizing centers are the necessary characteristics requested to design better materials for molecular H2 storage.
Mislevy, J M; Schiller, M R; Wolf, K N; Finn, S C
To ascertain perceived access of dietitians to power in the workplace. The conceptual framework was Kanter's theory of organizational power. The Conditions for Work Effectiveness Questionnaire was used to measure perceived access to sources of power: information, support, resources, and opportunities. Demographic data were collected to identify factors that may enhance empowerment. The questionnaire was sent to a random sample of 348 dietitians chosen from members of the Clinical Nutrition Management dietetic practice group of the American Dietetic Association. Blank questionnaires were returned by 99 (28.4%) people not working as clinical nutrition managers, which left 249 in the sample. Descriptive statistics were used to organize and summarize data. One-way analysis of variance and t tests were performed to identify differences in responses based on levels of education, work setting, and information technology skills. Usable questionnaires were received from 178 people (71.5%). On a 5-point scale, scores for access to information (mean +/- standard deviation [SD] = 3.8 +/- 0.7), opportunity (mean +/- SD = 3.6 +/- 0.7), support (mean +/- SD = 3.2 +/- 0.9), and resources (mean +/- SD = 3.1 +/- 0.8) demonstrated that clinical nutrition managers perceived themselves as having substantial access to sources of empowerment. Those having higher levels of education, working in larger hospitals, having better-developed information technology skills, and using information technology more frequently had statistically significant higher empowerment scores (P = leadership roles in today's health care settings. Their power may be enhanced by asserting more pressure to gain greater access to sources of power: support, information, resources, and opportunities.
Megan E. Dempsey
Full Text Available Given the prevalence of the Information Literacy Competency Standards in the library profession for the past 15 years, and the heated debate that took place regarding whether or not the Framework for Information Literacy and the Standards could harmoniously co-exist, the article raises questions about the future of information literacy in higher education. We do not necessarily have answers to these questions, but offer our own perspectives, some insight into how the Standards have served New Jersey academic librarians in the past, and how we envision using the Framework and the Standards together to further information literacy instruction at our institutions. Discussions of these questions have led us to the conclusion that the Framework and the Standards serve different purposes and have different intended audiences and are thus both valuable to the library profession.
Specific safety measures for emergency lanes and shoulders of motorways : a proposal for motorways' authorities in the framework of the European research project Safety Standards for Road Design and Redesign SAFESTAR, Workpackage 1.1.
This workpackage is one of seven workpackages of the European SAFESTAR project, launched by DG VII. Directing on safety standards and recommendations for the Trans-European Roadway Network (TERN), the workpackage considered safety measures on emergency lanes (stopping strips), which are inherent
González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E
The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.
Lukac, Rastislav; Plataniotis, Konstantinos N.; Smolka, Bogdan; Venetsanopoulos, Anastasios N.
We provide a unified framework of nonlinear vector techniques outputting the lowest ranked vector. The proposed framework constitutes a generalized filter class for multichannel signal processing. A new class of nonlinear selection filters are based on the robust order-statistic theory and the minimization of the weighted distance function to other input samples. The proposed method can be designed to perform a variety of filtering operations including previously developed filtering techniques such as vector median, basic vector directional filter, directional distance filter, weighted vector median filters and weighted directional filters. A wide range of filtering operations is guaranteed by the filter structure with two independent weight vectors for angular and distance domains of the vector space. In order to adapt the filter parameters to varying signal and noise statistics, we provide also the generalized optimization algorithms taking the advantage of the weighted median filters and the relationship between standard median filter and vector median filter. Thus, we can deal with both statistical and deterministic aspects of the filter design process. It will be shown that the proposed method holds the required properties such as the capability of modelling the underlying system in the application at hand, the robustness with respect to errors in the model of underlying system, the availability of the training procedure and finally, the simplicity of filter representation, analysis, design and implementation. Simulation studies also indicate that the new filters are computationally attractive and have excellent performance in environments corrupted by bit errors and impulsive noise.
Daniel, Christel; García Rojo, Marcial; Bourquard, Karima; Henin, Dominique; Schrader, Thomas; Della Mea, Vincenzo; Gilbertson, John; Beckwith, Bruce A
Integrating anatomic pathology information- text and images-into electronic health care records is a key challenge for enhancing clinical information exchange between anatomic pathologists and clinicians. The aim of the Integrating the Healthcare Enterprise (IHE) international initiative is precisely to ensure interoperability of clinical information systems by using existing widespread industry standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level Seven (HL7). To define standard-based informatics transactions to integrate anatomic pathology information to the Healthcare Enterprise. We used the methodology of the IHE initiative. Working groups from IHE, HL7, and DICOM, with special interest in anatomic pathology, defined consensual technical solutions to provide end-users with improved access to consistent information across multiple information systems. The IHE anatomic pathology technical framework describes a first integration profile, "Anatomic Pathology Workflow," dedicated to the diagnostic process including basic image acquisition and reporting solutions. This integration profile relies on 10 transactions based on HL7 or DICOM standards. A common specimen model was defined to consistently identify and describe specimens in both HL7 and DICOM transactions. The IHE anatomic pathology working group has defined standard-based informatics transactions to support the basic diagnostic workflow in anatomic pathology laboratories. In further stages, the technical framework will be completed to manage whole-slide images and semantically rich structured reports in the diagnostic workflow and to integrate systems used for patient care and those used for research activities (such as tissue bank databases or tissue microarrayers).
Reddy, Sandeep; Wakerman, John; Westhorp, Gill; Herring, Sally
The Remote Primary Health Care Manuals (RPHCM) project team manages the development and publication of clinical protocols and procedures for primary care clinicians practicing in remote Australia. The Central Australian Rural Practitioners Association Standard Treatment Manual, the flagship manual of the RPHCM suite, has been evaluated for accessibility and acceptability in remote clinics three times in its 20-year history. These evaluations did not consider a theory-based framework or a programme theory, resulting in some limitations with the evaluation findings. With the RPHCM having an aim of enabling evidence-based practice in remote clinics and anecdotally reported to do so, testing this empirically for the full suite is vital for both stakeholders and future editions of the RPHCM. The project team utilized a realist evaluation framework to assess how, why and for what the RPHCM were being used by remote practitioners. A theory regarding the circumstances in which the manuals have and have not enabled evidence-based practice in the remote clinical context was tested. The project assessed this theory for all the manuals in the RPHCM suite, across government and aboriginal community-controlled clinics, in three regions of Australia. Implementing a realist evaluation framework to generate robust findings in this context has required innovation in the evaluation design and adaptation by researchers. This article captures the RPHCM team's experience in designing this evaluation. © 2015 John Wiley & Sons, Ltd.
Laboratories ( PNL ) to establish the standards for the NIMS initiative. This entire situation creates another set of inconsistencies in criteria for...www.hsaj.org/?fullarticle=1.2.2, (accessed June 12, 2007), 8. 24 setting agencies: NFPA, PNL , American National Standards Institute (ANSI), and...with Pacific Northwest Laboratory ( PNL ) to develop the NIC standards. The Incident Management Standards Working Group (IMSWG) was established
Burgess, Cliff; Moore, Guy
List of illustrations; List of tables; Preface; Acknowledgments; Part I. Theoretical Framework: 1. Field theory review; 2. The standard model: general features; 3. Cross sections and lifetimes; Part II. Applications: Leptons: 4. Elementary boson decays; 5. Leptonic weak interactions: decays; 6. Leptonic weak interactions: collisions; 7. Effective Lagrangians; Part III. Applications: Hadrons: 8. Hadrons and QCD; 9. Hadronic interactions; Part IV. Beyond the Standard Model: 10. Neutrino masses; 11. Open questions, proposed solutions; Appendix A. Experimental values for the parameters; Appendix B. Symmetries and group theory review; Appendix C. Lorentz group and the Dirac algebra; Appendix D. ξ-gauge Feynman rules; Appendix E. Metric convention conversion table; Select bibliography; Index.
Full Text Available This article presents the project that I led for HEFCE on the implications of OA (open access for monographs and other long-form research publications. The likely requirement that books should be OA if submitted to the REF (Research Excellence Framework after next means that OA development must be based on an understanding of the importance of the monograph in the AHSS (arts, humanities and social sciences as well as the challenges involved in making the transition to online OA. The project focused on three issues and each is summarized in turn in the article: What is the place of the monograph and other long-form publications in AHSS disciplines that makes it so important? What is happening to the monograph and is there a crisis as some suggest? What are the issues involved in moving monographs into a digital and OA environment – not just the challenge of effective business models but also many other aspects of sustaining and enhancing the qualities of the monograph? These include third-party rights, technical challenges, licences and the need for international collaboration.
The attempts to develop models beyond the Standard Model are briefly reviewed paying particular regard to the mechanisms responsible for symmetry breaking and mass generation. A comparison is made of the theoretical expectations with recent precision measurements for theories with composite Higgs and for supersymmetric theories with elementary Higgs boson(s). The implications of a heavy top quark and the origin of the light quark and lepton masses and mixing angles are considered within these frameworks. ((orig.))
Stellinga, B.; Mügge, D.
The European and global regulation of accounting standards have witnessed remarkable changes over the past twenty years. In the early 1990s, EU accounting practices were fragmented along national lines and US accounting standards were the de facto global standards. Since 2005, all EU listed
Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK
Full Text Available stream_source_info Masonta2_2013.pdf.txt stream_content_type text/plain stream_size 29237 Content-Encoding UTF-8 stream_name Masonta2_2013.pdf.txt Content-Type text/plain; charset=UTF-8 Television White Space (TVWS... have been promoted by leading ICT regulators in the United States and Europe over the past decade. For instance, the US Federal Communications Commission (FCC)  and the Office of Communications (Ofcom) in the United Kingdom (UK)  went through...
The contemporary challenge facing education in South Africa is finding ways to assist the vast majority of school-leavers who do not qualify for direct entry into higher ... These programmes were institution-based and had very few uniform characteristics in terms of duration and curriculum; moreover, they failed to provide any ...
Spaccapietra, S.; Rinderle, S.B.; Reichert, M.U.
For several reasons enterprises are frequently subject to organizational change. Respective adaptations may concern business processes, but also other components of an enterprise architecture. In particular, changes of organizational structures often become necessary. The information about
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies. Qualitative spatial reasoning deals with knowledge about an infinite spatial domain using a finite set of qualitative relations without using numerical computation. Qualitative knowledge is relative knowledge where we obtain the knowledge on the basis of comparison of features with in the object domain rather then using some external scales. Reasoning is an inte...
Several nice hardware functionalities located at the low level of operating system on mobile phones could be utilized in a better way if they are available to application developers. With their help, developers are able to bring overall user experience to a new level in terms of developing novel applications. For instance, one of those hardware functionalities, SIM-card authentication is able to offer stronger and more convenient way of authentication when compared to the traditional approach...
Olesen, Henning; Khajuria, Samant
Today, data is money. Whether it is private users' personal data or confidential data and assets belonging to service providers, all parties have a strong need to protect their resources when interacting with each other, i.e. for access control and authorization measures to be deployed. Enabling...... advanced user controlled privacy is essential to realize the visions of 5G applications and services. For service providers and enterprises resources are usually well safeguarded, while private users are often missing the tools and the know-how to protect their own data and preserve their privacy. The user...... the framework of User Managed Access (UMA), can enable users to understand the value of their protected resources and possibly give them control of how their data will be used by service providers....
Plante, R.; Fitzpatrick, M.; Graham, M.; Tody, D.; Young, W.
We introduce two products for accessing the VO from Python: PyVO and VOClient. PyVO is built on the widely-used Astropy package and is well suited for integrating automated access to astronomical data into highly customizable scripts and applications for data analysis in Python. VOClient is built on a collection of C-libraries and is well suited for integrating with multi-language analysis packages. It also provides a framework for integrating legacy software into the Python environment. In this demo, we will run through several examples demonstrate basic data discovery and retrieval of data. This includes finding archives containing data of interest (VO registry), retrieving datasets (SIA, SSA), and exploring (Cone Search, SLAP). VOClient features some extended capabilities including the ability to communicate to other desktop applications from a script using the SAMP protocol.
Finnell, Joshua Eugene [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
US President Barack Obama issued Executive Order 13642-Making Open and Machine Readable the New Default for Government - on May, 9 2013, mandating, wherever legally permissible and possible, that US Government information be made open to the public. This edict accelerated the construction of and framework for data repositories, and data citation principles and practices, such as data.gov. As a corollary, researchers across the country's national laboratories found themselves creating data management plans, applying data set metadata standards, and ensuring the long-term access of data for federally funded scientific research.
What Open Access is. What Open Access is not. How is Open Access provided? Open Access archives or repositories. Open Access journals. Why should authors provide Open Access to their work? Further information and resources
The ''NPOC Strategic Plan for Building New Nuclear Plants'' creates a framework within which new standardized nuclear plants may be built. The Strategic Plan is an expression of the nuclear energy industry's serious intent to create the necessary conditions for new plant construction and operation. One of the key elements of the Strategic Plan is a comprehensive industry commitment to standardization: through design certification, combined license, first-of-a-kind engineering, construction, operation and maintenance of nuclear power plants. The NPOC plan proposes four stages of standardization in advanced light water reactors (ALWRs). The first stage is established by the ALWR Utility Requirements Document which specifies owner/operator requirements at a functional level covering all elements of plant design and construction, and many aspects of operations and maintenance. The second stage of standardization is that achieved in the NRC design certification. This certification level includes requirements, design criteria and bases, functional descriptions and performance requirements for systems to assure plant safety. The third stage of standardization, commercial standardization, carries the design to a level of completion beyond that required for design certification to enable the industry to achieve potential increases in efficiency and economy. The final stage of standardization is enhanced standardization beyond design. A standardized approach is being developed in construction practices, operating, maintenance training, and procurement practices. This comprehensive standardization program enables the NRC to proceed with design certification with the confidence that standardization beyond the regulations will be achieved. This confidence should answer the question of design detail required for design certification, and demonstrate that the NRC should require no further regulatory review beyond that required by 10 CFR Part 52
This document provides the technical framework for groundwater restoration under Phase II of the Uranium Mill Tailings Remedial Action (UMTRA) Project. A preliminary management plan for Phase II has been set forth in a companion document titled ''Preplanning Guidance Document for Groundwater Restoration''. General principles of site characterization for groundwater restoration, restoration methods, and treatment are discussed in this document to provide an overview of standard technical approaches to groundwater restoration
L. Vinhas de Souza
textabstractHere the author empirically estimates if the different monetary and exchange rate frameworks observed in the Accession Countries of Central and Eastern Europe and the Baltics do yield different outcomes in terms of level and variance of a set of nominal and real variables. The author
The current status of diabetes professional educational standards and competencies in the UK--a position statement from the Diabetes UK Healthcare Professional Education Competency Framework Task and Finish Group.
Walsh, N; George, S; Priest, L; Deakin, T; Vanterpool, G; Karet, B; Simmons, D
Diabetes is a significant health concern, both in the UK and globally. Management can be complex, often requiring high levels of knowledge and skills in order to provide high-quality and safe care. The provision of good, safe, quality care lies within the foundations of healthcare education, continuing professional development and evidence-based practice, which are inseparable and part of a continuum during the career of any health professional. Sound education provides the launch pad for effective clinical management and positive patient experiences. This position paper reviews and discusses work undertaken by a Working Group under the auspices of Diabetes UK with the remit of considering all health professional educational issues for people delivering care to people with diabetes. This work has scoped the availability of education for those within the healthcare system who may directly or indirectly encounter people with diabetes and reviews alignment to existing competency frameworks within the UK's National Health Service. © 2011 The Authors. Diabetic Medicine © 2011 Diabetes UK.
Stokes, A V
Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se
Providers and Patients Caught Between Standardization and Individualization: Individualized Standardization as a Solution Comment on "(Re) Making the Procrustean Bed? Standardization and Customization as Competing Logics in Healthcare".
Ansmann, Lena; Pfaff, Holger
In their 2017 article, Mannion and Exworthy provide a thoughtful and theory-based analysis of two parallel trends in modern healthcare systems and their competing and conflicting logics: standardization and customization. This commentary further discusses the challenge of treatment decision-making in times of evidence-based medicine (EBM), shared decision-making and personalized medicine. From the perspective of systems theory, we propose the concept of individualized standardization as a solution to the problem. According to this concept, standardization is conceptualized as a guiding framework leaving room for individualization in the patient physician interaction. The theoretical background is the concept of context management according to systems theory. Moreover, the comment suggests multidisciplinary teams as a possible solution for the integration of standardization and individualization, using the example of multidisciplinary tumor conferences and highlighting its limitations. The comment also supports the authors' statement of the patient as co-producer and introduces the idea that the competing logics of standardization and individualization are a matter of perspective on macro, meso and micro levels. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Entity Framework 4.0 Recipes provides an exhaustive collection of ready-to-use code solutions for Microsoft's Entity Framework, Microsoft's vision for the future of data access. Entity Framework is a model-centric data access platform with an ocean of new concepts and patterns for developers to learn. With this book, you will learn the core concepts of Entity Framework through a broad range of clear and concise solutions to everyday data access tasks. Armed with this experience, you will be ready to dive deep into Entity Framework, experiment with new approaches, and develop ways to solve even
Hennig, Teresa; Hepworth, George; Yudovich, Dagi (Doug)
Authoritative and comprehensive coverage for building Access 2013 Solutions Access, the most popular database system in the world, just opened a new frontier in the Cloud. Access 2013 provides significant new features for building robust line-of-business solutions for web, client and integrated environments. This book was written by a team of Microsoft Access MVPs, with consulting and editing by Access experts, MVPs and members of the Microsoft Access team. It gives you the information and examples to expand your areas of expertise and immediately start to develop and upgrade projects. Exp
Hennig, Teresa; Griffith, Geoffrey L
A comprehensive guide to programming for Access 2010 and 2007. Millions of people use the Access database applications, and hundreds of thousands of developers work with Access daily. Access 2010 brings better integration with SQL Server and enhanced XML support; this Wrox guide shows developers how to take advantage of these and other improvements. With in-depth coverage of VBA, macros, and other programming methods for building Access applications, this book also provides real-world code examples to demonstrate each topic.: Access is the leading database that is used worldwide; While VBA rem
Duarte, A.S.; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F.
Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis
Duarte, A.S. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)], E-mail: email@example.com; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)
Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis.
Shawky, Ahmed; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup
information. The paper considers a developed framework from the ICT project, OPEN, and investigates the impact of applying Differentiated Services (DiffServ) Quality of Services (QoS). The paper finally provides insight in how the insight gained can be utilized to ensure reliable remote accessed context......Context aware network services are a new and inter-esting way to enhance network users experience. A context aware application/service enhances network performance in relation to dynamic context information, e.g. mobility, location and device information as it senses and reacts to environment...... changes. The reliability of the information accessed is a key factor in achieving reliable context aware application. This paper will review the service degradation in Context Management Frameworks (CMF) and the effect of high network utilization, with particular focus on the reliability of the accessed...
Trueness verification of actual creatinine assays in the European market demonstrates a disappointing variability that needs substantial improvement. An international study in the framework of the EC4 creatinine standardization working group.
Delanghe, Joris R; Cobbaert, Christa; Galteau, Marie-Madeleine; Harmoinen, Aimo; Jansen, Rob; Kruse, Rolf; Laitinen, Päivi; Thienpont, Linda M; Wuyts, Birgitte; Weykamp, Cas; Panteghini, Mauro
The European In Vitro Diagnostics (IVD) directive requires traceability to reference methods and materials of analytes. It is a task of the profession to verify the trueness of results and IVD compatibility. The results of a trueness verification study by the European Communities Confederation of Clinical Chemistry (EC4) working group on creatinine standardization are described, in which 189 European laboratories analyzed serum creatinine in a commutable serum-based material, using analytical systems from seven companies. Values were targeted using isotope dilution gas chromatography/mass spectrometry. Results were tested on their compliance to a set of three criteria: trueness, i.e., no significant bias relative to the target value, between-laboratory variation and within-laboratory variation relative to the maximum allowable error. For the lower and intermediate level, values differed significantly from the target value in the Jaffe and the dry chemistry methods. At the high level, dry chemistry yielded higher results. Between-laboratory coefficients of variation ranged from 4.37% to 8.74%. Total error budget was mainly consumed by the bias. Non-compensated Jaffe methods largely exceeded the total error budget. Best results were obtained for the enzymatic method. The dry chemistry method consumed a large part of its error budget due to calibration bias. Despite the European IVD directive and the growing needs for creatinine standardization, an unacceptable inter-laboratory variation was observed, which was mainly due to calibration differences. The calibration variation has major clinical consequences, in particular in pediatrics, where reference ranges for serum and plasma creatinine are low, and in the estimation of glomerular filtration rate.
Get a thorough introduction to ADO.NET Entity Framework 4 -- Microsoft's core framework for modeling and interacting with data in .NET applications. The second edition of this acclaimed guide provides a hands-on tour of the framework latest version in Visual Studio 2010 and .NET Framework 4. Not only will you learn how to use EF4 in a variety of applications, you'll also gain a deep understanding of its architecture and APIs. Written by Julia Lerman, the leading independent authority on the framework, Programming Entity Framework covers it all -- from the Entity Data Model and Object Service
Thelis R. S.
Full Text Available The main focus of the proposed research is maintaining the security of a network. Extranet is a popular network among most of the organizations where network access is provided to a selected group of outliers. Limiting access to an extranet can be carried out using Access Control Lists ACLs method. However handling the workload of ACLs is an onerous task for the router. The purpose of the proposed research is to improve the performance and to solidify the security of the ACLs used in a small organization. Using a high performance computer as a dedicated device to share and handle the router workload is suggested in order to increase the performance of the router when handling ACLs. Methods of detecting and directing sensitive data is also discussed in this paper. A framework is provided to help increase the efficiency of the ACLs in an organization network using the above mentioned procedures thus helping the organizations ACLs performance to be improved to be more secure and the system to perform faster. Inbuilt methods of Windows platform or Software for open source platforms can be used to make a computer function as a router. Extended ACL features allow the determining of the type of packets flowing through the router. Combining these mechanisms allows the ACLs to be improved and perform in a more efficient manner.
Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak
Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data.
The article describes the benefits of and required process and recommendations for implementing the standardization of training in the nuclear power industry in the United States and abroad. Current Information and Communication Technologies (ICT) enable training standardization in the nuclear power industry. The delivery of training through the Internet, Intranet and video over IP will facilitate this standardization and bring multiple benefits to the nuclear power industry worldwide. As the amount of available qualified and experienced professionals decreases because of retirements and fewer nuclear engineering institutions, standardized training will help increase the number of available professionals in the industry. Technology will make it possible to use the experience of retired professionals who may be interested in working part-time from a remote location. Well-planned standardized training will prevent a fragmented approach among utilities, and it will save the industry considerable resources in the long run. It will also ensure cost-effective and safe nuclear power plant operation
Full Text Available Open access is a mode of academic communication that has been on the rise in recent years, but open access academic resources are widely dispersed across the internet, making it occasionally inconvenient in terms of its use. This research is focused on library and information science, using the OAIS reference model as the system framework, two open access platform, DOAJ and E-LIS as the data sources, and through system implementation develop a “library and information science open access journal union catalogue” system. Using the OAI-PMH protocol as the data interoperability standard, and LAMP as the development environment, four major functionalities: injest, archiving, management and access of information were designed, developed, and integrated into system build. Actual testing and verification showed this system is able to successfully collect data from DOAJ and E-LIS open journal resources related to library and information science. The system is now active and functional, and can be used by researchers in the library and science information field.
Background Currently, inadequate wheelchair provision has forced many people with disabilities to be trapped in a cycle of poverty and deprivation, limiting their ability to access education, work and social facilities. This issue is in part because of the lack of collaboration among various stakeholders who need to work together to design, manufacture and deliver such assistive mobility devices. This in turn has led to inadequate evidence about intervention effectiveness, disability prevalence and subsequent costeffectiveness that would help facilitate appropriate provision and support for people with disabilities. Objectives In this paper, we describe a novel conceptual framework that can be tested across the globe to study and evaluate the effectiveness of wheelchair provision. Method The Comparative Effectiveness Research Subcommittee (CER-SC), consisting of the authors of this article, housed within the Evidence-Based Practice Working Group (EBP-WG) of the International Society of Wheelchair Professionals (ISWP), conducted a scoping review of scientific literature and standard practices used during wheelchair service provision. The literature review was followed by a series of discussion groups. Results The three iterations of the conceptual framework are described in this manuscript. Conclusion We believe that adoption of this conceptual framework could have broad applications in wheelchair provision globally to develop evidence-based practices. Such a perspective will help in the comparison of different strategies employed in wheelchair provision and further improve clinical guidelines. Further work is being conducted to test the efficacy of this conceptual framework to evaluate effectiveness of wheelchair service provision in various settings across the globe. PMID:28936421
Erdal, Selnur; Catalyurek, Umit V.; Saltz, Joel; Kamal, Jyoti; Gurcan, Metin N.
Medical centers collect and store significant amount of valuable data pertaining to patients' visit in the form of medical free-text. In addition, standardized diagnosis codes (International Classification of Diseases, Ninth Revision, Clinical Modification: ICD9-CM) related to those dictated reports are usually available. In this work, we have created a framework where image searches could be initiated through a combination of free-text reports as well as ICD9 codes. This framework enables more comprehensive search on existing large sets of patient data in a systematic way. The free text search is enriched by computer-aided inclusion of additional search terms enhanced by a thesaurus. This combination of enriched search allows users to access to a larger set of relevant results from a patient-centric PACS in a simpler way. Therefore, such framework is of particular use in tasks such as gathering images for desired patient populations, building disease models, and so on. As the motivating application of our framework, we implemented a search engine. This search engine processed two years of patient data from the OSU Medical Center's Information Warehouse and identified lung nodule location information using a combination of UMLS Meta-Thesaurus enhanced text report searches along with ICD9 code searches on patients that have been discharged. Five different queries with various ICD9 codes involving lung cancer were carried out on 172552 cases. Each search was completed under a minute on average per ICD9 code and the inclusion of UMLS thesaurus increased the number of relevant cases by 45% on average.
Hooten, W Michael; Brummett, Chad M; Sullivan, Mark D; Goesling, Jenna; Tilburt, Jon C; Merlin, Jessica S; St Sauver, Jennifer L; Wasan, Ajay D; Clauw, Daniel J; Warner, David O
An urgent need exists to better understand the transition from short-term opioid use to unintended prolonged opioid use (UPOU). The purpose of this work is to propose a conceptual framework for understanding UPOU that posits the influence of 3 principal domains that include the characteristics of (1) individual patients, (2) the practice environment, and (3) opioid prescribers. Although no standardized method exists for developing a conceptual framework, the process often involves identifying corroborative evidence, leveraging expert opinion to identify factors for inclusion in the framework, and developing a graphic depiction of the relationships between the various factors and the clinical problem of interest. Key patient characteristics potentially associated with UPOU include (1) medical and mental health conditions; (2) pain etiology; (3) individual affective, behavioral, and neurophysiologic reactions to pain and opioids; and (4) sociodemographic factors. Also, UPOU could be influenced by structural and health care policy factors: (1) the practice environment, including the roles of prescribing clinicians, adoption of relevant practice guidelines, and clinician incentives or disincentives, and (2) the regulatory environment. Finally, characteristics inherent to clinicians that could influence prescribing practices include (1) training in pain management and opioid use; (2) personal attitudes, knowledge, and beliefs regarding the risks and benefits of opioids; and (3) professionalism. As the gatekeeper to opioid access, the behavior of prescribing clinicians directly mediates UPOU, with the 3 domains interacting to determine this behavior. This proposed conceptual framework could guide future research on the topic and allow plausible hypothesis-based interventions to reduce UPOU. Copyright © 2017 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.
Tullney, Marco; van Wezenbeek, Wilma
Slides of an overview presentation given at a CESAER workshop on Open Access, February 2nd, 2017, in Brussels Cover major routes to more open access as discussed in the Task Force Open Science of CESAER: (national) open access strategies open access mandates open access incentives open access awareness open access publishing open access infrastructure
with the transformation agenda in South Africa. It is proposed that a comprehensive quality assurance framework with embedded commitment to access is likely to respond appropriately to national development prerogatives of higher education access. South African Journal of Higher Education Vol. 21 (3) 2007: pp. 385-399.
Wang, Wayne; Zou, Chen; Luo, Wenyi
This paper goal is to provide a framework for the remote configuration and management of services for PON (Passive Optical Network) access and fiber access. Also it defines how Auto-Configuration Servers (ACS) in the network can remotely configure, troubleshoot and manage a Passive Optical Network (PON) optical network termination (ONT) with layer 3 capabilities using the CPE WAN management protocol, TR-069.
Rogers-Shaw, Carol; Carr-Chellman, Davin J.; Choi, Jinhee
Universal Design for Learning (UDL) is a framework for the teaching-learning transaction that conceptualizes knowledge through learner-centered foci emphasizing accessibility, collaboration, and community. Given the importance of access to achieving social justice, UDL is a promising approach to meeting all learners' needs more effectively. In…
Zhang, Jiang; Guo, Hanqi; Yuan, Xiaoru
We present a novel high-order access dependencies based model for efficient pathline computation in unsteady flow visualization. By taking longer access sequences into account to model more sophisticated data access patterns in particle tracing, our method greatly improves the accuracy and reliability in data access prediction. In our work, high-order access dependencies are calculated by tracing uniformly-seeded pathlines in both forward and backward directions in a preprocessing stage. The effectiveness of our proposed approach is demonstrated through a parallel particle tracing framework with high-order data prefetching. Results show that our method achieves higher data locality and hence improves the efficiency of pathline computation.
Department of Transportation — This data set contains the personnel access card data (photo, name, activation/expiration dates, card number, and access level) as well as data about turnstiles and...
A report on how nine rail builder, operators and transport designers deal with design for accessibility......A report on how nine rail builder, operators and transport designers deal with design for accessibility...
Open Access (OA), defined most simply, means free full text online. There are over 130 Open Access journals hosted on the AJOL website. You can find a full list of these journals here: OA journals on AJOL ...
Cosma Emil; Jeflea Victor
By using Word, Excel or PowerPoint one can automate routine operations using the VBA language (Visual Basic for Applications). This language is also used in Access, allowing access to data stored in tables or queries. Thus, Access and VBA resources can be used together. Access is designed for programming forms and reports (among other things), so there won’t be found any of the VBA editor’s specific forms.
Geisler, G C [Pennsylvania State University (United States)
At the conference there was a considerable interest in research reactor standards and effluent standards in particular. On the program, this is demonstrated by the panel discussion on effluents, the paper on argon 41 measured by Sims, and the summary paper by Ringle, et al. on the activities of ANS research reactor standards committee (ANS-15). As a result, a meeting was organized to discuss the proposed ANS standard on research reactor effluents (15.9). This was held on Tuesday evening, was attended by members of the ANS-15 committee who were present at the conference, participants in the panel discussion on the subject, and others interested. Out of this meeting came a number of excellent suggestions for changes which will increase the utility of the standard, and a strong recommendation that the effluent standard (15.9) be combined with the effluent monitoring standard. It is expected that these suggestions and recommendations will be incorporated and a revised draft issued for comment early this summer. (author)
Fichtner, N.; Becker, K.; Bashir, M.
This compilation of all nuclear standards available to the authors by mid 1980 represents the third, carefully revised edition of a catalogue which was first published in 1975 as EUR 5362. In this third edition several changes have been made. The title has been condensed. The information has again been carefully up-dated, covering all changes regarding status, withdrawal of old standards, new projects, amendments, revisions, splitting of standards into several parts, combination of several standards into one, etc., as available to the authors by mid 1980. The speed with which information travels varies and requires in many cases rather tedious and cumbersome inquiries. Also, the classification scheme has been revised with the goal of better adjustment to changing situations and priorities. Whenever it turned out to be difficult to attribute a standard to a single subject category, multiple listings in all relevant categories have been made. As in previous editions, within the subcategories the standards are arranged by organization (in Categorie 2.1 by country) alphabetically and in ascending numerical order. It covers all relevant areas of power reactors, the fuel cycle, radiation protection, etc., from the basic laws and governmental regulations, regulatory guides, etc., all the way to voluntary industrial standards and codes of pratice. (orig./HP)
Thompson, Terrill; Primlani, Saroj; Fiedor, Lisa
The main goal of accessibility standards and guidelines is to design websites everyone can use. The "IT Accessibility Constituent Group" developed this set of draft guidelines to help EQ authors, reviewers, and staff and the larger EDUCAUSE community ensure that web content is accessible to all users, including those with disabilities. This…
Ostadzadeh, S. Shervin; Rahmani, Amir Masoud
Nowadays, the Operating System (OS) isn't only the software that runs your computer. In the typical information-driven organization, the operating system is part of a much larger platform for applications and data that extends across the LAN, WAN and Internet. An OS cannot be an island unto itself; it must work with the rest of the enterprise. Enterprise wide applications require an Enterprise Operating System (EOS). Enterprise operating systems used in an enterprise have brought about an inevitable tendency to lunge towards organizing their information activities in a comprehensive way. In this respect, Enterprise Architecture (EA) has proven to be the leading option for development and maintenance of enterprise operating systems. EA clearly provides a thorough outline of the whole information system comprising an enterprise. To establish such an outline, a logical framework needs to be laid upon the entire information system. Zachman Framework (ZF) has been widely accepted as a standard scheme for identifying and organizing descriptive representations that have prominent roles in enterprise-wide system development. In this paper, we propose a framework based on ZF for enterprise operating systems. The presented framework helps developers to design and justify completely integrated business, IT systems, and operating systems which results in improved project success rate.
Bridges, Edwin M.; Groves, Barry R.
Explicates a conceptual framework for analyzing the politics of personnel evaluation in an educational context. Using several elements of the framework, discusses the politics of teacher evaluation in California in relation to the types of personnel evaluation decisions, the actors, their access to these decisions, sources and levels of power, and…
McKay, Sharon Cline
Discusses issues librarians need to consider when providing access to electronic journals. Topics include gateways; index and abstract services; validation and pay-per-view; title selection; integration with OPACs (online public access catalogs)or Web sites; paper availability; ownership versus access; usage restrictions; and services offered…
Full Text Available Abstract Uncensored exchange of scientific results hastens progress. Open Access does not stop at the removal of price and permission barriers; still, censorship and reading disabilities, to name a few, hamper access to information. Here, we invite the scientific community and the public to discuss new methods to distribute, store and manage literature in order to achieve unfettered access to literature.
Olesen, Henning; Khajuria, Samant
TODAY, DATA IS MONEY. Whether it is private users’ personal data or confidential data and assets belonging to service providers, all parties have a strong need to protect their resources when interacting with each other, i.e. for access control and authorization. For service providers and enterpr......TODAY, DATA IS MONEY. Whether it is private users’ personal data or confidential data and assets belonging to service providers, all parties have a strong need to protect their resources when interacting with each other, i.e. for access control and authorization. For service providers...... and enterprises resources are usually well safeguarded, while private users are often missing the tools and the know-how to protect their own data and preserve their privacy. The user’s personal data have become an economic asset, not necessarily to the owners of these data, but to the service providers, whose...... business mod- els often includes the use of these data. In this paper we focus on the user – service provider interaction and discuss how recent technological progress, in particular the framework of User Managed Access (UMA), can enable users to understand the value of their protected resources...
Stephen C. Dorner, MSc
Full Text Available Introduction: Under regulations established by the Affordable Care Act, insurance plans must meet minimum standards in order to be sold through the federal Marketplace. These standards to become a qualified health plan (QHP include maintaining a provider network sufficient to assure access to services. However, the complexity of emergency physician (EP employment practices – in which the EPs frequently serve as independent contractors of emergency departments, independently establish insurance contracts, etc... – and regulations governing insurance repayment may hinder the application of network adequacy standards to emergency medicine. As such, we hypothesized the existence of QHPs without in-network access to EPs. The objective is to identify whether there are QHPs without in-network access to EPs using information available through the federal Marketplace and publicly available provider directories. Results: In a national sample of Marketplace plans, we found that one in five provider networks lacks identifiable in-network EPs. QHPs lacking EPs spanned nearly half (44% of the 34 states using the federal Marketplace. Conclusion: Our data suggest that the present regulatory framework governing network adequacy is not generalizable to emergency care, representing a missed opportunity to protect patient access to in-network physicians. These findings and the current regulations governing insurance payment to EPs dis-incentivize the creation of adequate physician networks, incentivize the practice of balance billing, and shift the cost burden to patients.
Juan D. Deaton; Luiz A. DaSilva; Christian Wernz
A current trend in spectrum regulation is to incorporate spectrum sharing through the design of spectrum access rules that support Dynamic Spectrum Access (DSA). This paper develops a decision-theoretic framework for regulators to assess the impacts of different decision rules on both primary and secondary operators. We analyze access rules based on sensing and exclusion areas, which in practice can be enforced through geolocation databases. Our results show that receiver-only sensing provides insufficient protection for primary and co-existing secondary users and overall low social welfare. On the other hand, using sensing information between the transmitter and receiver of a communication link, provides dramatic increases in system performance. The performance of using these link end points is relatively close to that of using many cooperative sensing nodes associated to the same access point and large link exclusion areas. These results are useful to regulators and network developers in understanding in developing rules for future DSA regulation.
Matsuda, Fumio; Nakabayashi, Ryo; Sawada, Yuji; Suzuki, Makoto; Hirai, Masami Y.; Kanaya, Shigehiko; Saito, Kazuki
A novel framework for automated elucidation of metabolite structures in liquid chromatography–mass spectrometer metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method. PMID:22645535
Full Text Available A novel framework for automated elucidation of metabolite structures in liquid chromatography-mass spectrometer (LC-MS metabolome data was constructed by integrating databases. High-resolution tandem mass spectra data automatically acquired from each metabolite signal were used for database searches. Three distinct databases, KNApSAcK, ReSpect, and the PRIMe standard compound database, were employed for the structural elucidation. The outputs were retrieved using the CAS metabolite identifier for identification and putative annotation. A simple metabolite ontology system was also introduced to attain putative characterization of the metabolite signals. The automated method was applied for the metabolome data sets obtained from the rosette leaves of 20 Arabidopsis accessions. Phenotypic variations in novel Arabidopsis metabolites among these accessions could be investigated using this method.
Farmer, R. E.
The MATE (Modular Automatic Test Equipment) program was developed to combat the proliferation of unique, expensive ATE within the Air Force. MATE incorporates a standard management approach and a standard architecture designed to implement a cradle-to-grave approach to the acquisition of ATE and to significantly reduce the life cycle cost of weapons systems support. These standards are detailed in the MATE Guides. The MATE Guides assist both the Air Force and Industry in implementing the MATE concept, and provide the necessary tools and guidance required for successful acquisition of ATE. The guides also provide the necessary specifications for industry to build MATE-qualifiable equipment. The MATE architecture provides standards for all key interfaces of an ATE system. The MATE approach to the acquisition and management of ATE has been jointly endorsed by the commanders of Air Force Systems Command and Air Force Logistics Command as the way of doing business in the future.
The number of children and adolescents accessing the Internet as well as the amount of time online are steadily increasing. The most common online activities include playing video games, accessing web sites, and communicating via chat rooms, email, and instant messaging. A theoretical framework for understanding the effects of Internet use on…
Wireless communication is witnessing tremendous growth with proliferation of different standards covering wide, local and personal area networks (WAN, LAN and PAN). The trends call for designs that allow 1) smooth migration to future generations of wireless standards with higher data rates for multimedia applications, 2) convergence of wireless services allowing access to different standards from the same wireless device, 3) inter-continental roaming. This requires designs that work across multiple wireless standards, can easily be reused, achieve maximum hardware share at a minimum power consumption levels particularly for mobile battery-operated devices.
Escher, Beate I; Aїt-Aїssa, Selim; Behnisch, Peter A; Brack, Werner; Brion, François; Brouwer, Abraham; Buchinger, Sebastian; Crawford, Sarah E; Du Pasquier, David; Hamers, Timo; Hettwer, Karina; Hilscherová, Klára; Hollert, Henner; Kase, Robert; Kienle, Cornelia; Tindall, Andrew J; Tuerk, Jochen; van der Oost, Ron; Vermeirssen, Etienne; Neale, Peta A
Effect-based methods including cell-based bioassays, reporter gene assays and whole-organism assays have been applied for decades in water quality monitoring and testing of enriched solid-phase extracts. There is no common EU-wide agreement on what level of bioassay response in water extracts is acceptable. At present, bioassay results are only benchmarked against each other but not against a consented measure of chemical water quality. The EU environmental quality standards (EQS) differentiate between acceptable and unacceptable surface water concentrations for individual chemicals but cannot capture the thousands of chemicals in water and their biological action as mixtures. We developed a method that reads across from existing EQS and includes additional mixture considerations with the goal that the derived effect-based trigger values (EBT) indicate acceptable risk for complex mixtures as they occur in surface water. Advantages and limitations of various approaches to read across from EQS are discussed and distilled to an algorithm that translates EQS into their corresponding bioanalytical equivalent concentrations (BEQ). The proposed EBT derivation method was applied to 48 in vitro bioassays with 32 of them having sufficient information to yield preliminary EBTs. To assess the practicability and robustness of the proposed approach, we compared the tentative EBTs with observed environmental effects. The proposed method only gives guidance on how to derive EBTs but does not propose final EBTs for implementation. The EBTs for some bioassays such as those for estrogenicity are already mature and could be implemented into regulation in the near future, while for others it will still take a few iterations until we can be confident of the power of the proposed EBTs to differentiate good from poor water quality with respect to chemical contamination. Copyright © 2018 Elsevier B.V. All rights reserved.
The Kepler space telescope is designed to detect Earth-like planets around Sun-like stars using transit photometry by simultaneously observing 100,000 stellar targets nearly continuously over a three and a half year period. The 96-megapixel focal plane consists of 42 charge-coupled devices (CCD) each containing two 1024 x 1100 pixel arrays. Cross-correlations between calibrated pixels are introduced by common calibrations performed on each CCD requiring downstream data products access to the calibrated pixel covariance matrix in order to properly estimate uncertainties. The prohibitively large covariance matrices corresponding to the 75,000 calibrated pixels per CCD preclude calculating and storing the covariance in standard lock-step fashion. We present a novel framework used to implement standard propagation of uncertainties (POU) in the Kepler Science Operations Center (SOC) data processing pipeline. The POU framework captures the variance of the raw pixel data and the kernel of each subsequent calibration transformation allowing the full covariance matrix of any subset of calibrated pixels to be recalled on-the-fly at any step in the calibration process. Singular value decomposition (SVD) is used to compress and low-pass filter the raw uncertainty data as well as any data dependent kernels. The combination of POU framework and SVD compression provide downstream consumers of the calibrated pixel data access to the full covariance matrix of any subset of the calibrated pixels traceable to pixel level measurement uncertainties without having to store, retrieve and operate on prohibitively large covariance matrices. We describe the POU Framework and SVD compression scheme and its implementation in the Kepler SOC pipeline.
Ulrich Fuller, Laurie
The easy guide to Microsoft Access returns with updates on the latest version! Microsoft Access allows you to store, organize, view, analyze, and share data; the new Access 2013 release enables you to build even more powerful, custom database solutions that integrate with the web and enterprise data sources. Access 2013 For Dummies covers all the new features of the latest version of Accessand serves as an ideal reference, combining the latest Access features with the basics of building usable databases. You'll learn how to create an app from the Welcome screen, get support
Pro Access 2010 Development is a fundamental resource for developing business applications that take advantage of the features of Access 2010 and the many sources of data available to your business. In this book, you'll learn how to build database applications, create Web-based databases, develop macros and Visual Basic for Applications (VBA) tools for Access applications, integrate Access with SharePoint and other business systems, and much more. Using a practical, hands-on approach, this book will take you through all the facets of developing Access-based solutions, such as data modeling, co
Full Text Available The main objective of this work is to analyze and extend security model of mobile devices running on Android OS. Provided security extension is a Linux kernel security module that allows the system administrator to restrict program's capabilities with per-program profiles. Profiles can allow capabilities like network access, raw socket access, and the permission to read, write, or execute files on matching paths. Module supplements the traditional Android capability access control model by providing mandatory access control (MAC based on path. This extension increases security of access to system objects in a device and allows creating security sandboxes per application.
Kaiser, Mary Elizabeth; Morris, Matthew J.; Aldoroty, Lauren N.; Pelton, Russell; Kurucz, Robert; Peacock, Grant O.; Hansen, Jason; McCandliss, Stephan R.; Rauscher, Bernard J.; Kimble, Randy A.; Kruk, Jeffrey W.; Wright, Edward L.; Orndorff, Joseph D.; Feldman, Paul D.; Moos, H. Warren; Riess, Adam G.; Gardner, Jonathan P.; Bohlin, Ralph; Deustua, Susana E.; Dixon, W. V.; Sahnow, David J.; Perlmutter, Saul
Establishing improved spectrophotometric standards is important for a broad range of missions and is relevant to many astrophysical problems. ACCESS, "Absolute Color Calibration Experiment for Standard Stars", is a series of rocket-borne sub-orbital missions and ground-based experiments designed to enable improvements in the precision of the astrophysical flux scale through the transfer of absolute laboratory detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35 - 1.7μm bandpass. This paper describes the sub-system testing, payload integration, avionics operations, and data transfer for the ACCESS instrument.
Permann, Cody; Alger, Brian; Peterson, John; Slaughter, Andrew; Andrš, David; Martineau, Richard
The MOOSE Framework is a modular pluggable framework for building complex simulations. The ability to add new objects with custom syntax is a core capability that makes MOOSE a powerful platform for coupling multiple applications together within a single environment. The creation of a new, more standardized JSON syntax output improves the external interfaces for generating graphical components or for validating input file syntax. The design of this interface and the requirements it satisfies are covered in this short report.
Permann, Cody [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alger, Brian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John [Idaho National Lab. (INL), Idaho Falls, ID (United States); Slaughter, Andrew [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrš, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard [Idaho National Lab. (INL), Idaho Falls, ID (United States)
The MOOSE Framework is a modular pluggable framework for building complex simulations. The ability to add new objects with custom syntax is a core capability that makes MOOSE a powerful platform for coupling multiple applications together within a single environment. The creation of a new, more standardized JSON syntax output improves the external interfaces for generating graphical components or for validating input file syntax. The design of this interface and the requirements it satisfies are covered in this short report.
Barcenas, M.; Mejia, M.
The purpose of this paper is to present an overview of the current regulatory framework concerning the radioactive waste management in Mexico. It is intended to show regulatory historical antecedents, the legal responsibilities assigned to institutions involved in the radioactive waste management, the sources of radioactive waste, and the development and preparation of national standards for fulfilling the legal framework for low level radioactive waste. It is at present the most important matter to be resolved. (author)
Open Access is high on the agenda in Denmark and internationally. Denmark has announced a national strategy for Open Access that aims to achieve Open Access to 80% in 2017 and 100% in 2022 to peer review research articles. All public Danish funders as well as H2020 requires that all peer review...... articles that is an outcome of their funding will be Open Access. Uploading your full texts (your final author manuscript after review ) to DTU Orbit is a fundamental part of providing Open Access to your research. We are here to answer all your questions with regards to Open Access and related topics...... such as copyright, DTU Orbit, Open Access journals, APCs, Vouchers etc....
Sydee, Ahmed Nasim
In the first essay, a theoretical model is developed to determine the time path of optimal access price in the telecommunications industry. Determining the optimal access price is an important issue in the economics of telecommunications. Setting a high access price discourages potential entrants; a low access price, on the other hand, amounts to confiscation of private property because the infrastructure already built by the incumbent is sunk. Furthermore, a low access price does not give the incumbent incentives to maintain the current network and to invest in new infrastructures. Much of the existing literature on access pricing suffers either from the limitations of a static framework or from the assumption that all costs are avoidable. The telecommunications industry is subject to high stranded costs and, therefore, to address this issue a dynamic model is imperative. This essay presents a dynamic model of one-way access pricing in which the compensation involved in deregulatory taking is formalized and then analyzed. The short run adjustment after deregulatory taking has occurred is carried out and discussed. The long run equilibrium is also analyzed. A time path for the Ramsey price is shown as the correct dynamic price of access. In the second essay, a theoretical model is developed to determine the time path of optimal access price for an infrastructure that is characterized by congestion and lumpy investment. Much of the theoretical literature on access pricing of infrastructure prescribes that the access price be set at the marginal cost of the infrastructure. In proposing this rule of access pricing, the conventional analysis assumes that infrastructure investments are infinitely divisible so that it makes sense to talk about the marginal cost of investment. Often it is the case that investments in infrastructure are lumpy and can only be made in large chunks, and this renders the marginal cost concept meaningless. In this essay, we formalize a model of
Programming Entity Framework is a thorough introduction to Microsoft's new core framework for modeling and interacting with data in .NET applications. This highly-acclaimed book not only gives experienced developers a hands-on tour of the Entity Framework and explains its use in a variety of applications, it also provides a deep understanding of its architecture and APIs -- knowledge that will be extremely valuable as you shift to the Entity Framework version in .NET Framework 4.0 and Visual Studio 2010. From the Entity Data Model (EDM) and Object Services to EntityClient and the Metadata Work
Davidson, Patricia; Rushton, Cynda Hylton; Kurtz, Melissa; Wise, Brian; Jackson, Debra; Beaman, Adam; Broome, Marion
To develop a framework to enable discussion, debate and the formulation of interventions to address ethical issues in nursing practice. Social, cultural, political and economic drivers are rapidly changing the landscape of health care in our local environments but also in a global context. Increasingly, nurses are faced with a range of ethical dilemmas in their work. This requires investigation into the culture of healthcare systems and organisations to identify the root causes and address the barriers and enablers of ethical practice. The increased medicalisation of health care; pressures for systemisation; efficiency and cost reduction; and an ageing population contribute to this complexity. Often, ethical issues in nursing are considered within the abstract and philosophical realm until a dilemma is encountered. Such an approach limits the capacity to tangibly embrace ethical values and frameworks as pathways to equitable, accessible, safe and quality health care and as a foundation for strengthening a supportive and enabling workplace for nurses and other healthcare workers. Conceptual framework development. A comprehensive literature review was undertaken using the social-ecological framework as an organising construct. This framework views ethical practice as the outcome of interaction among a range of factors at eight levels: individual factors (patients and families); individual factors (nurses); relationships between healthcare professionals; relationships between patients and nurses; organisational healthcare context; professional and education regulation and standards; community; and social, political and economic. Considering these elements as discrete, yet interactive and intertwined forces can be useful in developing interventions to promote ethical practice. We consider this framework to have utility in policy, practice, education and research. Nurses face ethical challenges on a daily basis, considering these within a social-ecological framework can
Kreickemeier, Udo; Raimondos-Møller, Pascalis
Reducing tariffs and increasing consumption taxes is a standard IMF advice to countries that want to open up their economy without hurting government finances. Indeed, theoretical analysis of such a tariff-tax reform shows an unambiguous increase in welfare and government revenues. The present pa...... efficient proposal to follow both as far as it concerns market access and welfare.JEL code: F13, H20.Keywords: Market access; tariff reform, consumption tax reform....
Ionela Cristina Breahna Pravat
Full Text Available Following the creation of a set of concepts, principles and generally accepted international accounting conventions, to which any elaboration, interpretation or enforcement of accounting and financial information would refer, IASC (later IASB has developed, in 1989, the Framework for the Preparation and Presentation of Financial Statements that, although inspired from the American one, didn’t address predominantly only to a single category of users (investors, but several categories of representatives of accounting information demand. Nowadays, it is now known that international body of accounting normalization - IASB (International Accounting Standards Board, cooperates with the American body - FASB (Financial Accounting Standards Board for the purpose of developing a Single Conceptual Framework, which is an important phase in strengthening current and future international accounting standardization process. Conceptual Framework for Financial Reporting, published in September 2010 by the IASB, replaced the Framework for the Preparation and Presentation of Financial Statements issued in 1989 and is actually the result of the current process of updating the General framework of the IASB, but also represents the completion of an important stage in the process to develop a single conceptual framework.
Baina, Amine; El Kalam, Anas Abou; Deswarte, Yves; Kaaniche, Mohamed
A critical infrastructure (CI) can fail with various degrees of severity due to physical and logical vulnerabilities. Since many interdependencies exist between CIs, failures can have dramatic consequences on the entire infrastructure. This paper focuses on threats that affect information and communication systems that constitute the critical information infrastructure (CII). A new collaborative access control framework called PolyOrBAC is proposed to address security problems that are specific to CIIs. The framework offers each organization participating in a CII the ability to collaborate with other organizations while maintaining control of its resources and internal security policy. The approach is demonstrated on a practical scenario involving the electrical power grid.
Blumenfeld, Barry; Johns Hopkins U.; Dykstra, David; Lueking, Lee; Wicklund, Eric; Fermilab
The CMS experiment at the LHC has established an infrastructure using the FroNTier framework to deliver conditions (i.e. calibration, alignment, etc.) data to processing clients worldwide. FroNTier is a simple web service approach providing client HTTP access to a central database service. The system for CMS has been developed to work with POOL which provides object relational mapping between the C++ clients and various database technologies. Because of the read only nature of the data, Squid proxy caching servers are maintained near clients and these caches provide high performance data access. Several features have been developed to make the system meet the needs of CMS including careful attention to cache coherency with the central database, and low latency loading required for the operation of the online High Level Trigger. The ease of deployment, stability of operation, and high performance make the FroNTier approach well suited to the GRID environment being used for CMS offline, as well as for the online environment used by the CMS High Level Trigger (HLT). The use of standard software, such as Squid and various monitoring tools, make the system reliable, highly configurable and easily maintained. We describe the architecture, software, deployment, performance, monitoring and overall operational experience for the system
Blumenfeld, B; Dykstra, D; Lueking, L; Wicklund, E
The CMS experiment at the LHC has established an infrastructure using the FroNTier framework to deliver conditions (i.e. calibration, alignment, etc.) data to processing clients worldwide. FroNTier is a simple web service approach providing client HTTP access to a central database service. The system for CMS has been developed to work with POOL which provides object relational mapping between the C++ clients and various database technologies. Because of the read only nature of the data, Squid proxy caching servers are maintained near clients and these caches provide high performance data access. Several features have been developed to make the system meet the needs of CMS including careful attention to cache coherency with the central database, and low latency loading required for the operation of the online High Level Trigger. The ease of deployment, stability of operation, and high performance make the FroNTier approach well suited to the GRID environment being used for CMS offline, as well as for the online environment used by the CMS High Level Trigger. The use of standard software, such as Squid and various monitoring tools, makes the system reliable, highly configurable and easily maintained. We describe the architecture, software, deployment, performance, monitoring and overall operational experience for the system
Ishan Sudeera Abeywardena
Full Text Available The open educational resources (OER movement has gained momentum in the past few years. With this new drive towards making knowledge open and accessible, a large number of OER repositories have been established and made available online throughout the world. However, the inability of existing search engines such as Google, Yahoo!, and Bing to effectively search for useful OER which are of acceptable academic standard for teaching purposes is a major factor contributing to the slow uptake of the entire movement. As a major step towards solving this issue, this paper proposes OERScout, a technology framework based on text mining solutions. The objectives of our work are to (i develop a technology framework which will parametrically measure the usefulness of an OER for a particular academic purpose based on the openness, accessibility, and relevance attributes; and (ii provide academics with a mechanism to locate OER which are of an acceptable academic standard. From our user tests, we have identified that OERScout is a sound solution for effectively zeroing in on OER which can be readily used for teaching and learning purposes.
Gujba, Haruna; Thorne, Steve; Mulugetta, Yacob; Rai, Kavita; Sokona, Youba
Modern energy access in Africa is critical to meeting a wide range of developmental challenges including poverty reduction and the Millennium Development Goals (MDGs). Despite having a huge amount and variety of energy resources, modern energy access in the continent is abysmal, especially Sub-Saharan Africa. Only about 31% of the Sub-Saharan African population have access to electricity while traditional biomass energy accounts for over 80% of energy consumption in many Sub-Saharan African countries. With energy use per capita among the lowest in the world, there is no doubt that Africa will need to increase its energy consumption to drive economic growth and human development. Africa also faces a severe threat from global climate change with vulnerabilities in several key areas or sectors in the continent including agriculture, water supply, energy, etc. Low carbon development provides opportunities for African countries to improve and expand access to modern energy services while also building low-emission and climate-resilient economies. However, access to finance from different sources will be critical in achieving these objectives. This paper sets out to explore the financial instruments available for low carbon energy access in Africa including the opportunities, markets and risks in low carbon energy investments in the continent. - Highlights: ► Access to finance will be critical to achieving low carbon energy access in Africa. ► Domestic finance will be important in leveraging private finance. ► Private sector participation in modern and clean energy in Africa is still low. ► Many financing mechanisms exist for low carbon energy access in Africa. ► The right institutional frameworks are critical to achieving low carbon energy access in Africa.
...) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG PRODUCTS..., Processing, and Packaging of Livestock and Poultry Products § 54.1014 Accessibility of equipment and utensils...
Kaiser, Mary Elizabeth; Morris, Matthew J.; Aldoroty, Lauren Nicole; Godon, David; Pelton, Russell; McCandliss, Stephan R.; Kurucz, Robert L.; Kruk, Jeffrey W.; Rauscher, Bernard J.; Kimble, Randy A.; Wright, Edward L.; Benford, Dominic J.; Gardner, Jonathan P.; Feldman, Paul D.; Moos, H. Warren; Riess, Adam G.; Bohlin, Ralph; Deustua, Susana E.; Dixon, William Van Dyke; Sahnow, David J.; Lampton, Michael; Perlmutter, Saul
ACCESS: Absolute Color Calibration Experiment for Standard Stars is a series of rocket-borne sub-orbital missions and ground-based experiments designed to leverage significant technological advances in detectors, instruments, and the precision of the fundamental laboratory standards used to calibrate these instruments to enable improvements in the precision of the astrophysical flux scale through the transfer of laboratory absolute detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35 to 1.7 micron bandpass.A cross wavelength calibration of the astrophysical flux scale to this level of precision over this broad a bandpass is relevant for the data used to probe fundamental astrophysical problems such as the SNeIa photometry based measurements used to constrain dark energy theories.We will describe the strategy for achieving this level of precision, the payload and calibration configuration, present sub-system test data, and the status and preliminary performance of the integration and test of the spectrograph and telescope. NASA APRA sounding rocket grant NNX14AH48G supports this work.
Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards
Fuller, Richard A; Lee, Jasmine R; Watson, James E M
Conservation science is a crisis discipline in which the results of scientific enquiry must be made available quickly to those implementing management. We assessed the extent to which scientific research published since the year 2000 in 20 conservation science journals is publicly available. Of the 19,207 papers published, 1,667 (8.68%) are freely downloadable from an official repository. Moreover, only 938 papers (4.88%) meet the standard definition of open access in which material can be freely reused providing attribution to the authors is given. This compares poorly with a comparable set of 20 evolutionary biology journals, where 31.93% of papers are freely downloadable and 7.49% are open access. Seventeen of the 20 conservation journals offer an open access option, but fewer than 5% of the papers are available through open access. The cost of accessing the full body of conservation science runs into tens of thousands of dollars per year for institutional subscribers, and many conservation practitioners cannot access pay-per-view science through their workplace. However, important initiatives such as Research4Life are making science available to organizations in developing countries. We urge authors of conservation science to pay for open access on a per-article basis or to choose publication in open access journals, taking care to ensure the license allows reuse for any purpose providing attribution is given. Currently, it would cost $51 million to make all conservation science published since 2000 freely available by paying the open access fees currently levied to authors. Publishers of conservation journals might consider more cost effective models for open access and conservation-oriented organizations running journals could consider a broader range of options for open access to nonmembers such as sponsorship of open access via membership fees. © 2014 The Authors. Conservation Biology published by Wiley Periodicals, Inc., on behalf of the Society for
Juan D. Deaton; Ryan E. lrwin; Luiz A. DaSilva
As early as 2014, wireless network operators spectral capacity will be overwhelmed by a data tsunami brought on by new devices and applications. To augment spectral capacity, operators could deploy a Dynamic Spectrum Access (DSA) overlay. In the light of the many planned Long Term Evolution (LTE) network deployments, the affects of a DSA overlay have not been fully considered into the existing LTE standards. Coalescing many different aspects of DSA, this paper develops the Spectrum Accountability (SA) framework. The SA framework defines specific network element functionality, protocol interfaces, and signaling flow diagrams for LTE to support service requests and enforce rights of responsibilities of primary and secondary users, respectively. We also include a network simulation to quantify the benefits of using DSA channels to augment capacity. Based on our simulation we show that, network operators can benefit up to %40 increase in operating capacity when sharing DSA bands to augment spectral capacity. With our framework, this paper could serve as an guide in developing future LTE network standards that include DSA.
There are strong pragmatic and moral reasons for receiving societies to address access to healthcare for migrants. Receiving societies have a pragmatic interest in sustaining migrants' health to facilitate integration; they also have a moral obligation to ensure migrants' access to healthcare...... according to international human rights principles. The intention of this thesis is to increase the understanding of migrants' access to healthcare by exploring two study aims: 1) Are there differences in migrants' access to healthcare compared to that of non-migrants? (substudy I and II); and 2) Why...... are there possible differences in migrants' access to healthcare compared to that of non-migrants? (substudy III and IV). The thesis builds on different methodological approaches using both register-based retrospective cohort design, cross-sectional design and survey methods. Two different measures of access were...
Wu, Binghui Helen; Wharton, Stephen (Technical Monitor)
Query processing is the most common operation in a DBMS. Sophisticated query processing has been mainly targeted at a single enterprise environment providing centralized control over data and metadata. Submitting queries by anonymous users on the web is different in such a way that load balancing or DBMS' accessing control becomes the key issue. This paper provides a solution by introducing a framework for WWW query processing. The success of this framework lies in the utilization of query optimization techniques and the ontological approach. This methodology has proved to be cost effective at the NASA Goddard Space Flight Center Distributed Active Archive Center (GDAAC).
Esterly, Sean; Baring-Gould, Ian; Booth, Samuel
To address the root challenges of providing quality power to remote consumers through financially viable mini-grids, the Global Lighting and Energy Access Partnership (Global LEAP) initiative of the Clean Energy Ministerial and the U.S. Department of Energy teamed with the National Renewable Energy Laboratory (NREL) and Power Africa to develop a Quality Assurance Framework (QAF) for isolated mini-grids. The framework addresses both alternating current (AC) and direct current (DC) mini-grids, and is applicable to renewable, fossil-fuel, and hybrid systems.
This book offers practical recipes to solve a variety of common problems that users have with extracting Access data and performing calculations on it. Whether you use Access 2007 or an earlier version, this book will teach you new methods to query data, different ways to move data in and out of Access, how to calculate answers to financial and investment issues, how to jump beyond SQL by manipulating data with VBA, and more.
Bacchiega, Emanuele; Randon, Emanuela; Zirulia, Lorenzo
We analyze the effect of competition in market-accessibility enhancement among quality-differentiated firms. Firms are located in regions with different ex-ante transport costs to reach the final market. We characterize the equilibrium of the two-stage game in which firms first invest to improve market accessibility and then compete in prices. Efforts in accessibility improvement crucially depend on the interplay between the willingness to pay for the quality premium of the median consumer an...
Svendsen, Michael; Hansen, Lars Asger Juel; Andersen, Dorte
Open Access Monitor - DK (OAM-DK) is a 2-year DEFF funded [DEFF.2016-0018] national project running in 2017-2018 with the aim of collecting, documenting and administrating Open Access publishing costs. OAM-DK is lead by Copenhagen University Library under the Royal Danish Library with participation...... of all Danish University Libraries. This poster presents the first results of Open Access costs related to 2015 publications at the The University of Copenhagen....
.86: Ethernet over LAPS. Standard in China and India. G.7041: Generic Framing Procedure (GFP). Supports Ethernet as well as other data formats (e.g., Fibre Channel); Protocol of ... IEEE 802.3x for flow control of incoming Ethernet data ...
International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...
International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...
Because of its vast software investment in Fortran programs, the nuclear community has an inherent interest in the evolution of Fortran. This paper reviews the impact of the new Fortran 77 standard and discusses the projected changes which can be expected in the future
The tenets of Open Access are to grant anyone, anywhere and anytime free access to the results of scientific research. HEP spearheaded the Open Access dissemination of scientific results with the mass mailing of preprints in the pre-WWW era and with the launch of the arXiv preprint system at the dawn of the '90s. The HEP community is now ready for a further push to Open Access while retaining all the advantages of the peer-review system and, at the same time, bring the spiralling cost of journal subscriptions under control. I will present a possible plan for the conversion to Open Access of HEP peer-reviewed journals, through a consortium of HEP funding agencies, laboratories and libraries: SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics). SCOAP3 will engage with scientific publishers towards building a sustainable model for Open Access publishing, which is as transparent as possible for HEP authors. The current system in which journals income comes from subscription fees is replaced with a scheme where SCOAP3 compensates publishers for the costs incurred to organise the peer-review service and give Open Access to the final version of articles. SCOAP3 will be funded by all countries active in HEP under a 'fair share' scenario, according to their production of HEP articles. In this talk I will present a short overview of the history of Open Access in HEP, the details of the SCOAP3 model and the outlook for its implementation.
National Archives and Records Administration — The OGIS Access System (OAS) provides case management, stakeholder collaboration, and public communications activities including a web presence via a web portal.
A comprehensive reference to the updated and new features of Access 2013 As the world's most popular database management tool, Access enables you to organize, present, analyze, and share data as well as build powerful database solutions. However, databases can be complex. That's why you need the expert guidance in this comprehensive reference. Access 2013 Bible helps you gain a solid understanding of database purpose, construction, and application so that whether you're new to Access or looking to upgrade to the 2013 version, this well-rounded resource provides you with a th
Harper, Simon; Yesilada, Yeliz
Access to, and movement around, complex online environments, of which the World Wide Web (Web) is the most popular example, has long been considered an important and major issue in the Web design and usability field. The commonly used slang phrase ‘surfing the Web’ implies rapid and free access, pointing to its importance among designers and users alike. It has also been long established that this potentially complex and difficult access is further complicated, and becomes neither rapid nor free, if the user is disabled. There are millions of people who have disabilities that affect their use of the Web. Web accessibility aims to help these people to perceive, understand, navigate, and interact with, as well as contribute to, the Web, and thereby the society in general. This accessibility is, in part, facilitated by the Web Content Accessibility Guidelines (WCAG) currently moving from version one to two. These guidelines are intended to encourage designers to make sure their sites conform to specifications, and in that conformance enable the assistive technologies of disabled users to better interact with the page content. In this way, it was hoped that accessibility could be supported. While this is in part true, guidelines do not solve all problems and the new WCAG version two guidelines are surrounded by controversy and intrigue. This chapter aims to establish the published literature related to Web accessibility and Web accessibility guidelines, and discuss limitations of the current guidelines and future directions.
Full Text Available The emergence of IEEE802.16 wireless standard technology (WiMAX has significantly increased the choice to operators for the provisioning of wireless broadband access network. WiMAX is being deployed to compliment with xDSL in underserved or lack of the broadband network area, in both developed and developing countries. Many incumbent operators in developing countries are considering the deployment of WiMAX as part of their broadband access strategy. This paper presents an efficient and simple method for planning of broadband fixed wireless access (BFWA with IEEE802.16 standard to support home connection to Internet. The study formulates the framework for planning both coverage and capacity designs. The relationship between coverage area and access rate from subscriber in each environment area is presented. The study also presents the throughput and channel capacity of IEEE802.16 in different access rates. An extensive analysis is performed and the results are applied to the real case study to demonstrate the practicality of using IEEE 802.16 for connecting home to Internet. Using empirical data and original subscriber traffic from measurement, it is shown that the BFWA with IEEE802.16 standard is a capacity limited system. The capacity of IEEE802.16 is related to different factors including frequency bandwidth, spectrum allocation, estimation of traffic per subscriber, and choice of adaptive modulation from subscriber terminal. The wireless access methods and procedures evolved in this research work and set out in this paper are shown to be well suited for planning BFWA system based on IEEE802.16 which supports broadband home to Internet connections.
Moritz, Tom; Krishnan, S; Roberts, Dave; Ingwersen, Peter; Agosti, Donat; Penev, Lyubomir; Cockerill, Matthew; Chavan, Vishwas
Data are the evidentiary basis for scientific hypotheses, analyses and publication, for policy formation and for decision-making. They are essential to the evaluation and testing of results by peer scientists both present and future. There is broad consensus in the scientific and conservation communities that data should be freely, openly available in a sustained, persistent and secure way, and thus standards for 'free' and 'open' access to data have become well developed in recent years. The question of effective access to data remains highly problematic. Specifically with respect to scientific publishing, the ability to critically evaluate a published scientific hypothesis or scientific report is contingent on the examination, analysis, evaluation - and if feasible - on the re-generation of data on which conclusions are based. It is not coincidental that in the recent 'climategate' controversies, the quality and integrity of data and their analytical treatment were central to the debate. There is recent evidence that even when scientific data are requested for evaluation they may not be available. The history of dissemination of scientific results has been marked by paradigm shifts driven by the emergence of new technologies. In recent decades, the advance of computer-based technology linked to global communications networks has created the potential for broader and more consistent dissemination of scientific information and data. Yet, in this digital era, scientists and conservationists, organizations and institutions have often been slow to make data available. Community studies suggest that the withholding of data can be attributed to a lack of awareness, to a lack of technical capacity, to concerns that data should be withheld for reasons of perceived personal or organizational self interest, or to lack of adequate mechanisms for attribution. There is a clear need for institutionalization of a 'data publishing framework' that can address sociocultural
'data publishing framework' that can address sociocultural, technical-infrastructural, policy, political and legal constraints, as well as addressing issues of sustainability and financial support. To address these aspects of a data publishing framework - a systematic, standard approach to the formal definition and public disclosure of data - in the context of biodiversity data, the Global Biodiversity Information Facility (GBIF, the single inter-governmental body most clearly mandated to undertake such an effort convened a Data Publishing Framework Task Group. We conceive this data publishing framework as an environment conducive to ensure free and open access to world's biodiversity data. Here, we present the recommendations of that Task Group, which are intended to encourage free and open access to the worlds' biodiversity data.
Christensen, Henrik Bærbak; Caspersen, Michael Edelgaard
point for introducing graphical user interface frameworks such as Java Swing and AWT as the students are not overwhelmed by all the details of such frameworks right away but given a conceptual road-map and practical experience that allow them to cope with the complexity.......In this paper we argue that introducing object-oriented frameworks as subject already in the CS1 curriculum is important if we are to train the programmers of tomorrow to become just as much software reusers as software producers. We present a simple, graphical, framework that we have successfully...... used to introduce the principles of object-oriented frameworks to students at the introductory programming level. Our framework, while simple, introduces central abstractions such as inversion of control, event-driven programming, and variability points/hot-spots. This has provided a good starting...
Chiam, Yin Kia; Zhu, Liming; Staples, Mark
The quality of software is achieved during its development. Development teams use various techniques to investigate, evaluate and control potential quality problems in their systems. These “Quality Attribute Techniques” target specific product qualities such as safety or security. This paper proposes a framework to capture important characteristics of these techniques. The framework is intended to support process tailoring, by facilitating the selection of techniques for inclusion into process models that target specific product qualities. We use risk management as a theory to accommodate techniques for many product qualities and lifecycle phases. Safety techniques have motivated the framework, and safety and performance techniques have been used to evaluate the framework. The evaluation demonstrates the ability of quality risk management to cover the development lifecycle and to accommodate two different product qualities. We identify advantages and limitations of the framework, and discuss future research on the framework.
Berrah, Karrim; Gay, David; Genilloud, Guy
OSI network management provides a general framework for the management of OSI systems, and by extension of any distributed system. However, it is not yet possible to tell to what extent the tools developed for network management will be applicable to distributed systems management. This paper assumes that network managers will want to have some control of the distributed infrastructure and applications. It examines how access to some of the ANSA management interfaces can be given to OSI netwo...
Electricity access is already well established within the framework of human rights, either as an implicit attribute of a pre-existing right (such as non-discrimination or sustainable development) or explicitly in the context of eliminating discrimination against women. There is also broad acknowledgement by states of the desirability of eliminating energy poverty - for all, but particularly for the rural poor, and women. (author)