WorldWideScience

Sample records for large-scale low-risk affordable

  1. Technology scale and supply chains in a secure, affordable and low carbon energy transition

    International Nuclear Information System (INIS)

    Hoggett, Richard

    2014-01-01

    Highlights: • Energy systems need to decarbonise, provide security and remain affordable. • There is uncertainty over which technologies will best enable this to happen. • A strategy to deal with uncertainty is to assess a technologies ability to show resilience, flexibility and adaptability. • Scale is important and smaller scale technologies are like to display the above characteristics. • Smaller scale technologies are therefore more likely to enable a sustainable, secure, and affordable energy transition. - Abstract: This research explores the relationship between technology scale, energy security and decarbonisation within the UK energy system. There is considerable uncertainty about how best to deliver on these goals for energy policy, but a focus on supply chains and their resilience can provide useful insights into the problems uncertainty causes. Technology scale is central to this, and through an analysis of the supply chains of nuclear power and solar photovoltaics, it is suggested that smaller scale technologies are more likely to support and enable a secure, low carbon energy transition. This is because their supply chains are less complex, show more flexibility and adaptability, and can quickly respond to changes within an energy system, and as such they are more resilient than large scale technologies. These characteristics are likely to become increasingly important in a rapidly changing energy system, and prioritising those technologies that demonstrate resilience, flexibility and adaptability will better enable a transition that is rapid, sustainable, secure and affordable

  2. Making Safe Surgery Affordable: Design of a Surgical Drill Cover System for Scale.

    Science.gov (United States)

    Buchan, Lawrence L; Black, Marianne S; Cancilla, Michael A; Huisman, Elise S; Kooyman, Jeremy J R; Nelson, Scott C; OʼHara, Nathan N; OʼBrien, Peter J; Blachut, Piotr A

    2015-10-01

    Many surgeons in low-resource settings do not have access to safe, affordable, or reliable surgical drilling tools. Surgeons often resort to nonsterile hardware drills because they are affordable, robust, and efficient, but they are impossible to sterilize using steam. A promising alternative is to use a Drill Cover system (a sterilizable fabric bag plus surgical chuck adapter) so that a nonsterile hardware drill can be used safely for surgical bone drilling. Our objective was to design a safe, effective, affordable Drill Cover system for scale in low-resource settings. We designed our device based on feedback from users at Mulago Hospital (Kampala, Uganda) and focused on 3 main aspects. First, the design included a sealed barrier between the surgical field and hardware drill that withstands pressurized fluid. Second, the selected hardware drill had a maximum speed of 1050 rpm to match common surgical drills and reduce risk of necrosis. Third, the fabric cover was optimized for ease of assembly while maintaining a sterile technique. Furthermore, with the Drill Cover approach, multiple Drill Covers can be provided with a single battery-powered drill in a "kit," so that the drill can be used in back-to-back surgeries without requiring immediate sterilization. The Drill Cover design presented here provides a proof-of-concept for a product that can be commercialized, produced at scale, and used in low-resource settings globally to improve access to safe surgery.

  3. Development of Affordable, Low-Carbon Hydrogen Supplies at an Industrial Scale

    Science.gov (United States)

    Roddy, Dermot J.

    2008-01-01

    An existing industrial hydrogen generation and distribution infrastructure is described, and a number of large-scale investment projects are outlined. All of these projects have the potential to generate significant volumes of low-cost, low-carbon hydrogen. The technologies concerned range from gasification of coal with carbon capture and storage…

  4. Evolutionary leap in large-scale flood risk assessment needed

    OpenAIRE

    Vorogushyn, Sergiy; Bates, Paul D.; de Bruijn, Karin; Castellarin, Attilio; Kreibich, Heidi; Priest, Sally J.; Schröter, Kai; Bagli, Stefano; Blöschl, Günter; Domeneghetti, Alessio; Gouldby, Ben; Klijn, Frans; Lammersen, Rita; Neal, Jeffrey C.; Ridder, Nina

    2018-01-01

    Current approaches for assessing large-scale flood risks contravene the fundamental principles of the flood risk system functioning because they largely ignore basic interactions and feedbacks between atmosphere, catchments, river-floodplain systems and socio-economic processes. As a consequence, risk analyses are uncertain and might be biased. However, reliable risk estimates are required for prioritizing national investments in flood risk mitigation or for appraisal and management of insura...

  5. Large-scale road safety programmes in low- and middle-income countries: an opportunity to generate evidence.

    Science.gov (United States)

    Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David

    2013-01-01

    The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.

  6. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  7. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... adequate representations. We focus on a large-scale energy company in Denmark as one case of current product/servicesystems risk management best practices. We analyze their risk management process and investigate the tools they use in order to support decision making processes within the company. First, we...... identify the following challenges in the current risk management practices that are in line with literature: (1) current methods are not appropriate for the situations dominated by weak knowledge and information; (2) quality of traditional models in such situations is open to debate; (3) quality of input...

  8. Social Policy Trends- Housing Affordability for Families with Low Incomes Across Canada

    Directory of Open Access Journals (Sweden)

    Margarita (Gres Wilkins

    2017-06-01

    Full Text Available HOUSING AFFORDABILITY FOR FAMILIES WITH LOW INCOMES ACROSS CANADA Percentage of income devoted to paying lowest-priced rent in a city, by low-income family type, select years, 1990-2015 Much public attention has been directed towards the issue of a Canada-wide housing crisis. The focus has typically been on the cost of housing for an average income Canadian family. Less attention has been paid to families with incomes much lower than those of the average Canadian household, for which the housing crisis is far more severe. Households and individuals with particularly low incomes are at the highest risk of experiencing the worst effects of a lack of housing affordability, including homelessness.

  9. Managing the risks of a large-scale infrastructure project : The case of Spoorzone Delft

    NARCIS (Netherlands)

    Priemus, H.

    2012-01-01

    Risk management in large-scale infrastructure projects is attracting the attention of academics and practitioners alike. After a brief summary of the theoretical background, this paper describes how the risk analysis and risk management shaped up in a current large-scale infrastructure project in

  10. Incentivising flood risk adaptation through risk based insurance premiums : Trade-offs between affordability and risk reduction

    NARCIS (Netherlands)

    Hudson, Paul F.; Botzen, W.J.W.; Feyen, L.; Aerts, Jeroen C.J.H.

    2016-01-01

    The financial incentives offered by the risk-based pricing of insurance can stimulate policyholder adaptation to flood risk while potentially conflicting with affordability. We examine the trade-off between risk reduction and affordability in a model of public-private flood insurance in France and

  11. Sustainability Risk Evaluation for Large-Scale Hydropower Projects with Hybrid Uncertainty

    Directory of Open Access Journals (Sweden)

    Weiyao Tang

    2018-01-01

    Full Text Available As large-scale hydropower projects are influenced by many factors, risk evaluations are complex. This paper considers a hydropower project as a complex system from the perspective of sustainability risk, and divides it into three subsystems: the natural environment subsystem, the eco-environment subsystem and the socioeconomic subsystem. Risk-related factors and quantitative dimensions of each subsystem are comprehensively analyzed considering uncertainty of some quantitative dimensions solved by hybrid uncertainty methods, including fuzzy (e.g., the national health degree, the national happiness degree, the protection of cultural heritage, random (e.g., underground water levels, river width, and fuzzy random uncertainty (e.g., runoff volumes, precipitation. By calculating the sustainability risk-related degree in each of the risk-related factors, a sustainable risk-evaluation model is built. Based on the calculation results, the critical sustainability risk-related factors are identified and targeted to reduce the losses caused by sustainability risk factors of the hydropower project. A case study at the under-construction Baihetan hydropower station is presented to demonstrate the viability of the risk-evaluation model and to provide a reference for the sustainable risk evaluation of other large-scale hydropower projects.

  12. Challenges of Modeling Flood Risk at Large Scales

    Science.gov (United States)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    Flood risk management is a major concern for many nations and for the insurance sector in places where this peril is insured. A prerequisite for risk management, whether in the public sector or in the private sector is an accurate estimation of the risk. Mitigation measures and traditional flood management techniques are most successful when the problem is viewed at a large regional scale such that all inter-dependencies in a river network are well understood. From an insurance perspective the jury is still out there on whether flood is an insurable peril. However, with advances in modeling techniques and computer power it is possible to develop models that allow proper risk quantification at the scale suitable for a viable insurance market for flood peril. In order to serve the insurance market a model has to be event-simulation based and has to provide financial risk estimation that forms the basis for risk pricing, risk transfer and risk management at all levels of insurance industry at large. In short, for a collection of properties, henceforth referred to as a portfolio, the critical output of the model is an annual probability distribution of economic losses from a single flood occurrence (flood event) or from an aggregation of all events in any given year. In this paper, the challenges of developing such a model are discussed in the context of Great Britain for which a model has been developed. The model comprises of several, physically motivated components so that the primary attributes of the phenomenon are accounted for. The first component, the rainfall generator simulates a continuous series of rainfall events in space and time over thousands of years, which are physically realistic while maintaining the statistical properties of rainfall at all locations over the model domain. A physically based runoff generation module feeds all the rivers in Great Britain, whose total length of stream links amounts to about 60,000 km. A dynamical flow routing

  13. Citizen Science into Action - Robust Data with Affordable Technologies for Flood Risks Management in the Himalayas

    Science.gov (United States)

    Pandeya, B.; Uprety, M.; Paul, J. D.; Dugar, S.; Buytaert, W.

    2017-12-01

    With a robust and affordable monitoring system, a wealth of hydrological data can be generated which is fundamental to predict flood risks more accurately. Since the Himalayan region is characterized by data deficiency and unpredictable hydrological behaviour, a locally based participatory monitoring system is a necessity to deal with frequently occurring flooding incidents. A gap in hydrological data is the main bottleneck for establishing any effective flood early warning system. Therefore, an alternative and affordable technical solution can only overcome the situation and support flood risks management activities in the region. In coordination with local people, government authorities and NGOs, we have established a citizen science monitoring system, in which we tested two types of low-cost sensors, ultrasound and LiDAR, in the Karnali river basin of Nepal. The results confirm the robustness of sensor data when compared to conventional radar system based monitoring data. Additionally, our findings also confirmed that the ultrasound sensors are only useful to small rivers whereas the LiDAR sensors are suitable to large river basins with highly variable local climatic conditions. Since the collected sensor data can be directly used in operational flood early warning system in the basin, an opportunity has been created for integrating both affordable technology and citizen science into existing hydrological monitoring practice. Finally, a successful integration could become a testament for upscaling the practice and building flood risk resilient communities in the region.

  14. Integrating adaptive behaviour in large-scale flood risk assessments: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Aerts, Jeroen

    2015-04-01

    Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.

  15. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  16. Incentivising flood risk adaptation through ris based insurance premiums: trade-offs between affordability and risk reduction

    NARCIS (Netherlands)

    Hudson, P.G.M.B.; Botzen, W.J.W.; Feyen, L.; Aerts, J.C.J.H.

    2016-01-01

    The financial incentives offered by the risk-based pricing of insurance can stimulate policyholder adaptation to flood risk while potentially conflicting with affordability. We examine the trade-off between risk reduction and affordability in a model of public-private flood insurance in France and

  17. Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset

    Science.gov (United States)

    Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.

    2017-12-01

    Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.

  18. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  19. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  20. Correlates of housing affordability stress among older Australians.

    Science.gov (United States)

    Temple, Jeromey B

    2008-03-01

    The purpose of this study was to examine the prevalence and correlates of housing affordability stress among community-dwelling older Australians. The 2002 ABS General Social Survey was used to measure the prevalence of housing affordability stress. Rare event logistic regression was used to measure the potential correlates of housing affordability stress. Almost 5% of Australians aged 55 years and older, and 20% of those younger than 55 years, are estimated to experience housing affordability stress. Men and women living alone are more likely to experience affordability stress when compared to couples. Low-income earners, those with a consumer debt or who do not hold assets, are at a heightened risk of such stress. Home ownership, regardless of income, is the strongest buffer against housing affordability problems in old age. Although the prevalence of housing affordability stress is low among older Australians when compared to the younger population, a definite social gradient exists in those at risk.

  1. Low rank approximation methods for MR fingerprinting with large scale dictionaries.

    Science.gov (United States)

    Yang, Mingrui; Ma, Dan; Jiang, Yun; Hamilton, Jesse; Seiberlich, Nicole; Griswold, Mark A; McGivney, Debra

    2018-04-01

    This work proposes new low rank approximation approaches with significant memory savings for large scale MR fingerprinting (MRF) problems. We introduce a compressed MRF with randomized singular value decomposition method to significantly reduce the memory requirement for calculating a low rank approximation of large sized MRF dictionaries. We further relax this requirement by exploiting the structures of MRF dictionaries in the randomized singular value decomposition space and fitting them to low-degree polynomials to generate high resolution MRF parameter maps. In vivo 1.5T and 3T brain scan data are used to validate the approaches. T 1 , T 2 , and off-resonance maps are in good agreement with that of the standard MRF approach. Moreover, the memory savings is up to 1000 times for the MRF-fast imaging with steady-state precession sequence and more than 15 times for the MRF-balanced, steady-state free precession sequence. The proposed compressed MRF with randomized singular value decomposition and dictionary fitting methods are memory efficient low rank approximation methods, which can benefit the usage of MRF in clinical settings. They also have great potentials in large scale MRF problems, such as problems considering multi-component MRF parameters or high resolution in the parameter space. Magn Reson Med 79:2392-2400, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  2. Large-scale assessment of flood risk and the effects of mitigation measures along the Elbe River

    NARCIS (Netherlands)

    de Kok, Jean-Luc; Grossmann, M.

    2010-01-01

    The downstream effects of flood risk mitigation measures and the necessity to develop flood risk management strategies that are effective on a basin scale call for a flood risk assessment methodology that can be applied at the scale of a large river. We present an example of a rapid flood risk

  3. Large-scale model-based assessment of deer-vehicle collision risk.

    Directory of Open Access Journals (Sweden)

    Torsten Hothorn

    Full Text Available Ungulates, in particular the Central European roe deer Capreolus capreolus and the North American white-tailed deer Odocoileus virginianus, are economically and ecologically important. The two species are risk factors for deer-vehicle collisions and as browsers of palatable trees have implications for forest regeneration. However, no large-scale management systems for ungulates have been implemented, mainly because of the high efforts and costs associated with attempts to estimate population sizes of free-living ungulates living in a complex landscape. Attempts to directly estimate population sizes of deer are problematic owing to poor data quality and lack of spatial representation on larger scales. We used data on >74,000 deer-vehicle collisions observed in 2006 and 2009 in Bavaria, Germany, to model the local risk of deer-vehicle collisions and to investigate the relationship between deer-vehicle collisions and both environmental conditions and browsing intensities. An innovative modelling approach for the number of deer-vehicle collisions, which allows nonlinear environment-deer relationships and assessment of spatial heterogeneity, was the basis for estimating the local risk of collisions for specific road types on the scale of Bavarian municipalities. Based on this risk model, we propose a new "deer-vehicle collision index" for deer management. We show that the risk of deer-vehicle collisions is positively correlated to browsing intensity and to harvest numbers. Overall, our results demonstrate that the number of deer-vehicle collisions can be predicted with high precision on the scale of municipalities. In the densely populated and intensively used landscapes of Central Europe and North America, a model-based risk assessment for deer-vehicle collisions provides a cost-efficient instrument for deer management on the landscape scale. The measures derived from our model provide valuable information for planning road protection and defining

  4. Comparison of the large-scale radon risk map for southern Belgium with results of high resolution surveys

    International Nuclear Information System (INIS)

    Zhu, H.-C.; Charlet, J.M.; Poffijn, A.

    2000-01-01

    A large-scale radon survey consisting of long-term measurements in about 5200 singe-family houses in the southern part of Belgium was carried from 1995 to 1999. A radon risk map for the region was produced using geostatistical and GIS approaches. Some communes or villages situated within high risk areas were chosen for detailed surveys. A high resolution radon survey with about 330 measurements was performed in half part of the commune of Burg-Reuland. Comparison of radon maps on quite different scales shows that the general Rn risk map has similar pattern as the radon map for the detailed study area. Another detailed radon survey in the village of Hatrival, situated in a high radon area, found very high proportion of houses with elevated radon concentrations. The results of this detailed survey are comparable to the expectation for high risk areas on the large-scale radon risk map. The good correspondence between the findings of the general risk map and the analysis of the limited detailed surveys, suggests that the large-scale radon risk map is likely reliable. (author)

  5. RISK MANAGEMENT IN A LARGE-SCALE NEW RAILWAY TRANSPORT SYSTEM PROJECT

    Directory of Open Access Journals (Sweden)

    Sunduck D. SUH, Ph.D., P.E.

    2000-01-01

    Full Text Available Risk management experiences of the Korean Seoul-Pusan high-speed railway (KTX project since the planning stage are evaluated. One can clearly see the interplay of engineering and construction risks, financial risks and political risks in the development of the KTX project, which is the peculiarity of large-scale new railway system projects. A brief description on evaluation methodology and overview of the project is followed by detailed evaluations on key differences in risks between conventional railway system and high-speed railway system, social and political risks, engineering and construction risks, and financial risks. Risks involved in system procurement process, such as proposal solicitation, evaluation, selection, and scope of solicitation are separated out and evaluated in depth. Detailed events resulting from these issues are discussed along with their possible impact on system risk. Lessons learned and further possible refinements are also discussed.

  6. Large-scale correlations in gas traced by Mg II absorbers around low-mass galaxies

    Science.gov (United States)

    Kauffmann, Guinevere

    2018-03-01

    The physical origin of the large-scale conformity in the colours and specific star formation rates of isolated low-mass central galaxies and their neighbours on scales in excess of 1 Mpc is still under debate. One possible scenario is that gas is heated over large scales by feedback from active galactic nuclei (AGNs), leading to coherent modulation of cooling and star formation between well-separated galaxies. In this Letter, the metal line absorption catalogue of Zhu & Ménard is used to probe gas out to large projected radii around a sample of a million galaxies with stellar masses ˜1010M⊙ and photometric redshifts in the range 0.4 Survey imaging data. This galaxy sample covers an effective volume of 2.2 Gpc3. A statistically significant excess of Mg II absorbers is present around the red-low-mass galaxies compared to their blue counterparts out to projected radii of 10 Mpc. In addition, the equivalent width distribution function of Mg II absorbers around low-mass galaxies is shown to be strongly affected by the presence of a nearby (Rp < 2 Mpc) radio-loud AGNs out to projected radii of 5 Mpc.

  7. A low-cost iron-cadmium redox flow battery for large-scale energy storage

    Science.gov (United States)

    Zeng, Y. K.; Zhao, T. S.; Zhou, X. L.; Wei, L.; Jiang, H. R.

    2016-10-01

    The redox flow battery (RFB) is one of the most promising large-scale energy storage technologies that offer a potential solution to the intermittency of renewable sources such as wind and solar. The prerequisite for widespread utilization of RFBs is low capital cost. In this work, an iron-cadmium redox flow battery (Fe/Cd RFB) with a premixed iron and cadmium solution is developed and tested. It is demonstrated that the coulombic efficiency and energy efficiency of the Fe/Cd RFB reach 98.7% and 80.2% at 120 mA cm-2, respectively. The Fe/Cd RFB exhibits stable efficiencies with capacity retention of 99.87% per cycle during the cycle test. Moreover, the Fe/Cd RFB is estimated to have a low capital cost of 108 kWh-1 for 8-h energy storage. Intrinsically low-cost active materials, high cell performance and excellent capacity retention equip the Fe/Cd RFB to be a promising solution for large-scale energy storage systems.

  8. Low-cost satellite mechanical design and construction

    Science.gov (United States)

    Boisjolie-Gair, Nathaniel; Straub, Jeremy

    2017-05-01

    This paper presents a discussion of techniques for low-cost design and construction of a CubeSat mechanical structure that can serve as a basis for academic programs and a starting point for government, military and commercial large-scale sensing networks, where the cost of each node must be minimized to facilitate system affordability and lower the cost and associated risk of losing any node. Spacecraft Design plays a large role in manufacturability. An intentionally simplified mechanical design is presented which reduces machining costs, as compared to more intricate designs that were considered. Several fabrication approaches are evaluated relative to the low-cost goal.

  9. Determining 30-day readmission risk for heart failure patients: the Readmission After Heart Failure scale.

    Science.gov (United States)

    Chamberlain, Ronald S; Sond, Jaswinder; Mahendraraj, Krishnaraj; Lau, Christine Sm; Siracuse, Brianna L

    2018-01-01

    Chronic heart failure (CHF), which affects >5 million Americans, accounts for >1 million hospitalizations annually. As a part of the Hospital Readmission Reduction Program, the Affordable Care Act requires that the Centers for Medicare and Medicaid Services reduce payments to hospitals with excess readmissions. This study sought to develop a scale that reliably predicts readmission rates among patients with CHF. The State Inpatient Database (2006-2011) was utilized, and discharge data including demographic and clinical characteristics on 642,448 patients with CHF from California and New York (derivation cohort) and 365,359 patients with CHF from Florida and Washington (validation cohort) were extracted. The Readmission After Heart Failure (RAHF) scale was developed to predict readmission risk. The 30-day readmission rates were 9.42 and 9.17% (derivation and validation cohorts, respectively). Age readmission risk after hospitalization for CHF. The RAHF scale was created and explained the 95% of readmission variability within the validation cohort. The RAHF scale was then used to define the following three levels of risk for readmission: low (RAHF score readmission rate), moderate (RAHF score 12-15; 9.78% readmission rate), and high (RAHF score >15; 12.04% readmission rate). The relative risk of readmission was 1.67 for the high-risk group compared with the low-risk group. The RAHF scale reliably predicts a patient's 30-day CHF readmission risk based on demographic and clinical factors present upon initial admission. By risk-stratifying patients, using models such as the RAHF scale, strategies tailored to each patient can be implemented to improve patient outcomes and reduce health care costs.

  10. Enabling Dedicated, Affordable Space Access Through Aggressive Technology Maturation

    Science.gov (United States)

    Jones, Jonathan E.; Kibbey, Timothy P.; Cobb, C. Brent; Harris, Lawanna L.

    2014-01-01

    A launch vehicle at the scale and price point which allows developers to take reasonable risks with high payoff propulsion and avionics hardware solutions does not exist today. Establishing this service provides a ride through the proverbial technology "valley of death" that lies between demonstration in laboratory and flight environments. NASA's NanoLaunch effort will provide the framework to mature both earth-to-orbit and on-orbit propulsion and avionics technologies while also providing affordable, dedicated access to low earth orbit for cubesat class payloads.

  11. A long-term, continuous simulation approach for large-scale flood risk assessments

    Science.gov (United States)

    Falter, Daniela; Schröter, Kai; Viet Dung, Nguyen; Vorogushyn, Sergiy; Hundecha, Yeshewatesfa; Kreibich, Heidi; Apel, Heiko; Merz, Bruno

    2014-05-01

    The Regional Flood Model (RFM) is a process based model cascade developed for flood risk assessments of large-scale basins. RFM consists of four model parts: the rainfall-runoff model SWIM, a 1D channel routing model, a 2D hinterland inundation model and the flood loss estimation model for residential buildings FLEMOps+r. The model cascade was recently undertaken a proof-of-concept study at the Elbe catchment (Germany) to demonstrate that flood risk assessments, based on a continuous simulation approach, including rainfall-runoff, hydrodynamic and damage estimation models, are feasible for large catchments. The results of this study indicated that uncertainties are significant, especially for hydrodynamic simulations. This was basically a consequence of low data quality and disregarding dike breaches. Therefore, RFM was applied with a refined hydraulic model setup for the Elbe tributary Mulde. The study area Mulde catchment comprises about 6,000 km2 and 380 river-km. The inclusion of more reliable information on overbank cross-sections and dikes considerably improved the results. For the application of RFM for flood risk assessments, long-term climate input data is needed to drive the model chain. This model input was provided by a multi-site, multi-variate weather generator that produces sets of synthetic meteorological data reproducing the current climate statistics. The data set comprises 100 realizations of 100 years of meteorological data. With the proposed continuous simulation approach of RFM, we simulated a virtual period of 10,000 years covering the entire flood risk chain including hydrological, 1D/2D hydrodynamic and flood damage estimation models. This provided a record of around 2.000 inundation events affecting the study area with spatially detailed information on inundation depths and damage to residential buildings on a resolution of 100 m. This serves as basis for a spatially consistent, flood risk assessment for the Mulde catchment presented in

  12. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    Science.gov (United States)

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  13. Dynamic properties of energy affordability measures

    International Nuclear Information System (INIS)

    Heindl, Peter; Schuessler, Rudolf

    2015-01-01

    Measures of affordability and of fuel poverty are applied in practice to assess the affordability of energy services, for example, or of water or housing. The extensive body of literature on affordability measures has little overlap with the existing literature on poverty measurement. A comprehensive assessment of the response of affordability measures as a result of changes in the distribution of income or expenditure (the dynamic properties) is missing. This paper aims to fill this gap by providing a conceptual discussion on the ‘dynamics’ of both energy affordability measures and fuel poverty measures. Several types of measures are examined in a microsimulation framework. Our results indicate that some measures exhibit odd dynamic behavior. This includes measures used in practice, such as the low income/high cost measure and the double median of expenditure share indicator. Odd dynamic behavior causes the risk of drawing false policy recommendations from the measures. Thus, an appropriate response of affordability measures to changes in relevant variables is a prerequisite for defining meaningful measures that inform about affordability or deprivation in certain domains of consumption. - Highlights: • We investigate changes in fuel poverty measures as result from changes in income and expenditure. • More generally, we investigate dynamic behavior of affordability measures using microsimulation. • We propose axioms regarding dynamic behavior of affordability measures. • Some measures which are used in practice show unintuitive dynamic behavior. • Inappropriate dynamic behavior causes a risk of false policy implications.

  14. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    Energy Technology Data Exchange (ETDEWEB)

    Babu, Sudarsanam Suresh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Peter, William H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Dehoff, Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility

    2016-05-01

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact of the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii

  15. Towards large scale stochastic rainfall models for flood risk assessment in trans-national basins

    Science.gov (United States)

    Serinaldi, F.; Kilsby, C. G.

    2012-04-01

    While extensive research has been devoted to rainfall-runoff modelling for risk assessment in small and medium size watersheds, less attention has been paid, so far, to large scale trans-national basins, where flood events have severe societal and economic impacts with magnitudes quantified in billions of Euros. As an example, in the April 2006 flood events along the Danube basin at least 10 people lost their lives and up to 30 000 people were displaced, with overall damages estimated at more than half a billion Euros. In this context, refined analytical methods are fundamental to improve the risk assessment and, then, the design of structural and non structural measures of protection, such as hydraulic works and insurance/reinsurance policies. Since flood events are mainly driven by exceptional rainfall events, suitable characterization and modelling of space-time properties of rainfall fields is a key issue to perform a reliable flood risk analysis based on alternative precipitation scenarios to be fed in a new generation of large scale rainfall-runoff models. Ultimately, this approach should be extended to a global flood risk model. However, as the need of rainfall models able to account for and simulate spatio-temporal properties of rainfall fields over large areas is rather new, the development of new rainfall simulation frameworks is a challenging task involving that faces with the problem of overcoming the drawbacks of the existing modelling schemes (devised for smaller spatial scales), but keeping the desirable properties. In this study, we critically summarize the most widely used approaches for rainfall simulation. Focusing on stochastic approaches, we stress the importance of introducing suitable climate forcings in these simulation schemes in order to account for the physical coherence of rainfall fields over wide areas. Based on preliminary considerations, we suggest a modelling framework relying on the Generalized Additive Models for Location, Scale

  16. Low frequency steady-state brain responses modulate large scale functional networks in a frequency-specific means.

    Science.gov (United States)

    Wang, Yi-Feng; Long, Zhiliang; Cui, Qian; Liu, Feng; Jing, Xiu-Juan; Chen, Heng; Guo, Xiao-Nan; Yan, Jin H; Chen, Hua-Fu

    2016-01-01

    Neural oscillations are essential for brain functions. Research has suggested that the frequency of neural oscillations is lower for more integrative and remote communications. In this vein, some resting-state studies have suggested that large scale networks function in the very low frequency range (frequency characteristics of brain networks because both resting-state studies and conventional frequency tagging approaches cannot simultaneously capture multiple large scale networks in controllable cognitive activities. In this preliminary study, we aimed to examine whether large scale networks can be modulated by task-induced low frequency steady-state brain responses (lfSSBRs) in a frequency-specific pattern. In a revised attention network test, the lfSSBRs were evoked in the triple network system and sensory-motor system, indicating that large scale networks can be modulated in a frequency tagging way. Furthermore, the inter- and intranetwork synchronizations as well as coherence were increased at the fundamental frequency and the first harmonic rather than at other frequency bands, indicating a frequency-specific modulation of information communication. However, there was no difference among attention conditions, indicating that lfSSBRs modulate the general attention state much stronger than distinguishing attention conditions. This study provides insights into the advantage and mechanism of lfSSBRs. More importantly, it paves a new way to investigate frequency-specific large scale brain activities. © 2015 Wiley Periodicals, Inc.

  17. Scale up risk of developing oil shale processing units

    International Nuclear Information System (INIS)

    Oepik, I.

    1991-01-01

    The experiences in oil shale processing in three large countries, China, the U.S.A. and the U.S.S.R. have demonstrated, that the relative scale up risk of developing oil shale processing units is related to the scale up factor. On the background of large programmes for developing the oil shale industry branch, i.e. the $30 billion investments in colorado and Utah or 50 million t/year oil shale processing in Estonia and Leningrad Region planned in the late seventies, the absolute scope of the scale up risk of developing single retorting plants, seems to be justified. But under the conditions of low crude oil prices, when the large-scale development of oil shale processing industry is stopped, the absolute scope of the scale up risk is to be divided between a small number of units. Therefore, it is reasonable to build the new commercial oil shale processing plants with a minimum scale up risk. For example, in Estonia a new oil shale processing plant with gas combustion retorts projected to start in the early nineties will be equipped with four units of 1500 t/day enriched oil shale throughput each, designed with scale up factor M=1.5 and with a minimum scale up risk, only r=2.5-4.5%. The oil shale retorting unit for the PAMA plant in Israel [1] is planned to develop in three steps, also with minimum scale up risk: feasibility studies in Colorado with Israel's shale at Paraho 250 t/day retort and other tests, demonstration retort of 700 t/day and M=2.8 in Israel, and commercial retorts in the early nineties with the capacity of about 1000 t/day with M=1.4. The scale up risk of the PAMA project r=2-4% is approximately the same as that in Estonia. the knowledge of the scope of the scale up risk of developing oil shale processing retorts assists on the calculation of production costs in erecting new units. (author). 9 refs., 2 tabs

  18. Insertion Sequence-Caused Large Scale-Rearrangements in the Genome of Escherichia coli

    Science.gov (United States)

    2016-07-18

    affordable ap- proach to genome-wide characterization of genetic varia - tion in bacterial and eukaryotic genomes (1–3). In addition to small-scale...Paired-End Reads), that uses a graph-based al- gorithm (27) capable of detecting most large-scale varia - tion involving repetitive regions, including novel...Avila,P., Grinsted,J. and De La Cruz,F. (1988) Analysis of the variable endpoints generated by one-ended transposition of Tn21.. J. Bacteriol., 170

  19. Development of Large-Scale Spacecraft Fire Safety Experiments

    DEFF Research Database (Denmark)

    Ruff, Gary A.; Urban, David L.; Fernandez-Pello, A. Carlos

    2013-01-01

    exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low...... of the spacecraft fire safety risk. The activity of this project is supported by an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The large-scale space flight experiment will be conducted in an Orbital Sciences...

  20. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  1. Experimental Investigation of a Large-Scale Low-Boom Inlet Concept

    Science.gov (United States)

    Hirt, Stefanie M.; Chima, Rodrick V.; Vyas, Manan A.; Wayman, Thomas R.; Conners, Timothy R.; Reger, Robert W.

    2011-01-01

    A large-scale low-boom inlet concept was tested in the NASA Glenn Research Center 8- x 6- foot Supersonic Wind Tunnel. The purpose of this test was to assess inlet performance, stability and operability at various Mach numbers and angles of attack. During this effort, two models were tested: a dual stream inlet designed to mimic potential aircraft flight hardware integrating a high-flow bypass stream; and a single stream inlet designed to study a configuration with a zero-degree external cowl angle and to permit surface visualization of the vortex generator flow on the internal centerbody surface. During the course of the test, the low-boom inlet concept was demonstrated to have high recovery, excellent buzz margin, and high operability. This paper will provide an overview of the setup, show a brief comparison of the dual stream and single stream inlet results, and examine the dual stream inlet characteristics.

  2. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  3. Affordable housing as a niche product: The case of the Danish “SocialHousing Plus”

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole; Stensgaard, Anne Gro

    Establishing affordable housing is a growing demand in many larger cities there is however a number of challenges related to establishing affordable housing, as well as many different approaches. This paper presents a case-study of an affordable housing concept in the Danish social housing sector......, the “SocialHousing Plus” (“AlmenBolig+”) which is based on lowering production costs as well as operation costs including residential self-management, large-scale production of pre-fab housing units, low-energy solutions and other innovative approaches. The concept was developed in 2007, and has so far...... resulted in the production of more than 1.500 dwellings. The paper will discuss the results of the concept, and the various challenges related to it. Based on the theory of Technological Transition (Geels, 2002) it will discuss the options and limitations of providing affordable housing through developing...

  4. Evaluating high risks in large-scale projects using an extended VIKOR method under a fuzzy environment

    Directory of Open Access Journals (Sweden)

    S. Ebrahimnejad

    2012-04-01

    Full Text Available The complexity of large-scale projects has led to numerous risks in their life cycle. This paper presents a new risk evaluation approach in order to rank the high risks in large-scale projects and improve the performance of these projects. It is based on the fuzzy set theory that is an effective tool to handle uncertainty. It is also based on an extended VIKOR method that is one of the well-known multiple criteria decision-making (MCDM methods. The proposed decision-making approach integrates knowledge and experience acquired from professional experts, since they perform the risk identification and also the subjective judgments of the performance rating for high risks in terms of conflicting criteria, including probability, impact, quickness of reaction toward risk, event measure quantity and event capability criteria. The most notable difference of the proposed VIKOR method with its traditional version is just the use of fuzzy decision-matrix data to calculate the ranking index without the need to ask the experts. Finally, the proposed approach is illustrated with a real-case study in an Iranian power plant project, and the associated results are compared with two well-known decision-making methods under a fuzzy environment.

  5. Large Scale Composite Manufacturing for Heavy Lift Launch Vehicles

    Science.gov (United States)

    Stavana, Jacob; Cohen, Leslie J.; Houseal, Keth; Pelham, Larry; Lort, Richard; Zimmerman, Thomas; Sutter, James; Western, Mike; Harper, Robert; Stuart, Michael

    2012-01-01

    Risk reduction for the large scale composite manufacturing is an important goal to produce light weight components for heavy lift launch vehicles. NASA and an industry team successfully employed a building block approach using low-cost Automated Tape Layup (ATL) of autoclave and Out-of-Autoclave (OoA) prepregs. Several large, curved sandwich panels were fabricated at HITCO Carbon Composites. The aluminum honeycomb core sandwich panels are segments of a 1/16th arc from a 10 meter cylindrical barrel. Lessons learned highlight the manufacturing challenges required to produce light weight composite structures such as fairings for heavy lift launch vehicles.

  6. Risk management in a large-scale CO2 geosequestration pilot project, Illinois, USA

    Science.gov (United States)

    Hnottavange-Telleen, K.; Chabora, E.; Finley, R.J.; Greenberg, S.E.; Marsteller, S.

    2011-01-01

    Like most large-scale infrastructure projects, carbon dioxide (CO 2) geological sequestration (GS) projects have multiple success criteria and multiple stakeholders. In this context "risk evaluation" encompasses multiple scales. Yet a risk management program aims to maximize the chance of project success by assessing, monitoring, minimizing all risks in a consistent framework. The 150,000-km2 Illinois Basin underlies much of the state of Illinois, USA, and parts of adjacent Kentucky and Indiana. Its potential for CO2 storage is first-rate among basins in North America, an impression that has been strengthened by early testing of the injection well of the Midwest Geological Sequestration Consortium's (MGSC's) Phase III large scale demonstration project, the Illinois Basin - Decatur Project (IBDP). The IBDP, funded by the U.S. Department of Energy's National Energy Technology Laboratory (NETL), represents a key trial of GS technologies and project-management techniques. Though risks are specific to each site and project, IBDP risk management methodologies provide valuable experience for future GS projects. IBDP views risk as the potential for negative impact to any of these five values: health and safety, environment, financial, advancing the viability and public acceptability of a GS industry, and research. Research goals include monitoring one million metric tonnes of injected CO2 in the subsurface. Risk management responds to the ways in which any values are at risk: for example, monitoring is designed to reduce uncertainties in parameter values that are important for research and system control, and is also designed to provide public assurance. Identified risks are the primary basis for risk-reduction measures: risks linked to uncertainty in geologic parameters guide further characterization work and guide simulations applied to performance evaluation. Formally, industry defines risk (more precisely risk criticality) as the product L*S, the Likelihood multiplied

  7. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  8. Affordability for sustainable energy development products

    International Nuclear Information System (INIS)

    Riley, Paul H.

    2014-01-01

    Highlights: • Clean cookstoves that also generate electricity improve affordability. • Excel spreadsheet model to assist stakeholders to choose optimum technology. • Presents views for each stakeholder villager, village and country. • By adding certain capital costs, affordability and sustainability are improved. • Affordability is highly dependent on carbon credits and social understandings. - Abstract: Clean burning products, for example cooking stoves, can reduce household air pollution (HAP), which prematurely kills 3.5 million people each year. By careful selection of components into a product package with micro-finance used for the capital payment, barriers to large-scale uptake of products that remove HAP are reduced. Such products reduce smoke from cooking and the lighting from electricity produced, eliminates smoke from kerosene lamps. A bottom-up financial model, that is cognisant of end user social needs, has been developed to compare different products for use in rural areas of developing countries. The model is freely available for use by researchers and has the ability to assist in the analysis of changing assumptions. Business views of an individual villager, the village itself and a country view are presented. The model shows that affordability (defined as the effect on household expenses as a result of a product purchase) and recognition of end-user social needs are as important as product cost. The effects of large-scale deployment (greater that 10 million per year) are described together with level of subsidy required by the poorest people. With the assumptions given, the model shows that pico-hydro is the most cost effective, but not generally available, one thermo-acoustic technology option does not require subsidy, but it is only at technology readiness level 2 (NASA definition) therefore costs are predicted and very large investment in manufacturing capability is needed to meet the cost target. Thermo-electric is currently the only

  9. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  10. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  11. Large-scale exact diagonalizations reveal low-momentum scales of nuclei

    Science.gov (United States)

    Forssén, C.; Carlsson, B. D.; Johansson, H. T.; Sääf, D.; Bansal, A.; Hagen, G.; Papenbrock, T.

    2018-03-01

    Ab initio methods aim to solve the nuclear many-body problem with controlled approximations. Virtually exact numerical solutions for realistic interactions can only be obtained for certain special cases such as few-nucleon systems. Here we extend the reach of exact diagonalization methods to handle model spaces with dimension exceeding 1010 on a single compute node. This allows us to perform no-core shell model (NCSM) calculations for 6Li in model spaces up to Nmax=22 and to reveal the 4He+d halo structure of this nucleus. Still, the use of a finite harmonic-oscillator basis implies truncations in both infrared (IR) and ultraviolet (UV) length scales. These truncations impose finite-size corrections on observables computed in this basis. We perform IR extrapolations of energies and radii computed in the NCSM and with the coupled-cluster method at several fixed UV cutoffs. It is shown that this strategy enables information gain also from data that is not fully UV converged. IR extrapolations improve the accuracy of relevant bound-state observables for a range of UV cutoffs, thus making them profitable tools. We relate the momentum scale that governs the exponential IR convergence to the threshold energy for the first open decay channel. Using large-scale NCSM calculations we numerically verify this small-momentum scale of finite nuclei.

  12. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    data and representation of the results to the decision makers play an important role. Second, we introduce a selection of alternative, so-called “post-probabilistic”, risk management methods developed across different scientific fields to cope with uncertainty due to lack of knowledge. Possibilities......Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... for overcoming industrial PSS risk management challenges are suggested through application of post-probabilistic methods. We conclude with the discussion on the importance for the field to consider their application....

  13. An easy, low-cost method to transfer large-scale graphene onto polyethylene terephthalate as a transparent conductive flexible substrate

    International Nuclear Information System (INIS)

    Chen, Chih-Sheng; Hsieh, Chien-Kuo

    2014-01-01

    In this study, we develop a low-cost method for transferring a large-scale graphene film onto a flexible transparent substrate. An easily accessible method for home-made chemical vapor deposition (CVD) and a commercial photograph laminator were utilized to fabricate the low-cost graphene-based transparent conductive flexible substrate. The graphene was developed based on CVD growth on nickel foil using a carbon gas source, and the graphene thin film was easily transferred onto the laminating film via a heated photograph laminator. Field emission scanning electron microscopy and atomic force microscopy were utilized to examine the morphological characteristics of the graphene surface. Raman spectroscopy and transmission electron microscopy were utilized to examine the microstructure of the graphene. The optical–electronic properties of the transferred graphene flexible thin film were measured by ultraviolet–visible spectrometry and a four-point probe. The advantage of this method is that large-scale graphene-based thin films can be easily obtained. We provide an economical method for fabricating a graphene-based transparent conductive flexible substrate. - Highlight: • We synthesized the large-scale graphene by thermal CVD method. • A low-cost commercial photograph laminator was used to transfer graphene. • A large-scale transparent and flexible graphene substrate was obtained easily

  14. An easy, low-cost method to transfer large-scale graphene onto polyethylene terephthalate as a transparent conductive flexible substrate

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Chih-Sheng; Hsieh, Chien-Kuo, E-mail: jack_hsieh@mail.mcut.edu.tw

    2014-11-03

    In this study, we develop a low-cost method for transferring a large-scale graphene film onto a flexible transparent substrate. An easily accessible method for home-made chemical vapor deposition (CVD) and a commercial photograph laminator were utilized to fabricate the low-cost graphene-based transparent conductive flexible substrate. The graphene was developed based on CVD growth on nickel foil using a carbon gas source, and the graphene thin film was easily transferred onto the laminating film via a heated photograph laminator. Field emission scanning electron microscopy and atomic force microscopy were utilized to examine the morphological characteristics of the graphene surface. Raman spectroscopy and transmission electron microscopy were utilized to examine the microstructure of the graphene. The optical–electronic properties of the transferred graphene flexible thin film were measured by ultraviolet–visible spectrometry and a four-point probe. The advantage of this method is that large-scale graphene-based thin films can be easily obtained. We provide an economical method for fabricating a graphene-based transparent conductive flexible substrate. - Highlight: • We synthesized the large-scale graphene by thermal CVD method. • A low-cost commercial photograph laminator was used to transfer graphene. • A large-scale transparent and flexible graphene substrate was obtained easily.

  15. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  16. Extreme weather events in southern Germany - Climatological risk and development of a large-scale identification procedure

    Science.gov (United States)

    Matthies, A.; Leckebusch, G. C.; Rohlfing, G.; Ulbrich, U.

    2009-04-01

    Extreme weather events such as thunderstorms, hail and heavy rain or snowfall can pose a threat to human life and to considerable tangible assets. Yet there is a lack of knowledge about present day climatological risk and its economic effects, and its changes due to rising greenhouse gas concentrations. Therefore, parts of economy particularly sensitve to extreme weather events such as insurance companies and airports require regional risk-analyses, early warning and prediction systems to cope with such events. Such an attempt is made for southern Germany, in close cooperation with stakeholders. Comparing ERA40 and station data with impact records of Munich Re and Munich Airport, the 90th percentile was found to be a suitable threshold for extreme impact relevant precipitation events. Different methods for the classification of causing synoptic situations have been tested on ERA40 reanalyses. An objective scheme for the classification of Lamb's circulation weather types (CWT's) has proved to be most suitable for correct classification of the large-scale flow conditions. Certain CWT's have been turned out to be prone to heavy precipitation or on the other side to have a very low risk of such events. Other large-scale parameters are tested in connection with CWT's to find out a combination that has the highest skill to identify extreme precipitation events in climate model data (ECHAM5 and CLM). For example vorticity advection in 700 hPa shows good results, but assumes knowledge of regional orographic particularities. Therefore ongoing work is focused on additional testing of parameters that indicate deviations of a basic state of the atmosphere like the Eady Growth Rate or the newly developed Dynamic State Index. Evaluation results will be used to estimate the skill of the regional climate model CLM concerning the simulation of frequency and intensity of the extreme weather events. Data of the A1B scenario (2000-2050) will be examined for a possible climate change

  17. Associations of biological factors and affordances in the home with infant motor development.

    Science.gov (United States)

    Saccani, Raquel; Valentini, Nadia C; Pereira, Keila Rg; Müller, Alessandra B; Gabbard, Carl

    2013-04-01

    Whereas considerable work has been published regarding biological factors associated with infant health, much less is known about the associations of environmental context with infant development - the focus of the present cross-sectional study. Data were collected on 561 infants, aged newborn to 18 months. Measures included the Affordances in the Home Environment for Motor Development-Infant Scale, Alberta Infant Motor Scale, and selected bio/medical factors. Correlation and regression were used to analyze the data. Home environmental factors were associated with children's motor development as much as some typically high-risk biologic factors. The home environment partially explained infant development outcomes and infants at risk could possibly be helped with a home assessment for affordances. © 2012 The Authors. Pediatrics International © 2012 Japan Pediatric Society.

  18. Identification of low order models for large scale processes

    NARCIS (Netherlands)

    Wattamwar, S.K.

    2010-01-01

    Many industrial chemical processes are complex, multi-phase and large scale in nature. These processes are characterized by various nonlinear physiochemical effects and fluid flows. Such processes often show coexistence of fast and slow dynamics during their time evolutions. The increasing demand

  19. The large-scale solar feed-in tariff reverse auction in the Australian Capital Territory, Australia

    International Nuclear Information System (INIS)

    Buckman, Greg; Sibley, Jon; Bourne, Richard

    2014-01-01

    Feed-in tariffs (FiTs) offer renewable energy developers significant investor certainty but sometimes at the cost of being misaligned with generation costs. Reverse FiT auctions, where the FiT rights for a predetermined capacity are auctioned, can overcome this problem but can be plagued by non-delivery risks, particularly of competitively priced proposals. In 2012 and 2013 the Australian Capital Territory (ACT) Government in Australia conducted a FiT reverse auction for 40 MW of large-scale solar generating capacity, the first such auction undertaken in the country. The auction was highly competitive in relation to price and demonstrating low delivery risks. Proposal capital costs, particularly engineering, procurement and construction costs, as well as internal rates of return, were lower than expected. The auction process revealed limited land availability for large-scale solar developments in the ACT as well as a significant perceived sovereign risk issue. The auction process was designed to mitigate non-delivery risk by requiring proposals to be pre-qualified on the basis of delivery risk, before considering FiT pricing. The scheme is likely to be used by the ACT Government to support further large-scale renewable energy development as part of its greenhouse gas reduction strategy which is underpinned by a 90-per cent-by-2020 renewable energy target. - Highlights: • Evolution of the reverse auction process in the Australian Capital Territory. • Analysis of the outcomes of the first Australian feed-in tariff reverse auction. • Identification of the major drivers of the low FiT prices achieved in the auction. • Identification of major issues that emerged in the auction

  20. Large-scale derived flood frequency analysis based on continuous simulation

    Science.gov (United States)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  1. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  2. Cross-cultural adaptation and validation of the Chinese Comfort, Afford, Respect, and Expect scale of caring nurse-patient interaction competence.

    Science.gov (United States)

    Chung, Hui-Chun; Hsieh, Tsung-Cheng; Chen, Yueh-Chih; Chang, Shu-Chuan; Hsu, Wen-Lin

    2017-11-29

    To investigate the construct validity and reliability of the Chinese Comfort, Afford, Respect, and Expect scale, which can be used to determine clinical nurses' competence. The results can also serve to promote nursing competence and improve patient satisfaction. Nurse-patient interaction is critical for improving nursing care quality. However, to date, no relevant validated instrument has been proposed for assessing caring nurse-patient interaction competence in clinical practice. This study adapted and validated the Chinese version of the caring nurse-patient interaction scale. A cross-cultural adaptation and validation study. A psychometric analysis of the four major constructs of the Chinese Comfort, Afford, Respect, and Expect scale was conducted on a sample of 356 nurses from a medical centre in China. Item analysis and exploratory factor analysis were adopted to extract the main components, both the internal consistency and correlation coefficients were used to examine reliability and a confirmatory factor analysis was adopted to verify the construct validity. The goodness-of-fit results of the model were strong. The standardised factor loadings of the Chinese Comfort, Afford, Respect, and Expect scale ranged from 0.73-0.95, indicating that the validity and reliability of this instrument were favourable. Moreover, the 12 extracted items explained 95.9% of the measured content of the Chinese Comfort, Afford, Respect, and Expect scale. The results serve as empirical evidence regarding the validity and reliability of the Chinese Comfort, Afford, Respect, and Expect scale. Hospital nurses increasingly demand help from patients and their family members in identifying health problems and assisting with medical decision-making. Therefore, enhancing nurses' competence in nurse-patient interactions is crucial for nursing and hospital managers to improve nursing care quality. The Chinese caring nurse-patient interaction scale can serve as an effective tool for nursing

  3. Large-scale evaluation of candidate genes identifies associations between VEGF polymorphisms and bladder cancer risk.

    Directory of Open Access Journals (Sweden)

    Montserrat García-Closas

    2007-02-01

    Full Text Available Common genetic variation could alter the risk for developing bladder cancer. We conducted a large-scale evaluation of single nucleotide polymorphisms (SNPs in candidate genes for cancer to identify common variants that influence bladder cancer risk. An Illumina GoldenGate assay was used to genotype 1,433 SNPs within or near 386 genes in 1,086 cases and 1,033 controls in Spain. The most significant finding was in the 5' UTR of VEGF (rs25648, p for likelihood ratio test, 2 degrees of freedom = 1 x 10(-5. To further investigate the region, we analyzed 29 additional SNPs in VEGF, selected to saturate the promoter and 5' UTR and to tag common genetic variation in this gene. Three additional SNPs in the promoter region (rs833052, rs1109324, and rs1547651 were associated with increased risk for bladder cancer: odds ratio (95% confidence interval: 2.52 (1.06-5.97, 2.74 (1.26-5.98, and 3.02 (1.36-6.63, respectively; and a polymorphism in intron 2 (rs3024994 was associated with reduced risk: 0.65 (0.46-0.91. Two of the promoter SNPs and the intron 2 SNP showed linkage disequilibrium with rs25648. Haplotype analyses revealed three blocks of linkage disequilibrium with significant associations for two blocks including the promoter and 5' UTR (global p = 0.02 and 0.009, respectively. These findings are biologically plausible since VEGF is critical in angiogenesis, which is important for tumor growth, its elevated expression in bladder tumors correlates with tumor progression, and specific 5' UTR haplotypes have been shown to influence promoter activity. Associations between bladder cancer risk and other genes in this report were not robust based on false discovery rate calculations. In conclusion, this large-scale evaluation of candidate cancer genes has identified common genetic variants in the regulatory regions of VEGF that could be associated with bladder cancer risk.

  4. Large-Scale Spacecraft Fire Safety Experiments in ISS Resupply Vehicles

    Science.gov (United States)

    Ruff, Gary A.; Urban, David

    2013-01-01

    Our understanding of the fire safety risk in manned spacecraft has been limited by the small scale of the testing we have been able to conduct in low-gravity. Fire growth and spread cannot be expected to scale linearly with sample size so we cannot make accurate predictions of the behavior of realistic scale fires in spacecraft based on the limited low-g testing to date. As a result, spacecraft fire safety protocols are necessarily very conservative and costly. Future crewed missions are expected to be longer in duration than previous exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low-gravity, the need for realistic scale testing at reduced gravity has been demonstrated. To address this concern, a spacecraft fire safety research project is underway to reduce the uncertainty and risk in the design of spacecraft fire safety systems by testing at nearly full scale in low-gravity. This project is supported by the NASA Advanced Exploration Systems Program Office in the Human Exploration and Operations Mission Directorate. The activity of this project is supported by an international topical team of fire experts from other space agencies to maximize the utility of the data and to ensure the widest possible scrutiny of the concept. The large-scale space flight experiment will be conducted on three missions; each in an Orbital Sciences Corporation Cygnus vehicle after it has deberthed from the ISS. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew allows the fire products to be released into the cabin. The tests will be fully automated with the data downlinked at the conclusion of the test before the Cygnus vehicle reenters the

  5. Global analysis of seagrass restoration: the importance of large-scale planting

    KAUST Repository

    van Katwijk, Marieke M.; Thorhaug, Anitra; Marbà , Nú ria; Orth, Robert J.; Duarte, Carlos M.; Kendrick, Gary A.; Althuizen, Inge H. J.; Balestri, Elena; Bernard, Guillaume; Cambridge, Marion L.; Cunha, Alexandra; Durance, Cynthia; Giesen, Wim; Han, Qiuying; Hosokawa, Shinya; Kiswara, Wawan; Komatsu, Teruhisa; Lardicci, Claudio; Lee, Kun-Seop; Meinesz, Alexandre; Nakaoka, Masahiro; O'Brien, Katherine R.; Paling, Erik I.; Pickerell, Chris; Ransijn, Aryan M. A.; Verduin, Jennifer J.

    2015-01-01

    In coastal and estuarine systems, foundation species like seagrasses, mangroves, saltmarshes or corals provide important ecosystem services. Seagrasses are globally declining and their reintroduction has been shown to restore ecosystem functions. However, seagrass restoration is often challenging, given the dynamic and stressful environment that seagrasses often grow in. From our world-wide meta-analysis of seagrass restoration trials (1786 trials), we describe general features and best practice for seagrass restoration. We confirm that removal of threats is important prior to replanting. Reduced water quality (mainly eutrophication), and construction activities led to poorer restoration success than, for instance, dredging, local direct impact and natural causes. Proximity to and recovery of donor beds were positively correlated with trial performance. Planting techniques can influence restoration success. The meta-analysis shows that both trial survival and seagrass population growth rate in trials that survived are positively affected by the number of plants or seeds initially transplanted. This relationship between restoration scale and restoration success was not related to trial characteristics of the initial restoration. The majority of the seagrass restoration trials have been very small, which may explain the low overall trial survival rate (i.e. estimated 37%). Successful regrowth of the foundation seagrass species appears to require crossing a minimum threshold of reintroduced individuals. Our study provides the first global field evidence for the requirement of a critical mass for recovery, which may also hold for other foundation species showing strong positive feedback to a dynamic environment. Synthesis and applications. For effective restoration of seagrass foundation species in its typically dynamic, stressful environment, introduction of large numbers is seen to be beneficial and probably serves two purposes. First, a large-scale planting

  6. Global analysis of seagrass restoration: the importance of large-scale planting

    KAUST Repository

    van Katwijk, Marieke M.

    2015-10-28

    In coastal and estuarine systems, foundation species like seagrasses, mangroves, saltmarshes or corals provide important ecosystem services. Seagrasses are globally declining and their reintroduction has been shown to restore ecosystem functions. However, seagrass restoration is often challenging, given the dynamic and stressful environment that seagrasses often grow in. From our world-wide meta-analysis of seagrass restoration trials (1786 trials), we describe general features and best practice for seagrass restoration. We confirm that removal of threats is important prior to replanting. Reduced water quality (mainly eutrophication), and construction activities led to poorer restoration success than, for instance, dredging, local direct impact and natural causes. Proximity to and recovery of donor beds were positively correlated with trial performance. Planting techniques can influence restoration success. The meta-analysis shows that both trial survival and seagrass population growth rate in trials that survived are positively affected by the number of plants or seeds initially transplanted. This relationship between restoration scale and restoration success was not related to trial characteristics of the initial restoration. The majority of the seagrass restoration trials have been very small, which may explain the low overall trial survival rate (i.e. estimated 37%). Successful regrowth of the foundation seagrass species appears to require crossing a minimum threshold of reintroduced individuals. Our study provides the first global field evidence for the requirement of a critical mass for recovery, which may also hold for other foundation species showing strong positive feedback to a dynamic environment. Synthesis and applications. For effective restoration of seagrass foundation species in its typically dynamic, stressful environment, introduction of large numbers is seen to be beneficial and probably serves two purposes. First, a large-scale planting

  7. Accuracy assessment of planimetric large-scale map data for decision-making

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-06-01

    Full Text Available This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data.

  8. Low-cost and large-scale flexible SERS-cotton fabric as a wipe substrate for surface trace analysis

    Science.gov (United States)

    Chen, Yanmin; Ge, Fengyan; Guang, Shanyi; Cai, Zaisheng

    2018-04-01

    The large-scale surface enhanced Raman scattering (SERS) cotton fabrics were fabricated based on traditional woven ones using a dyeing-like method of vat dyes, where silver nanoparticles (Ag NPs) were in-situ synthesized by 'dipping-reducing-drying' process. By controlling the concentration of AgNO3 solution, the optimal SERS cotton fabric was obtained, which had a homogeneous close packing of Ag NPs. The SERS cotton fabric was employed to detect p-Aminothiophenol (PATP). It was found that the new fabric possessed excellent reproducibility (about 20%), long-term stability (about 57 days) and high SERS sensitivity with a detected concentration as low as 10-12 M. Furthermore, owing to the excellent mechanical flexibility and good absorption ability, the SERS cotton fabric was employed to detect carbaryl on the surface of an apple by simply swabbing, which showed great potential in fast trace analysis. More importantly, this study may realize large-scale production with low cost by a traditional cotton fabric.

  9. Low Risk Anomalies?

    DEFF Research Database (Denmark)

    Schneider, Paul; Wagner, Christian; Zechner, Josef

    This paper shows theoretically and empirically that beta- and volatility-based low risk anomalies are driven by return skewness. The empirical patterns concisely match the predictions of our model that endogenizes the role of skewness for stock returns through default risk. With increasing downside...... risk, the standard capital asset pricing model (CAPM) increasingly overestimates expected equity returns relative to firms' true (skew-adjusted) market risk. Empirically, the profitability of betting against beta/volatility increases with firms' downside risk, and the risk-adjusted return differential...... of betting against beta/volatility among low skew firms compared to high skew firms is economically large. Our results suggest that the returns to betting against beta or volatility do not necessarily pose asset pricing puzzles but rather that such strategies collect premia that compensate for skew risk...

  10. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  11. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  12. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  13. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  14. Results of Large-Scale Spacecraft Flammability Tests

    Science.gov (United States)

    Ferkul, Paul; Olson, Sandra; Urban, David L.; Ruff, Gary A.; Easton, John; T'ien, James S.; Liao, Ta-Ting T.; Fernandez-Pello, A. Carlos; Torero, Jose L.; Eigenbrand, Christian; hide

    2017-01-01

    For the first time, a large-scale fire was intentionally set inside a spacecraft while in orbit. Testing in low gravity aboard spacecraft had been limited to samples of modest size: for thin fuels the longest samples burned were around 15 cm in length and thick fuel samples have been even smaller. This is despite the fact that fire is a catastrophic hazard for spaceflight and the spread and growth of a fire, combined with its interactions with the vehicle cannot be expected to scale linearly. While every type of occupied structure on earth has been the subject of full scale fire testing, this had never been attempted in space owing to the complexity, cost, risk and absence of a safe location. Thus, there is a gap in knowledge of fire behavior in spacecraft. The recent utilization of large, unmanned, resupply craft has provided the needed capability: a habitable but unoccupied spacecraft in low earth orbit. One such vehicle was used to study the flame spread over a 94 x 40.6 cm thin charring solid (fiberglasscotton fabric). The sample was an order of magnitude larger than anything studied to date in microgravity and was of sufficient scale that it consumed 1.5 of the available oxygen. The experiment which is called Saffire consisted of two tests, forward or concurrent flame spread (with the direction of flow) and opposed flame spread (against the direction of flow). The average forced air speed was 20 cms. For the concurrent flame spread test, the flame size remained constrained after the ignition transient, which is not the case in 1-g. These results were qualitatively different from those on earth where an upward-spreading flame on a sample of this size accelerates and grows. In addition, a curious effect of the chamber size is noted. Compared to previous microgravity work in smaller tunnels, the flame in the larger tunnel spread more slowly, even for a wider sample. This is attributed to the effect of flow acceleration in the smaller tunnels as a result of hot

  15. Refusal to enrol in Ghana's National Health Insurance Scheme: is affordability the problem?

    Science.gov (United States)

    Kusi, Anthony; Enemark, Ulrika; Hansen, Kristian S; Asante, Felix A

    2015-01-17

    Access to health insurance is expected to have positive effect in improving access to healthcare and offer financial risk protection to households. Ghana began the implementation of a National Health Insurance Scheme (NHIS) in 2004 as a way to ensure equitable access to basic healthcare for all residents. After a decade of its implementation, national coverage is just about 34% of the national population. Affordability of the NHIS contribution is often cited by households as a major barrier to enrolment in the NHIS without any rigorous analysis of this claim. In light of the global interest in achieving universal health insurance coverage, this study seeks to examine the extent to which affordability of the NHIS contribution is a barrier to full insurance for households and a burden on their resources. The study uses data from a cross-sectional household survey involving 2,430 households from three districts in Ghana conducted between January-April, 2011. Affordability of the NHIS contribution is analysed using the household budget-based approach based on the normative definition of affordability. The burden of the NHIS contributions to households is assessed by relating the expected annual NHIS contribution to household non-food expenditure and total consumption expenditure. Households which cannot afford full insurance were identified. Results show that 66% of uninsured households and 70% of partially insured households could afford full insurance for their members. Enroling all household members in the NHIS would account for 5.9% of household non-food expenditure or 2.0% of total expenditure but higher for households in the first (11.4%) and second (7.0%) socio-economic quintiles. All the households (29%) identified as unable to afford full insurance were in the two lower socio-economic quintiles and had large household sizes. Non-financial factors relating to attributes of the insurer and health system problems also affect enrolment in the NHIS. Affordability

  16. Large-scale risk assessment of polycyclic aromatic hydrocarbons in shoreline sediments from Saudi Arabia: Environmental legacy after twelve years of the Gulf war oil spill

    Energy Technology Data Exchange (ETDEWEB)

    Bejarano, Adriana C., E-mail: ABejarano@researchplanning.co [Research Planning Inc., 1121 Park St., Columbia, SC 29201 (United States); Michel, Jacqueline [Research Planning Inc., 1121 Park St., Columbia, SC 29201 (United States)

    2010-05-15

    A large-scale assessment of polycyclic aromatic hydrocarbons (PAHs) from the 1991 Gulf War oil spill was performed for 2002-2003 sediment samples (n = 1679) collected from habitats along the shoreline of Saudi Arabia. Benthic sediment toxicity was characterized using the Equilibrium Partitioning Sediment Benchmark Toxic Unit approach for 43 PAHs (ESBTU{sub FCV,43}). Samples were assigned to risk categories according to ESBTU{sub FCV,43} values: no-risk (<=1), low (>1-<=2), low-medium (>2-<=3), medium (>3-<=5) and high-risk (>5). Sixty seven percent of samples had ESBTU{sub FCV,43} > 1 indicating potential adverse ecological effects. Sediments from the 0-30 cm layer from tidal flats, and the >30-<60 cm layer from heavily oiled halophytes and mangroves had high frequency of high-risk samples. No-risk samples were characterized by chrysene enrichment and depletion of lighter molecular weight PAHs, while high-risk samples showed little oil weathering and PAH patterns similar to 1993 samples. North of Safaniya sediments were not likely to pose adverse ecological effects contrary to sediments south of Tanaqib. Landscape and geomorphology has played a role on the distribution and persistence in sediments of oil from the Gulf War. - Risk Assessment of PAHs in shoreline sediments 12 years after the Gulf War oil spill.

  17. Large-scale risk assessment of polycyclic aromatic hydrocarbons in shoreline sediments from Saudi Arabia: Environmental legacy after twelve years of the Gulf war oil spill

    International Nuclear Information System (INIS)

    Bejarano, Adriana C.; Michel, Jacqueline

    2010-01-01

    A large-scale assessment of polycyclic aromatic hydrocarbons (PAHs) from the 1991 Gulf War oil spill was performed for 2002-2003 sediment samples (n = 1679) collected from habitats along the shoreline of Saudi Arabia. Benthic sediment toxicity was characterized using the Equilibrium Partitioning Sediment Benchmark Toxic Unit approach for 43 PAHs (ESBTU FCV,43 ). Samples were assigned to risk categories according to ESBTU FCV,43 values: no-risk (≤1), low (>1-≤2), low-medium (>2-≤3), medium (>3-≤5) and high-risk (>5). Sixty seven percent of samples had ESBTU FCV,43 > 1 indicating potential adverse ecological effects. Sediments from the 0-30 cm layer from tidal flats, and the >30-<60 cm layer from heavily oiled halophytes and mangroves had high frequency of high-risk samples. No-risk samples were characterized by chrysene enrichment and depletion of lighter molecular weight PAHs, while high-risk samples showed little oil weathering and PAH patterns similar to 1993 samples. North of Safaniya sediments were not likely to pose adverse ecological effects contrary to sediments south of Tanaqib. Landscape and geomorphology has played a role on the distribution and persistence in sediments of oil from the Gulf War. - Risk Assessment of PAHs in shoreline sediments 12 years after the Gulf War oil spill.

  18. Unitarity bounds on low scale quantum gravity

    International Nuclear Information System (INIS)

    Atkins, Michael; Calmet, Xavier

    2010-01-01

    We study the unitarity of models with low scale quantum gravity both in four dimensions and in models with a large extra-dimensional volume. We find that models with low scale quantum gravity have problems with unitarity below the scale at which gravity becomes strong. An important consequence of our work is that their first signal at the Large Hadron Collider would not be of a gravitational nature such as graviton emission or small black holes, but rather would be linked to the mechanism which fixes the unitarity problem. We also study models with scalar fields with non-minimal couplings to the Ricci scalar. We consider the strength of gravity in these models and study the consequences for inflation models with non-minimally coupled scalar fields. We show that a single scalar field with a large non-minimal coupling can lower the Planck mass in the TeV region. In that model, it is possible to lower the scale at which gravity becomes strong down to 14 TeV without violating unitarity below that scale. (orig.)

  19. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  20. The role of large scale storage in a GB low carbon energy future: Issues and policy challenges

    International Nuclear Information System (INIS)

    Gruenewald, Philipp; Cockerill, Tim; Contestabile, Marcello; Pearson, Peter

    2011-01-01

    Large scale storage offers the prospect of capturing and using excess electricity within a low carbon energy system, which otherwise might have to be wasted. Incorporating the role of storage into current scenario tools is challenging, because it requires high temporal resolution to reflect the effects of intermittent sources on system balancing. This study draws on results from a model with such resolution. It concludes that large scale storage could become economically viable for scenarios with high penetration of renewables. As the proportion of intermittent sources increases, the optimal type of storage shifts towards solutions with low energy related costs, even at the expense of efficiency. However, a range of uncertainties have been identified, concerning storage technology development, the regulatory environment, alternatives to storage and the stochastic uncertainty of year-on-year revenues. All of these negatively affect the cost of finance and the chances of successful market uptake. We argue, therefore, that, if the possible wider system and social benefits from the presence of storage are to be achieved, stronger and more strategic policy support may be necessary. More work on the social and system benefits of storage is needed to gauge the appropriate extent of support measures. - Highlights: → Time resolved modelling shows future potential for large scale power storage in GB. → The value of storage is highly sensitive to a range of parameters. → Uncertainty over the revenue from storage could pose a barrier to investment. → To realise wider system benefits stronger and more strategic policy support may be necessary.

  1. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  2. Solar wind fluctuations at large scale: A comparison between low and high solar activity conditions

    International Nuclear Information System (INIS)

    Bavassano, B.; Bruno, R.

    1991-01-01

    The influence of the Sun's activity cycle on the solar wind fluctuations at time scales from 1 hour to 3 days in the inner heliosphere (0.3 to 1 AU) is investigated. Hourly averages of plasma and magnetic field data by Helios spacecraft are used. Since fluctuations behave quite differently with changing scale, the analysis is performed separately for two different ranges in time scale. Between 1 and 6 hours Alfvenic fluctuations and pressure-balanced structures are extensively observed. At low solar activity and close to 0.3 AU, Alfvenic fluctuations are more frequent than pressure-balanced structures. This predominance, however, weakens for rising solar activity and radial distance, to the point that a role exchange, in terms of occurrence rate, is found at the maximum of the cycle close to 1 AU. On the other hand, in all cases Alfvenic fluctuations have a larger amplitude than pressure-balanced structures. On the whole, the Alfvenic contribution to the solar wind energy spectrum comes out to be dominant at all solar activity conditions. At scales from 0.5 to 3 days the most important feature is the growth, as the solar wind expansion develops, of strong positive correlations between magnetic and thermal pressures. These structures are progressively built up by the interaction between different wind flows. This effect is more pronounced at low than at high activity. Our findings support the conclusion that the solar cycle evolution of the large-scale velocity pattern is the factor governing the observed variations

  3. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  4. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  5. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  6. Upscaling of Large-Scale Transport in Spatially Heterogeneous Porous Media Using Wavelet Transformation

    Science.gov (United States)

    Moslehi, M.; de Barros, F.; Ebrahimi, F.; Sahimi, M.

    2015-12-01

    Modeling flow and solute transport in large-scale heterogeneous porous media involves substantial computational burdens. A common approach to alleviate this complexity is to utilize upscaling methods. These processes generate upscaled models with less complexity while attempting to preserve the hydrogeological properties comparable to the original fine-scale model. We use Wavelet Transformations (WT) of the spatial distribution of aquifer's property to upscale the hydrogeological models and consequently transport processes. In particular, we apply the technique to a porous formation with broadly distributed and correlated transmissivity to verify the performance of the WT. First, transmissivity fields are coarsened using WT in such a way that the high transmissivity zones, in which more important information is embedded, mostly remain the same, while the low transmissivity zones are averaged out since they contain less information about the hydrogeological formation. Next, flow and non-reactive transport are simulated in both fine-scale and upscaled models to predict both the concentration breakthrough curves at a control location and the large-scale spreading of the plume around its centroid. The results reveal that the WT of the fields generates non-uniform grids with an average of 2.1% of the number of grid blocks in the original fine-scale models, which eventually leads to a significant reduction in the computational costs. We show that the upscaled model obtained through the WT reconstructs the concentration breakthrough curves and the spreading of the plume at different times accurately. Furthermore, the impacts of the Hurst coefficient, size of the flow domain and the orders of magnitude difference in transmissivity values on the results have been investigated. It is observed that as the heterogeneity and the size of the domain increase, better agreement between the results of fine-scale and upscaled models can be achieved. Having this framework at hand aids

  7. Low Risk Anomalies?

    DEFF Research Database (Denmark)

    Schneider, Paul; Wagner, Christian; Zechner, Josef

    . Empirically, we find that option-implied ex-ante skewness is strongly related to ex-post residual coskewness and alphas. Beta- and volatility-based low risk anomalies are largely driven by a single principal component, which is in turn largely explained by skewness. Controlling for skewness renders the alphas......This paper shows that stocks' CAPM alphas are negatively related to CAPM betas if investors demand compensation for negative skewness. Thus, high (low) beta stocks appear to underperform (outperform). This apparent anomaly merely reflects compensation for residual coskewness ignored by the CAPM...... of betting-against-beta and -volatility insignificant....

  8. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  9. Is quality affordable?

    Directory of Open Access Journals (Sweden)

    Robert Lindfield

    2008-12-01

    Full Text Available The question “Is quality affordable?” is loaded with dynamite!Can a person who lives on less than US $1 per day afford a high-quality cataract operation? If the answer is ‘No’, then do we offer that person poor or low-quality services? Do people living in poverty have a ‘right’ to high-quality eye or health care? If the answer is ‘Yes’, then at what price and who should pay? Should we ignore quality and focus on affordability? Or should we provide high-quality services in the hope that someone else will pay?These are difficult questions, which policy makers, managers, and clinicians must face and try to answer.

  10. Tracking of large-scale structures in turbulent channel with direct numerical simulation of low Prandtl number passive scalar

    Science.gov (United States)

    Tiselj, Iztok

    2014-12-01

    Channel flow DNS (Direct Numerical Simulation) at friction Reynolds number 180 and with passive scalars of Prandtl numbers 1 and 0.01 was performed in various computational domains. The "normal" size domain was ˜2300 wall units long and ˜750 wall units wide; size taken from the similar DNS of Moser et al. The "large" computational domain, which is supposed to be sufficient to describe the largest structures of the turbulent flows was 3 times longer and 3 times wider than the "normal" domain. The "very large" domain was 6 times longer and 6 times wider than the "normal" domain. All simulations were performed with the same spatial and temporal resolution. Comparison of the standard and large computational domains shows the velocity field statistics (mean velocity, root-mean-square (RMS) fluctuations, and turbulent Reynolds stresses) that are within 1%-2%. Similar agreement is observed for Pr = 1 temperature fields and can be observed also for the mean temperature profiles at Pr = 0.01. These differences can be attributed to the statistical uncertainties of the DNS. However, second-order moments, i.e., RMS temperature fluctuations of standard and large computational domains at Pr = 0.01 show significant differences of up to 20%. Stronger temperature fluctuations in the "large" and "very large" domains confirm the existence of the large-scale structures. Their influence is more or less invisible in the main velocity field statistics or in the statistics of the temperature fields at Prandtl numbers around 1. However, these structures play visible role in the temperature fluctuations at low Prandtl number, where high temperature diffusivity effectively smears the small-scale structures in the thermal field and enhances the relative contribution of large-scales. These large thermal structures represent some kind of an echo of the large scale velocity structures: the highest temperature-velocity correlations are not observed between the instantaneous temperatures and

  11. Kinetically controlled synthesis of large-scale morphology-tailored silver nanostructures at low temperature

    Science.gov (United States)

    Zhang, Ling; Zhao, Yuda; Lin, Ziyuan; Gu, Fangyuan; Lau, Shu Ping; Li, Li; Chai, Yang

    2015-08-01

    Ag nanostructures are widely used in catalysis, energy conversion and chemical sensing. Morphology-tailored synthesis of Ag nanostructures is critical to tune physical and chemical properties. In this study, we develop a method for synthesizing the morphology-tailored Ag nanostructures in aqueous solution at a low temperature (45 °C). With the use of AgCl nanoparticles as the precursor, the growth kinetics of Ag nanostructures can be tuned with the pH value of solution and the concentration of Pd cubes which catalyze the reaction. Ascorbic acid and cetylpyridinium chloride are used as the mild reducing agent and capping agent in aqueous solution, respectively. High-yield Ag nanocubes, nanowires, right triangular bipyramids/cubes with twinned boundaries, and decahedra are successfully produced. Our method opens up a new environmentally-friendly and economical route to synthesize large-scale and morphology-tailored Ag nanostructures, which is significant to the controllable fabrication of Ag nanostructures and fundamental understanding of the growth kinetics.Ag nanostructures are widely used in catalysis, energy conversion and chemical sensing. Morphology-tailored synthesis of Ag nanostructures is critical to tune physical and chemical properties. In this study, we develop a method for synthesizing the morphology-tailored Ag nanostructures in aqueous solution at a low temperature (45 °C). With the use of AgCl nanoparticles as the precursor, the growth kinetics of Ag nanostructures can be tuned with the pH value of solution and the concentration of Pd cubes which catalyze the reaction. Ascorbic acid and cetylpyridinium chloride are used as the mild reducing agent and capping agent in aqueous solution, respectively. High-yield Ag nanocubes, nanowires, right triangular bipyramids/cubes with twinned boundaries, and decahedra are successfully produced. Our method opens up a new environmentally-friendly and economical route to synthesize large-scale and morphology

  12. Solar wind fluctuations at large scale - A comparison between low and high solar activity conditions

    Science.gov (United States)

    Bavassano, B.; Bruno, R.

    1991-02-01

    The influence of the sun's activity cycle on the solar wind fluctuations at time scales from 1 hour to 3 days in the inner heliosphere (0.3 to 1 AU) is investigated. Hourly averages of plasma and magnetic field data by Helios spacecraft are used. Since fluctuations behave quite differently with changing scale, the analysis is performed separately for two different ranges in time scale. Between 1 and 6 hours Alfvenic fluctuations and pressure-balanced structures are extensively observed. At low solar activity and close to 0.3 AU Alfvenic fluctuations are more frequent than pressure-balanced structures. This predominance, however, weakens for rising solar activity and radial distance, to the point that a role-exchange, in terms of occurrence rate, is found at the maximum of the cycle close to 1 AU. On the other hand, in all cases Alfvenic fluctuations have a larger amplitude than pressure-balanced structures. The Alfvenic contribution to the solar wind energy spectrum comes out to be dominant at all solar activity conditions. These findings support the conclusion that the solar cycle evolution of the large-scale velocity pattern is the factor governing the observed variations.

  13. Food stress in Adelaide: the relationship between low income and the affordability of healthy food.

    Science.gov (United States)

    Ward, Paul R; Verity, Fiona; Carter, Patricia; Tsourtos, George; Coveney, John; Wong, Kwan Chui

    2013-01-01

    Healthy food is becoming increasingly expensive, and families on low incomes face a difficult financial struggle to afford healthy food. When food costs are considered, families on low incomes often face circumstances of poverty. Housing, utilities, health care, and transport are somewhat fixed in cost; however food is more flexible in cost and therefore is often compromised with less healthy, cheaper food, presenting an opportunity for families on low incomes to cut costs. Using a "Healthy Food Basket" methodology, this study costed a week's supply of healthy food for a range of family types. It found that low-income families would have to spend approximately 30% of household income on eating healthily, whereas high-income households needed to spend about 10%. The differential is explained by the cost of the food basket relative to household income (i.e., affordability). It is argued that families that spend more than 30% of household income on food could be experiencing "food stress." Moreover the high cost of healthy foods leaves low-income households vulnerable to diet-related health problems because they often have to rely on cheaper foods which are high in fat, sugar, and salt.

  14. Low-Temperature Soft-Cover Deposition of Uniform Large-Scale Perovskite Films for High-Performance Solar Cells.

    Science.gov (United States)

    Ye, Fei; Tang, Wentao; Xie, Fengxian; Yin, Maoshu; He, Jinjin; Wang, Yanbo; Chen, Han; Qiang, Yinghuai; Yang, Xudong; Han, Liyuan

    2017-09-01

    Large-scale high-quality perovskite thin films are crucial to produce high-performance perovskite solar cells. However, for perovskite films fabricated by solvent-rich processes, film uniformity can be prevented by convection during thermal evaporation of the solvent. Here, a scalable low-temperature soft-cover deposition (LT-SCD) method is presented, where the thermal convection-induced defects in perovskite films are eliminated through a strategy of surface tension relaxation. Compact, homogeneous, and convection-induced-defects-free perovskite films are obtained on an area of 12 cm 2 , which enables a power conversion efficiency (PCE) of 15.5% on a solar cell with an area of 5 cm 2 . This is the highest efficiency at this large cell area. A PCE of 15.3% is also obtained on a flexible perovskite solar cell deposited on the polyethylene terephthalate substrate owing to the advantage of presented low-temperature processing. Hence, the present LT-SCD technology provides a new non-spin-coating route to the deposition of large-area uniform perovskite films for both rigid and flexible perovskite devices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Visual analysis of inter-process communication for large-scale parallel computing.

    Science.gov (United States)

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  16. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  17. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  18. The affordances of broken affordances

    DEFF Research Database (Denmark)

    Grünbaum, Martin Gielsgaard; Simonsen, Jakob Grue

    2015-01-01

    We consider the use of physical and virtual objects having one or more affordances associated to simple interactions with them. Based on Kaptelinin and Nardi’s notion of instrumental affordance, we investigate what it means to break an affordance, and the two ensuing questions we deem most import...

  19. Food Stress in Adelaide: The Relationship between Low Income and the Affordability of Healthy Food

    Directory of Open Access Journals (Sweden)

    Paul R. Ward

    2013-01-01

    Full Text Available Healthy food is becoming increasingly expensive, and families on low incomes face a difficult financial struggle to afford healthy food. When food costs are considered, families on low incomes often face circumstances of poverty. Housing, utilities, health care, and transport are somewhat fixed in cost; however food is more flexible in cost and therefore is often compromised with less healthy, cheaper food, presenting an opportunity for families on low incomes to cut costs. Using a “Healthy Food Basket” methodology, this study costed a week’s supply of healthy food for a range of family types. It found that low-income families would have to spend approximately 30% of household income on eating healthily, whereas high-income households needed to spend about 10%. The differential is explained by the cost of the food basket relative to household income (i.e., affordability. It is argued that families that spend more than 30% of household income on food could be experiencing “food stress.” Moreover the high cost of healthy foods leaves low-income households vulnerable to diet-related health problems because they often have to rely on cheaper foods which are high in fat, sugar, and salt.

  20. Food Stress in Adelaide: The Relationship between Low Income and the Affordability of Healthy Food

    OpenAIRE

    Paul R. Ward; Fiona Verity; Patricia Carter; George Tsourtos; John Coveney; Kwan Chui Wong

    2013-01-01

    Healthy food is becoming increasingly expensive, and families on low incomes face a difficult financial struggle to afford healthy food. When food costs are considered, families on low incomes often face circumstances of poverty. Housing, utilities, health care, and transport are somewhat fixed in cost; however food is more flexible in cost and therefore is often compromised with less healthy, cheaper food, presenting an opportunity for families on low incomes to cut costs. Using a “Healthy ...

  1. Update on Risk Reduction Activities for a Liquid Advanced Booster for NASA's Space Launch System

    Science.gov (United States)

    Crocker, Andrew M.; Doering, Kimberly B; Meadows, Robert G.; Lariviere, Brian W.; Graham, Jerry B.

    2015-01-01

    The stated goals of NASA's Research Announcement for the Space Launch System (SLS) Advanced Booster Engineering Demonstration and/or Risk Reduction (ABEDRR) are to reduce risks leading to an affordable Advanced Booster that meets the evolved capabilities of SLS; and enable competition by mitigating targeted Advanced Booster risks to enhance SLS affordability. Dynetics, Inc. and Aerojet Rocketdyne (AR) formed a team to offer a wide-ranging set of risk reduction activities and full-scale, system-level demonstrations that support NASA's ABEDRR goals. For NASA's SLS ABEDRR procurement, Dynetics and AR formed a team to offer a series of full-scale risk mitigation hardware demonstrations for an affordable booster approach that meets the evolved capabilities of the SLS. To establish a basis for the risk reduction activities, the Dynetics Team developed a booster design that takes advantage of the flight-proven Apollo-Saturn F-1. Using NASA's vehicle assumptions for the SLS Block 2, a two-engine, F-1-based booster design delivers 150 mT (331 klbm) payload to LEO, 20 mT (44 klbm) above NASA's requirements. This enables a low-cost, robust approach to structural design. During the ABEDRR effort, the Dynetics Team has modified proven Apollo-Saturn components and subsystems to improve affordability and reliability (e.g., reduce parts counts, touch labor, or use lower cost manufacturing processes and materials). The team has built hardware to validate production costs and completed tests to demonstrate it can meet performance requirements. State-of-the-art manufacturing and processing techniques have been applied to the heritage F-1, resulting in a low recurring cost engine while retaining the benefits of Apollo-era experience. NASA test facilities have been used to perform low-cost risk-reduction engine testing. In early 2014, NASA and the Dynetics Team agreed to move additional large liquid oxygen/kerosene engine work under Dynetics' ABEDRR contract. Also led by AR, the

  2. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  3. AFFORDABILITY OF LOW INCOME HOUSING IN PUMWANI, NAIROBI, KENYA

    Directory of Open Access Journals (Sweden)

    Crispino C. Ochieng

    2007-07-01

    Full Text Available Since 1987, in Kenya, through the National Housing Corporation (NHC, an arm of the central government that delivers affordable houses, the local government embarked on the redevelopment of Pumwani the oldest surviving affordable low income housing in Nairobi. Pumwani was started in 1923 and it targeted early African immigrants to Nairobi. Currently, the old Pumwani is home to some of the city’s poorest dwellers majorities who depend on the informal sector for an income. Redevelopment was targeted at housing all the genuine dwellers. Instead delivery ended up with house types that were at first rejected by the beneficiaries. Although the new housing was slightly of an improved physical and spatial quality it was unaffordable. Beneficiaries were required to pay an average monthly rent of US$157 for up to eighteen years towards purchase of the new house. In the beginning, some of them had declined to take position of the newly built houses. To raise the basic rent majorities of those who have since moved in have opted to rent out some of the space. To date there is still standoff with some of the houses still unoccupied. Except during the period of social survey when the beneficiaries were brought in to supply the necessary information, the entire construction process was undertaken by NHC under a turnkey project. Among other factors the construction process was at fault for it raised the costs. Also, some of the basic housing needs were not effectively looked into. There was a housing mismatch.

  4. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  5. Understanding high traffic injury risks for children in low socioeconomic areas: a qualitative study of parents' views.

    Science.gov (United States)

    Christie, N; Ward, H; Kimberlee, R; Towner, E; Sleney, J

    2007-12-01

    To gain an in-depth qualitative understanding of parents' views about their children's exposure to road traffic injury risk in low socioeconomic areas. Focus groups facilitated by a moderator with content analysis of data. Focus groups were conducted in 10 low socioeconomic English districts that also have high rates of child pedestrian injury. Research was conducted in community venues within each area. Parents of children aged 9-14 years living in low socioeconomic areas. Parents believe that children play in their local streets for the following reasons: they like playing out with friends near home; there are few safe, secure, and well-maintained public spaces for children; children are excluded from affordable leisure venues because of their costs; insufficient parental responsibility. For children that play in the street, the key sources of risk identified by parents were: illegal riding and driving around estates and on the pavements; the speed and volume of traffic; illegal parking; drivers being poorly informed about where children play; children's risk-taking behavior. Intervention programs need to take into account multiple reasons why children in low socioeconomic areas become exposed to hazardous environments thereby increasing their risk of injury. Multi-agency partnerships involving the community are increasingly needed to implement traditional road safety approaches, such as education, engineering, and enforcement, and provide safe and accessible public space, affordable activities for children, and greater support for parents.

  6. Update on Risk Reduction Activities for a Liquid Advanced Booster for NASA's Space Launch System

    Science.gov (United States)

    Crocker, Andrew M.; Greene, William D.

    2017-01-01

    The stated goals of NASA's Research Announcement for the Space Launch System (SLS) Advanced Booster Engineering Demonstration and/or Risk Reduction (ABEDRR) are to reduce risks leading to an affordable Advanced Booster that meets the evolved capabilities of SLS and enable competition by mitigating targeted Advanced Booster risks to enhance SLS affordability. Dynetics, Inc. and Aerojet Rocketdyne (AR) formed a team to offer a wide-ranging set of risk reduction activities and full-scale, system-level demonstrations that support NASA's ABEDRR goals. During the ABEDRR effort, the Dynetics Team has modified flight-proven Apollo-Saturn F-1 engine components and subsystems to improve affordability and reliability (e.g., reduce parts counts, touch labor, or use lower cost manufacturing processes and materials). The team has built hardware to validate production costs and completed tests to demonstrate it can meet performance requirements. State-of-the-art manufacturing and processing techniques have been applied to the heritage F-1, resulting in a low recurring cost engine while retaining the benefits of Apollo-era experience. NASA test facilities have been used to perform low-cost risk-reduction engine testing. In early 2014, NASA and the Dynetics Team agreed to move additional large liquid oxygen/kerosene engine work under Dynetics' ABEDRR contract. Also led by AR, the objectives of this work are to demonstrate combustion stability and measure performance of a 500,000 lbf class Oxidizer-Rich Staged Combustion (ORSC) cycle main injector. A trade study was completed to investigate the feasibility, cost effectiveness, and technical maturity of a domestically-produced engine that could potentially both replace the RD-180 on Atlas V and satisfy NASA SLS payload-to-orbit requirements via an advanced booster application. Engine physical dimensions and performance parameters resulting from this study provide the system level requirements for the ORSC risk reduction test article

  7. Effects of climate variability on global scale flood risk

    Science.gov (United States)

    Ward, P.; Dettinger, M. D.; Kummu, M.; Jongman, B.; Sperna Weiland, F.; Winsemius, H.

    2013-12-01

    In this contribution we demonstrate the influence of climate variability on flood risk. Globally, flooding is one of the worst natural hazards in terms of economic damages; Munich Re estimates global losses in the last decade to be in excess of $240 billion. As a result, scientifically sound estimates of flood risk at the largest scales are increasingly needed by industry (including multinational companies and the insurance industry) and policy communities. Several assessments of global scale flood risk under current and conditions have recently become available, and this year has seen the first studies assessing how flood risk may change in the future due to global change. However, the influence of climate variability on flood risk has as yet hardly been studied, despite the fact that: (a) in other fields (drought, hurricane damage, food production) this variability is as important for policy and practice as long term change; and (b) climate variability has a strong influence in peak riverflows around the world. To address this issue, this contribution illustrates the influence of ENSO-driven climate variability on flood risk, at both the globally aggregated scale and the scale of countries and large river basins. Although it exerts significant and widespread influences on flood peak discharges in many parts of the world, we show that ENSO does not have a statistically significant influence on flood risk once aggregated to global totals. At the scale of individual countries, though, strong relationships exist over large parts of the Earth's surface. For example, we find particularly strong anomalies of flood risk in El Niño or La Niña years (compared to all years) in southern Africa, parts of western Africa, Australia, parts of Central Eurasia (especially for El Niño), the western USA (especially for La Niña), and parts of South America. These findings have large implications for both decadal climate-risk projections and long-term future climate change

  8. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  9. Neutrinos and large-scale structure

    International Nuclear Information System (INIS)

    Eisenstein, Daniel J.

    2015-01-01

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos

  10. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  11. The (in)effectiveness of Global Land Policies on Large-Scale Land Acquisition

    NARCIS (Netherlands)

    Verhoog, S.M.

    2014-01-01

    Due to current crises, large-scale land acquisition (LSLA) is becoming a topic of growing concern. Public data from the ‘Land Matrix Global Observatory’ project (Land Matrix 2014a) demonstrates that since 2000, 1,664 large-scale land transactions in low- and middle-income countries were reported,

  12. Detection of large-scale concentric gravity waves from a Chinese airglow imager network

    Science.gov (United States)

    Lai, Chang; Yue, Jia; Xu, Jiyao; Yuan, Wei; Li, Qinzeng; Liu, Xiao

    2018-06-01

    Concentric gravity waves (CGWs) contain a broad spectrum of horizontal wavelengths and periods due to their instantaneous localized sources (e.g., deep convection, volcanic eruptions, or earthquake, etc.). However, it is difficult to observe large-scale gravity waves of >100 km wavelength from the ground for the limited field of view of a single camera and local bad weather. Previously, complete large-scale CGW imagery could only be captured by satellite observations. In the present study, we developed a novel method that uses assembling separate images and applying low-pass filtering to obtain temporal and spatial information about complete large-scale CGWs from a network of all-sky airglow imagers. Coordinated observations from five all-sky airglow imagers in Northern China were assembled and processed to study large-scale CGWs over a wide area (1800 km × 1 400 km), focusing on the same two CGW events as Xu et al. (2015). Our algorithms yielded images of large-scale CGWs by filtering out the small-scale CGWs. The wavelengths, wave speeds, and periods of CGWs were measured from a sequence of consecutive assembled images. Overall, the assembling and low-pass filtering algorithms can expand the airglow imager network to its full capacity regarding the detection of large-scale gravity waves.

  13. Large Scale Experiments on Spacecraft Fire Safety

    DEFF Research Database (Denmark)

    Urban, David L.; Ruff, Gary A.; Minster, Olivier

    2012-01-01

    -based microgravity facilities or has been limited to very small fuel samples. Still, the work conducted to date has shown that fire behaviour in low-gravity is very different from that in normal-gravity, with differences observed for flammability limits, ignition delay, flame spread behaviour, flame colour and flame......Full scale fire testing complemented by computer modelling has provided significant knowhow about the risk, prevention and suppression of fire in terrestrial systems (cars, ships, planes, buildings, mines, and tunnels). In comparison, no such testing has been carried out for manned spacecraft due...... to the complexity, cost and risk associ-ated with operating a long duration fire safety experiment of a relevant size in microgravity. Therefore, there is currently a gap in knowledge of fire behaviour in spacecraft. The entire body of low-gravity fire research has either been conducted in short duration ground...

  14. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  15. A Domain-Specific Risk-Taking (DOSPERT scale for adult populations

    Directory of Open Access Journals (Sweden)

    Ann-Renée Blais

    2006-07-01

    Full Text Available This paper proposes a revised version of the original Domain-Specific Risk-Taking (DOSPERT scale developed by Weber, Blais, and Betz (2002 that is shorter and applicable to a {broader range of ages, cultures, and educational levels}. It also provides a French translation of the revised scale. Using multilevel modeling, we investigated the risk-return relationship between apparent risk taking and risk perception in 5 risk domains. The results replicate previously noted differences in reported degree of risk taking and risk perception at the mean level of analysis. The multilevel modeling shows, more interestingly, that within-participants variation in risk taking across the 5 content domains of the scale was about 7 times as large as between-participants variation. We discuss the implications of our findings in terms of the person-situation debate related to risk attitude

  16. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    basically consisted in 1- decomposing both signals (SLP field and precipitation or streamflow) using discrete wavelet multiresolution analysis and synthesis, 2- generating one statistical downscaling model per time-scale, 3- summing up all scale-dependent models in order to obtain a final reconstruction of the predictand. The results obtained revealed a significant improvement of the reconstructions for both precipitation and streamflow when using the multiresolution ESD model instead of basic ESD ; in addition, the scale-dependent spatial patterns associated to the model matched quite well those obtained from scale-dependent composite analysis. In particular, the multiresolution ESD model handled very well the significant changes in variance through time observed in either prepciptation or streamflow. For instance, the post-1980 period, which had been characterized by particularly high amplitudes in interannual-to-interdecadal variability associated with flood and extremely low-flow/drought periods (e.g., winter 2001, summer 2003), could not be reconstructed without integrating wavelet multiresolution analysis into the model. Further investigations would be required to address the issue of the stationarity of the large-scale/local-scale relationships and to test the capability of the multiresolution ESD model for interannual-to-interdecadal forecasting. In terms of methodological approach, further investigations may concern a fully comprehensive sensitivity analysis of the modeling to the parameter of the multiresolution approach (different families of scaling and wavelet functions used, number of coefficients/degree of smoothness, etc.).

  17. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  18. Underground large scale test facility for rocks

    International Nuclear Information System (INIS)

    Sundaram, P.N.

    1981-01-01

    This brief note discusses two advantages of locating the facility for testing rock specimens of large dimensions in an underground space. Such an environment can be made to contribute part of the enormous axial load and stiffness requirements needed to get complete stress-strain behavior. The high pressure vessel may also be located below the floor level since the lateral confinement afforded by the rock mass may help to reduce the thickness of the vessel

  19. AUSERA: Large-Scale Automated Security Risk Assessment of Global Mobile Banking Apps

    OpenAIRE

    Chen, Sen; Meng, Guozhu; Su, Ting; Fan, Lingling; Xue, Yinxing; Liu, Yang; Xu, Lihua; Xue, Minhui; Li, Bo; Hao, Shuang

    2018-01-01

    Contemporary financial technology (FinTech) that enables cashless mobile payment has been widely adopted by financial institutions, such as banks, due to its convenience and efficiency. However, FinTech has also made massive and dynamic transactions susceptible to security risks. Given large financial losses caused by such vulnerabilities, regulatory technology (RegTech) has been developed, but more comprehensive security risk assessment is specifically desired to develop robust, scalable, an...

  20. Small-scale dynamo at low magnetic Prandtl numbers

    Science.gov (United States)

    Schober, Jennifer; Schleicher, Dominik; Bovino, Stefano; Klessen, Ralf S.

    2012-12-01

    The present-day Universe is highly magnetized, even though the first magnetic seed fields were most probably extremely weak. To explain the growth of the magnetic field strength over many orders of magnitude, fast amplification processes need to operate. The most efficient mechanism known today is the small-scale dynamo, which converts turbulent kinetic energy into magnetic energy leading to an exponential growth of the magnetic field. The efficiency of the dynamo depends on the type of turbulence indicated by the slope of the turbulence spectrum v(ℓ)∝ℓϑ, where v(ℓ) is the eddy velocity at a scale ℓ. We explore turbulent spectra ranging from incompressible Kolmogorov turbulence with ϑ=1/3 to highly compressible Burgers turbulence with ϑ=1/2. In this work, we analyze the properties of the small-scale dynamo for low magnetic Prandtl numbers Pm, which denotes the ratio of the magnetic Reynolds number, Rm, to the hydrodynamical one, Re. We solve the Kazantsev equation, which describes the evolution of the small-scale magnetic field, using the WKB approximation. In the limit of low magnetic Prandtl numbers, the growth rate is proportional to Rm(1-ϑ)/(1+ϑ). We furthermore discuss the critical magnetic Reynolds number Rmcrit, which is required for small-scale dynamo action. The value of Rmcrit is roughly 100 for Kolmogorov turbulence and 2700 for Burgers. Furthermore, we discuss that Rmcrit provides a stronger constraint in the limit of low Pm than it does for large Pm. We conclude that the small-scale dynamo can operate in the regime of low magnetic Prandtl numbers if the magnetic Reynolds number is large enough. Thus, the magnetic field amplification on small scales can take place in a broad range of physical environments and amplify week magnetic seed fields on short time scales.

  1. Small-scale dynamo at low magnetic Prandtl numbers.

    Science.gov (United States)

    Schober, Jennifer; Schleicher, Dominik; Bovino, Stefano; Klessen, Ralf S

    2012-12-01

    The present-day Universe is highly magnetized, even though the first magnetic seed fields were most probably extremely weak. To explain the growth of the magnetic field strength over many orders of magnitude, fast amplification processes need to operate. The most efficient mechanism known today is the small-scale dynamo, which converts turbulent kinetic energy into magnetic energy leading to an exponential growth of the magnetic field. The efficiency of the dynamo depends on the type of turbulence indicated by the slope of the turbulence spectrum v(ℓ)∝ℓ^{ϑ}, where v(ℓ) is the eddy velocity at a scale ℓ. We explore turbulent spectra ranging from incompressible Kolmogorov turbulence with ϑ=1/3 to highly compressible Burgers turbulence with ϑ=1/2. In this work, we analyze the properties of the small-scale dynamo for low magnetic Prandtl numbers Pm, which denotes the ratio of the magnetic Reynolds number, Rm, to the hydrodynamical one, Re. We solve the Kazantsev equation, which describes the evolution of the small-scale magnetic field, using the WKB approximation. In the limit of low magnetic Prandtl numbers, the growth rate is proportional to Rm^{(1-ϑ)/(1+ϑ)}. We furthermore discuss the critical magnetic Reynolds number Rm_{crit}, which is required for small-scale dynamo action. The value of Rm_{crit} is roughly 100 for Kolmogorov turbulence and 2700 for Burgers. Furthermore, we discuss that Rm_{crit} provides a stronger constraint in the limit of low Pm than it does for large Pm. We conclude that the small-scale dynamo can operate in the regime of low magnetic Prandtl numbers if the magnetic Reynolds number is large enough. Thus, the magnetic field amplification on small scales can take place in a broad range of physical environments and amplify week magnetic seed fields on short time scales.

  2. Organizational Media Affordances : Operationalization and Associations with Media Use

    OpenAIRE

    Rice, Ronald E.; Evans, Sandra K.; Pearce, Katy E.; Sivunen, Anu; Vitak, Jessica; Treem, Jeffrey W.

    2017-01-01

    The concept of affordances has been increasingly applied to the study of information and communication technologies (ICTs) in organizational contexts. However, almost no research operationalizes affordances, limiting comparisons and programmatic research. This article briefly reviews conceptualizations and possibilities of affordances in general and for media, then introduces the concept of organizational media affordances as organizational resources. Analysis of survey data from a large Nord...

  3. A large-scale RF-based Indoor Localization System Using Low-complexity Gaussian filter and improved Bayesian inference

    Directory of Open Access Journals (Sweden)

    L. Xiao

    2013-04-01

    Full Text Available The growing convergence among mobile computing device and smart sensors boosts the development of ubiquitous computing and smart spaces, where localization is an essential part to realize the big vision. The general localization methods based on GPS and cellular techniques are not suitable for tracking numerous small size and limited power objects in the indoor case. In this paper, we propose and demonstrate a new localization method, this method is an easy-setup and cost-effective indoor localization system based on off-the-shelf active RFID technology. Our system is not only compatible with the future smart spaces and ubiquitous computing systems, but also suitable for large-scale indoor localization. The use of low-complexity Gaussian Filter (GF, Wheel Graph Model (WGM and Probabilistic Localization Algorithm (PLA make the proposed algorithm robust and suitable for large-scale indoor positioning from uncertainty, self-adjective to varying indoor environment. Using MATLAB simulation, we study the system performances, especially the dependence on a number of system and environment parameters, and their statistical properties. The simulation results prove that our proposed system is an accurate and cost-effective candidate for indoor localization.

  4. High-Performance Carbon Dioxide Electrocatalytic Reduction by Easily Fabricated Large-Scale Silver Nanowire Arrays.

    Science.gov (United States)

    Luan, Chuhao; Shao, Yang; Lu, Qi; Gao, Shenghan; Huang, Kai; Wu, Hui; Yao, Kefu

    2018-05-17

    An efficient and selective catalyst is in urgent need for carbon dioxide electroreduction and silver is one of the promising candidates with affordable costs. Here we fabricated large-scale vertically standing Ag nanowire arrays with high crystallinity and electrical conductivity as carbon dioxide electroreduction catalysts by a simple nanomolding method that was usually considered not feasible for metallic crystalline materials. A great enhancement of current densities and selectivity for CO at moderate potentials was achieved. The current density for CO ( j co ) of Ag nanowire array with 200 nm in diameter was more than 2500 times larger than that of Ag foil at an overpotential of 0.49 V with an efficiency over 90%. The origin of enhanced performances are attributed to greatly increased electrochemically active surface area (ECSA) and higher intrinsic activity compared to those of polycrystalline Ag foil. More low-coordinated sites on the nanowires which can stabilize the CO 2 intermediate better are responsible for the high intrinsic activity. In addition, the impact of surface morphology that induces limited mass transportation on reaction selectivity and efficiency of nanowire arrays with different diameters was also discussed.

  5. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  6. Demonstration of Mobile Auto-GPS for Large Scale Human Mobility Analysis

    Science.gov (United States)

    Horanont, Teerayut; Witayangkurn, Apichon; Shibasaki, Ryosuke

    2013-04-01

    The greater affordability of digital devices and advancement of positioning and tracking capabilities have presided over today's age of geospatial Big Data. Besides, the emergences of massive mobile location data and rapidly increase in computational capabilities open up new opportunities for modeling of large-scale urban dynamics. In this research, we demonstrate the new type of mobile location data called "Auto-GPS" and its potential use cases for urban applications. More than one million Auto-GPS mobile phone users in Japan have been observed nationwide in a completely anonymous form for over an entire year from August 2010 to July 2011 for this analysis. A spate of natural disasters and other emergencies during the past few years has prompted new interest in how mobile location data can help enhance our security, especially in urban areas which are highly vulnerable to these impacts. New insights gleaned from mining the Auto-GPS data suggest a number of promising directions of modeling human movement during a large-scale crisis. We question how people react under critical situation and how their movement changes during severe disasters. Our results demonstrate a case of major earthquake and explain how people who live in Tokyo Metropolitan and vicinity area behave and return home after the Great East Japan Earthquake on March 11, 2011.

  7. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  8. Global Trends in the Affordability of Sugar-Sweetened Beverages, 1990-2016.

    Science.gov (United States)

    Blecher, Evan; Liber, Alex C; Drope, Jeffrey M; Nguyen, Binh; Stoklosa, Michal

    2017-05-04

    The objective of this study was to quantify changes in the affordability of sugar-sweetened beverages, a product implicated as a contributor to rising rates of obesity worldwide, as a function of product price and personal income. We used international survey data in a retrospective analysis of 40 high-income and 42 low-income and middle-income countries from 1990 to 2016. Prices of sugar-sweetened beverages were from the Economist Intelligence Unit's World Cost of Living Survey. Income and inflation data were from the International Monetary Fund's World Economic Outlook Database. The measure of affordability was the average annual percentage change in the relative-income price of sugar-sweetened beverages, which is the annual rate of change in the proportion of per capita gross domestic product needed to purchase 100 L of Coca-Cola in each country in each year of the study. In 79 of 82 countries, the proportion of income needed to purchase sugar-sweetened beverages declined on average (using annual measures) during the study period. This pattern, described as an increase in the affordability of sugar-sweetened beverages, indicated that sugar-sweetened beverages became more affordable more rapidly in low-income and middle-income countries than in high-income countries, a fact largely attributable to the higher rate of income growth in those countries than to a decline in the real price of sugar-sweetened beverages. Without deliberate policy action to raise prices, sugar-sweetened beverages are likely to become more affordable and more widely consumed around the world.

  9. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  10. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  11. Low-cost Photoacoustic-based Measurement System for Carbon Dioxide Fluxes with the Potential for large-scale Monitoring

    Science.gov (United States)

    Scholz, L. T.; Bierer, B.; Ortiz Perez, A.; Woellenstein, J.; Sachs, T.; Palzer, S.

    2016-12-01

    The determination of carbon dioxide (CO2) fluxes between ecosystems and the atmosphere is crucial for understanding ecological processes on regional and global scales. High quality data sets with full uncertainty estimates are needed to evaluate model simulations. However, current flux monitoring techniques are unsuitable to provide reliable data of a large area at both a detailed level and an appropriate resolution, at best in combination with a high sampling rate. Currently used sensing technologies, such as non-dispersive infrared (NDIR) gas analyzers, cannot be deployed in large numbers to provide high spatial resolution due to their costs and complex maintenance requirements. Here, we propose a novel CO2 measurement system, whose gas sensing unit is made up of low-cost, low-power consuming components only, such as an IR-LED and a photoacoustic detector. The sensor offers a resolution of sensor response of just a few seconds. Since the sensor can be applied in-situ without special precautions, it allows for environmental monitoring in a non-invasive way. Its low energy consumption enables long-term measurements. The low overall costs favor the manufacturing in large quantities. This allows the operation of multiple sensors at a reasonable price and thus provides concentration measurements at any desired spatial coverage and at high temporal resolution. With appropriate 3D configuration of the units, vertical and horizontal fluxes can be determined. By applying a closely meshed wireless sensor network, inhomogeneities as well as CO2 sources and sinks in the lower atmosphere can be monitored. In combination with sensors for temperature, pressure and humidity, our sensor paves the way towards the reliable and extensive monitoring of ecosystem-atmosphere exchange rates. The technique can also be easily adapted to other relevant greenhouse gases.

  12. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  13. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  14. Bootstrapping Relational Affordances of Object Pairs using Transfer

    DEFF Research Database (Denmark)

    Fichtl, Severin; Kraft, Dirk; Krüger, Norbert

    2018-01-01

    leverage past knowledge to accelerate current learning (which we call bootstrapping). We learn Random Forest based affordance predictors from visual inputs and demonstrate two approaches to knowledge transfer for bootstrapping. In the first approach (direct bootstrapping), the state-space for a new...... affordance predictor is augmented with the output of previously learnt affordances. In the second approach (category based bootstrapping), we form categories that capture underlying commonalities of a pair of existing affordances and augment the state-space with this category classifier’s output. In addition......, we introduce a novel heuristic, which suggests how a large set of potential affordance categories can be pruned to leave only those categories which are most promising for bootstrapping future affordances. Our results show that both bootstrapping approaches outperform learning without bootstrapping...

  15. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  16. The Total Risk Analysis of Large Dams under Flood Hazards

    Directory of Open Access Journals (Sweden)

    Yu Chen

    2018-02-01

    Full Text Available Dams and reservoirs are useful systems in water conservancy projects; however, they also pose a high-risk potential for large downstream areas. Flood, as the driving force of dam overtopping, is the main cause of dam failure. Dam floods and their risks are of interest to researchers and managers. In hydraulic engineering, there is a growing tendency to evaluate dam flood risk based on statistical and probabilistic methods that are unsuitable for the situations with rare historical data or low flood probability, so a more reasonable dam flood risk analysis method with fewer application restrictions is needed. Therefore, different from previous studies, this study develops a flood risk analysis method for large dams based on the concept of total risk factor (TRF used initially in dam seismic risk analysis. The proposed method is not affected by the adequacy of historical data or the low probability of flood and is capable of analyzing the dam structure influence, the flood vulnerability of the dam site, and downstream risk as well as estimating the TRF of each dam and assigning corresponding risk classes to each dam. Application to large dams in the Dadu River Basin, Southwestern China, demonstrates that the proposed method provides quick risk estimation and comparison, which can help local management officials perform more detailed dam safety evaluations for useful risk management information.

  17. Analysis for Large Scale Integration of Electric Vehicles into Power Grids

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Wang, Xiaoru

    2011-01-01

    Electric Vehicles (EVs) provide a significant opportunity for reducing the consumption of fossil energies and the emission of carbon dioxide. With more and more electric vehicles integrated in the power systems, it becomes important to study the effects of EV integration on the power systems......, especially the low and middle voltage level networks. In the paper, the basic structure and characteristics of the electric vehicles are introduced. The possible impacts of large scale integration of electric vehicles on the power systems especially the advantage to the integration of the renewable energies...... are discussed. Finally, the research projects related to the large scale integration of electric vehicles into the power systems are introduced, it will provide reference for large scale integration of Electric Vehicles into power grids....

  18. Recent Regional Climate State and Change - Derived through Downscaling Homogeneous Large-scale Components of Re-analyses

    Science.gov (United States)

    Von Storch, H.; Klehmet, K.; Geyer, B.; Li, D.; Schubert-Frisius, M.; Tim, N.; Zorita, E.

    2015-12-01

    Global re-analyses suffer from inhomogeneities, as they process data from networks under development. However, the large-scale component of such re-analyses is mostly homogeneous; additional observational data add in most cases to a better description of regional details and less so on large-scale states. Therefore, the concept of downscaling may be applied to homogeneously complementing the large-scale state of the re-analyses with regional detail - wherever the condition of homogeneity of the large-scales is fulfilled. Technically this can be done by using a regional climate model, or a global climate model, which is constrained on the large scale by spectral nudging. This approach has been developed and tested for the region of Europe, and a skillful representation of regional risks - in particular marine risks - was identified. While the data density in Europe is considerably better than in most other regions of the world, even here insufficient spatial and temporal coverage is limiting risk assessments. Therefore, downscaled data-sets are frequently used by off-shore industries. We have run this system also in regions with reduced or absent data coverage, such as the Lena catchment in Siberia, in the Yellow Sea/Bo Hai region in East Asia, in Namibia and the adjacent Atlantic Ocean. Also a global (large scale constrained) simulation has been. It turns out that spatially detailed reconstruction of the state and change of climate in the three to six decades is doable for any region of the world.The different data sets are archived and may freely by used for scientific purposes. Of course, before application, a careful analysis of the quality for the intended application is needed, as sometimes unexpected changes in the quality of the description of large-scale driving states prevail.

  19. Paying for Cures: How Can We Afford It? Managed Care Pharmacy Stakeholder Perceptions of Policy Options to Address Affordability of Prescription Drugs.

    Science.gov (United States)

    Yeung, Kai; Suh, Kangho; Basu, Anirban; Garrison, Louis P; Bansal, Aasthaa; Carlson, Josh J

    2017-10-01

    High-priced medications with curative potential, such as the newer hepatitis C therapies, have contributed to the recent growth in pharmaceutical expenditure. Despite the obvious benefits, health care decision makers are just beginning to grapple with questions of how to value and pay for curative therapies that may feature large upfront cost, followed by health benefits that are reaped over a patient's lifespan. Alternative policy options have been proposed to promote high value and financially sustainable use of these therapies. It is unclear which policy options would be most acceptable to health care payer and biomedical manufacturer stakeholders. To (a) briefly review pharmaceutical policy options to address health system affordability and (b) assess the acceptability of alternative policy options to health care payers and biomedical manufacturers before and after an Academy of Managed Care Pharmacy (AMCP) continuing pharmacy education (CPE) session. We searched MEDLINE and Cochran databases for pharmaceutical policy options addressing affordability. With input from a focus group of managed care professionals, we developed CPE session content and an 8-question survey focusing on the most promising policy options. We fielded the survey before and after the CPE session, which occurred as part of the 2016 AMCP Annual Meeting. We first conducted a chi-squared goodness-of-fit test to assess response distributions. Next, we tested how responses differed before and after by using an ordered logit and a multinomial logit to model Likert scale and unordered responses, respectively. Although risk-sharing payments over time remained the most favorable choice before (37%) and after (35%) the CPE session, this choice was closely followed by HealthCoin after the session, which increased in favorability from 4% to 33% of responses (P = 0.001). About half of the respondents (54%) indicated that legislative change is the most significant barrier to the implementation of any

  20. Alternatives to electricity for transmission and annual-scale firming - Storage for diverse, stranded, renewable energy resources: hydrogen and ammonia

    Energy Technology Data Exchange (ETDEWEB)

    Leighty, William

    2010-09-15

    The world's richest renewable energy resources 'of large geographic extent and high intensity' are stranded: far from end-users with inadequate or nonexistent gathering and transmission systems to deliver energy. Output of most renewables varies greatly, at time scales of seconds-seasons: energy capture assets operate at low capacity factor; energy delivery is not 'firm'. New electric transmission systems, or fractions thereof, dedicated to renewables, suffer the same low CF: substantial stranded capital assets, increasing the cost of delivered renewable-source energy. Electricity storage cannot affordably firm large renewables at annual scale. Gaseous hydrogen and anhydrous ammonia fuels can: attractive alternatives.

  1. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  2. Competitive Dynamics of Market Entry: Scale and Survival

    Directory of Open Access Journals (Sweden)

    John W. UPSON

    2017-06-01

    Full Text Available Market entry is the essence of strategy and is largely viewed as a dichotomous event: entry or no entry. What has not been acknowledged is the uniqueness of each market entry. Our study highlights the scale of market entry in the context of multipoint competition. We assert that entry scale varies based on the risk of market incumbent retaliation. Theory suggests that when risk associated with retaliation are low, firms enter with large scale and when associated risks are high, firms enter with low scale. Further, survival is viewed as dependent on following theory. We argue and find supporting evidence that firms behave in the opposite manner and do so to their own benefit, thereby revealing a unique discrepancy between theory and practice among 75 product market entries by 27 firms.

  3. Low-frequency scaling applied to stochastic finite-fault modeling

    Science.gov (United States)

    Crane, Stephen; Motazedian, Dariush

    2014-01-01

    Stochastic finite-fault modeling is an important tool for simulating moderate to large earthquakes. It has proven to be useful in applications that require a reliable estimation of ground motions, mostly in the spectral frequency range of 1 to 10 Hz, which is the range of most interest to engineers. However, since there can be little resemblance between the low-frequency spectra of large and small earthquakes, this portion can be difficult to simulate using stochastic finite-fault techniques. This paper introduces two different methods to scale low-frequency spectra for stochastic finite-fault modeling. One method multiplies the subfault source spectrum by an empirical function. This function has three parameters to scale the low-frequency spectra: the level of scaling and the start and end frequencies of the taper. This empirical function adjusts the earthquake spectra only between the desired frequencies, conserving seismic moment in the simulated spectra. The other method is an empirical low-frequency coefficient that is added to the subfault corner frequency. This new parameter changes the ratio between high and low frequencies. For each simulation, the entire earthquake spectra is adjusted, which may result in the seismic moment not being conserved for a simulated earthquake. These low-frequency scaling methods were used to reproduce recorded earthquake spectra from several earthquakes recorded in the Pacific Earthquake Engineering Research Center (PEER) Next Generation Attenuation Models (NGA) database. There were two methods of determining the stochastic parameters of best fit for each earthquake: a general residual analysis and an earthquake-specific residual analysis. Both methods resulted in comparable values for stress drop and the low-frequency scaling parameters; however, the earthquake-specific residual analysis obtained a more accurate distribution of the averaged residuals.

  4. Socio-economic status, racial composition and the affordability of fresh fruits and vegetables in neighborhoods of a large rural region in Texas

    Directory of Open Access Journals (Sweden)

    Bouhlal Yasser

    2011-01-01

    Full Text Available Abstract Background Little is known about how affordability of healthy food varies with community characteristics in rural settings. We examined how the cost of fresh fruit and vegetables varies with the economic and demographic characteristics in six rural counties of Texas. Methods Ground-truthed data from the Brazos Valley Food Environment Project were used to identify all food stores in the rural region and the availability and lowest price of fresh whole fruit and vegetables in the food stores. Socioeconomic characteristics were extracted from the 2000 U.S. Census Summary Files 3 at the level of the census block group. We used an imputation strategy to calculate two types of price indices for both fresh fruit and fresh vegetables: a high variety and a basic index; and evaluated the relationship between neighborhood economic and demographic characteristics and affordability of fresh produce, using linear regression models. Results The mean cost of meeting the USDA recommendation of fruit consumption from a high variety basket of fruit types in our sample of stores was just over $27.50 per week. Relying on the three most common fruits lowered the weekly expense to under $17.25 per week, a reduction of 37.6%. The effect of moving from a high variety to a low variety basket was much less when considering vegetable consumption: a 4.3% decline from $29.23 to $27.97 per week. Univariate regression analysis revealed that the cost of fresh produce is not associated with the racial/ethnic composition of the local community. However, multivariate regression showed that holding median income constant, stores in neighborhoods with higher percentages of Black residents paid more for fresh fruits and vegetables. The proportion of Hispanic residents was not associated with cost in either the univariate or multivariate analysis. Conclusion This study extends prior work by examining the affordability of fresh fruit and vegetables from food stores in a large

  5. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  6. The importance of creating a social business to produce low-cost hearing aids.

    Science.gov (United States)

    Caccamo, Samantha; Voloshchenko, Anastasia; Dankyi, Nana Yaa

    2014-09-01

    The World Health Organization (WHO) estimates that about 280 million people worldwide have a bilateral hearing loss, mostly living in poor countries. Hearing loss causes heavy social burdens on individuals, families, communities and countries. However, due to the lack of accessibility and affordability, the vast majority of people in the world who need hearing aids do not have access to them. Low-income countries are thus pulled into a disability/poverty spiral. From this standpoint, the production of available, accessible and affordable hearing aids for the poorest populations of our planet should be one of the main issues in global hearing healthcare. Designing and producing a brand new low-cost hearing aid is the most effective option. Involving a large producer of hearing aids in the creation of a social business to solve the problem of access to affordable hearing aids is an essential step to reduce hearing disability on a large scale globally. Today's technology allows for the creation of a "minimal design" product that does not exceed $100-$150, that can be further lowered when purchased in large quantities and dispensed with alternative models. It is conceivable that by making a sustainable social business, the low cost product could be sold with a cross-subsidy model in order to recover the overhead costs. Social business is an economic model that has the potential to produce and distribute affordable hearing aids in low- and middle-income countries. Rehabilitation of hearing impaired children will be carried out in partnership with Sahic (Society of Assistance to Hearing Impaired Children) in Dhaka, Bangladesh and the ENT Department of Ospedale Burlo di Trieste, Dr. Eva Orzan.

  7. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  8. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  9. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  10. Global Trends in the Affordability of Sugar-Sweetened Beverages, 1990–2016

    Science.gov (United States)

    Blecher, Evan; Liber, Alex C.; Nguyen, Binh; Stoklosa, Michal

    2017-01-01

    Introduction The objective of this study was to quantify changes in the affordability of sugar-sweetened beverages, a product implicated as a contributor to rising rates of obesity worldwide, as a function of product price and personal income. Methods We used international survey data in a retrospective analysis of 40 high-income and 42 low-income and middle-income countries from 1990 to 2016. Prices of sugar-sweetened beverages were from the Economist Intelligence Unit’s World Cost of Living Survey. Income and inflation data were from the International Monetary Fund’s World Economic Outlook Database. The measure of affordability was the average annual percentage change in the relative-income price of sugar-sweetened beverages, which is the annual rate of change in the proportion of per capita gross domestic product needed to purchase 100 L of Coca-Cola in each country in each year of the study. Results In 79 of 82 countries, the proportion of income needed to purchase sugar-sweetened beverages declined on average (using annual measures) during the study period. This pattern, described as an increase in the affordability of sugar-sweetened beverages, indicated that sugar-sweetened beverages became more affordable more rapidly in low-income and middle-income countries than in high-income countries, a fact largely attributable to the higher rate of income growth in those countries than to a decline in the real price of sugar-sweetened beverages. Conclusion Without deliberate policy action to raise prices, sugar-sweetened beverages are likely to become more affordable and more widely consumed around the world. PMID:28472607

  11. Hydrogen combustion modelling in large-scale geometries

    International Nuclear Information System (INIS)

    Studer, E.; Beccantini, A.; Kudriakov, S.; Velikorodny, A.

    2014-01-01

    Hydrogen risk mitigation issues based on catalytic recombiners cannot exclude flammable clouds to be formed during the course of a severe accident in a Nuclear Power Plant. Consequences of combustion processes have to be assessed based on existing knowledge and state of the art in CFD combustion modelling. The Fukushima accidents have also revealed the need for taking into account the hydrogen explosion phenomena in risk management. Thus combustion modelling in a large-scale geometry is one of the remaining severe accident safety issues. At present day there doesn't exist a combustion model which can accurately describe a combustion process inside a geometrical configuration typical of the Nuclear Power Plant (NPP) environment. Therefore the major attention in model development has to be paid on the adoption of existing approaches or creation of the new ones capable of reliably predicting the possibility of the flame acceleration in the geometries of that type. A set of experiments performed previously in RUT facility and Heiss Dampf Reactor (HDR) facility is used as a validation database for development of three-dimensional gas dynamic model for the simulation of hydrogen-air-steam combustion in large-scale geometries. The combustion regimes include slow deflagration, fast deflagration, and detonation. Modelling is based on Reactive Discrete Equation Method (RDEM) where flame is represented as an interface separating reactants and combustion products. The transport of the progress variable is governed by different flame surface wrinkling factors. The results of numerical simulation are presented together with the comparisons, critical discussions and conclusions. (authors)

  12. Large-scale coherent structures of suspended dust concentration in the neutral atmospheric surface layer: A large-eddy simulation study

    Science.gov (United States)

    Zhang, Yangyue; Hu, Ruifeng; Zheng, Xiaojing

    2018-04-01

    Dust particles can remain suspended in the atmospheric boundary layer, motions of which are primarily determined by turbulent diffusion and gravitational settling. Little is known about the spatial organizations of suspended dust concentration and how turbulent coherent motions contribute to the vertical transport of dust particles. Numerous studies in recent years have revealed that large- and very-large-scale motions in the logarithmic region of laboratory-scale turbulent boundary layers also exist in the high Reynolds number atmospheric boundary layer, but their influence on dust transport is still unclear. In this study, numerical simulations of dust transport in a neutral atmospheric boundary layer based on an Eulerian modeling approach and large-eddy simulation technique are performed to investigate the coherent structures of dust concentration. The instantaneous fields confirm the existence of very long meandering streaks of dust concentration, with alternating high- and low-concentration regions. A strong negative correlation between the streamwise velocity and concentration and a mild positive correlation between the vertical velocity and concentration are observed. The spatial length scales and inclination angles of concentration structures are determined, compared with their flow counterparts. The conditionally averaged fields vividly depict that high- and low-concentration events are accompanied by a pair of counter-rotating quasi-streamwise vortices, with a downwash inside the low-concentration region and an upwash inside the high-concentration region. Through the quadrant analysis, it is indicated that the vertical dust transport is closely related to the large-scale roll modes, and ejections in high-concentration regions are the major mechanisms for the upward motions of dust particles.

  13. Strategic options towards an affordable high-performance infrared camera

    Science.gov (United States)

    Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.

    2016-05-01

    The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise ( 500 frames per second (FPS)) at full resolution, and low power consumption (market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.

  14. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  15. Low-Cost and Scaled-Up Production of Fluorine-Free, Substrate-Independent, Large-Area Superhydrophobic Coatings Based on Hydroxyapatite Nanowire Bundles.

    Science.gov (United States)

    Chen, Fei-Fei; Yang, Zi-Yue; Zhu, Ying-Jie; Xiong, Zhi-Chao; Dong, Li-Ying; Lu, Bing-Qiang; Wu, Jin; Yang, Ri-Long

    2018-01-09

    To date, the scaled-up production and large-area applications of superhydrophobic coatings are limited because of complicated procedures, environmentally harmful fluorinated compounds, restrictive substrates, expensive equipment, and raw materials usually involved in the fabrication process. Herein, the facile, low-cost, and green production of superhydrophobic coatings based on hydroxyapatite nanowire bundles (HNBs) is reported. Hydrophobic HNBs are synthesised by using a one-step solvothermal method with oleic acid as the structure-directing and hydrophobic agent. During the reaction process, highly hydrophobic C-H groups of oleic acid molecules can be attached in situ to the surface of HNBs through the chelate interaction between Ca 2+ ions and carboxylic groups. This facile synthetic method allows the scaled-up production of HNBs up to about 8 L, which is the largest production scale of superhydrophobic paint based on HNBs ever reported. In addition, the design of the 100 L reaction system is also shown. The HNBs can be coated on any substrate with an arbitrary shape by the spray-coating technique. The self-cleaning ability in air and oil, high-temperature stability, and excellent mechanical durability of the as-prepared superhydrophobic coatings are demonstrated. More importantly, the HNBs are coated on large-sized practical objects to form large-area superhydrophobic coatings. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Efficient Topology Estimation for Large Scale Optical Mapping

    CERN Document Server

    Elibol, Armagan; Garcia, Rafael

    2013-01-01

    Large scale optical mapping methods are in great demand among scientists who study different aspects of the seabed, and have been fostered by impressive advances in the capabilities of underwater robots in gathering optical data from the seafloor. Cost and weight constraints mean that low-cost ROVs usually have a very limited number of sensors. When a low-cost robot carries out a seafloor survey using a down-looking camera, it usually follows a predefined trajectory that provides several non time-consecutive overlapping image pairs. Finding these pairs (a process known as topology estimation) is indispensable to obtaining globally consistent mosaics and accurate trajectory estimates, which are necessary for a global view of the surveyed area, especially when optical sensors are the only data source. This book contributes to the state-of-art in large area image mosaicing methods for underwater surveys using low-cost vehicles equipped with a very limited sensor suite. The main focus has been on global alignment...

  17. Biomass Gasification - A synthesis of technical barriers and current research issues for deployment at large scale

    Energy Technology Data Exchange (ETDEWEB)

    Heyne, Stefan [Chalmers Univ. of Technology, Gothenburg (Sweden); Liliedahl, Truls [KTH, Royal Inst. of Technology, Stockholm (Sweden); Marklund, Magnus [Energy Technology Centre, Piteaa (Sweden)

    2013-09-01

    Thermal gasification at large scale for cogeneration of power and heat and/or production of fuels and materials is a main pathway for a sustainable deployment of biomass resources. However, so far no such full scale production exists and biomass gasification projects remain at the pilot or demonstration scale. This report focuses on the key critical technology challenges for the large-scale deployment of the following biomass-based gasification concepts: Direct Fluidized Bed Gasification (FBG), Entrained Flow Gasification (EFG) and indirect Dual Fluidized Bed Gasification (DFBG). The main content in this report is based on responses from a number of experts in biomass gasification obtained from a questionnaire. The survey was composed of a number of more or less specific questions on technical barriers as to the three gasification concepts considered. For formalising the questionnaire, the concept of Technology Readiness Level (TRL 1-9) was used for grading the level of technical maturity of the different sub-processes within the three generic biomass gasification technologies. For direct fluidized bed gasification (FBG) it is mentioned that the technology is already available at commercial scale as air-blown technology and thus that air-blown FBG gasification may be reckoned a mature technology. The remaining technical challenge is the conversion to operation on oxygen with the final goal of producing chemicals or transport fuels. Tar reduction, in particular, and gas cleaning and upgrading in general are by far the most frequently named technical issues considered problematic. Other important aspects are problems that may occur when operating on low-grade fuels - i.e. low-cost fuels. These problems include bed agglomeration/ash sintering as well as alkali fouling. Even the preparation and feeding of these low-grade fuels tend to be problematic and require further development to be used on a commercial scale. Furthermore, efficient char conversion is mentioned by

  18. Quantitative Missense Variant Effect Prediction Using Large-Scale Mutagenesis Data.

    Science.gov (United States)

    Gray, Vanessa E; Hause, Ronald J; Luebeck, Jens; Shendure, Jay; Fowler, Douglas M

    2018-01-24

    Large datasets describing the quantitative effects of mutations on protein function are becoming increasingly available. Here, we leverage these datasets to develop Envision, which predicts the magnitude of a missense variant's molecular effect. Envision combines 21,026 variant effect measurements from nine large-scale experimental mutagenesis datasets, a hitherto untapped training resource, with a supervised, stochastic gradient boosting learning algorithm. Envision outperforms other missense variant effect predictors both on large-scale mutagenesis data and on an independent test dataset comprising 2,312 TP53 variants whose effects were measured using a low-throughput approach. This dataset was never used for hyperparameter tuning or model training and thus serves as an independent validation set. Envision prediction accuracy is also more consistent across amino acids than other predictors. Finally, we demonstrate that Envision's performance improves as more large-scale mutagenesis data are incorporated. We precompute Envision predictions for every possible single amino acid variant in human, mouse, frog, zebrafish, fruit fly, worm, and yeast proteomes (https://envision.gs.washington.edu/). Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Assessing flood risk at the global scale: model setup, results, and sensitivity

    International Nuclear Information System (INIS)

    Ward, Philip J; Jongman, Brenden; Weiland, Frederiek Sperna; Winsemius, Hessel C; Bouwman, Arno; Ligtvoet, Willem; Van Beek, Rens; Bierkens, Marc F P

    2013-01-01

    Globally, economic losses from flooding exceeded $19 billion in 2012, and are rising rapidly. Hence, there is an increasing need for global-scale flood risk assessments, also within the context of integrated global assessments. We have developed and validated a model cascade for producing global flood risk maps, based on numerous flood return-periods. Validation results indicate that the model simulates interannual fluctuations in flood impacts well. The cascade involves: hydrological and hydraulic modelling; extreme value statistics; inundation modelling; flood impact modelling; and estimating annual expected impacts. The initial results estimate global impacts for several indicators, for example annual expected exposed population (169 million); and annual expected exposed GDP ($1383 billion). These results are relatively insensitive to the extreme value distribution employed to estimate low frequency flood volumes. However, they are extremely sensitive to the assumed flood protection standard; developing a database of such standards should be a research priority. Also, results are sensitive to the use of two different climate forcing datasets. The impact model can easily accommodate new, user-defined, impact indicators. We envisage several applications, for example: identifying risk hotspots; calculating macro-scale risk for the insurance industry and large companies; and assessing potential benefits (and costs) of adaptation measures. (letter)

  20. Large scale hydrogeological modelling of a low-lying complex coastal aquifer system

    DEFF Research Database (Denmark)

    Meyer, Rena

    2018-01-01

    intrusion. In this thesis a new methodological approach was developed to combine 3D numerical groundwater modelling with a detailed geological description and hydrological, geochemical and geophysical data. It was applied to a regional scale saltwater intrusion in order to analyse and quantify...... the groundwater flow dynamics, identify the driving mechanisms that formed the saltwater intrusion to its present extent and to predict its progression in the future. The study area is located in the transboundary region between Southern Denmark and Northern Germany, adjacent to the Wadden Sea. Here, a large-scale...... parametrization schemes that accommodate hydrogeological heterogeneities. Subsequently, density-dependent flow and transport modelling of multiple salt sources was successfully applied to simulate the formation of the saltwater intrusion during the last 4200 years, accounting for historic changes in the hydraulic...

  1. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    Science.gov (United States)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4Destiny will be used in its third year as a high resolution, wide-field imager to conduct a weak lensing survey covering >lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  2. Risk transfer formula for individual and small group markets under the Affordable Care Act.

    Science.gov (United States)

    Pope, Gregory C; Bachofer, Henry; Pearlman, Andrew; Kautter, John; Hunter, Elizabeth; Miller, Daniel; Keenan, Patricia

    2014-01-01

    The Affordable Care Act provides for a program of risk adjustment in the individual and small group health insurance markets in 2014 as Marketplaces are implemented and new market reforms take effect. The purpose of risk adjustment is to lessen or eliminate the influence of risk selection on the premiums that plans charge. The risk adjustment methodology includes the risk adjustment model and the risk transfer formula. This article is the third of three in this issue of the Medicare & Medicaid Research Review that describe the ACA risk adjustment methodology and focuses on the risk transfer formula. In our first companion article, we discussed the key issues and choices in developing the methodology. In our second companion paper, we described the risk adjustment model that is used to calculate risk scores. In this article we present the risk transfer formula. We first describe how the plan risk score is combined with factors for the plan allowable premium rating, actuarial value, induced demand, geographic cost, and the statewide average premium in a formula that calculates transfers among plans. We then show how each plan factor is determined, as well as how the factors relate to each other in the risk transfer formula. The goal of risk transfers is to offset the effects of risk selection on plan costs while preserving premium differences due to factors such as actuarial value differences. Illustrative numerical simulations show the risk transfer formula operating as anticipated in hypothetical scenarios.

  3. Segmentation and fragmentation of melt jets due to generation of large-scale structures. Observation in low subcooling conditions

    International Nuclear Information System (INIS)

    Sugiyama, Ken-ichiro; Yamada, Tsuyoshi

    1999-01-01

    In order to clarify a mechanism of melt-jet breakup and fragmentation entirely different from the mechanism of stripping, a series of experiments were carried out by using molten tin jets of 100 grams with initial temperatures from 250degC to 900degC. Molten tin jets with a small kinematic viscosity and a large thermal diffusivity were used to observe breakup and fragmentation of melt jets enhanced thermally and hydrodynamically. We observed jet columns with second-stage large-scale structures generated by the coalescence of large-scale structures recognized in the field of fluid mechanics. At a greater depth, the segmentation of jet columns between second-stage large-scale structures and the fragmentation of the segmented jet columns were observed. It is reasonable to consider that the segmentation and the fragmentation of jet columns are caused by the boiling of water hydrodynamically entrained within second-stage large-scale structures. (author)

  4. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  5. ENSO impacts on flood risk at the global scale

    Science.gov (United States)

    Ward, Philip; Dettinger, Michael; Jongman, Brenden; Kummu, Matti; Winsemius, Hessel

    2014-05-01

    We present the impacts of El Niño Southern Oscillation (ENSO) on society and the economy, via relationships between ENSO and the hydrological cycle. We also discuss ways in which this knowledge can be used in disaster risk management and risk reduction. This contribution provides the most recent results of an ongoing 4-year collaborative research initiative to assess and map the impacts of large scale interannual climate variability on flood hazard and risk at the global scale. We have examined anomalies in flood risk between ENSO phases, whereby flood risk is expressed in terms of indicators such as: annual expected damage; annual expected affected population; annual expected affected Gross Domestic Product (GDP). We show that large anomalies in flood risk occur during El Niño or La Niña years in basins covering large parts of the Earth's surface. These anomalies reach statistical significance river basins covering almost two-thirds of the Earth's surface. Particularly strong anomalies exist in southern Africa, parts of western Africa, Australia, parts of Central Eurasia (especially for El Niño), the western USA (especially La Niña anomalies), and parts of South America. We relate these anomalies to possible causal relationships between ENSO and flood hazard, using both modelled and observed data on flood occurrence and extremity. The implications for flood risk management are many-fold. In those regions where disaster risk is strongly influenced by ENSO, the potential predictably of ENSO could be used to develop probabilistic flood risk projections with lead times up to several seasons. Such data could be used by the insurance industry in managing risk portfolios and by multinational companies for assessing the robustness of their supply chains to potential flood-related interruptions. Seasonal forecasts of ENSO influence of peak flows could also allow for improved flood early warning and regulation by dam operators, which could also reduce overall risks

  6. College Affordability for Low-Income Adults: Improving Returns on Investment for Families and Society. Report #C412

    Science.gov (United States)

    Gault, Barbara; Reichlin, Lindsey; Román, Stephanie

    2014-01-01

    This report examines how efforts to understand and improve college affordability can be informed by the experiences and circumstances of low-income adults, students of color, and students with dependent children. The report discusses how the time and financial demands associated with financial independence, parenthood, and work affect a student's…

  7. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  8. Sensitivity of LDEF foil analyses using ultra-low background germanium vs. large NaI(Tl) multidimensional spectrometers

    International Nuclear Information System (INIS)

    Reeves, J.H.; Arthur, R.J.; Brodzinski, R.L.

    1992-06-01

    Cobalt foils and stainless steel samples were analyzed for induced 6O Co activity with both an ultra-low background germanium gamma-ray spectrometer and with a large NaI(Tl) multidimensional spectrometer, both of which use electronic anticoincidence shielding to reduce background counts resulting from cosmic rays. Aluminum samples were analyzed for 22 Na. The results, in addition to the relative sensitivities and precisions afforded by the two methods, are presented

  9. Large-Scale Molded Silicon Oxycarbide Composite Components for Ultra-Low-Cost Lightweight Mirrors, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Next-generation telescopes need mirrors that are extremely stable, lightweight, and affordable. Semplastics has developed a novel, innovative ceramic material which...

  10. Effect of low dose gamma irradiation on onion yield: Large scale application

    International Nuclear Information System (INIS)

    Al-Oudat, M.

    1993-01-01

    Large scale application of presowing gamma-irradiation of seeds, bulblets and bulbs of onion, performed in 1989, using the doses of 10 Gy for seeds and 1 Gy for bulblets and bulbs. The doses were chosen on the basis of previous experiments. Reliable increases in yield of seeds (19.3%), bulblets (18.9) and bulbs (31.4%) for red variety. and of 22.3% and 23.4% for seeds and bulbs of white variety were obtained. (author). 2 tabs

  11. Cosmological streaming velocities and large-scale density maxima

    International Nuclear Information System (INIS)

    Peacock, J.A.; Lumsden, S.L.; Heavens, A.F.

    1987-01-01

    The statistical testing of models for galaxy formation against the observed peculiar velocities on 10-100 Mpc scales is considered. If it is assumed that observers are likely to be sited near maxima in the primordial field of density perturbations, then the observed filtered velocity field will be biased to low values by comparison with a point selected at random. This helps to explain how the peculiar velocities (relative to the microwave background) of the local supercluster and the Rubin-Ford shell can be so similar in magnitude. Using this assumption to predict peculiar velocities on two scales, we test models with large-scale damping (i.e. adiabatic perturbations). Allowed models have a damping length close to the Rubin-Ford scale and are mildly non-linear. Both purely baryonic universes and universes dominated by massive neutrinos can account for the observed velocities, provided 0.1 ≤ Ω ≤ 1. (author)

  12. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  13. A convex optimization approach for solving large scale linear systems

    Directory of Open Access Journals (Sweden)

    Debora Cores

    2017-01-01

    Full Text Available The well-known Conjugate Gradient (CG method minimizes a strictly convex quadratic function for solving large-scale linear system of equations when the coefficient matrix is symmetric and positive definite. In this work we present and analyze a non-quadratic convex function for solving any large-scale linear system of equations regardless of the characteristics of the coefficient matrix. For finding the global minimizers, of this new convex function, any low-cost iterative optimization technique could be applied. In particular, we propose to use the low-cost globally convergent Spectral Projected Gradient (SPG method, which allow us to extend this optimization approach for solving consistent square and rectangular linear system, as well as linear feasibility problem, with and without convex constraints and with and without preconditioning strategies. Our numerical results indicate that the new scheme outperforms state-of-the-art iterative techniques for solving linear systems when the symmetric part of the coefficient matrix is indefinite, and also for solving linear feasibility problems.

  14. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  15. Comparison of fall prediction by the Hessisch Oldendorf Fall Risk Scale and the Fall Risk Scale by Huhn in neurological rehabilitation: an observational study.

    Science.gov (United States)

    Hermann, Olena; Schmidt, Simone B; Boltzmann, Melanie; Rollnik, Jens D

    2018-05-01

    To calculate scale performance of the newly developed Hessisch Oldendorf Fall Risk Scale (HOSS) for classifying fallers and non-fallers in comparison with the Risk of Falling Scale by Huhn (FSH), a frequently used assessment tool. A prospective observational trail was conducted. The study was performed in a large specialized neurological rehabilitation facility. The study population ( n = 690) included neurological and neurosurgery patients during neurological rehabilitation with varying levels of disability. Around the half of the study patients were independent and dependent in the activities of daily living (ADL), respectively. Fall risk of each patient was assessed by HOSS and FSH within the first seven days after admission. Event of fall during rehabilitation was compared with HOSS and FSH scores as well as the according fall risk. Scale performance including sensitivity and specificity was calculated for both scales. A total of 107 (15.5%) patients experienced at least one fall. In general, fallers were characterized by an older age, a prolonged length of stay, and a lower Barthel Index (higher dependence in the ADL) on admission than non-fallers. The verification of fall prediction for both scales showed a sensitivity of 83% and a specificity of 64% for the HOSS scale, and a sensitivity of 98% with a specificity of 12% for the FSH scale, respectively. The HOSS shows an adequate sensitivity, a higher specificity and therefore a better scale performance than the FSH. Thus, the HOSS might be superior to existing assessments.

  16. ``Large''- vs Small-scale friction control in turbulent channel flow

    Science.gov (United States)

    Canton, Jacopo; Örlü, Ramis; Chin, Cheng; Schlatter, Philipp

    2017-11-01

    We reconsider the ``large-scale'' control scheme proposed by Hussain and co-workers (Phys. Fluids 10, 1049-1051 1998 and Phys. Rev. Fluids, 2, 62601 2017), using new direct numerical simulations (DNS). The DNS are performed in a turbulent channel at friction Reynolds number Reτ of up to 550 in order to eliminate low-Reynolds-number effects. The purpose of the present contribution is to re-assess this control method in the light of more modern developments in the field, in particular also related to the discovery of (very) large-scale motions. The goals of the paper are as follows: First, we want to better characterise the physics of the control, and assess what external contribution (vortices, forcing, wall motion) are actually needed. Then, we investigate the optimal parameters and, finally, determine which aspects of this control technique actually scale in outer units and can therefore be of use in practical applications. In addition to discussing the mentioned drag-reduction effects, the present contribution will also address the potential effect of the naturally occurring large-scale motions on frictional drag, and give indications on the physical processes for potential drag reduction possible at all Reynolds numbers.

  17. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  18. Development of polymers for large scale roll-to-roll processing of polymer solar cells

    DEFF Research Database (Denmark)

    Carlé, Jon Eggert

    Development of polymers for large scale roll-to-roll processing of polymer solar cells Conjugated polymers potential to both absorb light and transport current as well as the perspective of low cost and large scale production has made these kinds of material attractive in solar cell research....... The research field of polymer solar cells (PSCs) is rapidly progressing along three lines: Improvement of efficiency and stability together with the introduction of large scale production methods. All three lines are explored in this work. The thesis describes low band gap polymers and why these are needed....... Polymer of this type display broader absorption resulting in better overlap with the solar spectrum and potentially higher current density. Synthesis, characterization and device performance of three series of polymers illustrating how the absorption spectrum of polymers can be manipulated synthetically...

  19. Dynamical links between small- and large-scale mantle heterogeneity: Seismological evidence

    Science.gov (United States)

    Frost, Daniel A.; Garnero, Edward J.; Rost, Sebastian

    2018-01-01

    We identify PKP • PKP scattered waves (also known as P‧ •P‧) from earthquakes recorded at small-aperture seismic arrays at distances less than 65°. P‧ •P‧ energy travels as a PKP wave through the core, up into the mantle, then scatters back down through the core to the receiver as a second PKP. P‧ •P‧ waves are unique in that they allow scattering heterogeneities throughout the mantle to be imaged. We use array-processing methods to amplify low amplitude, coherent scattered energy signals and resolve their incoming direction. We deterministically map scattering heterogeneity locations from the core-mantle boundary to the surface. We use an extensive dataset with sensitivity to a large volume of the mantle and a location method allowing us to resolve and map more heterogeneities than have previously been possible, representing a significant increase in our understanding of small-scale structure within the mantle. Our results demonstrate that the distribution of scattering heterogeneities varies both radially and laterally. Scattering is most abundant in the uppermost and lowermost mantle, and a minimum in the mid-mantle, resembling the radial distribution of tomographically derived whole-mantle velocity heterogeneity. We investigate the spatial correlation of scattering heterogeneities with large-scale tomographic velocities, lateral velocity gradients, the locations of deep-seated hotspots and subducted slabs. In the lowermost 1500 km of the mantle, small-scale heterogeneities correlate with regions of low seismic velocity, high lateral seismic gradient, and proximity to hotspots. In the upper 1000 km of the mantle there is no significant correlation between scattering heterogeneity location and subducted slabs. Between 600 and 900 km depth, scattering heterogeneities are more common in the regions most remote from slabs, and close to hotspots. Scattering heterogeneities show an affinity for regions close to slabs within the upper 200 km of the

  20. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  1. Parallelizing Gene Expression Programming Algorithm in Enabling Large-Scale Classification

    Directory of Open Access Journals (Sweden)

    Lixiong Xu

    2017-01-01

    Full Text Available As one of the most effective function mining algorithms, Gene Expression Programming (GEP algorithm has been widely used in classification, pattern recognition, prediction, and other research fields. Based on the self-evolution, GEP is able to mine an optimal function for dealing with further complicated tasks. However, in big data researches, GEP encounters low efficiency issue due to its long time mining processes. To improve the efficiency of GEP in big data researches especially for processing large-scale classification tasks, this paper presents a parallelized GEP algorithm using MapReduce computing model. The experimental results show that the presented algorithm is scalable and efficient for processing large-scale classification tasks.

  2. Large scale mapping: an empirical comparison of pixel-based and ...

    African Journals Online (AJOL)

    In the past, large scale mapping was carried using precise ground survey methods. Later, paradigm shift in data collection using medium to low resolution and, recently, high resolution images brought to bear the problem of accurate data analysis and fitness-for-purpose challenges. Using high resolution satellite images ...

  3. Large-scale nuclear energy from the thorium cycle

    International Nuclear Information System (INIS)

    Lewis, W.B.; Duret, M.F.; Craig, D.S.; Veeder, J.I.; Bain, A.S.

    1973-02-01

    The thorium fuel cycle in CANDU (Canada Deuterium Uranium) reactors challenges breeders and fusion as the simplest means of meeting the world's large-scale demands for energy for centuries. Thorium oxide fuel allows high power density with excellent neutron economy. The combination of thorium fuel with organic caloporteur promises easy maintenance and high availability of the whole plant. The total fuelling cost including charges on the inventory is estimated to be attractively low. (author) [fr

  4. HFSB-seeding for large-scale tomographic PIV in wind tunnels

    Science.gov (United States)

    Caridi, Giuseppe Carlo Alp; Ragni, Daniele; Sciacchitano, Andrea; Scarano, Fulvio

    2016-12-01

    A new system for large-scale tomographic particle image velocimetry in low-speed wind tunnels is presented. The system relies upon the use of sub-millimetre helium-filled soap bubbles as flow tracers, which scatter light with intensity several orders of magnitude higher than micron-sized droplets. With respect to a single bubble generator, the system increases the rate of bubbles emission by means of transient accumulation and rapid release. The governing parameters of the system are identified and discussed, namely the bubbles production rate, the accumulation and release times, the size of the bubble injector and its location with respect to the wind tunnel contraction. The relations between the above parameters, the resulting spatial concentration of tracers and measurement of dynamic spatial range are obtained and discussed. Large-scale experiments are carried out in a large low-speed wind tunnel with 2.85 × 2.85 m2 test section, where a vertical axis wind turbine of 1 m diameter is operated. Time-resolved tomographic PIV measurements are taken over a measurement volume of 40 × 20 × 15 cm3, allowing the quantitative analysis of the tip-vortex structure and dynamical evolution.

  5. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  6. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  7. Large-area perovskite nanowire arrays fabricated by large-scale roll-to-roll micro-gravure printing and doctor blading

    Science.gov (United States)

    Hu, Qiao; Wu, Han; Sun, Jia; Yan, Donghang; Gao, Yongli; Yang, Junliang

    2016-02-01

    Organic-inorganic hybrid halide perovskite nanowires (PNWs) show great potential applications in electronic and optoelectronic devices such as solar cells, field-effect transistors and photodetectors. It is very meaningful to fabricate ordered, large-area PNW arrays and greatly accelerate their applications and commercialization in electronic and optoelectronic devices. Herein, highly oriented and ultra-long methylammonium lead iodide (CH3NH3PbI3) PNW array thin films were fabricated by large-scale roll-to-roll (R2R) micro-gravure printing and doctor blading in ambient environments (humility ~45%, temperature ~28 °C), which produced PNW lengths as long as 15 mm. Furthermore, photodetectors based on these PNWs were successfully fabricated on both silicon oxide (SiO2) and flexible polyethylene terephthalate (PET) substrates and showed moderate performance. This study provides low-cost, large-scale techniques to fabricate large-area PNW arrays with great potential applications in flexible electronic and optoelectronic devices.Organic-inorganic hybrid halide perovskite nanowires (PNWs) show great potential applications in electronic and optoelectronic devices such as solar cells, field-effect transistors and photodetectors. It is very meaningful to fabricate ordered, large-area PNW arrays and greatly accelerate their applications and commercialization in electronic and optoelectronic devices. Herein, highly oriented and ultra-long methylammonium lead iodide (CH3NH3PbI3) PNW array thin films were fabricated by large-scale roll-to-roll (R2R) micro-gravure printing and doctor blading in ambient environments (humility ~45%, temperature ~28 °C), which produced PNW lengths as long as 15 mm. Furthermore, photodetectors based on these PNWs were successfully fabricated on both silicon oxide (SiO2) and flexible polyethylene terephthalate (PET) substrates and showed moderate performance. This study provides low-cost, large-scale techniques to fabricate large-area PNW arrays

  8. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo

    2014-04-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  9. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo; Ghaffour, NorEddine; Alsaadi, Ahmad Salem; Nunes, Suzana Pereira; Amy, Gary L.

    2014-01-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  10. Sociodemographic factors and attitudes toward food affordability and health are associated with fruit and vegetable consumption in a low-income French population.

    Science.gov (United States)

    Bihan, Hélène; Castetbon, Katia; Mejean, Caroline; Peneau, Sandrine; Pelabon, Laetitia; Jellouli, Fatima; Le Clesiau, Hervé; Hercberg, Serge

    2010-04-01

    Determinants of fruit and vegetable consumption, including affordability and attitudes, have been poorly investigated, especially in European deprived populations. Our objective was to analyze various determinants of low consumption of fruits and vegetables in disadvantaged participants. Our participants were randomized into 2 groups, 1 which received nutritional advice alone and 1 that also received vouchers that were exchangeable for fruits and vegetables during a 12-mo period. Socioeconomic characteristics, food insufficiency, affordability, and motivation for eating fruits and vegetables were assessed. A short FFQ was administered. Determinants of consumption of French population are numerous. The impact of financial difficulties is crucial, as is the perception of affordability of fruits and vegetables.

  11. Traditional Cantonese diet and nasopharyngeal carcinoma risk: a large-scale case-control study in Guangdong, China

    Directory of Open Access Journals (Sweden)

    Jia Wei-Hua

    2010-08-01

    Full Text Available Abstract Background Nasopharyngeal carcinoma (NPC is rare in most parts of the world but is a common malignancy in southern China, especially in Guangdong. Dietary habit is regarded as an important modifier of NPC risk in several endemic areas and may partially explain the geographic distribution of NPC incidence. In China, rapid economic development during the past few decades has changed the predominant lifestyle and dietary habits of the Chinese considerably, requiring a reassessment of diet and its potential influence on NPC risk in this NPC-endemic area. Methods To evaluate the association between dietary factors and NPC risk in Guangdong, China, a large-scale, hospital-based case-control study was conducted. 1387 eligible cases and 1459 frequency matched controls were recruited. Odds ratios (ORs and the corresponding 95% confidence intervals (CIs were estimated using a logistic regression model, adjusting for age, sex, education, dialect, and habitation household type. Results Observations made include the following: 1 consumption of canton-style salted fish, preserved vegetables and preserved/cured meat were significantly associated with increased risk of NPC, with enhanced odds ratios (OR of 2.45 (95% CI: 2.03-2.94, 3.17(95% CI: 2.68-3.77 and 2.09 (95% CI: 1.22-3.60 respectively in the highest intake frequency stratum during childhood; 2 consumption of fresh fruit was associated with reduced risk with a dose-dependent relationship (p = 0.001; and 3 consumption of Canton-style herbal tea and herbal slow-cooked soup was associated with decreased risk, with ORs of 0.84 (95% CI: 0.68-1.03 and 0.58 (95% CI: 0.47-0.72 respectively in the highest intake frequency stratum. In multivariate analyses, these associations remained significant. Conclusions It can be inferred that previously established dietary risk factors in the Cantonese population are still stable and have contributed to the incidence of NPC.

  12. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  13. The Cosmology Large Angular Scale Surveyor (CLASS)

    Science.gov (United States)

    Harrington, Kathleen; Marriange, Tobias; Aamir, Ali; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  14. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  15. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  16. Engineering an Affordable Self-Driving Car

    KAUST Repository

    Budisteanu, Alexandru Ionut

    2018-01-01

    for affordable self-driving cars and he designed a low-cost self-driving car. The car's roof has cameras and low-resolution 3D LiDAR equipment to detect traffic lanes, other cars, curbs and obstacles, such as people crossing by. To process this dizzying amount

  17. Automatic Measurement in Large-Scale Space with the Laser Theodolite and Vision Guiding Technology

    Directory of Open Access Journals (Sweden)

    Bin Wu

    2013-01-01

    Full Text Available The multitheodolite intersection measurement is a traditional approach to the coordinate measurement in large-scale space. However, the procedure of manual labeling and aiming results in the low automation level and the low measuring efficiency, and the measurement accuracy is affected easily by the manual aiming error. Based on the traditional theodolite measuring methods, this paper introduces the mechanism of vision measurement principle and presents a novel automatic measurement method for large-scale space and large workpieces (equipment combined with the laser theodolite measuring and vision guiding technologies. The measuring mark is established on the surface of the measured workpiece by the collimating laser which is coaxial with the sight-axis of theodolite, so the cooperation targets or manual marks are no longer needed. With the theoretical model data and the multiresolution visual imaging and tracking technology, it can realize the automatic, quick, and accurate measurement of large workpieces in large-scale space. Meanwhile, the impact of artificial error is reduced and the measuring efficiency is improved. Therefore, this method has significant ramification for the measurement of large workpieces, such as the geometry appearance characteristics measuring of ships, large aircraft, and spacecraft, and deformation monitoring for large building, dams.

  18. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  19. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  20. Utility payments in Ukraine: Affordability, subsidies and arrears

    International Nuclear Information System (INIS)

    Fankhauser, Samuel; Rodionova, Yulia; Falcetti, Elisabetta

    2008-01-01

    The transition from a planned economy to a market economy has caused considerable hardship for the people of Eastern Europe. One important aspect of the social costs of transition is access to, and the affordability of, basic services like electricity, heat and water, which under communism had been supplied fairly cheaply and abundantly. This paper provides evidence on this issue from the Ukraine Longitudinal Monitoring Survey (ULMS). The paper identifies considerable differences in both access and affordability between different localities in Ukraine. Social protection measures can help to alleviate affordability constraints, but the analysis finds that social support is not well targeted. The currently low tariffs prevent an escalation of affordability problems but constraints nevertheless exist. Many households have accumulated substantial arrears as a consequence, although non-payment is a complex issue and not solely a function of affordability

  1. Utility payments in Ukraine: Affordability, subsidies and arrears

    Energy Technology Data Exchange (ETDEWEB)

    Fankhauser, Samuel [London School of Economics, London (United Kingdom); Rodionova, Yulia [School of Slavonic and East European Studies, University College London, 16 Taviton street, London WC1H 0BW (United Kingdom)], E-mail: y.rodionova@ssees.ucl.ac.uk; Falcetti, Elisabetta [European Bank for Reconstruction and Development (EBRD), London (United Kingdom)

    2008-11-15

    The transition from a planned economy to a market economy has caused considerable hardship for the people of Eastern Europe. One important aspect of the social costs of transition is access to, and the affordability of, basic services like electricity, heat and water, which under communism had been supplied fairly cheaply and abundantly. This paper provides evidence on this issue from the Ukraine Longitudinal Monitoring Survey (ULMS). The paper identifies considerable differences in both access and affordability between different localities in Ukraine. Social protection measures can help to alleviate affordability constraints, but the analysis finds that social support is not well targeted. The currently low tariffs prevent an escalation of affordability problems but constraints nevertheless exist. Many households have accumulated substantial arrears as a consequence, although non-payment is a complex issue and not solely a function of affordability.

  2. Utility payments in Ukraine. Affordability, subsidies and arrears

    Energy Technology Data Exchange (ETDEWEB)

    Fankhauser, Samuel [London School of Economics, London (United Kingdom); Rodionova, Yulia [School of Slavonic and East European Studies, University College London, 16 Taviton street, London WC1H 0BW (United Kingdom); Falcetti, Elisabetta [European Bank for Reconstruction and Development (EBRD), London (United Kingdom)

    2008-11-15

    The transition from a planned economy to a market economy has caused considerable hardship for the people of Eastern Europe. One important aspect of the social costs of transition is access to, and the affordability of, basic services like electricity, heat and water, which under communism had been supplied fairly cheaply and abundantly. This paper provides evidence on this issue from the Ukraine Longitudinal Monitoring Survey (ULMS). The paper identifies considerable differences in both access and affordability between different localities in Ukraine. Social protection measures can help to alleviate affordability constraints, but the analysis finds that social support is not well targeted. The currently low tariffs prevent an escalation of affordability problems but constraints nevertheless exist. Many households have accumulated substantial arrears as a consequence, although non-payment is a complex issue and not solely a function of affordability. (author)

  3. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  4. Just enough inflation. Power spectrum modifications at large scales

    International Nuclear Information System (INIS)

    Cicoli, Michele; Downes, Sean

    2014-07-01

    We show that models of 'just enough' inflation, where the slow-roll evolution lasted only 50-60 e-foldings, feature modifications of the CMB power spectrum at large angular scales. We perform a systematic and model-independent analysis of any possible non-slow-roll background evolution prior to the final stage of slow-roll inflation. We find a high degree of universality since most common backgrounds like fast-roll evolution, matter or radiation-dominance give rise to a power loss at large angular scales and a peak together with an oscillatory behaviour at scales around the value of the Hubble parameter at the beginning of slow-roll inflation. Depending on the value of the equation of state parameter, different pre-inflationary epochs lead instead to an enhancement of power at low-l, and so seem disfavoured by recent observational hints for a lack of CMB power at l< or similar 40. We also comment on the importance of initial conditions and the possibility to have multiple pre-inflationary stages.

  5. Hypofractionated stereotactic body radiotherapy in low- and intermediate-risk prostate carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hun Jung; Phak, Jeong Hoon; Kim, Woo Chul [Dept. of Radiation Oncology, Inha University Hospital, Inha University School of Medicine, Incheon (Korea, Republic of)

    2016-12-15

    Stereotactic body radiotherapy (SBRT) takes advantage of low α/β ratio of prostate cancer to deliver a large dose in few fractions. We examined clinical outcomes of SBRT using CyberKnife for the treatment of low- and intermediate-risk prostate cancer. This study was based on a retrospective analysis of the 33 patients treated with SBRT using CyberKnife for localized prostate cancer (27.3% in low-risk and 72.7% in intermediate-risk). Total dose of 36.25 Gy in 5 fractions of 7.25 Gy were administered. The acute and late toxicities were recorded using the Radiation Therapy Oncology Group scale. Prostate-specific antigen (PSA) response was monitored. Thirty-three patients with a median 51 months (range, 6 to 71 months) follow-up were analyzed. There was no biochemical failure. Median PSA nadir was 0.27 ng/mL at median 33 months and PSA bounce occurred in 30.3% (n = 10) of patients at median at median 10.5 months after SBRT. No grade 3 acute toxicity was noted. The 18.2% of the patients had acute grade 2 genitourinary (GU) toxicities and 21.2% had acute grade 2 gastrointestinal (GI) toxicities. After follow-up of 2 months, most complications had returned to baseline. There was no grade 3 late GU and GI toxicity. Our experience with SBRT using CyberKnife in low- and intermediate-risk prostate cancer demonstrates favorable efficacy and toxicity. Further studies with more patients and longer follow-up duration are required.

  6. Large-Scale Fabrication of Silicon Nanowires for Solar Energy Applications.

    Science.gov (United States)

    Zhang, Bingchang; Jie, Jiansheng; Zhang, Xiujuan; Ou, Xuemei; Zhang, Xiaohong

    2017-10-11

    The development of silicon (Si) materials during past decades has boosted up the prosperity of the modern semiconductor industry. In comparison with the bulk-Si materials, Si nanowires (SiNWs) possess superior structural, optical, and electrical properties and have attracted increasing attention in solar energy applications. To achieve the practical applications of SiNWs, both large-scale synthesis of SiNWs at low cost and rational design of energy conversion devices with high efficiency are the prerequisite. This review focuses on the recent progresses in large-scale production of SiNWs, as well as the construction of high-efficiency SiNW-based solar energy conversion devices, including photovoltaic devices and photo-electrochemical cells. Finally, the outlook and challenges in this emerging field are presented.

  7. Persistent problems of access to appropriate, affordable TB services in rural China: experiences of different socio-economic groups.

    Science.gov (United States)

    Zhang, Tuohong; Tang, Shenglan; Jun, Gao; Whitehead, Margaret

    2007-02-08

    Large-scale Tuberculosis (TB) control programmes in China have been hailed a success. Concerns remain, however, about whether the programme is reaching all sections of the population, particularly poorer groups within rural communities, and whether there are hidden costs. This study takes a household perspective to investigate receipt of appropriate care and affordability of services for different socio-economic groups with TB symptoms in rural China. Secondary analysis of Chinese National Household Health Survey for 2003: 40,000 rural households containing 143,991 individuals, 2,308 identified as TB suspects. use of services and expenditure of TB suspects, by gender and socio-economic position, indicated by household income, education, material assets, and insurance status. 37% of TB suspects did not seek any professional care, with low-income groups less likely to seek care than more affluent counterparts. Of those seeking care, only 35% received any of the recommended diagnostic tests. Of the 182 patients with a confirmed TB diagnosis, 104 (57%) received treatment at the recommended level, less likely if lacking health insurance or material assets. The burden of payment for services amounted to 45% of annual household income for the low-income group, 16% for the high-income group. Access to appropriate, affordable TB services is still problematic in some rural areas of China, and receipt of care and affordability declines with declining socio-economic position. These findings highlight the current shortcomings of the national TB control programme in China and the formidable challenge it faces if it is to reach all sections of the population, including the poor with the highest burden of disease.

  8. Imprint of non-linear effects on HI intensity mapping on large scales

    Energy Technology Data Exchange (ETDEWEB)

    Umeh, Obinna, E-mail: umeobinna@gmail.com [Department of Physics and Astronomy, University of the Western Cape, Cape Town 7535 (South Africa)

    2017-06-01

    Intensity mapping of the HI brightness temperature provides a unique way of tracing large-scale structures of the Universe up to the largest possible scales. This is achieved by using a low angular resolution radio telescopes to detect emission line from cosmic neutral Hydrogen in the post-reionization Universe. We use general relativistic perturbation theory techniques to derive for the first time the full expression for the HI brightness temperature up to third order in perturbation theory without making any plane-parallel approximation. We use this result and the renormalization prescription for biased tracers to study the impact of nonlinear effects on the power spectrum of HI brightness temperature both in real and redshift space. We show how mode coupling at nonlinear order due to nonlinear bias parameters and redshift space distortion terms modulate the power spectrum on large scales. The large scale modulation may be understood to be due to the effective bias parameter and effective shot noise.

  9. Energy efficiency and economic value in affordable housing

    International Nuclear Information System (INIS)

    Chegut, Andrea; Eichholtz, Piet; Holtermans, Rogier

    2016-01-01

    Strong rental protection in the affordable housing market often prohibits landlords from charging rental premiums for energy-efficient dwellings. This may impede (re)development of energy efficient affordable housing. In the Netherlands, affordable housing institutions regularly sell dwellings from their housing stock to individual households. If they can sell energy efficient dwellings at a premium, this may stimulate investments in the environmental performance of homes. We analyze the value effects of energy efficiency in the affordable housing market, by using a sample of 17,835 homes sold by Dutch affordable housing institutions in the period between 2008 and 2013. We use Energy Performance Certificates to determine the value of energy efficiency in these transactions. We document that dwellings with high energy efficiency sell for 2.0–6.3% more compared to otherwise similar dwellings with low energy efficiency. This implies a premium of some EUR 3,000 to EUR 9,700 for highly energy efficient affordable housing. - Highlights: • Dutch affordable housing suppliers recoup sustainability investment by selling dwellings. • Energy-efficient affordable dwellings sell at a premium. • A-labeled dwellings are 6.3% – 9,300 euros – more valuable than C-labeled ones. • The combined value effect of refurbishing an affordable housing dwelling, including improving the energy efficiency, of 20% would more than pay for the retrofit.

  10. An improved method to characterise the modulation of small-scale turbulent by large-scale structures

    Science.gov (United States)

    Agostini, Lionel; Leschziner, Michael; Gaitonde, Datta

    2015-11-01

    A key aspect of turbulent boundary layer dynamics is ``modulation,'' which refers to degree to which the intensity of coherent large-scale structures (LS) cause an amplification or attenuation of the intensity of the small-scale structures (SS) through large-scale-linkage. In order to identify the variation of the amplitude of the SS motion, the envelope of the fluctuations needs to be determined. Mathis et al. (2009) proposed to define this latter by low-pass filtering the modulus of the analytic signal built from the Hilbert transform of SS. The validity of this definition, as a basis for quantifying the modulated SS signal, is re-examined on the basis of DNS data for a channel flow. The analysis shows that the modulus of the analytic signal is very sensitive to the skewness of its PDF, which is dependent, in turn, on the sign of the LS fluctuation and thus of whether these fluctuations are associated with sweeps or ejections. The conclusion is that generating an envelope by use of a low-pass filtering step leads to an important loss of information associated with the effects of the local skewness of the PDF of the SS on the modulation process. An improved Hilbert-transform-based method is proposed to characterize the modulation of SS turbulence by LS structures

  11. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  12. Infant Feeding Attitudes and Practices of Spanish Low-Risk Expectant Women Using the IIFAS (Iowa Infant Feeding Attitude Scale

    Directory of Open Access Journals (Sweden)

    María del Carmen Suárez Cotelo

    2018-04-01

    Full Text Available The Iowa Infant Feeding Attitude Scale (IIFAS has been shown to have good psychometric properties for English-speaking populations, but it has not been validated among low-risk pregnant women in Spain. The aim of this study was to assess the reliability and validity of the translated version of the IIFAS in order to examine infant feeding attitudes in Spanish women with an uncomplicated pregnancy. Low-risk expectant women (n = 297 were recruited from eight primary public health care centres in Galicia (Spain. Questionnaires including both socio-demographic and breastfeeding characteristics and items about infant feeding were administered during the third trimester. Participants were contacted by telephone during the postpartum period to obtain information regarding their infant feeding status. Prediction validity and internal consistency were assessed. The translated IIFAS (69.76 ± 7.75, which had good psychometric properties (Cronbach’s alpha = 0.785; area under the curve (AUC of the receiver operating characteristic (ROC curve = 0.841, CI95% = 0.735–0.948, showed more positive attitudes towards breastfeeding than towards formula feeding, especially among mothers who intended to exclusively breastfeed. This scale was also useful for inferring the intent to breastfeed and duration of breastfeeding. This study provides evidence that the IIFAS is a reliable and valid tool for assessing infant feeding attitudes in Spanish women with an uncomplicated pregnancy.

  13. Infant Feeding Attitudes and Practices of Spanish Low-Risk Expectant Women Using the IIFAS (Iowa Infant Feeding Attitude Scale).

    Science.gov (United States)

    Cotelo, María Del Carmen Suárez; Movilla-Fernández, María Jesús; Pita-García, Paula; Novío, Silvia

    2018-04-22

    The Iowa Infant Feeding Attitude Scale (IIFAS) has been shown to have good psychometric properties for English-speaking populations, but it has not been validated among low-risk pregnant women in Spain. The aim of this study was to assess the reliability and validity of the translated version of the IIFAS in order to examine infant feeding attitudes in Spanish women with an uncomplicated pregnancy. Low-risk expectant women ( n = 297) were recruited from eight primary public health care centres in Galicia (Spain). Questionnaires including both socio-demographic and breastfeeding characteristics and items about infant feeding were administered during the third trimester. Participants were contacted by telephone during the postpartum period to obtain information regarding their infant feeding status. Prediction validity and internal consistency were assessed. The translated IIFAS (69.76 ± 7.75), which had good psychometric properties (Cronbach's alpha = 0.785; area under the curve (AUC) of the receiver operating characteristic (ROC) curve = 0.841, CI 95% = 0.735⁻0.948), showed more positive attitudes towards breastfeeding than towards formula feeding, especially among mothers who intended to exclusively breastfeed. This scale was also useful for inferring the intent to breastfeed and duration of breastfeeding. This study provides evidence that the IIFAS is a reliable and valid tool for assessing infant feeding attitudes in Spanish women with an uncomplicated pregnancy.

  14. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  15. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  16. Large scale spatial risk and comparative prevalence of Borrelia miyamotoi and Borrelia burgdorferi sensu lato in Ixodes pacificus.

    Directory of Open Access Journals (Sweden)

    Kerry Padgett

    Full Text Available Borrelia miyamotoi is a newly described emerging pathogen transmitted to people by Ixodes species ticks and found in temperate regions of North America, Europe, and Asia. There is limited understanding of large scale entomological risk patterns of B. miyamotoi and of Borreila burgdorferi sensu stricto (ss, the agent of Lyme disease, in western North America. In this study, B. miyamotoi, a relapsing fever spirochete, was detected in adult (n=70 and nymphal (n=36 Ixodes pacificus ticks collected from 24 of 48 California counties that were surveyed over a 13 year period. Statewide prevalence of B. burgdorferi sensu lato (sl, which includes B. burgdorferi ss, and B. miyamotoi were similar in adult I. pacificus (0.6% and 0.8%, respectively. In contrast, the prevalence of B. burgdorferi sl was almost 2.5 times higher than B. miyamotoi in nymphal I. pacificus (3.2% versus 1.4%. These results suggest similar risk of exposure to B. burgdorferi sl and B. miyamotoi from adult I. pacificus tick bites in California, but a higher risk of contracting B. burgdorferi sl than B. miyamotoi from nymphal tick bites. While regional risk of exposure to these two spirochetes varies, the highest risk for both species is found in north and central coastal California and the Sierra Nevada foothill region, and the lowest risk is in southern California; nevertheless, tick-bite avoidance measures should be implemented in all regions of California. This is the first study to comprehensively evaluate entomologic risk for B. miyamotoi and B. burgdorferi for both adult and nymphal I. pacificus, an important human biting tick in western North America.

  17. Optimal Wind Energy Integration in Large-Scale Electric Grids

    Science.gov (United States)

    Albaijat, Mohammad H.

    The major concern in electric grid operation is operating under the most economical and reliable fashion to ensure affordability and continuity of electricity supply. This dissertation investigates the effects of such challenges, which affect electric grid reliability and economic operations. These challenges are: 1. Congestion of transmission lines, 2. Transmission lines expansion, 3. Large-scale wind energy integration, and 4. Phaser Measurement Units (PMUs) optimal placement for highest electric grid observability. Performing congestion analysis aids in evaluating the required increase of transmission line capacity in electric grids. However, it is necessary to evaluate expansion of transmission line capacity on methods to ensure optimal electric grid operation. Therefore, the expansion of transmission line capacity must enable grid operators to provide low-cost electricity while maintaining reliable operation of the electric grid. Because congestion affects the reliability of delivering power and increases its cost, the congestion analysis in electric grid networks is an important subject. Consequently, next-generation electric grids require novel methodologies for studying and managing congestion in electric grids. We suggest a novel method of long-term congestion management in large-scale electric grids. Owing to the complication and size of transmission line systems and the competitive nature of current grid operation, it is important for electric grid operators to determine how many transmission lines capacity to add. Traditional questions requiring answers are "Where" to add, "How much of transmission line capacity" to add, and "Which voltage level". Because of electric grid deregulation, transmission lines expansion is more complicated as it is now open to investors, whose main interest is to generate revenue, to build new transmission lines. Adding a new transmission capacity will help the system to relieve the transmission system congestion, create

  18. Punishment sustains large-scale cooperation in prestate warfare

    Science.gov (United States)

    Mathew, Sarah; Boyd, Robert

    2011-01-01

    Understanding cooperation and punishment in small-scale societies is crucial for explaining the origins of human cooperation. We studied warfare among the Turkana, a politically uncentralized, egalitarian, nomadic pastoral society in East Africa. Based on a representative sample of 88 recent raids, we show that the Turkana sustain costly cooperation in combat at a remarkably large scale, at least in part, through punishment of free-riders. Raiding parties comprised several hundred warriors and participants are not kin or day-to-day interactants. Warriors incur substantial risk of death and produce collective benefits. Cowardice and desertions occur, and are punished by community-imposed sanctions, including collective corporal punishment and fines. Furthermore, Turkana norms governing warfare benefit the ethnolinguistic group, a population of a half-million people, at the expense of smaller social groupings. These results challenge current views that punishment is unimportant in small-scale societies and that human cooperation evolved in small groups of kin and familiar individuals. Instead, these results suggest that cooperation at the larger scale of ethnolinguistic units enforced by third-party sanctions could have a deep evolutionary history in the human species. PMID:21670285

  19. Accident of Large-scale Wind Turbines Disconnecting from Power Grid and Its Protection

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    There were many accidents of large-scale wind turbines disconnecting from power grid in 2011. As single- phase-to-ground fault cannot be correctly detected, single-phase-to-ground fault evolved to phase-to-phase fault. Phase-to-phase fault was isolated slowly, thus leading to low voltage. And wind turbines without enough low voltage ride-through capacity had to be disconnected from the grid. After some wind turbines being disconnected from the grid, overvoltage caused by reactive power surplus made more wind turbines disconnect from the grid. Based on the accident analysis, this paper presents solutions to above problems, including travelling waves based single-phase-to-ground protection, adaptive low voltage protection, integrated protection and control, and high impedance fault detection. The solutions lay foundations in theory and technology to prevent large-scale wind turbines disconnecting from the operating power grid.

  20. Large Scale Screening of Low Cost Ferritic Steel Designs For Advanced Ultra Supercritical Boiler Using First Principles Methods

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, Lizhi [Tennessee State Univ. Nashville, TN (United States)

    2016-11-29

    Advanced Ultra Supercritical Boiler (AUSC) requires materials that can operate in corrosive environment at temperature and pressure as high as 760°C (or 1400°F) and 5000psi, respectively, while at the same time maintain good ductility at low temperature. We develop automated simulation software tools to enable fast large scale screening studies of candidate designs. While direct evaluation of creep rupture strength and ductility are currently not feasible, properties such as energy, elastic constants, surface energy, interface energy, and stack fault energy can be used to assess their relative ductility and creeping strength. We implemented software to automate the complex calculations to minimize human inputs in the tedious screening studies which involve model structures generation, settings for first principles calculations, results analysis and reporting. The software developed in the project and library of computed mechanical properties of phases found in ferritic steels, many are complex solid solutions estimated for the first time, will certainly help the development of low cost ferritic steel for AUSC.

  1. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  2. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  3. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir; Ltaief, Hatem; Mikhalev, Aleksandr; Charara, Ali; Keyes, David E.

    2018-01-01

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  4. Measuring Risk Perception in Later Life: The Perceived Risk Scale.

    Science.gov (United States)

    Lifshitz, Rinat; Nimrod, Galit; Bachner, Yaacov G

    2016-11-01

    Risk perception is a subjective assessment of the actual or potential threat to one's life or, more broadly, to one's psychological well-being. Given the various risks associated with later life, a valid and reliable integrative screening tool for assessing risk perception among the elderly is warranted. The study examined the psychometric properties and factor structure of a new integrative risk perception instrument, the Perceived Risk Scale. This eight-item measure refers to various risks simultaneously, including terror, health issues, traffic accidents, violence, and financial loss, and was developed specifically for older adults. An online survey was conducted with 306 participants aged 50 years and older. The scale was examined using exploratory factor analysis and concurrent validity testing. Factor analysis revealed a two-factor structure: later-life risks and terror risks A high percentage of explained variance, as well as internal consistency, was found for the entire scale and for both factors. Concurrent validity was supported by significant positive associations with participants' depression and negative correlations with their life satisfaction. These findings suggest that the Perceived Risk Scale is internally reliable, valid, and appropriate for evaluating risk perception in later life. The scale's potential applications are discussed. © The Author(s) 2016.

  5. Affordable and accurate large-scale hybrid-functional calculations on GPU-accelerated supercomputers

    Science.gov (United States)

    Ratcliff, Laura E.; Degomme, A.; Flores-Livas, José A.; Goedecker, Stefan; Genovese, Luigi

    2018-03-01

    Performing high accuracy hybrid functional calculations for condensed matter systems containing a large number of atoms is at present computationally very demanding or even out of reach if high quality basis sets are used. We present a highly optimized multiple graphics processing unit implementation of the exact exchange operator which allows one to perform fast hybrid functional density-functional theory (DFT) calculations with systematic basis sets without additional approximations for up to a thousand atoms. With this method hybrid DFT calculations of high quality become accessible on state-of-the-art supercomputers within a time-to-solution that is of the same order of magnitude as traditional semilocal-GGA functionals. The method is implemented in a portable open-source library.

  6. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  7. Historicizing affordance theory

    DEFF Research Database (Denmark)

    Pedersen, Sofie; Bang, Jytte Susanne

    2017-01-01

    The aim of this article is to discuss how mutually enriching points from both affordance theory and cultural-historical activity theory can promote theoretical ideas which may prove useful as analytical tools for the study of human life and human development. There are two issues that need...... to be overcome in order to explore the potentials of James Gibson’s affordance theory: it does not sufficiently theorize (a) development and (b) society. We claim that Gibson’s affordance theory still needs to be brought beyond “the axiom of immediacy.” Ambivalences in Gibson’s affordance theory...... societal character of affordance theory....

  8. Preventive dentistry: practitioners' recommendations for low-risk patients compared with scientific evidence and practice guidelines.

    Science.gov (United States)

    Frame, P S; Sawai, R; Bowen, W H; Meyerowitz, C

    2000-02-01

    The purpose of this article is to compare published evidence supporting procedures to prevent dental caries and periodontal disease, in low-risk patients, with the actual preventive recommendations of practicing dentists. Methods included (1) a survey questionnaire of general dentists practicing in western New York State concerning the preventive procedures they would recommend and at what intervals for low-risk children, young adults, and older adults; and (2) review of the published, English-language literature for evidence supporting preventive dental interventions. The majority of dentists surveyed recommended semiannual visits for visual examination and probing to detect caries (73% to 79%), and scaling and polishing to prevent periodontal disease (83% to 86%) for low-risk patients of all ages. Bite-wing radiographs were recommended for all age groups at annual or semiannual intervals. In-office fluoride applications were recommended for low-risk children at intervals of 6 to 12 months by 73% of dentists but were recommended for low-risk older persons by only 22% of dentists. Application of sealants to prevent pit and fissure caries was recommended for low-risk children by 22% of dentists. Literature review found no studies comparing different frequencies of dental examinations and bite-wing radiographs to determine the optimal screening interval in low-risk patients. Two studies of the effect of scaling and polishing on the prevention of periodontal disease found no benefit from more frequent than annual treatments. Although fluoride is clearly a major reason for the decline in the prevalence of dental caries, there are no studies of the incremental benefit of in-office fluoride treatments for low-risk patients exposed to fluoridated water and using fluoridated toothpaste. Comparative studies using outcome end points are needed to determine the optimal frequency of dental examinations and bite-wing radiographs for the early detection of caries, and of scaling

  9. Mesoderm Lineage 3D Tissue Constructs Are Produced at Large-Scale in a 3D Stem Cell Bioprocess.

    Science.gov (United States)

    Cha, Jae Min; Mantalaris, Athanasios; Jung, Sunyoung; Ji, Yurim; Bang, Oh Young; Bae, Hojae

    2017-09-01

    Various studies have presented different approaches to direct pluripotent stem cell differentiation such as applying defined sets of exogenous biochemical signals and genetic/epigenetic modifications. Although differentiation to target lineages can be successfully regulated, such conventional methods are often complicated, laborious, and not cost-effective to be employed to the large-scale production of 3D stem cell-based tissue constructs. A 3D-culture platform that could realize the large-scale production of mesoderm lineage tissue constructs from embryonic stem cells (ESCs) is developed. ESCs are cultured using our previously established 3D-bioprocess platform which is amenable to mass-production of 3D ESC-based tissue constructs. Hepatocarcinoma cell line conditioned medium is introduced to the large-scale 3D culture to provide a specific biomolecular microenvironment to mimic in vivo mesoderm formation process. After 5 days of spontaneous differentiation period, the resulting 3D tissue constructs are composed of multipotent mesodermal progenitor cells verified by gene and molecular expression profiles. Subsequently the optimal time points to trigger terminal differentiation towards cardiomyogenesis or osteogenesis from the mesodermal tissue constructs is found. A simple and affordable 3D ESC-bioprocess that can reach the scalable production of mesoderm origin tissues with significantly improved correspondent tissue properties is demonstrated. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. The three-point function as a probe of models for large-scale structure

    International Nuclear Information System (INIS)

    Frieman, J.A.; Gaztanaga, E.

    1993-01-01

    The authors analyze the consequences of models of structure formation for higher-order (n-point) galaxy correlation functions in the mildly non-linear regime. Several variations of the standard Ω = 1 cold dark matter model with scale-invariant primordial perturbations have recently been introduced to obtain more power on large scales, R p ∼20 h -1 Mpc, e.g., low-matter-density (non-zero cosmological constant) models, open-quote tilted close-quote primordial spectra, and scenarios with a mixture of cold and hot dark matter. They also include models with an effective scale-dependent bias, such as the cooperative galaxy formation scenario of Bower, et al. The authors show that higher-order (n-point) galaxy correlation functions can provide a useful test of such models and can discriminate between models with true large-scale power in the density field and those where the galaxy power arises from scale-dependent bias: a bias with rapid scale-dependence leads to a dramatic decrease of the hierarchical amplitudes Q J at large scales, r approx-gt R p . Current observational constraints on the three-point amplitudes Q 3 and S 3 can place limits on the bias parameter(s) and appear to disfavor, but not yet rule out, the hypothesis that scale-dependent bias is responsible for the extra power observed on large scales

  11. To be an affordable healthy house, case study Medan

    Science.gov (United States)

    Silitonga, Shanty

    2018-03-01

    House has a paramount meaning in human life. Provision of adequate housing will be able to improve the quality of life. Provision of an affordable house is a major step to fulfilling the needs of houses in the big city. Medan has built a lot of affordable houses, and mostly it takes place in the suburbs. Although the affordable house is for low-income people, it must be worthy of its physical condition, affordable in the budget and healthy for its users. House often saw only as physical alone, the provision of a house only to achieve solely in quantity regardless its quality. This study aims to examine the condition of affordable houses in the suburbs of Medan. The research method used qualitative descriptive, using indicator according to affordable healthy house standard according to the regulation in Indonesia and other related theories. This study took place in Medan by taking three areas in the suburbs of Medan. The results show that most affordable houses in the suburbs of Medan are unhealthy. There are several design recommendations for the houses to meet the affordable healthy house category; the most important is the addition of ventilation and window holes.

  12. Implementing large-scale programmes to optimise the health workforce in low- and middle-income settings: a multicountry case study synthesis.

    Science.gov (United States)

    Gopinathan, Unni; Lewin, Simon; Glenton, Claire

    2014-12-01

    To identify factors affecting the implementation of large-scale programmes to optimise the health workforce in low- and middle-income countries. We conducted a multicountry case study synthesis. Eligible programmes were identified through consultation with experts and using Internet searches. Programmes were selected purposively to match the inclusion criteria. Programme documents were gathered via Google Scholar and PubMed and from key informants. The SURE Framework - a comprehensive list of factors that may influence the implementation of health system interventions - was used to organise the data. Thematic analysis was used to identify the key issues that emerged from the case studies. Programmes from Brazil, Ethiopia, India, Iran, Malawi, Venezuela and Zimbabwe were selected. Key system-level factors affecting the implementation of the programmes were related to health worker training and continuing education, management and programme support structures, the organisation and delivery of services, community participation, and the sociopolitical environment. Existing weaknesses in health systems may undermine the implementation of large-scale programmes to optimise the health workforce. Changes in the roles and responsibilities of cadres may also, in turn, impact the health system throughout. © 2014 John Wiley & Sons Ltd.

  13. Learning Grasp Affordance Densities

    DEFF Research Database (Denmark)

    Detry, Renaud; Kraft, Dirk; Kroemer, Oliver

    2011-01-01

    and relies on kernel density estimation to provide a continuous model. Grasp densities are learned and refined from exploration, by letting a robot “play” with an object in a sequence of graspand-drop actions: The robot uses visual cues to generate a set of grasp hypotheses; it then executes......We address the issue of learning and representing object grasp affordance models. We model grasp affordances with continuous probability density functions (grasp densities) which link object-relative grasp poses to their success probability. The underlying function representation is nonparametric...... these and records their outcomes. When a satisfactory number of grasp data is available, an importance-sampling algorithm turns these into a grasp density. We evaluate our method in a largely autonomous learning experiment run on three objects of distinct shapes. The experiment shows how learning increases success...

  14. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  15. Active self-testing noise measurement sensors for large-scale environmental sensor networks.

    Science.gov (United States)

    Domínguez, Federico; Cuong, Nguyen The; Reinoso, Felipe; Touhafi, Abdellah; Steenhaut, Kris

    2013-12-13

    Large-scale noise pollution sensor networks consist of hundreds of spatially distributed microphones that measure environmental noise. These networks provide historical and real-time environmental data to citizens and decision makers and are therefore a key technology to steer environmental policy. However, the high cost of certified environmental microphone sensors render large-scale environmental networks prohibitively expensive. Several environmental network projects have started using off-the-shelf low-cost microphone sensors to reduce their costs, but these sensors have higher failure rates and produce lower quality data. To offset this disadvantage, we developed a low-cost noise sensor that actively checks its condition and indirectly the integrity of the data it produces. The main design concept is to embed a 13 mm speaker in the noise sensor casing and, by regularly scheduling a frequency sweep, estimate the evolution of the microphone's frequency response over time. This paper presents our noise sensor's hardware and software design together with the results of a test deployment in a large-scale environmental network in Belgium. Our middle-range-value sensor (around €50) effectively detected all experienced malfunctions, in laboratory tests and outdoor deployments, with a few false positives. Future improvements could further lower the cost of our sensor below €10.

  16. The new affordances in the home environment for motor development - infant scale (AHEMD-IS): Versions in English and Portuguese languages

    Science.gov (United States)

    Caçola, Priscila M.; Gabbard, Carl; Montebelo, Maria I. L.; Santos, Denise C. C.

    2015-01-01

    The home environment has been established as a crucial factor for motor development, especially in infants. Exploring the home environment can have significant implications for intervention, as it is common practice in physical therapy to have professionals advise patients on home activities. Since 2010, our group has been working on the development of the Affordances in the Home Environment for Motor Development - Infant Scale (AHEMD-IS), a parental self-reporting instrument designed to assess the quality and quantity of factors (affordances) in the home environment. In Brazil, the instrument has been translated as "Affordances no Ambiente Domiciliar para o Desenvolvimento Motor - Escala Bebê", and it has been extensively used in several studies that address infant development. These studies in Brazil and other parts of the world highly recommended the need for a normative sample and standardized scoring system. A description of the study that addressed that need, along with the English version of the questionnaire and score sheets, was recently published in the well-known and respected journal Physical Therapy. Our intent with the present short communication is to notify Brazilian investigators and clinicians of this latest update so they can download the new instrument, as well as present the Brazilian (Portuguese) version of the AHEMD-IS along with its scoring system. PMID:26647753

  17. The trade-off between expected risk and the potential for large accidents

    International Nuclear Information System (INIS)

    Niehaus, F.; de Leon, G.; Cullingford, M.

    1984-01-01

    This paper, by Niehaus, de Leon, and Cullingford, examines the relationship between expected risk and the potential for large accidents. Using historical data for airplane accidents from 19471980, the authors show that a reduction of the expected value of risk often results in an increase in the potential for catastrophe. Phrased differently, the authors suggest that there may be a nenessary trade-off between high-probability/low-consequence risk and low-probability/high-consequence risk. The implications of this study for energy systems, for oil and gas supply, and for nuclear systems are also given

  18. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  19. Large-Scale Production of Nanographite by Tube-Shear Exfoliation in Water.

    Directory of Open Access Journals (Sweden)

    Nicklas Blomquist

    Full Text Available The number of applications based on graphene, few-layer graphene, and nanographite is rapidly increasing. A large-scale process for production of these materials is critically needed to achieve cost-effective commercial products. Here, we present a novel process to mechanically exfoliate industrial quantities of nanographite from graphite in an aqueous environment with low energy consumption and at controlled shear conditions. This process, based on hydrodynamic tube shearing, produced nanometer-thick and micrometer-wide flakes of nanographite with a production rate exceeding 500 gh-1 with an energy consumption about 10 Whg-1. In addition, to facilitate large-area coating, we show that the nanographite can be mixed with nanofibrillated cellulose in the process to form highly conductive, robust and environmentally friendly composites. This composite has a sheet resistance below 1.75 Ω/sq and an electrical resistivity of 1.39×10-4 Ωm and may find use in several applications, from supercapacitors and batteries to printed electronics and solar cells. A batch of 100 liter was processed in less than 4 hours. The design of the process allow scaling to even larger volumes and the low energy consumption indicates a low-cost process.

  20. Affordances of Ditches for Children in Preschool

    DEFF Research Database (Denmark)

    Lerstrup, Inger Elisabeth; Møller, Maja Steen

    2016-01-01

    This study aims to expand understanding of the affordances provided by ditches in a Danish preschool context. Affordances are defined as the meaningful action possibilities of the environment. At a forest preschool, a group of 21 children aged approximately 3to 6.5 years accompanied by two to three...... offered varied and changing action possibilities for the preschool children. The paper discusses the possible incorporation of this largely unrecognized design element by planners and managers of green spaces and playgrounds for children in preschool....

  1. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  2. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  3. Medicine prices, availability, and affordability in the Shaanxi Province in China: implications for the future.

    Science.gov (United States)

    Jiang, Minghuan; Zhou, Zhongliang; Wu, Lina; Shen, Qian; Lv, Bing; Wang, Xiao; Yang, Shimin; Fang, Yu

    2015-02-01

    In 2009, China implemented the National Essential Medicines System (NEMS) to improve access to high-quality low-cost essential medicines. To measure the prices, availability and affordability of medicines in China following the implementation of the NEMS. 120 public hospitals and 120 private pharmacies in ten cities in Shaanxi Province, Western China. The standardized methodology developed by the World Health Organization and Health Action International was used to collect data on prices and availability of 49 medicines. Median price ratio; availability as a percentage; cost of course of treatment in days' wages of the lowest-paid government workers. In the public hospitals, originator brands (OBs) were procured at 8.89 times the international reference price, more than seven times higher than the lowest-priced generics (LPGs). Patients paid 11.83 and 1.69 times the international reference prices for OBs and generics respectively. A similar result was observed in the private pharmacies. The mean availabilities of OBs and LPGs were 7.1 and 20.0 % in the public hospitals, and 12.6 and 29.2 % in the private pharmacies. Treatment with OBs is therefore largely unaffordable, but the affordability of the LPGs is generally good. High prices and low availability of survey medicines were observed. The affordability of generics, but not OBs, is reasonable. Effective measures should be taken to reduce medicine prices and improve availability and affordability in Shaanxi Province.

  4. 18/20 T high magnetic field scanning tunneling microscope with fully low voltage operability, high current resolution, and large scale searching ability.

    Science.gov (United States)

    Li, Quanfeng; Wang, Qi; Hou, Yubin; Lu, Qingyou

    2012-04-01

    We present a home-built 18/20 T high magnetic field scanning tunneling microscope (STM) featuring fully low voltage (lower than ±15 V) operability in low temperatures, large scale searching ability, and 20 fA high current resolution (measured by using a 100 GOhm dummy resistor to replace the tip-sample junction) with a bandwidth of 3.03 kHz. To accomplish low voltage operation which is important in achieving high precision, low noise, and low interference with the strong magnetic field, the coarse approach is implemented with an inertial slider driven by the lateral bending of a piezoelectric scanner tube (PST) whose inner electrode is axially split into two for enhanced bending per volt. The PST can also drive the same sliding piece to inertial slide in the other bending direction (along the sample surface) of the PST, which realizes the large area searching ability. The STM head is housed in a three segment tubular chamber, which is detachable near the STM head for the convenience of sample and tip changes. Atomic resolution images of a graphite sample taken under 17.6 T and 18.0001 T are presented to show its performance. © 2012 American Institute of Physics

  5. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  6. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  7. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  8. Network Affordances

    DEFF Research Database (Denmark)

    Samson, Audrey; Soon, Winnie

    2015-01-01

    This paper examines the notion of network affordance within the context of network art. Building on Gibson's theory (Gibson, 1979) we understand affordance as the perceived and actual parameters of a thing. We expand on Gaver's affordance of predictability (Gaver, 1996) to include ecological...... and computational parameters of unpredictability. We illustrate the notion of unpredictability by considering four specific works that were included in a network art exhibiton, SPEED SHOW [2.0] Hong Kong. The paper discusses how the artworks are contingent upon the parameteric relations (Parisi, 2013......), of the network. We introduce network affordance as a dynamic framework that could articulate the experienced tension arising from the (visible) symbolic representation of computational processes and its hidden occurrences. We base our proposal on the experience of both organising the SPEED SHOW and participating...

  9. Large scale organized motion in isothermal swirling flow through an axisymmetric dump combustor

    International Nuclear Information System (INIS)

    Daddis, E.D.; Lieber, B.B.; Nejad, A.S.; Ahmed, S.A.

    1990-01-01

    This paper reports on velocity measurements that were obtained in a model axisymmetric dump combustor which included a coaxial swirler by means of a two component laser Doppler velocimeter (LDV) at a Reynolds number of 125,000. The frequency spectrum of the velocity fluctuations is obtained via the Fast Fourier Transform (FFT). The velocity field downstream of the dump plane is characterized, in addition to background turbulence, by large scale organized structures which are manifested as sharp spikes of the spectrum at relatively low frequencies. The decomposition of velocity disturbances to background turbulence and large scale structures can then be achieved through spectral methods which include matched filters and spectral factorization. These methods are demonstrated here for axial velocity obtained one step height downstream of the dump plane. Subsequent analysis of the various velocity disturbances shows that large scale structures account for about 25% of the apparent normal stresses at this particular location. Naturally, large scale structures evolve spatially and their contribution to the apparent stress tensor may vary depending on the location in the flow field

  10. Innovation-driven efficient development of the Longwangmiao Fm large-scale sulfur gas reservoir in Moxi block, Sichuan Basin

    Directory of Open Access Journals (Sweden)

    Xinhua Ma

    2016-03-01

    Full Text Available The Lower Cambrian Longwangmiao Fm gas reservoir in Moxi block of the Anyue Gas field, Sichuan Basin, is the largest single-sandbody integrated carbonate gas reservoir proved so far in China. Notwithstanding this reservoir's advantages like large-scale reserves and high single-well productivity, there are multiple complicated factors restricting its efficient development, such as a median content of hydrogen sulfide, low porosity and strong heterogeneity of fracture–cave formation, various modes of gas–water occurrences, and close relation between overpressure and stress sensitivity. Up till now, since only a few Cambrian large-scale carbonate gas reservoirs have ever been developed in the world, there still exists some blind spots especially about its exploration and production rules. Besides, as for large-scale sulfur gas reservoirs, the exploration and construction is costly, and production test in the early evaluation stage is severely limited, all of which will bring about great challenges in productivity construction and high potential risks. In this regard, combining with Chinese strategic demand of strengthening clean energy supply security, the PetroChina Southwest Oil & Gas Field Company has carried out researches and field tests for the purpose of providing high-production wells, optimizing development design, rapidly constructing high-quality productivity and upgrading HSE security in the Longwangmiao Fm gas reservoir in Moxi block. Through the innovations of technology and management mode within 3 years, this gas reservoir has been built into a modern large-scale gas field with high quality, high efficiency and high benefit, and its annual capacity is now up to over 100 × 108 m3, with a desirable production capacity and development indexes gained as originally anticipated. It has become a new model of large-scale gas reservoirs with efficient development, providing a reference for other types of gas reservoirs in China.

  11. Affordances in activity theory and cognitive systems engineering

    DEFF Research Database (Denmark)

    Albrechtsen, H.; Andersen, H.H.K.; Bødker, S.

    2001-01-01

    on design for low level interaction modalities. To incorporate the concept of affordances in the design of human computer interaction it is necessary to systematically unravel affordances that supporthuman action possibilities. Furthermore, it is a necessity that Gibson's theory of affordances...... is supplemented by careful analyses of other human modalities and activities than visual perception. Within HMI two well established perspectives on HMI,Activity Theory (AT) and Cognitive Systems Engineering (CSE), have discussed such analyses and design of action possibilities focusing on providing computer...... to cover deeper semantic and pragmatic aspects of the ecology of work, as compared with the previous applications of Gibson's theory in HMI....

  12. Readmission After COPD Exacerbation Scale: determining 30-day readmission risk for COPD patients.

    Science.gov (United States)

    Lau, Christine Sm; Siracuse, Brianna L; Chamberlain, Ronald S

    2017-01-01

    COPD affects over 13 million Americans, and accounts for over half a million hospitalizations annually. The Hospital Readmission Reduction Program, established by the Affordable Care Act requires the Centers for Medicare and Medicaid Services to reduce payments to hospitals with excess readmissions for COPD as of 2015. This study sought to develop a predictive readmission scale to identify COPD patients at higher readmission risk. Demographic and clinical data on 339,389 patients from New York and California (derivation cohort) and 258,113 patients from Washington and Florida (validation cohort) were abstracted from the State Inpatient Database (2006-2011), and the Readmission After COPD Exacerbation (RACE) Scale was developed to predict 30-day readmission risk. Thirty-day COPD readmission rates were 7.54% for the derivation cohort and 6.70% for the validation cohort. Factors including age 40-65 years (odds ratio [OR] 1.17; 95% CI, 1.12-1.21), male gender (OR 1.16; 95% CI, 1.13-1.19), African American (OR 1.11; 95% CI, 1.06-1.16), 1st income quartile (OR 1.10; 95% CI, 1.06-1.15), 2nd income quartile (OR 1.06; 95% CI, 1.02-1.10), Medicaid insured (OR 1.83; 95% CI, 1.73-1.93), Medicare insured (OR 1.45; 95% CI, 1.38-1.52), anemia (OR 1.05; 95% CI, 1.02-1.09), congestive heart failure (OR 1.06; 95% CI, 1.02-1.09), depression (OR 1.18; 95% CI, 1.14-1.23), drug abuse (OR 1.17; 95% CI, 1.09-1.25), and psychoses (OR 1.19; 95% CI, 1.13-1.25) were independently associated with increased readmission rates, P readmission variability. The RACE Scale reliably predicts an individual patient's 30-day COPD readmission risk based on specific factors present at initial admission. By identifying these patients at high risk of readmission with the RACE Scale, patient-specific readmission-reduction strategies can be implemented to improve patient care as well as reduce readmissions and health care expenditures.

  13. A generic library for large scale solution of PDEs on modern heterogeneous architectures

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter

    2012-01-01

    Adapting to new programming models for modern multi- and many-core architectures requires code-rewriting and changing algorithms and data structures, in order to achieve good efficiency and scalability. We present a generic library for solving large scale partial differential equations (PDEs......), capable of utilizing heterogeneous CPU/GPU environments. The library can be used for fast proto-typing of PDE solvers, based on finite difference approximations of spatial derivatives in one, two, or three dimensions. In order to efficiently solve large scale problems, we keep memory consumption...... and memory access low, using a low-storage implementation of flexible-order finite difference operators. We will illustrate the use of library components by assembling such matrix-free operators to be used with one of the supported iterative solvers, such as GMRES, CG, Multigrid or Defect Correction...

  14. Housing ownership and affordability among low-income society in the poorest sub-district of Semarang, Central Java, Indonesia

    Science.gov (United States)

    Indrianingrum, Lulut

    2017-03-01

    The Government has intervened to deal with various affordable public housing programs, as well as financing programs for Low Income society in Indonesia. The characteristics of this society in each region are so diverse, that made the housing programs for this social segment uneasy in reaching the right target. Regulation of Housing and Settlement No. 2/2001 has mandated that the State are obliged to implement a habitable public housing for people, especially for the low income society. The purpose of this study is exploring the low-income residents' preferences and affordability of home ownership for their families in the poorest sub-district of Semarang. Aspects of studies include family conditions, financing, location, housing type and price. The research used a descriptive method to analyze a set of questionnaire data, distributed to low income residents in Sub district Tanjungmas, which isthe poorest sub district in Semarang. The results showed that the respondents developed a vision of home ownership by saving their money for the allocated housing budget and taking a bank installment. They tended to plan to get a house in their current neighborhood or nearby or anywhere else with the same price range. They really understood that, in order to get a better home and neighborhood they have to pay for higher prices. Therefore, their housing criteria or standards were set based on the quality of life in their current residential area, and should be located in a township (kampung).

  15. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  16. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  17. LARGE-SCALE HYDROGEN PRODUCTION FROM NUCLEAR ENERGY USING HIGH TEMPERATURE ELECTROLYSIS

    International Nuclear Information System (INIS)

    O'Brien, James E.

    2010-01-01

    Hydrogen can be produced from water splitting with relatively high efficiency using high-temperature electrolysis. This technology makes use of solid-oxide cells, running in the electrolysis mode to produce hydrogen from steam, while consuming electricity and high-temperature process heat. When coupled to an advanced high temperature nuclear reactor, the overall thermal-to-hydrogen efficiency for high-temperature electrolysis can be as high as 50%, which is about double the overall efficiency of conventional low-temperature electrolysis. Current large-scale hydrogen production is based almost exclusively on steam reforming of methane, a method that consumes a precious fossil fuel while emitting carbon dioxide to the atmosphere. Demand for hydrogen is increasing rapidly for refining of increasingly low-grade petroleum resources, such as the Athabasca oil sands and for ammonia-based fertilizer production. Large quantities of hydrogen are also required for carbon-efficient conversion of biomass to liquid fuels. With supplemental nuclear hydrogen, almost all of the carbon in the biomass can be converted to liquid fuels in a nearly carbon-neutral fashion. Ultimately, hydrogen may be employed as a direct transportation fuel in a 'hydrogen economy.' The large quantity of hydrogen that would be required for this concept should be produced without consuming fossil fuels or emitting greenhouse gases. An overview of the high-temperature electrolysis technology will be presented, including basic theory, modeling, and experimental activities. Modeling activities include both computational fluid dynamics and large-scale systems analysis. We have also demonstrated high-temperature electrolysis in our laboratory at the 15 kW scale, achieving a hydrogen production rate in excess of 5500 L/hr.

  18. Coupled Large Scale Hydro-mechanical Modelling for cap-rock Failure Risk Assessment of CO2 Storage in Deep Saline Aquifers

    International Nuclear Information System (INIS)

    Rohmer, J.; Seyedi, D.M.

    2010-01-01

    This work presents a numerical strategy of large scale hydro-mechanical simulations to assess the risk of damage in cap-rock formations during a CO 2 injection process. The proposed methodology is based on the development of a sequential coupling between a multiphase fluid flow (TOUGH2) and a hydro-mechanical calculation code (Code-Aster) that enables us to perform coupled hydro-mechanical simulation at a regional scale. The likelihood of different cap-rock damage mechanisms can then be evaluated based on the results of the coupled simulations. A scenario based approach is proposed to take into account the effect of the uncertainty of model parameters on damage likelihood. The developed methodology is applied for the cap-rock failure analysis of deep aquifer of the Dogger formation in the context of the Paris basin multilayered geological system as a demonstration example. The simulation is carried out at a regional scale (100 km) considering an industrial mass injection rate of CO 2 of 10 Mt/y. The assessment of the stress state after 10 years of injection is conducted through the developed sequential coupling. Two failure mechanisms have been taken into account, namely the tensile fracturing and the shear slip reactivation of pre-existing fractures. To deal with the large uncertainties due to sparse data on the layer formations, a scenario based strategy is undertaken. It consists in defining a first reference modelling scenario considering the mean values of the hydro-mechanical properties for each layer. A sensitivity analysis is then carried out and shows the importance of both the initial stress state and the reservoir hydraulic properties on the cap-rock failure tendency. On this basis, a second scenario denoted 'critical' is defined so that the most influential model parameters are taken in their worst configuration. None of these failure criteria is activated for the considered conditions. At a phenomenological level, this study points out three key

  19. 16 CFR 1061.8 - Information on the heightened degree of protection afforded.

    Science.gov (United States)

    2010-01-01

    ... protection afforded. 1061.8 Section 1061.8 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION GENERAL APPLICATIONS FOR EXEMPTION FROM PREEMPTION § 1061.8 Information on the heightened degree of protection afforded... State or local requirement provides a significantly higher degree of protection from the risk of injury...

  20. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  1. Non-gut baryogenesis and large scale structure of the universe

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    1995-07-01

    We discuss a mechanism for generating baryon density perturbations and study the evolution of the baryon charge density distribution in the framework of the low temperature baryogenesis scenario. This mechanism may be important for the large scale structure formation of the Universe and particularly, may be essential for understanding the existence of a characteristic scale of 130h -1 Mpc in the distribution of the visible matter. The detailed analysis showed that both the observed very large scale of the visible matter distribution in the Universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, according to our model, at present the visible part of the Universe may consist of baryonic and antibaryonic shells, sufficiently separated, so that annihilation radiation is not observed. This is an interesting possibility as far as the observational data of antiparticles in cosmic rays do not rule out the possibility of antimatter superclusters in the Universe. (author). 16 refs, 3 figs

  2. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  3. Large scale production and downstream processing of a recombinant porcine parvovirus vaccine

    NARCIS (Netherlands)

    Maranga, L.; Rueda, P.; Antonis, A.F.G.; Vela, C.; Langeveld, J.P.M.; Casal, J.I.; Carrondo, M.J.T.

    2002-01-01

    Porcine parvovirus (PPV) virus-like particles (VLPs) constitute a potential vaccine for prevention of parvovirus-induced reproductive failure in gilts. Here we report the development of a large scale (25 l) production process for PPV-VLPs with baculovirus-infected insect cells. A low multiplicity of

  4. Combining high-scale inflation with low-energy SUSY

    Energy Technology Data Exchange (ETDEWEB)

    Antusch, Stefan [Basel Univ. (Switzerland). Dept. of Physics; Max-Planck-Institut fuer Physik, Muenchen (Germany). Werner-Heisenberg-Institut; Dutta, Koushik [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Halter, Sebastian [Max-Planck-Institut fuer Physik, Muenchen (Germany). Werner-Heisenberg-Institut

    2011-12-15

    We propose a general scenario for moduli stabilization where low-energy supersymmetry can be accommodated with a high scale of inflation. The key ingredient is that the stabilization of the modulus field during and after inflation is not associated with a single, common scale, but relies on two different mechanisms. We illustrate this general scenario in a simple example, where during inflation the modulus is stabilized with a large mass by a Kaehler potential coupling to the field which provides the inflationary vacuum energy via its F-term. After inflation, the modulus is stabilized, for instance, by a KKLT superpotential. (orig.)

  5. Radiations: large scale monitoring in Japan

    International Nuclear Information System (INIS)

    Linton, M.; Khalatbari, A.

    2011-01-01

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  6. Investigation of the modes of origin of attitudes concerning the risks of large-scale technical plants

    International Nuclear Information System (INIS)

    Gutmann, G.; Huschke, P.

    1980-12-01

    A hypothetical framework of the modes of origin of attitudes concerning the risks of large-scale industrial plants is developed. The hypotheses regarding the assessment of nuclear energy are empirically investigated in several schools, the pupils being between 14 and 16 years old. Results: 1. the ''polarity'' of opinion patterns on the problem of nuclear energy ''brought along'', i.e. already coined within the families could not be reversed by school instruction. 2. Instruction can possibly differentiate attitudes (assessments and cognitions). 3. Obviously the teaching style of the teacher is more determining for a differentiated attitude towards nuclear energy than the instruction material used. 4. A teaching style supporting and activating differentiation can be called partner-orientated and ''provocative'' and be considered as taking into consideration the communicative situations in school classes in a flexible manner. 5. On the other hand, a frontal and monologizing teaching style which does not respond to the changing communicative structures in school classes can block all interest and thus the possibility of influencing attitudes. (orig./HP) [de

  7. Screening for prenatal substance use: development of the Substance Use Risk Profile-Pregnancy scale.

    Science.gov (United States)

    Yonkers, Kimberly A; Gotman, Nathan; Kershaw, Trace; Forray, Ariadna; Howell, Heather B; Rounsaville, Bruce J

    2010-10-01

    To report on the development of a questionnaire to screen for hazardous substance use in pregnant women and to compare the performance of the questionnaire with other drug and alcohol measures. Pregnant women were administered a modified TWEAK (Tolerance, Worried, Eye-openers, Amnesia, K[C] Cut Down) questionnaire, the 4Ps Plus questionnaire, items from the Addiction Severity Index, and two questions about domestic violence (N=2,684). The sample was divided into "training" (n=1,610) and "validation" (n=1,074) subsamples. We applied recursive partitioning class analysis to the responses from individuals in the training subsample that resulted in a three-item Substance Use Risk Profile-Pregnancy scale. We examined sensitivity, specificity, and the fit of logistic regression models in the validation subsample to compare the performance of the Substance Use Risk Profile-Pregnancy scale with the modified TWEAK and various scoring algorithms of the 4Ps. The Substance Use Risk Profile-Pregnancy scale is comprised of three informative questions that can be scored for high- or low-risk populations. The Substance Use Risk Profile-Pregnancy scale algorithm for low-risk populations was mostly highly predictive of substance use in the validation subsample (Akaike's Information Criterion=579.75, Nagelkerke R=0.27) with high sensitivity (91%) and adequate specificity (67%). The high-risk algorithm had lower sensitivity (57%) but higher specificity (88%). The Substance Use Risk Profile-Pregnancy scale is simple and flexible with good sensitivity and specificity. The Substance Use Risk Profile-Pregnancy scale can potentially detect a range of substances that may be abused. Clinicians need to further assess women with a positive screen to identify those who require treatment for alcohol or illicit substance use in pregnancy. III.

  8. Racial Differences in Awareness of the Affordable Care Act and Application Assistance Among Low-Income Adults in Three Southern States

    Directory of Open Access Journals (Sweden)

    Adrian Garcia Mosqueira MA

    2015-10-01

    Full Text Available The Affordable Care Act (ACA expanded Medicaid eligibility to adults with incomes under 138% of the federal poverty level, leading to substantial reductions in uninsured rates among low-income adults. Despite large gains in coverage, studies suggest that Latinos may be less likely than other racial/ethnic groups to apply and enroll in health insurance, and they remain the group with the highest uninsured rate in the United States. We explore two potential factors related to racial/ethnic differences in ACA enrollment—awareness of the law and receipt of application assistance such as navigator services. Using a survey of nearly 3000 low-income U.S. citizens (aged 19-64 in 3 states in late 2014, we find that Latinos had significantly lower levels of awareness of the ACA relative to other groups, even after adjusting for demographic covariates. Higher education was the strongest positive predictor of ACA awareness. In contrast, Latinos were much more likely to receive assistance from navigators or social workers when applying, relative to other racial/ethnic groups. Taken together, these results highlight the importance of ACA outreach efforts to increase awareness among low-income and less educated populations, two groups that are overrepresented in the Latino population, to close existing disparities in coverage.

  9. Increasing stress on disaster risk finance due to large floods

    Science.gov (United States)

    Jongman, Brenden; Hochrainer-Stigler, Stefan; Feyen, Luc; Aerts, Jeroen; Mechler, Reinhard; Botzen, Wouter; Bouwer, Laurens; Pflug, Georg; Rojas, Rodrigo; Ward, Philip

    2014-05-01

    Recent major flood disasters have shown that single extreme events can affect multiple countries simultaneously, which puts high pressure on trans-national risk reduction and risk transfer mechanisms. To date, little is known about such flood hazard interdependencies across regions, and the corresponding joint risks at regional to continental scales. Reliable information on correlated loss probabilities is crucial for developing robust insurance schemes and public adaptation funds, and for enhancing our understanding of climate change impacts. Here we show that extreme discharges are strongly correlated across European river basins and that these correlations can, or should, be used in national to continental scale risk assessment. We present probabilistic trends in continental flood risk, and demonstrate that currently observed extreme flood losses could more than double in frequency by 2050 under future climate change and socioeconomic development. The results demonstrate that accounting for tail dependencies leads to higher estimates of extreme losses than estimates based on the traditional assumption of independence between basins. We suggest that risk management for these increasing losses is largely feasible, and we demonstrate that risk can be shared by expanding risk transfer financing, reduced by investing in flood protection, or absorbed by enhanced solidarity between countries. We conclude that these measures have vastly different efficiency, equity and acceptability implications, which need to be taken into account in broader consultation, for which our analysis provides a basis.

  10. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  11. Accelerated fetal growth in early pregnancy and risk of severe large-for-gestational-age and macrosomic infant: a cohort study in a low-risk population.

    Science.gov (United States)

    Simic, Marija; Wikström, Anna-Karin; Stephansson, Olof

    2017-10-01

    Our objective was to examine the association between fetal growth in early pregnancy and risk of severe large-for-gestational-age (LGA) and macrosomia at birth in a low-risk population. Cohort study that included 68 771 women with non-anomalous singleton pregnancies, without history of diabetes or hypertension, based on an electronic database on pregnancies and deliveries in Stockholm-Gotland Region, Sweden, 2008-2014. We performed multivariable logistic regression to estimate the association between accelerated fetal growth occurring in the first through early second trimester as measured by ultrasound and LGA and macrosomia at birth. Restricted analyses were performed in the groups without gestational diabetes and with normal body mass index (18.5-24.9 kg/m 2 ). When adjusting for confounders, the odds of having a severely LGA or macrosomic infant were elevated in mothers with fetuses that were at least 7 days larger than expected as compared with mothers without age discrepancy at the second-trimester scan (adjusted odds ratio 1.80; 95% CI 1.23-2.64 and adjusted odds ratio 2.15; 95% CI 1.55-2.98, respectively). Additionally, mothers without gestational diabetes and mothers with normal weight had an elevated risk of having a severely LGA or macrosomic infant when the age discrepancy by second-trimester ultrasound was at least 7 days. In a low-risk population, ultrasound-estimated accelerated fetal growth in early pregnancy was associated with an increased risk of having a severely LGA or macrosomic infant. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.

  12. Development of large scale production of Nd-doped phosphate glasses for megajoule-scale laser systems

    International Nuclear Information System (INIS)

    Ficini, G.; Campbell, J.H.

    1996-01-01

    Nd-doped phosphate glasses are the preferred gain medium for high-peak-power lasers used for Inertial Confinement Fusion research because they have excellent energy storage and extraction characteristics. In addition, these glasses can be manufactured defect-free in large sizes and at relatively low cost. To meet the requirements of the future mega-joule size lasers, advanced laser glass manufacturing methods are being developed that would enable laser glass to be continuously produced at the rate of several thousand large (790 x 440 x 44 mm 3 ) plates of glass per year. This represents more than a 10 to 100-fold improvement in the scale of the present manufacturing technology

  13. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  14. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  15. The Affordance Template ROS Package for Robot Task Programming

    Science.gov (United States)

    Hart, Stephen; Dinh, Paul; Hambuchen, Kimberly

    2015-01-01

    This paper introduces the Affordance Template ROS package for quickly programming, adjusting, and executing robot applications in the ROS RViz environment. This package extends the capabilities of RViz interactive markers by allowing an operator to specify multiple end-effector waypoint locations and grasp poses in object-centric coordinate frames and to adjust these waypoints in order to meet the run-time demands of the task (specifically, object scale and location). The Affordance Template package stores task specifications in a robot-agnostic XML description format such that it is trivial to apply a template to a new robot. As such, the Affordance Template package provides a robot-generic ROS tool appropriate for building semi-autonomous, manipulation-based applications. Affordance Templates were developed by the NASA-JSC DARPA Robotics Challenge (DRC) team and have since successfully been deployed on multiple platforms including the NASA Valkyrie and Robonaut 2 humanoids, the University of Texas Dreamer robot and the Willow Garage PR2. In this paper, the specification and implementation of the affordance template package is introduced and demonstrated through examples for wheel (valve) turning, pick-and-place, and drill grasping, evincing its utility and flexibility for a wide variety of robot applications.

  16. The new affordances in the home environment for motor development - infant scale (AHEMD-IS: Versions in English and Portuguese languages

    Directory of Open Access Journals (Sweden)

    Priscila M. Caçola

    2015-12-01

    Full Text Available The home environment has been established as a crucial factor for motor development, especially in infants. Exploring the home environment can have significant implications for intervention, as it is common practice in physical therapy to have professionals advise patients on home activities. Since 2010, our group has been working on the development of the Affordances in the Home Environment for Motor Development - Infant Scale (AHEMD-IS, a parental self-reporting instrument designed to assess the quality and quantity of factors (affordances in the home environment. In Brazil, the instrument has been translated as "Affordances no Ambiente Domiciliar para o Desenvolvimento Motor - Escala Bebê", and it has been extensively used in several studies that address infant development. These studies in Brazil and other parts of the world highly recommended the need for a normative sample and standardized scoring system. A description of the study that addressed that need, along with the English version of the questionnaire and score sheets, was recently published in the well-known and respected journal Physical Therapy. Our intent with the present short communication is to notify Brazilian investigators and clinicians of this latest update so they can download the new instrument, as well as present the Brazilian (Portuguese version of the AHEMD-IS along with its scoring system.

  17. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    Science.gov (United States)

    Dednam, W.; Botha, A. E.

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  18. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    International Nuclear Information System (INIS)

    Dednam, W; Botha, A E

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  19. Encapsulating the delivery of affordable housing: An overview of Malaysian practice

    Directory of Open Access Journals (Sweden)

    Jamaluddin Nor Baizura

    2016-01-01

    Full Text Available Urban population increases much faster than other geographical areas and this brings huge challenges to Malaysian government especially those responsible for the provision of housing. Moreover, the rise of living cost and pressure towards the current economic situation had inevitably led to huge demands for affordable housing. Thus, affordability and inaccessibility of people in owning a house has become one of the major issues in Malaysia. These issues are not only faced by the low-income group but mostly experienced by urban dwellers including the middle-income group whom are not eligible to apply for low-cost housing delivered by the government, yet cannot afford to buy a house. Hence, the Malaysian government had therefore established the Perumahan Rakyat 1 Malaysia (PR1MA as the catalyst in providing adequate, quality and affordable houses. Furthermore, the Syarikat Perumahan Negara Berhad (SPNB and the state governments also play their big role in providing affordable house at the state level. In trying to ensure smooth delivery of affordable housing, a systematic approach is required. This paper intends to examine the delivery of affordable housing in Malaysia. The objective was to assess the parties involved, and the various stages of the process. The methodology adopted for this study includes in-depth interviews with affordable housing agencies. Investigation showed that the major problem relates to a mismatch in the delivery of affordable housing. Moreover, the political and economic aspects, as well as the organization situation had also influenced the inefficiency of affordable housing delivery system. Additionally, the significant of this paper is expected to formulate an improved strategy and guidance in delivery system, thus ensuring that the method of practice is applicable throughout the country.

  20. Large-Scale Control of the Arabian Sea Summer Monsoon Inversion and Low Clouds: A New Perspective

    Science.gov (United States)

    Wu, C. H.; Wang, S. Y.; Hsu, H. H.; Hsu, P. C.

    2016-12-01

    The Arabian Sea undergoes a so-called summer monsoon inversion that reaches the maximum intensity in August associated with a large amount of low-level clouds. The formation of inversion and low clouds was generally thought to be a local system influenced by the India-Pakistan monsoon advancement. New empirical and numerical evidence suggests that, rather than being a mere byproduct of the nearby monsoon, the Arabian Sea monsoon inversion is coupled with a broad-scale monsoon evolution connected across the Africa Sahel, South Asia, and the East Asia-western North Pacific (WNP). Several subseasonal variations occur in tandem: The eastward expansion of the Asian-Pacific monsoonal heating likely suppresses the India-Pakistan monsoon while enhancing low-level thermal inversion of Arabian Sea; the upper-tropospheric anticyclone in South Asia weakens in August smoothing zonal contrast in geopotential heights (10°N-30°N); the subtropical WNP monsoon trough in the lower troposphere that signals the revival of East Asian summer monsoon matures in August; the Sahel rainfall peaks in August accompanied by an intensified tropical easterly jet. The occurrence of the latter two processes enhances upper-level anticyclones over Africa and WNP and this, in turn, induces subsidence in between over the Arabian Sea. Numerical experiments demonstrate the combined effect of the African and WNP monsoonal heating on the enhancement of the Arabian Sea monsoon inversion. Connection is further found in the interannual and decadal variations between the East Asian-WNP monsoon and the Arabian Sea monsoon inversion. In years with reduced low clouds of Arabian Sea, the East Asian midlatitude jet stream remains strong in August while the WNP monsoon trough appears to be weakened. The Arabian Sea inversion (ridge) and WNP trough pattern which forms a dipole structure, is also found to have intensified since the 21st century.

  1. An Affordable, Low-Risk Approach to Launching Research Spacecraft as Tertiary Payloads

    DEFF Research Database (Denmark)

    Pranajaya, F.M.; Zee, R.E.; Thomsen, Per Lundahl

    2003-01-01

    among numerous parties and the handling of complex export control issues. In turn, this complicates mission scheduling and increases the risk of missing launch deadlines. The University of Toronto Institute for Aerospace Studies, Space Flight Laboratory (UTIAS/SFL) has taken a leading role in addressing...

  2. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Directory of Open Access Journals (Sweden)

    Xiangyun Xiao

    Full Text Available The reconstruction of gene regulatory networks (GRNs from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM, experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  3. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Science.gov (United States)

    Xiao, Xiangyun; Zhang, Wei; Zou, Xiufen

    2015-01-01

    The reconstruction of gene regulatory networks (GRNs) from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE)-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM), experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  4. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    In foundries, violent explosions occur occasionally when molten metal comes into contact with water. If similar explosions can occur with other materials, hazardous situations may arise for example in LNG marine transportation accidents, or in liquid cooled reactor incidents when molten UO 2 contacts water or sodium coolant. Over the last 10 years a large body of experimental data has been obtained on the behaviour of small quantities of hot material in contact with a vaporisable coolant. Such experiments generally give low energy yields, despite producing fine fragmentation of the molten material. These events have been interpreted in terms of a wide range of phenomena such as violent boiling, liquid entrainment, bubble collapse, superheat, surface cracking and many others. Many of these studies have been aimed at understanding the small scale behaviour of the particular materials of interest. However, understanding the nature of the energetic events which were the original cause for concern may also be necessary to give confidence that violent events cannot occur for these materials in large scale situations. More recently, there has been a trend towards larger experiments and some of these have produced explosions of moderately high efficiency. Although occurrence of such large scale explosions can depend rather critically on initial conditions in a way which is not fully understood, there are signs that the interpretation of these events may be more straightforward than that of the single drop experiments. In the last two years several theoretical models for large scale explosions have appeared which attempt a self contained explanation of at least some stages of such high yield events: these have as their common feature a description of how a propagating breakdown of an initially quasi-stable distribution of materials is induced by the pressure and flow field caused by the energy release in adjacent regions. These models have led to the idea that for a full

  5. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  6. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  7. Young children's tool innovation across culture: Affordance visibility matters.

    Science.gov (United States)

    Neldner, Karri; Mushin, Ilana; Nielsen, Mark

    2017-11-01

    Young children typically demonstrate low rates of tool innovation. However, previous studies have limited children's performance by presenting tools with opaque affordances. In an attempt to scaffold children's understanding of what constitutes an appropriate tool within an innovation task we compared tools in which the focal affordance was visible to those in which it was opaque. To evaluate possible cultural specificity, data collection was undertaken in a Western urban population and a remote Indigenous community. As expected affordance visibility altered innovation rates: young children were more likely to innovate on a tool that had visible affordances than one with concealed affordances. Furthermore, innovation rates were higher than those reported in previous innovation studies. Cultural background did not affect children's rates of tool innovation. It is suggested that new methods for testing tool innovation in children must be developed in order to broaden our knowledge of young children's tool innovation capabilities. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  9. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  10. Long-term low-calorie low-protein vegan diet and endurance exercise are associated with low cardiometabolic risk.

    Science.gov (United States)

    Fontana, Luigi; Meyer, Timothy E; Klein, Samuel; Holloszy, John O

    2007-06-01

    Western diets, which typically contain large amounts of energy-dense processed foods, together with a sedentary lifestyle are associated with increased cardiometabolic risk. We evaluated the long-term effects of consuming a low-calorie low-protein vegan diet or performing regular endurance exercise on cardiometabolic risk factors. In this cross-sectional study, cardiometabolic risk factors were evaluated in 21 sedentary subjects, who had been on a low-calorie low-protein raw vegan diet for 4.4 +/- 2.8 years, (mean age, 53.1 +/- 11 yrs), 21 body mass index (BMI)-matched endurance runners consuming Western diets, and 21 age- and gender-matched sedentary subjects, consuming Western diets. BMI was lower in the low-calorie low-protein vegan diet (21.3 +/- 3.1 kg/m(2)) and endurance runner (21.1 +/- 1.6 kg/m(2)) groups than in the sedentary Western diet group (26.5 +/- 2.7 kg/m(2)) (p vegan diet and runner groups than in the Western diet group (all p vegan diet group (104 +/- 15 and 62 +/- 11 mm Hg) than in BMI-matched endurance runners (122 +/- 13 and 72 +/- 9 mmHg) and Western diet group (132 +/- 14 and 79 +/- 8 mm Hg) (p vegan diet or regular endurance exercise training is associated with low cardiometabolic risk. Moreover, our data suggest that specific components of a low-calorie low-protein vegan diet provide additional beneficial effects on blood pressure.

  11. Algorithm 873: LSTRS: MATLAB Software for Large-Scale Trust-Region Subproblems and Regularization

    DEFF Research Database (Denmark)

    Rojas Larrazabal, Marielba de la Caridad; Santos, Sandra A.; Sorensen, Danny C.

    2008-01-01

    A MATLAB 6.0 implementation of the LSTRS method is resented. LSTRS was described in Rojas, M., Santos, S.A., and Sorensen, D.C., A new matrix-free method for the large-scale trust-region subproblem, SIAM J. Optim., 11(3):611-646, 2000. LSTRS is designed for large-scale quadratic problems with one...... at each step. LSTRS relies on matrix-vector products only and has low and fixed storage requirements, features that make it suitable for large-scale computations. In the MATLAB implementation, the Hessian matrix of the quadratic objective function can be specified either explicitly, or in the form...... of a matrix-vector multiplication routine. Therefore, the implementation preserves the matrix-free nature of the method. A description of the LSTRS method and of the MATLAB software, version 1.2, is presented. Comparisons with other techniques and applications of the method are also included. A guide...

  12. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  13. Tablets for Learning in Higher Education:The Top 10 Affordances

    OpenAIRE

    Godsk, Mikkel

    2013-01-01

    Based on a small-scale literature review this paper identifies the top 10 affordances of post PC tablets (sometimes referred to as ‘tablet computers’) for higher education in settings where the technology is used for learning. The review shows that the predominant affordances of the technology are related to its ability to support engaging, inclusive, and/or collaborative learning, to provide flexibility in place, and to include multimedia and interactive content in teaching practice. However...

  14. SULTAN test facility for large-scale vessel coolability in natural convection at low pressure

    International Nuclear Information System (INIS)

    Rouge, S.

    1997-01-01

    The SULTAN facility (France/CEA/CENG) was designed to study large-scale structure coolability by water in boiling natural convection. The objectives are to measure the main characteristics of two-dimensional, two-phase flow, in order to evaluate the recirculation mass flow in large systems, and the limits of the critical heat flux (CHF) for a wide range of thermo-hydraulic (pressure, 0.1-0.5 MPa; inlet temperature, 50-150 C; mass flow velocity, 5-4400 kg s -1 m -2 ; flux, 100-1000 kW m -2 ) and geometric (gap, 3-15 cm; inclination, 0-90 ) parameters. This paper makes available the experimental data obtained during the first two campaigns (90 , 3 cm; 10 , 15 cm): pressure drop differential pressure (DP) = f(G), CHF limits, local profiles of temperature and void fraction in the gap, visualizations. Other campaigns should confirm these first results, indicating a favourable possibility of the coolability of large surfaces under natural convection. (orig.)

  15. [Scale effect of Li-Xiang Railway construction impact on landscape pattern and its ecological risk].

    Science.gov (United States)

    Wang, De-zhi; Qiu, Peng-hua; Fang, Yuan-min

    2015-08-01

    As a large corridor project, plateau railway has multiple points and passes various sensitive environments along the railway. The determination of the scope of impact on ecological environment from railway construction is often controversial in ecological impact assessment work. Taking the Tangbu-Jiantang section of Li-Xiang Railway as study object, and using present land use map (1:10000) in 2012 and DEM as data sources, corridor cutting degree index ( CCI) and cumulative effect index of corridor (CCEI) were established by topology, buffer zone and landscape metrics methods. Besides, the ecological risk index used for railway construction was improved. By quantitative analysis of characteristics of the spatio-temporal change of landscape pattern and its evolution style at different spatial scales before and after railway construction, the most appropriate evaluation scale of the railway was obtained. Then the characteristics of the spatio-temporal variation of ecological risk within this scale before and after railway construction were analyzed. The results indicated that the cutting model and degree of railway corridor to various landscape types could be effectively reflected by CCI, and the exposure and harm relations between risk sources and risk receptors of railway can be measured by CCEI. After the railway construction, the railway corridor would cause a great deal of middle cutting effect on the landscape along the railroad, which would influence wood land and grassland landscape most greatly, while would cause less effect of edge cutting and internal cutting. Landscape indices within the 600 m buffer zone demonstrated the most obvious scale effect, therefore, the 600 m zone of the railway was set as the most suitable range of ecological impact assessment. Before railway construction, the low ecological risk level covered the biggest part of the 600 m assessment zone. However, after the railway construction, the ecological risk increased significantly, and

  16. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  17. Access to diagnostic tests and essential medicines for cardiovascular diseases and diabetes care: cost, availability and affordability in the West Region of Cameroon.

    Directory of Open Access Journals (Sweden)

    Ahmadou M Jingi

    Full Text Available To assess the availability and affordability of medicines and routine tests for cardiovascular disease (CVD and diabetes in the West region of Cameroon, a low-income setting.A survey was conducted on the availability and cost of twelve routine tests and twenty medicines for CVD and diabetes in eight health districts (four urban and four rural covering over 60% of the population of the region (1.8 million. We analyzed the percentage of tests and medicines available, the median price against the international reference price (median price ratio for the medicines, and affordability in terms of the number of days' wages it would cost the lowest-paid unskilled government worker for initial investigation tests and procurement for one month of treatment.The availability of tests varied between 10% for the ECG to 100% for the fasting blood sugar. The average cost for the initial investigation using the minimum tests cost 29.76 days' wages. The availability of medicines varied from 36.4% to 59.1% in urban and from 9.1% to 50% in rural settings. Only metformin and benzathine-benzylpenicilline had a median price ratio of ≤ 1.5, with statins being largely unaffordable (at least 30.51 days' wages. One month of combination treatment for coronary heart disease costs at least 40.87 days' wages.The investigation and management of patients with medium-to-high cardiovascular risk remains largely unavailable and unaffordable in this setting. An effective non-communicable disease program should lay emphasis on primary prevention, and improve affordable access to essential medicines in public outlets.

  18. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  19. Large Scale Experiments on Spacecraft Fire Safety

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Minster, Olivier; Fernandez-Pello, A. Carlos; Tien, James S.; Torero, Jose L.; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; hide

    2012-01-01

    Full scale fire testing complemented by computer modelling has provided significant knowhow about the risk, prevention and suppression of fire in terrestrial systems (cars, ships, planes, buildings, mines, and tunnels). In comparison, no such testing has been carried out for manned spacecraft due to the complexity, cost and risk associated with operating a long duration fire safety experiment of a relevant size in microgravity. Therefore, there is currently a gap in knowledge of fire behaviour in spacecraft. The entire body of low-gravity fire research has either been conducted in short duration ground-based microgravity facilities or has been limited to very small fuel samples. Still, the work conducted to date has shown that fire behaviour in low-gravity is very different from that in normal gravity, with differences observed for flammability limits, ignition delay, flame spread behaviour, flame colour and flame structure. As a result, the prediction of the behaviour of fires in reduced gravity is at present not validated. To address this gap in knowledge, a collaborative international project, Spacecraft Fire Safety, has been established with its cornerstone being the development of an experiment (Fire Safety 1) to be conducted on an ISS resupply vehicle, such as the Automated Transfer Vehicle (ATV) or Orbital Cygnus after it leaves the ISS and before it enters the atmosphere. A computer modelling effort will complement the experimental effort. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. This will facilitate the possibility of examining fire behaviour on a scale that is relevant to spacecraft fire safety and will provide unique data for fire model validation. This unprecedented opportunity will expand the understanding of the fundamentals of fire behaviour in spacecraft. The experiment is being

  20. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  1. Max-Min SINR in Large-Scale Single-Cell MU-MIMO: Asymptotic Analysis and Low Complexity Transceivers

    KAUST Repository

    Sifaou, Houssem; Kammoun, Abla; Sanguinetti, Luca; Debbah, Merouane; Alouini, Mohamed-Slim

    2016-01-01

    This work focuses on the downlink and uplink of large-scale single-cell MU-MIMO systems in which the base station (BS) endowed with M antennas communicates with K single-antenna user equipments (UEs). Particularly, we aim at reducing the complexity

  2. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  3. Affordability Engineering: Bridging the Gap Between Design and Cost

    Science.gov (United States)

    Reeves, J. D.; DePasquale, Dominic; Lim, Evan

    2010-01-01

    Affordability is a commonly used term that takes on numerous meanings depending on the context used. Within conceptual design of complex systems, the term generally implies comparisons between expected costs and expected resources. This characterization is largely correct, but does not convey the many nuances and considerations that are frequently misunderstood and underappreciated. In the most fundamental sense, affordability and cost directly relate to engineering and programmatic decisions made throughout development programs. Systems engineering texts point out that there is a temporal aspect to this relationship, for decisions made earlier in a program dictate design implications much more so than those made during latter phases. This paper explores affordability engineering and its many sub-disciplines by discussing how it can be considered an additional engineering discipline to be balanced throughout the systems engineering and systems analysis processes. Example methods of multidisciplinary design analysis with affordability as a key driver will be discussed, as will example methods of data visualization, probabilistic analysis, and other ways of relating design decisions to affordability results.

  4. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  5. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  6. Stereotype Threat, Inquiring about Test Takers' Race and Gender, and Performance on Low-Stakes Tests in a Large-Scale Assessment. Research Report. ETS RR-15-02

    Science.gov (United States)

    Stricker, Lawrence J.; Rock, Donald A.; Bridgeman, Brent

    2015-01-01

    This study explores stereotype threat on low-stakes tests used in a large-scale assessment, math and reading tests in the Education Longitudinal Study of 2002 (ELS). Issues identified in laboratory research (though not observed in studies of high-stakes tests) were assessed: whether inquiring about their race and gender is related to the…

  7. Large-scale compositional heterogeneity in the Earth's mantle

    Science.gov (United States)

    Ballmer, M.

    2017-12-01

    Seismic imaging of subducted Farallon and Tethys lithosphere in the lower mantle has been taken as evidence for whole-mantle convection, and efficient mantle mixing. However, cosmochemical constraints point to a lower-mantle composition that has a lower Mg/Si compared to upper-mantle pyrolite. Moreover, geochemical signatures of magmatic rocks indicate the long-term persistence of primordial reservoirs somewhere in the mantle. In this presentation, I establish geodynamic mechanisms for sustaining large-scale (primordial) heterogeneity in the Earth's mantle using numerical models. Mantle flow is controlled by rock density and viscosity. Variations in intrinsic rock density, such as due to heterogeneity in basalt or iron content, can induce layering or partial layering in the mantle. Layering can be sustained in the presence of persistent whole mantle convection due to active "unmixing" of heterogeneity in low-viscosity domains, e.g. in the transition zone or near the core-mantle boundary [1]. On the other hand, lateral variations in intrinsic rock viscosity, such as due to heterogeneity in Mg/Si, can strongly affect the mixing timescales of the mantle. In the extreme case, intrinsically strong rocks may remain unmixed through the age of the Earth, and persist as large-scale domains in the mid-mantle due to focusing of deformation along weak conveyor belts [2]. That large-scale lateral heterogeneity and/or layering can persist in the presence of whole-mantle convection can explain the stagnation of some slabs, as well as the deflection of some plumes, in the mid-mantle. These findings indeed motivate new seismic studies for rigorous testing of model predictions. [1] Ballmer, M. D., N. C. Schmerr, T. Nakagawa, and J. Ritsema (2015), Science Advances, doi:10.1126/sciadv.1500815. [2] Ballmer, M. D., C. Houser, J. W. Hernlund, R. Wentzcovitch, and K. Hirose (2017), Nature Geoscience, doi:10.1038/ngeo2898.

  8. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    Science.gov (United States)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  9. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  10. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  11. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  12. The potential for agricultural land use change to reduce flood risk in a large watershed

    Science.gov (United States)

    Effects of agricultural land management practices on surface runoff are evident at local scales, but evidence for watershed-scale impacts is limited. In this study, we used the Soil and Water Assessment Tool model to assess changes in downstream flood risks under different land uses for the large, ...

  13. A Single-use Strategy to Enable Manufacturing of Affordable Biologics

    Directory of Open Access Journals (Sweden)

    Renaud Jacquemart

    2016-01-01

    Full Text Available The current processing paradigm of large manufacturing facilities dedicated to single product production is no longer an effective approach for best manufacturing practices. Increasing competition for new indications and the launch of biosimilars for the monoclonal antibody market have put pressure on manufacturers to produce at lower cost. Single-use technologies and continuous upstream processes have proven to be cost-efficient options to increase biomass production but as of today the adoption has been only minimal for the purification operations, partly due to concerns related to cost and scale-up. This review summarizes how a single-use holistic process and facility strategy can overcome scale limitations and enable cost-efficient manufacturing to support the growing demand for affordable biologics. Technologies enabling high productivity, right-sized, small footprint, continuous, and automated upstream and downstream operations are evaluated in order to propose a concept for the flexible facility of the future.

  14. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng; Yuan, Ganzhao; Ghanem, Bernard

    2013-01-01

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  15. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  16. Low cultural identification, low parental involvement and adverse peer influences as risk factors for delinquent behaviour among Filipino youth in Hawai'i.

    Science.gov (United States)

    Guerrero, Anthony P S; Nishimura, Stephanie T; Chang, Janice Y; Ona, Celia; Cunanan, Vanessa L; Hishinuma, Earl S

    2010-07-01

    Among Filipino youth in Hawai'i, low Filipino cultural identification and low family support may be important risk factors for delinquency. To examine, in a sample of Filipino youth in Hawai'i, correlations between delinquent behaviour and the aforementioned - as well as other, potentially mediating - variables. A youth risk survey and Filipino Culture Scale were administered to Filipino students (N = 150) in Hawai'i. A parent risk survey was administered to available and consenting parents. Delinquent behaviour correlated positively with acculturative stress, low cultural identification and adverse peer influences; and negatively with total Filipino Culture Scale score. Structural equation modelling suggested that absent/ineffective adults and adverse peer influences might be more important variables compared to low self-esteem and less religiosity, linking low cultural identification to delinquent behaviour. Although further studies are warranted, to be effective, efforts to prevent delinquency by enhancing Filipino youths' cultural connectedness may also need to enhance family connectedness and address adverse peer influences.

  17. Application of simplified models to CO2 migration and immobilization in large-scale geological systems

    KAUST Repository

    Gasda, Sarah E.

    2012-07-01

    Long-term stabilization of injected carbon dioxide (CO 2) is an essential component of risk management for geological carbon sequestration operations. However, migration and trapping phenomena are inherently complex, involving processes that act over multiple spatial and temporal scales. One example involves centimeter-scale density instabilities in the dissolved CO 2 region leading to large-scale convective mixing that can be a significant driver for CO 2 dissolution. Another example is the potentially important effect of capillary forces, in addition to buoyancy and viscous forces, on the evolution of mobile CO 2. Local capillary effects lead to a capillary transition zone, or capillary fringe, where both fluids are present in the mobile state. This small-scale effect may have a significant impact on large-scale plume migration as well as long-term residual and dissolution trapping. Computational models that can capture both large and small-scale effects are essential to predict the role of these processes on the long-term storage security of CO 2 sequestration operations. Conventional modeling tools are unable to resolve sufficiently all of these relevant processes when modeling CO 2 migration in large-scale geological systems. Herein, we present a vertically-integrated approach to CO 2 modeling that employs upscaled representations of these subgrid processes. We apply the model to the Johansen formation, a prospective site for sequestration of Norwegian CO 2 emissions, and explore the sensitivity of CO 2 migration and trapping to subscale physics. Model results show the relative importance of different physical processes in large-scale simulations. The ability of models such as this to capture the relevant physical processes at large spatial and temporal scales is important for prediction and analysis of CO 2 storage sites. © 2012 Elsevier Ltd.

  18. Investigation of the large scale regional hydrogeological situation at Ceberg

    International Nuclear Information System (INIS)

    Boghammar, A.; Grundfelt, B.; Hartley, L.

    1997-11-01

    The present study forms part of the large-scale groundwater flow studies within the SR 97 project. The site of interest is Ceberg. Within the present study two different regional scale groundwater models have been constructed, one large regional model with an areal extent of about 300 km 2 and one semi-regional model with an areal extent of about 50 km 2 . Different types of boundary conditions have been applied to the models. Topography driven pressures, constant infiltration rates, non-linear infiltration combined specified pressure boundary conditions, and transfer of groundwater pressures from the larger model to the semi-regional model. The present model has shown that: -Groundwater flow paths are mainly local. Large-scale groundwater flow paths are only seen below the depth of the hypothetical repository (below 500 meters) and are very slow. -Locations of recharge and discharge, to and from the site area are in the close vicinity of the site. -The low contrast between major structures and the rock mass means that the factor having the major effect on the flowpaths is the topography. -A sufficiently large model, to incorporate the recharge and discharge areas to the local site is in the order of kilometres. -A uniform infiltration rate boundary condition does not give a good representation of the groundwater movements in the model. -A local site model may be located to cover the site area and a few kilometers of the surrounding region. In order to incorporate all recharge and discharge areas within the site model, the model will be somewhat larger than site scale models at other sites. This is caused by the fact that the discharge areas are divided into three distinct areas to the east, south and west of the site. -Boundary conditions may be supplied to the site model by means of transferring groundwater pressures obtained with the semi-regional model

  19. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  20. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  1. Distributed system for large-scale remote research

    International Nuclear Information System (INIS)

    Ueshima, Yutaka

    2002-01-01

    In advanced photon research, large-scale simulations and high-resolution observations are powerfull tools. In numerical and real experiments, the real-time visualization and steering system is considered as a hopeful method of data analysis. This approach is valid in the typical analysis at one time or low cost experiment and simulation. In research of an unknown problem, it is necessary that the output data be analyzed many times because conclusive analysis is difficult at one time. Consequently, output data should be filed to refer and analyze at any time. To support research, we need the automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be a functionally distributed system. (author)

  2. Low-scale gaugino mediation, lots of leptons at the LHC

    International Nuclear Information System (INIS)

    De Simone, Andrea; Fan Jiji; Skiba, Witold; Schmaltz, Martin

    2008-01-01

    Low-scale gaugino mediation predicts that gauginos are significantly heavier than scalar superpartners. In order of increasing mass the lightest superpartners are the gravitino, right-handed sleptons, and left-handed sleptons (no light neutralino). This implies that squark decay chains pass through one or more sleptons and typical final states from squark and gluino production at the LHC include multiple leptons. In addition, left-handed staus have large branching fractions into right-handed staus and the Higgs. As an example, we compute the spectrum of low-scale deconstructed gaugino mediation. In this model gauginos acquire masses at tree level at 5 TeV while scalar masses are generated radiatively from the gaugino masses.

  3. Risk-based optimization of pipe inspections in large underground networks with imprecise information

    International Nuclear Information System (INIS)

    Mancuso, A.; Compare, M.; Salo, A.; Zio, E.; Laakso, T.

    2016-01-01

    In this paper, we present a novel risk-based methodology for optimizing the inspections of large underground infrastructure networks in the presence of incomplete information about the network features and parameters. The methodology employs Multi Attribute Value Theory to assess the risk of each pipe in the network, whereafter the optimal inspection campaign is built with Portfolio Decision Analysis (PDA). Specifically, Robust Portfolio Modeling (RPM) is employed to identify Pareto-optimal portfolios of pipe inspections. The proposed methodology is illustrated by reporting a real case study on the large-scale maintenance optimization of the sewerage network in Espoo, Finland. - Highlights: • Risk-based approach to optimize pipe inspections on large underground networks. • Reasonable computational effort to select efficient inspection portfolios. • Possibility to accommodate imprecise expert information. • Feasibility of the approach shown by Espoo water system case study.

  4. Reliability in Warehouse-Scale Computing: Why Low Latency Matters

    DEFF Research Database (Denmark)

    Nannarelli, Alberto

    2015-01-01

    , the limiting factor of these warehouse-scale data centers is the power dissipation. Power is dissipated not only in the computation itself, but also in heat removal (fans, air conditioning, etc.) to keep the temperature of the devices within the operating ranges. The need to keep the temperature low within......Warehouse sized buildings are nowadays hosting several types of large computing systems: from supercomputers to large clusters of servers to provide the infrastructure to the cloud. Although the main target, especially for high-performance computing, is still to achieve high throughput...

  5. Bursting and large-scale intermittency in turbulent convection with differential rotation

    International Nuclear Information System (INIS)

    Garcia, O.E.; Bian, N.H.

    2003-01-01

    The tilting mechanism, which generates differential rotation in two-dimensional turbulent convection, is shown to produce relaxation oscillations in the mean flow energy integral and bursts in the global fluctuation level, akin to Lotka-Volterra oscillations. The basic reason for such behavior is the unidirectional and conservative transfer of kinetic energy from the fluctuating motions to the mean component of the flows, and its dissipation at large scales. Results from numerical simulations further demonstrate the intimate relation between these low-frequency modulations and the large-scale intermittency of convective turbulence, as manifested by exponential tails in single-point probability distribution functions. Moreover, the spatio-temporal evolution of convective structures illustrates the mechanism triggering avalanche events in the transport process. The latter involves the overlap of delocalized mixing regions when the barrier to transport, produced by the mean component of the flow, transiently disappears

  6. The Cosmology Large Angular Scale Surveyor (CLASS)

    Science.gov (United States)

    Cleary, Joseph

    2018-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is an array of four telescopes designed to measure the polarization of the Cosmic Microwave Background. CLASS aims to detect the B-mode polarization from primordial gravitational waves predicted by cosmic inflation theory, as well as the imprint left by reionization upon the CMB E-mode polarization. This will be achieved through a combination of observing strategy and state-of-the-art instrumentation. CLASS is observing 70% of the sky to characterize the CMB at large angular scales, which will measure the entire CMB power spectrum from the reionization peak to the recombination peak. The four telescopes operate at frequencies of 38, 93, 145, and 217 GHz, in order to estimate Galactic synchrotron and dust foregrounds while avoiding atmospheric absorption. CLASS employs rapid polarization modulation to overcome atmospheric and instrumental noise. Polarization sensitive cryogenic detectors with low noise levels provide CLASS the sensitivity required to constrain the tensor-to-scalar ratio down to levels of r ~ 0.01 while also measuring the optical depth the reionization to sample-variance levels. These improved constraints on the optical depth to reionization are required to pin down the mass of neutrinos from complementary cosmological data. CLASS has completed a year of observations at 38 GHz and is in the process of deploying the rest of the telescope array. This poster provides an overview and update on the CLASS science, hardware and survey operations.

  7. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  8. Biomass Energy for Transport and Electricity: Large scale utilization under low CO2 concentration scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Luckow, Patrick; Wise, Marshall A.; Dooley, James J.; Kim, Son H.

    2010-01-25

    This paper examines the potential role of large scale, dedicated commercial biomass energy systems under global climate policies designed to stabilize atmospheric concentrations of CO2 at 400ppm and 450ppm. We use an integrated assessment model of energy and agriculture systems to show that, given a climate policy in which terrestrial carbon is appropriately valued equally with carbon emitted from the energy system, biomass energy has the potential to be a major component of achieving these low concentration targets. The costs of processing and transporting biomass energy at much larger scales than current experience are also incorporated into the modeling. From the scenario results, 120-160 EJ/year of biomass energy is produced by midcentury and 200-250 EJ/year by the end of this century. In the first half of the century, much of this biomass is from agricultural and forest residues, but after 2050 dedicated cellulosic biomass crops become the dominant source. A key finding of this paper is the role that carbon dioxide capture and storage (CCS) technologies coupled with commercial biomass energy can play in meeting stringent emissions targets. Despite the higher technology costs of CCS, the resulting negative emissions used in combination with biomass are a very important tool in controlling the cost of meeting a target, offsetting the venting of CO2 from sectors of the energy system that may be more expensive to mitigate, such as oil use in transportation. The paper also discusses the role of cellulosic ethanol and Fischer-Tropsch biomass derived transportation fuels and shows that both technologies are important contributors to liquid fuels production, with unique costs and emissions characteristics. Through application of the GCAM integrated assessment model, it becomes clear that, given CCS availability, bioenergy will be used both in electricity and transportation.

  9. Multidimensional scaling for large genomic data sets

    Directory of Open Access Journals (Sweden)

    Lu Henry

    2008-04-01

    Full Text Available Abstract Background Multi-dimensional scaling (MDS is aimed to represent high dimensional data in a low dimensional space with preservation of the similarities between data points. This reduction in dimensionality is crucial for analyzing and revealing the genuine structure hidden in the data. For noisy data, dimension reduction can effectively reduce the effect of noise on the embedded structure. For large data set, dimension reduction can effectively reduce information retrieval complexity. Thus, MDS techniques are used in many applications of data mining and gene network research. However, although there have been a number of studies that applied MDS techniques to genomics research, the number of analyzed data points was restricted by the high computational complexity of MDS. In general, a non-metric MDS method is faster than a metric MDS, but it does not preserve the true relationships. The computational complexity of most metric MDS methods is over O(N2, so that it is difficult to process a data set of a large number of genes N, such as in the case of whole genome microarray data. Results We developed a new rapid metric MDS method with a low computational complexity, making metric MDS applicable for large data sets. Computer simulation showed that the new method of split-and-combine MDS (SC-MDS is fast, accurate and efficient. Our empirical studies using microarray data on the yeast cell cycle showed that the performance of K-means in the reduced dimensional space is similar to or slightly better than that of K-means in the original space, but about three times faster to obtain the clustering results. Our clustering results using SC-MDS are more stable than those in the original space. Hence, the proposed SC-MDS is useful for analyzing whole genome data. Conclusion Our new method reduces the computational complexity from O(N3 to O(N when the dimension of the feature space is far less than the number of genes N, and it successfully

  10. Large scale features and energetics of the hybrid subtropical low `Duck' over the Tasman Sea

    Science.gov (United States)

    Pezza, Alexandre Bernardes; Garde, Luke Andrew; Veiga, José Augusto Paixão; Simmonds, Ian

    2014-01-01

    New aspects of the genesis and partial tropical transition of a rare hybrid subtropical cyclone on the eastern Australian coast are presented. The `Duck' (March 2001) attracted more recent attention due to its underlying genesis mechanisms being remarkably similar to the first South Atlantic hurricane (March 2004). Here we put this cyclone in climate perspective, showing that it belongs to a class within the 1 % lowest frequency percentile in the Southern Hemisphere as a function of its thermal evolution. A large scale analysis reveals a combined influence from an existing tropical cyclone and a persistent mid-latitude block. A Lagrangian tracer showed that the upper level air parcels arriving at the cyclone's center had been modified by the blocking. Lorenz energetics is used to identify connections with both tropical and extratropical processes, and reveal how these create the large scale environment conducive to the development of the vortex. The results reveal that the blocking exerted the most important influence, with a strong peak in barotropic generation of kinetic energy over a large area traversed by the air parcels just before genesis. A secondary peak also coincided with the first time the cyclone developed an upper level warm core, but with insufficient amplitude to allow for a full tropical transition. The applications of this technique are numerous and promising, particularly on the use of global climate models to infer changes in environmental parameters associated with severe storms.

  11. Protein homology model refinement by large-scale energy optimization.

    Science.gov (United States)

    Park, Hahnbeom; Ovchinnikov, Sergey; Kim, David E; DiMaio, Frank; Baker, David

    2018-03-20

    Proteins fold to their lowest free-energy structures, and hence the most straightforward way to increase the accuracy of a partially incorrect protein structure model is to search for the lowest-energy nearby structure. This direct approach has met with little success for two reasons: first, energy function inaccuracies can lead to false energy minima, resulting in model degradation rather than improvement; and second, even with an accurate energy function, the search problem is formidable because the energy only drops considerably in the immediate vicinity of the global minimum, and there are a very large number of degrees of freedom. Here we describe a large-scale energy optimization-based refinement method that incorporates advances in both search and energy function accuracy that can substantially improve the accuracy of low-resolution homology models. The method refined low-resolution homology models into correct folds for 50 of 84 diverse protein families and generated improved models in recent blind structure prediction experiments. Analyses of the basis for these improvements reveal contributions from both the improvements in conformational sampling techniques and the energy function.

  12. Prospects for mass unification at low energy scales

    International Nuclear Information System (INIS)

    Volkas, R.R.

    1995-01-01

    A simple Pati-Salam SU(4) model with a low symmetry breaking scale of about 1000 TeV is presented. The analysis concentrates on calculating radiative corrections to tree level mass relations for third generation fermions. The tree-level relation m b /m τ = 1 predicted by such models can receive large radiative corrections up to about 50% due to threshold effects at the mass unification scale. These corrections are thus of about the same importance as those that give rise to renormalisation group running. The high figure of 50% can be achieved because l-loop graphs involving the physical charged Higgs boson give corrections to m τ -m b that are proportional to the large top quark mass. These corrections can either increase or decrease m b /m τ depending on the value of an unknown parameter. They can also be made to vanish through a fine-tuning. A related model of tree-level t-b-τ unification which uses the identification of SU(2) R with custodial SU(2) is then discussed. A curious relation m b ∼ √2m τ is found to be satisfied at tree-level in this model. The overall conclusion of this work is that the tree-level relation m b =m τ at low scales such as 1000 TeV or somewhat higher can produce a successful value for m b /m τ after corrections, but one must be mindful that radiative corrections beyond those incorporated through the renormalisation group can be very important. 14 refs., 7 figs

  13. Causal inference between bioavailability of heavy metals and environmental factors in a large-scale region.

    Science.gov (United States)

    Liu, Yuqiong; Du, Qingyun; Wang, Qi; Yu, Huanyun; Liu, Jianfeng; Tian, Yu; Chang, Chunying; Lei, Jing

    2017-07-01

    The causation between bioavailability of heavy metals and environmental factors are generally obtained from field experiments at local scales at present, and lack sufficient evidence from large scales. However, inferring causation between bioavailability of heavy metals and environmental factors across large-scale regions is challenging. Because the conventional correlation-based approaches used for causation assessments across large-scale regions, at the expense of actual causation, can result in spurious insights. In this study, a general approach framework, Intervention calculus when the directed acyclic graph (DAG) is absent (IDA) combined with the backdoor criterion (BC), was introduced to identify causation between the bioavailability of heavy metals and the potential environmental factors across large-scale regions. We take the Pearl River Delta (PRD) in China as a case study. The causal structures and effects were identified based on the concentrations of heavy metals (Zn, As, Cu, Hg, Pb, Cr, Ni and Cd) in soil (0-20 cm depth) and vegetable (lettuce) and 40 environmental factors (soil properties, extractable heavy metals and weathering indices) in 94 samples across the PRD. Results show that the bioavailability of heavy metals (Cd, Zn, Cr, Ni and As) was causally influenced by soil properties and soil weathering factors, whereas no causal factor impacted the bioavailability of Cu, Hg and Pb. No latent factor was found between the bioavailability of heavy metals and environmental factors. The causation between the bioavailability of heavy metals and environmental factors at field experiments is consistent with that on a large scale. The IDA combined with the BC provides a powerful tool to identify causation between the bioavailability of heavy metals and environmental factors across large-scale regions. Causal inference in a large system with the dynamic changes has great implications for system-based risk management. Copyright © 2017 Elsevier Ltd. All

  14. Housing Affordability And Children's Cognitive Achievement.

    Science.gov (United States)

    Newman, Sandra; Holupka, C Scott

    2016-11-01

    Housing cost burden-the fraction of income spent on housing-is the most prevalent housing problem affecting the healthy development of millions of low- and moderate-income children. By affecting disposable income, a high burden affects parents' expenditures on both necessities for and enrichment of their children, as well as investments in their children. Reducing those expenditures and investments, in turn, can affect children's development, including their cognitive skills and physical, social, and emotional health. This article summarizes the first empirical evidence of the effects of housing affordability on children's cognitive achievement and on one factor that appears to contribute to these effects: the larger expenditures on child enrichment by families in affordable housing. We found that housing cost burden has the same relationship to both children's cognitive achievement and enrichment spending on children, exhibiting an inverted U shape in both cases. The maximum benefit occurs when housing cost burden is near 30 percent of income-the long-standing rule-of-thumb definition of affordable housing. The effect of the burden is stronger on children's math ability than on their reading comprehension and is more pronounced with burdens above the 30 percent standard. For enrichment spending, the curve is "shallower" (meaning the effect of optimal affordability is less pronounced) but still significant. Project HOPE—The People-to-People Health Foundation, Inc.

  15. Decomposition of residual oil by large scale HSC plant

    Energy Technology Data Exchange (ETDEWEB)

    Washimi, Koichi; Ogata, Yoshitaka; Limmer, H.; Schuetter, H. (Toyo Engineering Corp., funabashi, Japan VEB Petrolchemisches Kombinat Schwedt, Schwedt (East Germany))

    1989-07-01

    Regarding large scale and high decomposition ratio visbreaker HSC, characteristic points and operation conditions of a new plant in East Germany were introduced. As for the characteristics of the process, high decomposition ratio and stable decpmposed oil, availability of high sulfur content oil or even decomposed residuum of visbreaker, stableness of produced light oil with low content of unsaturated components, low investment with low running cost, were indicated. For the realization of high decomposition ratio, designing for suppressing the decomposition in heating furnace and accelaration of it in soaking drum, high space velocity of gas phase for better agitation, were raised. As the main subject of technical development, design of soaking drum was indicated with main dimensions for the designing. Operation conditions of the process in East Germany using residual oil supplied from already working visbreaker for USSR crude oil were introduced. 6 refs., 4 figs., 2 tabs.

  16. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  17. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  18. Experimental facilities for large-scale and full-scale study of hydrogen accidents

    Energy Technology Data Exchange (ETDEWEB)

    Merilo, E.; Groethe, M.; Colton, J. [SRI International, Poulter Laboratory, Menlo Park, CA (United States); Chiba, S. [SRI Japan, Tokyo (Japan)

    2007-07-01

    This paper summarized some of the work that has been performed at SRI International over the past 5 years that address safety issues for the hydrogen-based economy. Researchers at SRI International have conducted experiments at the Corral Hollow Experiment Site (CHES) near Livermore California to obtain fundamental data on hydrogen explosions for risk assessment. In particular, large-scale hydrogen tests were conducted using homogeneous mixtures of hydrogen in volumes from 5.3 m{sup 3} to 300 m{sup 3} to represent scenarios involving fuel cell vehicles as well as transport and storage facilities. Experiments have focused on unconfined deflagrations of hydrogen and air, and detonations of hydrogen in a semi-open space to measure free-field blast effects; the use of blast walls as a mitigation technique; turbulent enhancement of hydrogen combustion due to obstacles within the mixture, and determination of when deflagration-to-detonation transition occurs; the effect of confined hydrogen releases and explosions that could originate from an interconnecting hydrogen pipeline; and, large and small accidental releases of hydrogen. The experiments were conducted to improve the prediction of hydrogen explosions and the capabilities for performing risk assessments, and to develop mitigation techniques. Measurements included hydrogen concentration; flame speed; blast overpressure; heat flux; and, high-speed, standard, and infrared video. The data collected in these experiments is used to correlate computer models and to facilitate the development of codes and standards. This work contributes to better safety technology by evaluating the effectiveness of different blast mitigation techniques. 13 refs., 13 figs.

  19. Sodium-cutting: a new top-down approach to cut open nanostructures on nonplanar surfaces on a large scale.

    Science.gov (United States)

    Chen, Wei; Deng, Da

    2014-11-11

    We report a new, low-cost and simple top-down approach, "sodium-cutting", to cut and open nanostructures deposited on a nonplanar surface on a large scale. The feasibility of sodium-cutting was demonstrated with the successfully cutting open of ∼100% carbon nanospheres into nanobowls on a large scale from Sn@C nanospheres for the first time.

  20. Causal inference between bioavailability of heavy metals and environmental factors in a large-scale region

    International Nuclear Information System (INIS)

    Liu, Yuqiong; Du, Qingyun; Wang, Qi; Yu, Huanyun; Liu, Jianfeng; Tian, Yu; Chang, Chunying; Lei, Jing

    2017-01-01

    The causation between bioavailability of heavy metals and environmental factors are generally obtained from field experiments at local scales at present, and lack sufficient evidence from large scales. However, inferring causation between bioavailability of heavy metals and environmental factors across large-scale regions is challenging. Because the conventional correlation-based approaches used for causation assessments across large-scale regions, at the expense of actual causation, can result in spurious insights. In this study, a general approach framework, Intervention calculus when the directed acyclic graph (DAG) is absent (IDA) combined with the backdoor criterion (BC), was introduced to identify causation between the bioavailability of heavy metals and the potential environmental factors across large-scale regions. We take the Pearl River Delta (PRD) in China as a case study. The causal structures and effects were identified based on the concentrations of heavy metals (Zn, As, Cu, Hg, Pb, Cr, Ni and Cd) in soil (0–20 cm depth) and vegetable (lettuce) and 40 environmental factors (soil properties, extractable heavy metals and weathering indices) in 94 samples across the PRD. Results show that the bioavailability of heavy metals (Cd, Zn, Cr, Ni and As) was causally influenced by soil properties and soil weathering factors, whereas no causal factor impacted the bioavailability of Cu, Hg and Pb. No latent factor was found between the bioavailability of heavy metals and environmental factors. The causation between the bioavailability of heavy metals and environmental factors at field experiments is consistent with that on a large scale. The IDA combined with the BC provides a powerful tool to identify causation between the bioavailability of heavy metals and environmental factors across large-scale regions. Causal inference in a large system with the dynamic changes has great implications for system-based risk management. - Causation between the

  1. Probabilistic discrimination between large-scale environments of intensifying and decaying African Easterly Waves

    Energy Technology Data Exchange (ETDEWEB)

    Agudelo, Paula A. [Area Hidrometria e Instrumentacion Carrera, Empresas Publicas de Medellin, Medellin (Colombia); Hoyos, Carlos D.; Curry, Judith A.; Webster, Peter J. [School of Earth and Atmospheric Sciences, Georgia Institute of Technology, Atlanta, GA (United States)

    2011-04-15

    About 50-60% of Atlantic tropical cyclones (TCs) including nearly 85% of intense hurricanes have their origins as African Easterly Waves (AEWs). However, predicting the likelihood of AEW intensification remains a difficult task. We have developed a Bayesian diagnostic methodology to understand genesis of North Atlantic TCs spawned by AEWs through the examination of the characteristics of the AEW itself together with the large-scale environment, resulting in a probabilistic discrimination between large-scale environments associated with intensifying and decaying AEWs. The methodology is based on a new objective and automatic AEW tracking scheme used for the period 1980 to 2001 based on spatio-temporally Fourier-filtered relative vorticity and meridional winds at different levels and outgoing long wave radiation. Using the AEW and Hurricane Best Track Files (HURDAT) data sets, probability density functions of environmental variables that discriminate between AEWs that decay, become TCs or become major hurricanes are determined. Results indicate that the initial amplitude of the AEWs is a major determinant for TC genesis, and that TC genesis risk increases when the wave enters an environment characterized by pre-existing large-scale convergence and moist convection. For the prediction of genesis, the most useful variables are column integrated heating, vertical velocity and specific humidity, and a combined form of divergence and vertical velocity and SST. It is also found that the state of the large-scale environment modulates the annual cycle and interannual variability of the AEW intensification efficiency. (orig.)

  2. VLSI scaling methods and low power CMOS buffer circuit

    International Nuclear Information System (INIS)

    Sharma Vijay Kumar; Pattanaik Manisha

    2013-01-01

    Device scaling is an important part of the very large scale integration (VLSI) design to boost up the success path of VLSI industry, which results in denser and faster integration of the devices. As technology node moves towards the very deep submicron region, leakage current and circuit reliability become the key issues. Both are increasing with the new technology generation and affecting the performance of the overall logic circuit. The VLSI designers must keep the balance in power dissipation and the circuit's performance with scaling of the devices. In this paper, different scaling methods are studied first. These scaling methods are used to identify the effects of those scaling methods on the power dissipation and propagation delay of the CMOS buffer circuit. For mitigating the power dissipation in scaled devices, we have proposed a reliable leakage reduction low power transmission gate (LPTG) approach and tested it on complementary metal oxide semiconductor (CMOS) buffer circuit. All simulation results are taken on HSPICE tool with Berkeley predictive technology model (BPTM) BSIM4 bulk CMOS files. The LPTG CMOS buffer reduces 95.16% power dissipation with 84.20% improvement in figure of merit at 32 nm technology node. Various process, voltage and temperature variations are analyzed for proving the robustness of the proposed approach. Leakage current uncertainty decreases from 0.91 to 0.43 in the CMOS buffer circuit that causes large circuit reliability. (semiconductor integrated circuits)

  3. Initial condition effects on large scale structure in numerical simulations of plane mixing layers

    Science.gov (United States)

    McMullan, W. A.; Garrett, S. J.

    2016-01-01

    In this paper, Large Eddy Simulations are performed on the spatially developing plane turbulent mixing layer. The simulated mixing layers originate from initially laminar conditions. The focus of this research is on the effect of the nature of the imposed fluctuations on the large-scale spanwise and streamwise structures in the flow. Two simulations are performed; one with low-level three-dimensional inflow fluctuations obtained from pseudo-random numbers, the other with physically correlated fluctuations of the same magnitude obtained from an inflow generation technique. Where white-noise fluctuations provide the inflow disturbances, no spatially stationary streamwise vortex structure is observed, and the large-scale spanwise turbulent vortical structures grow continuously and linearly. These structures are observed to have a three-dimensional internal geometry with branches and dislocations. Where physically correlated provide the inflow disturbances a "streaky" streamwise structure that is spatially stationary is observed, with the large-scale turbulent vortical structures growing with the square-root of time. These large-scale structures are quasi-two-dimensional, on top of which the secondary structure rides. The simulation results are discussed in the context of the varying interpretations of mixing layer growth that have been postulated. Recommendations are made concerning the data required from experiments in order to produce accurate numerical simulation recreations of real flows.

  4. Environmental degradation, global food production, and risk for large-scale migrations

    International Nuclear Information System (INIS)

    Doeoes, B.R.

    1994-01-01

    This paper attempts to estimate to what extent global food production is affected by the ongoing environmental degradation through processes, such as soil erosion, salinization, chemical contamination, ultraviolet radiation, and biotic stress. Estimates have also been made of available opportunities to improve food production efficiency by, e.g., increased use of fertilizers, irrigation, and biotechnology, as well as improved management. Expected losses and gains of agricultural land in competition with urbanization, industrial development, and forests have been taken into account. Although estimated gains in food production deliberately have been overestimated and losses underestimated, calculations indicate that during the next 30-35 years the annual net gain in food production will be significantly lower than the rate of world population growth. An attempt has also been made to identify possible scenarios for large-scale migrations, caused mainly by rapid population growth in combination with insufficient local food production and poverty. 18 refs, 7 figs, 6 tabs

  5. Actant affordances: a brief history of affordance theory and a ...

    African Journals Online (AJOL)

    Affordance theory provides a useful lens to explore the action opportunities that arise between users and technology, especially in education. However developments in the theory have resulted both in confusion and misapplication, due partly to issues related to affordance theory's ontology. This paper outlines two ...

  6. 55 Actant affordances: a brief history of affordance theory and a ...

    African Journals Online (AJOL)

    He combined affordances and perceptual information in a simple matrix, as shown in figure ... Secondly it shows up cases of False Affordances that appear to be useful ..... Distributed Cognitions: Psychological and Educational Considerations.

  7. The carcinogenic risks of low-LET and high-LET ionizing radiations

    International Nuclear Information System (INIS)

    Fabrikant, J.I.

    1991-08-01

    This report presents a discussion on risk from ionizing radiations to human populations. Important new information on human beings has come mainly from further follow-up of existing epidemiological studies, notably the Japanese atomic bomb survivors and the ankylosing spondylitis patients; from new epidemiological surveys, such as the patients treated for cancer of the uterine cervix; and from combined surveys, including workers exposed in underground mines. Since the numerous and complex differences among the different study populations introduce factors that influence the risk estimates derived in ways that are not completely understood, it is not clear how to combine the different risk estimates obtained. These factors involve complex biological and physical variables distributed over time. Because such carcinogenic effects occur too infrequently to be demonstrated at low doses, the risks of low-dose radiation can be estimated only by interpolation from observations at high doses on the basis of theoretical concepts, mathematical models and available empirical evidence, primarily the epidemiological surveys of large populations exposed to ionizing radiation. In spite of a considerable amount of research, only recently has there has been efforts to apply the extensive laboratory data in animals to define the dose-incidence relationship in the low dose region. There simply are insufficient data in the epidemiological studies of large human populations to estimate risk coefficients directly from exposure to low doses. The risk estimates for the carcinogenic effects of radiation have been, in the past, somewhat low and reassessment of the numerical values is now necessary

  8. Research on precision grinding technology of large scale and ultra thin optics

    Science.gov (United States)

    Zhou, Lian; Wei, Qiancai; Li, Jie; Chen, Xianhua; Zhang, Qinghua

    2018-03-01

    The flatness and parallelism error of large scale and ultra thin optics have an important influence on the subsequent polishing efficiency and accuracy. In order to realize the high precision grinding of those ductile elements, the low deformation vacuum chuck was designed first, which was used for clamping the optics with high supporting rigidity in the full aperture. Then the optics was planar grinded under vacuum adsorption. After machining, the vacuum system was turned off. The form error of optics was on-machine measured using displacement sensor after elastic restitution. The flatness would be convergenced with high accuracy by compensation machining, whose trajectories were integrated with the measurement result. For purpose of getting high parallelism, the optics was turned over and compensation grinded using the form error of vacuum chuck. Finally, the grinding experiment of large scale and ultra thin fused silica optics with aperture of 430mm×430mm×10mm was performed. The best P-V flatness of optics was below 3 μm, and parallelism was below 3 ″. This machining technique has applied in batch grinding of large scale and ultra thin optics.

  9. Can poor consumers pay for energy and water? An affordability analysis for transition countries

    International Nuclear Information System (INIS)

    Frankhauser, S.; Sladjana Tepic

    2007-01-01

    Low-income households spend a substantial share of their income on utility services such as electricity, heating and water. The difficulty of these socially vulnerable consumers to absorb further price increases is often used as an argument against tariff reform. However, detailed quantitative information on the affordability of tariff adjustments for low-income consumers is actually quite scarce. Much of the available information is based on households. This paper takes a more detailed look at the affordability of electricity, district heating and water for low-income consumers in transition countries. While the available data are incomplete, the paper finds that affordability is a problem for low-income consumers in most countries, in particular in the water sector and in the Commonwealth of Independent States (CIS). The affordability consequences of tariff reform ultimately depend on the speed of tariff adjustments relative to the growth in household income, the level of tariffs needed for cost recovery, the level of effective tariffs at the outset (tariffs adjusted for non-payment) and the demand response to the tariff increase. The paper finds that delaying tariff reform by a few years makes little difference to affordability constraints, and may therefore not be an effective way to mitigate the social impact of utility reform. (author)

  10. Can poor consumers pay for energy and water? An affordability analysis for transition countries

    International Nuclear Information System (INIS)

    Fankhauser, Samuel; Tepic, Sladjana

    2007-01-01

    Low-income households spend a substantial share of their income on utility services such as electricity, heating and water. The difficulty of these socially vulnerable consumers to absorb further price increases is often used as an argument against tariff reform. However, detailed quantitative information on the affordability of tariff adjustments for low-income consumers is actually quite scarce. Much of the available information is based on households. This paper takes a more detailed look at the affordability of electricity, district heating and water for low-income consumers in transition countries. While the available data are incomplete, the paper finds that affordability is a problem for low-income consumers in most countries, in particular in the water sector and in the Commonwealth of Independent States (CIS). The affordability consequences of tariff reform ultimately depend on the speed of tariff adjustments relative to the growth in household income, the level of tariffs needed for cost recovery, the level of effective tariffs at the outset (tariffs adjusted for non-payment) and the demand response to the tariff increase. The paper finds that delaying tariff reform by a few years makes little difference to affordability constraints, and may therefore not be an effective way to mitigate the social impact of utility reform

  11. Availability and Affordability of Insurance Under Climate Change. A Growing Challenge for the U.S

    International Nuclear Information System (INIS)

    Mills, E.; Roth, R.J. Jr; Lecomte, E.

    2005-01-01

    The paper explores the insurability of risks from climate change, and ways in which insurance affordability and availability could be adversely impacted in the U.S. i n the coming years. It includes examples where affordability and availability of insurance are already at risk from rising weather-related losses and how future financial exposure for insurers, governments, businesses and consumers could worsen if current climate and business trends continue

  12. A family of conjugate gradient methods for large-scale nonlinear equations.

    Science.gov (United States)

    Feng, Dexiang; Sun, Min; Wang, Xueyong

    2017-01-01

    In this paper, we present a family of conjugate gradient projection methods for solving large-scale nonlinear equations. At each iteration, it needs low storage and the subproblem can be easily solved. Compared with the existing solution methods for solving the problem, its global convergence is established without the restriction of the Lipschitz continuity on the underlying mapping. Preliminary numerical results are reported to show the efficiency of the proposed method.

  13. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  14. Online Censoring for Large-Scale Regressions with Application to Streaming Big Data.

    Science.gov (United States)

    Berberidis, Dimitris; Kekatos, Vassilis; Giannakis, Georgios B

    2016-08-01

    On par with data-intensive applications, the sheer size of modern linear regression problems creates an ever-growing demand for efficient solvers. Fortunately, a significant percentage of the data accrued can be omitted while maintaining a certain quality of statistical inference with an affordable computational budget. This work introduces means of identifying and omitting less informative observations in an online and data-adaptive fashion. Given streaming data, the related maximum-likelihood estimator is sequentially found using first- and second-order stochastic approximation algorithms. These schemes are well suited when data are inherently censored or when the aim is to save communication overhead in decentralized learning setups. In a different operational scenario, the task of joint censoring and estimation is put forth to solve large-scale linear regressions in a centralized setup. Novel online algorithms are developed enjoying simple closed-form updates and provable (non)asymptotic convergence guarantees. To attain desired censoring patterns and levels of dimensionality reduction, thresholding rules are investigated too. Numerical tests on real and synthetic datasets corroborate the efficacy of the proposed data-adaptive methods compared to data-agnostic random projection-based alternatives.

  15. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  16. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  17. A global classification of coastal flood hazard climates associated with large-scale oceanographic forcing.

    Science.gov (United States)

    Rueda, Ana; Vitousek, Sean; Camus, Paula; Tomás, Antonio; Espejo, Antonio; Losada, Inigo J; Barnard, Patrick L; Erikson, Li H; Ruggiero, Peter; Reguero, Borja G; Mendez, Fernando J

    2017-07-11

    Coastal communities throughout the world are exposed to numerous and increasing threats, such as coastal flooding and erosion, saltwater intrusion and wetland degradation. Here, we present the first global-scale analysis of the main drivers of coastal flooding due to large-scale oceanographic factors. Given the large dimensionality of the problem (e.g. spatiotemporal variability in flood magnitude and the relative influence of waves, tides and surge levels), we have performed a computer-based classification to identify geographical areas with homogeneous climates. Results show that 75% of coastal regions around the globe have the potential for very large flooding events with low probabilities (unbounded tails), 82% are tide-dominated, and almost 49% are highly susceptible to increases in flooding frequency due to sea-level rise.

  18. A comparison of working in small-scale and large-scale nursing homes: A systematic review of quantitative and qualitative evidence.

    Science.gov (United States)

    Vermeerbergen, Lander; Van Hootegem, Geert; Benders, Jos

    2017-02-01

    Ongoing shortages of care workers, together with an ageing population, make it of utmost importance to increase the quality of working life in nursing homes. Since the 1970s, normalised and small-scale nursing homes have been increasingly introduced to provide care in a family and homelike environment, potentially providing a richer work life for care workers as well as improved living conditions for residents. 'Normalised' refers to the opportunities given to residents to live in a manner as close as possible to the everyday life of persons not needing care. The study purpose is to provide a synthesis and overview of empirical research comparing the quality of working life - together with related work and health outcomes - of professional care workers in normalised small-scale nursing homes as compared to conventional large-scale ones. A systematic review of qualitative and quantitative studies. A systematic literature search (April 2015) was performed using the electronic databases Pubmed, Embase, PsycInfo, CINAHL and Web of Science. References and citations were tracked to identify additional, relevant studies. We identified 825 studies in the selected databases. After checking the inclusion and exclusion criteria, nine studies were selected for review. Two additional studies were selected after reference and citation tracking. Three studies were excluded after requesting more information on the research setting. The findings from the individual studies suggest that levels of job control and job demands (all but "time pressure") are higher in normalised small-scale homes than in conventional large-scale nursing homes. Additionally, some studies suggested that social support and work motivation are higher, while risks of burnout and mental strain are lower, in normalised small-scale nursing homes. Other studies found no differences or even opposing findings. The studies reviewed showed that these inconclusive findings can be attributed to care workers in some

  19. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  20. On distributed wavefront reconstruction for large-scale adaptive optics systems.

    Science.gov (United States)

    de Visser, Cornelis C; Brunner, Elisabeth; Verhaegen, Michel

    2016-05-01

    The distributed-spline-based aberration reconstruction (D-SABRE) method is proposed for distributed wavefront reconstruction with applications to large-scale adaptive optics systems. D-SABRE decomposes the wavefront sensor domain into any number of partitions and solves a local wavefront reconstruction problem on each partition using multivariate splines. D-SABRE accuracy is within 1% of a global approach with a speedup that scales quadratically with the number of partitions. The D-SABRE is compared to the distributed cumulative reconstruction (CuRe-D) method in open-loop and closed-loop simulations using the YAO adaptive optics simulation tool. D-SABRE accuracy exceeds CuRe-D for low levels of decomposition, and D-SABRE proved to be more robust to variations in the loop gain.

  1. 24 CFR 1000.104 - What families are eligible for affordable housing activities?

    Science.gov (United States)

    2010-04-01

    ... affordable housing activities? 1000.104 Section 1000.104 Housing and Urban Development Regulations Relating... Activities § 1000.104 What families are eligible for affordable housing activities? The following families... Indian area. (b) A non-low income Indian family may receive housing assistance in accordance with § 1000...

  2. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    International Nuclear Information System (INIS)

    Kanehiro, B.Y.; Lai, C.H.; Stow, S.H.

    1987-05-01

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made

  3. Iodine oxides in large-scale THAI tests

    International Nuclear Information System (INIS)

    Funke, F.; Langrock, G.; Kanzleiter, T.; Poss, G.; Fischer, K.; Kühnel, A.; Weber, G.; Allelein, H.-J.

    2012-01-01

    Highlights: ► Iodine oxide particles were produced from gaseous iodine and ozone. ► Ozone replaced the effect of ionizing radiation in the large-scale THAI facility. ► The mean diameter of the iodine oxide particles was about 0.35 μm. ► Particle formation was faster than the chemical reaction between iodine and ozone. ► Deposition of iodine oxide particles was slow in the absence of other aerosols. - Abstract: The conversion of gaseous molecular iodine into iodine oxide aerosols has significant relevance in the understanding of the fission product iodine volatility in a LWR containment during severe accidents. In containment, the high radiation field caused by fission products released from the reactor core induces radiolytic oxidation into iodine oxides. To study the characteristics and the behaviour of iodine oxides in large scale, two THAI tests Iod-13 and Iod-14 were performed, simulating radiolytic oxidation of molecular iodine by reaction of iodine with ozone, with ozone injected from an ozone generator. The observed iodine oxides form submicron particles with mean volume-related diameters of about 0.35 μm and show low deposition rates in the THAI tests performed in the absence of other nuclear aerosols. Formation of iodine aerosols from gaseous precursors iodine and ozone is fast as compared to their chemical interaction. The current approach in empirical iodine containment behaviour models in severe accidents, including the radiolytic production of I 2 -oxidizing agents followed by the I 2 oxidation itself, is confirmed by these THAI tests.

  4. Challenges of Integrating Affordable and Sustainable Housing in Malaysia

    Science.gov (United States)

    Syed Jamaludin, S. Z. H.; Mahayuddin, S. A.; Hamid, S. H. A.

    2018-04-01

    Developing countries including Malaysia have begun to comprehend the needs for affordable and sustainable housing development. The majority of the population is still aspiring for a comfortable, safe and reasonably priced house. Households in the low-middle income range face difficulties to find housing that can satisfy their needs and budget. Unfortunately, most of the housing development programs are considering affordability rather than sustainable aspects. Furthermore, developers are more interested in profit and neglect sustainability issues. Thus, the aim of this paper is to review the challenges in integrating affordable housing and sustainable practices in Malaysia. This paper is produced based on an extensive literature review as a basis to develop strategies of integrated affordable and sustainable housing in Malaysia. The challenges are divided into four sections, namely market challenges, professional challenges, societal challenges and technological challenges. The outcomes of this paper will assist in the decision making involving housing development and in enhancing quality of life for sustainable communities.

  5. Engineering an Affordable Self-Driving Car

    KAUST Repository

    Budisteanu, Alexandru Ionut

    2018-01-17

    "More than a million people die in car accidents each year, and most of those accidents are the result of human errorヤ Alexandru Budisteanu is 23 years old and owns a group of startups including Autonomix, an Artificial Intelligence software for affordable self-driving cars and he designed a low-cost self-driving car. The car\\'s roof has cameras and low-resolution 3D LiDAR equipment to detect traffic lanes, other cars, curbs and obstacles, such as people crossing by. To process this dizzying amount of data, Alexandru employed Artificial Intelligence algorithms to extract information from the visual data and plot a safe route for the car. Then, he built a manufacturing facility in his garage from Romania to assembly affordable VisionBot Pick and Place robots that are used to produce electronics. During this lecture, Alexandru will talk about this autonomous self-driving car prototype, for which he received the grand prize of the Intel International Science and Engineering Fair, and was nominated by TIME magazine as one of the worldメs most influential teens of 2013.

  6. Secure File Allocation and Caching in Large-scale Distributed Systems

    DEFF Research Database (Denmark)

    Di Mauro, Alessio; Mei, Alessandro; Jajodia, Sushil

    2012-01-01

    In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with hi......-balancing, and reducing delay of read operations. The system offers a trade-off-between performance and security that is dynamically tunable according to the current level of threat. We validate our mechanisms with extensive simulations in an Internet-like network.......In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with high...... security requirements in a system composed of a majority of low-security servers. We develop mechanisms to fragment files, to allocate them into multiple servers, and to cache them as close as possible to their readers while preserving the security requirement of the files, providing load...

  7. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  8. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  9. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  10. Risk Factors for Depression : Differential Across Age?

    NARCIS (Netherlands)

    Schaakxs, Roxanne; Comijs, Hannie C; van der Mast, Roos C; Schoevers, Robert A; Beekman, Aartjan T F; Penninx, Brenda W J H

    INTRODUCTION: The occurrence of well-established risk factors for depression differs across the lifespan. Risk factors may be more strongly associated with depression at ages when occurrence, and therefore expectance, is relatively low ("on-time off-time" hypothesis). This large-scale study examined

  11. Spatial and temporal distribution of pore gas concentrations during mainstream large-scale trough composting in China.

    Science.gov (United States)

    Zeng, Jianfei; Shen, Xiuli; Sun, Xiaoxi; Liu, Ning; Han, Lujia; Huang, Guangqun

    2018-05-01

    With the advantages of high treatment capacity and low operational cost, large-scale trough composting has become one of the mainstream composting patterns in composting plants in China. This study measured concentrations of O 2 , CO 2 , CH 4 and NH 3 on-site to investigate the spatial and temporal distribution of pore gas concentrations during mainstream large-scale trough composting in China. The results showed that the temperature in the center of the pile was obviously higher than that in the side of the pile. Pore O 2 concentration rapidly decreased and maintained composting process during large-scale trough composting when the pile was naturally aerated, which will contribute to improving the current undesirable atmosphere environment in China. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Prospects for mass unification at low energy scales

    Energy Technology Data Exchange (ETDEWEB)

    Volkas, R.R.

    1995-12-31

    A simple Pati-Salam SU(4) model with a low symmetry breaking scale of about 1000 TeV is presented. The analysis concentrates on calculating radiative corrections to tree level mass relations for third generation fermions. The tree-level relation m{sub b}/m{sub {tau}} = 1 predicted by such models can receive large radiative corrections up to about 50% due to threshold effects at the mass unification scale. These corrections are thus of about the same importance as those that give rise to renormalisation group running. The high figure of 50% can be achieved because l-loop graphs involving the physical charged Higgs boson give corrections to m{sub {tau}} -m{sub b} that are proportional to the large top quark mass. These corrections can either increase or decrease m{sub b}/m{sub {tau}} depending on the value of an unknown parameter. They can also be made to vanish through a fine-tuning. A related model of tree-level t-b-{tau} unification which uses the identification of SU(2){sub R} with custodial SU(2) is then discussed. A curious relation m{sub b}{approx} {radical}2m{sub {tau}} is found to be satisfied at tree-level in this model. The overall conclusion of this work is that the tree-level relation m{sub b}=m{sub {tau}} at low scales such as 1000 TeV or somewhat higher can produce a successful value for m{sub b}/m{sub {tau}} after corrections, but one must be mindful that radiative corrections beyond those incorporated through the renormalisation group can be very important. 14 refs., 7 figs.

  13. Remote Sensing Contributions to Prediction and Risk Assessment of Natural Disasters Caused by Large Scale Rift Valley Fever Outbreaks

    Science.gov (United States)

    Anyamba, Assaf; Linthicum, Kenneth J.; Small, Jennifer; Britch, S. C.; Tucker, C. J.

    2012-01-01

    Remotely sensed vegetation measurements for the last 30 years combined with other climate data sets such as rainfall and sea surface temperatures have come to play an important role in the study of the ecology of arthropod-borne diseases. We show that epidemics and epizootics of previously unpredictable Rift Valley fever are directly influenced by large scale flooding associated with the El Ni o/Southern Oscillation. This flooding affects the ecology of disease transmitting arthropod vectors through vegetation development and other bioclimatic factors. This information is now utilized to monitor, model, and map areas of potential Rift Valley fever outbreaks and is used as an early warning system for risk reduction of outbreaks to human and animal health, trade, and associated economic impacts. The continuation of such satellite measurements is critical to anticipating, preventing, and managing disease epidemics and epizootics and other climate-related disasters.

  14. Large-scale self-assembled zirconium phosphate smectic layers via a simple spray-coating process

    Science.gov (United States)

    Wong, Minhao; Ishige, Ryohei; White, Kevin L.; Li, Peng; Kim, Daehak; Krishnamoorti, Ramanan; Gunther, Robert; Higuchi, Takeshi; Jinnai, Hiroshi; Takahara, Atsushi; Nishimura, Riichi; Sue, Hung-Jue

    2014-04-01

    The large-scale assembly of asymmetric colloidal particles is used in creating high-performance fibres. A similar concept is extended to the manufacturing of thin films of self-assembled two-dimensional crystal-type materials with enhanced and tunable properties. Here we present a spray-coating method to manufacture thin, flexible and transparent epoxy films containing zirconium phosphate nanoplatelets self-assembled into a lamellar arrangement aligned parallel to the substrate. The self-assembled mesophase of zirconium phosphate nanoplatelets is stabilized by epoxy pre-polymer and exhibits rheology favourable towards large-scale manufacturing. The thermally cured film forms a mechanically robust coating and shows excellent gas barrier properties at both low- and high humidity levels as a result of the highly aligned and overlapping arrangement of nanoplatelets. This work shows that the large-scale ordering of high aspect ratio nanoplatelets is easier to achieve than previously thought and may have implications in the technological applications for similar materials.

  15. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  16. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  17. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  18. Streamflow Observations From Cameras: Large-Scale Particle Image Velocimetry or Particle Tracking Velocimetry?

    Science.gov (United States)

    Tauro, F.; Piscopia, R.; Grimaldi, S.

    2017-12-01

    Image-based methodologies, such as large scale particle image velocimetry (LSPIV) and particle tracking velocimetry (PTV), have increased our ability to noninvasively conduct streamflow measurements by affording spatially distributed observations at high temporal resolution. However, progress in optical methodologies has not been paralleled by the implementation of image-based approaches in environmental monitoring practice. We attribute this fact to the sensitivity of LSPIV, by far the most frequently adopted algorithm, to visibility conditions and to the occurrence of visible surface features. In this work, we test both LSPIV and PTV on a data set of 12 videos captured in a natural stream wherein artificial floaters are homogeneously and continuously deployed. Further, we apply both algorithms to a video of a high flow event on the Tiber River, Rome, Italy. In our application, we propose a modified PTV approach that only takes into account realistic trajectories. Based on our findings, LSPIV largely underestimates surface velocities with respect to PTV in both favorable (12 videos in a natural stream) and adverse (high flow event in the Tiber River) conditions. On the other hand, PTV is in closer agreement than LSPIV with benchmark velocities in both experimental settings. In addition, the accuracy of PTV estimations can be directly related to the transit of physical objects in the field of view, thus providing tangible data for uncertainty evaluation.

  19. A family of conjugate gradient methods for large-scale nonlinear equations

    Directory of Open Access Journals (Sweden)

    Dexiang Feng

    2017-09-01

    Full Text Available Abstract In this paper, we present a family of conjugate gradient projection methods for solving large-scale nonlinear equations. At each iteration, it needs low storage and the subproblem can be easily solved. Compared with the existing solution methods for solving the problem, its global convergence is established without the restriction of the Lipschitz continuity on the underlying mapping. Preliminary numerical results are reported to show the efficiency of the proposed method.

  20. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  1. Assuring Access to Affordable Coverage

    Data.gov (United States)

    U.S. Department of Health & Human Services — Under the Affordable Care Act, millions of uninsured Americans will gain access to affordable coverage through Affordable Insurance Exchanges and improvements in...

  2. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  3. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  4. High Working Memory Load Increases Intracortical Inhibition in Primary Motor Cortex and Diminishes the Motor Affordance Effect.

    Science.gov (United States)

    Freeman, Scott M; Itthipuripat, Sirawaj; Aron, Adam R

    2016-05-18

    Motor affordances occur when the visual properties of an object elicit behaviorally relevant motor representations. Typically, motor affordances only produce subtle effects on response time or on motor activity indexed by neuroimaging/neuroelectrophysiology, but sometimes they can trigger action itself. This is apparent in "utilization behavior," where individuals with frontal cortex damage inappropriately grasp affording objects. This raises the possibility that, in healthy-functioning individuals, frontal cortex helps ensure that irrelevant affordance provocations remain below the threshold for actual movement. In Experiment 1, we tested this "frontal control" hypothesis by "loading" the frontal cortex with an effortful working memory (WM) task (which ostensibly consumes frontal resources) and examined whether this increased EEG measures of motor affordances to irrelevant affording objects. Under low WM load, there were typical motor affordance signatures: an event-related desynchronization in the mu frequency and an increased P300 amplitude for affording (vs nonaffording) objects over centroparietal electrodes. Contrary to our prediction, however, these affordance measures were diminished under high WM load. In Experiment 2, we tested competing mechanisms responsible for the diminished affordance in Experiment 1. We used paired-pulse transcranial magnetic stimulation over primary motor cortex to measure long-interval cortical inhibition. We found greater long-interval cortical inhibition for high versus low load both before and after the affording object, suggesting that a tonic inhibition state in primary motor cortex could prevent the affordance from provoking the motor system. Overall, our results suggest that a high WM load "sets" the motor system into a suppressed state that mitigates motor affordances. Is an irrelevant motor affordance more likely to be triggered when you are under low or high cognitive load? We examined this using physiological measures

  5. Price, availability and affordability of medicines

    Directory of Open Access Journals (Sweden)

    Brenda S. Mhlanga

    2014-01-01

    Full Text Available Background: Medicines play an important role in healthcare, but prices can be a barrier to patient care. Few studies have looked at the prices of essential medicines in low- and middle-income countries in terms of patient affordability.Aim: To determine the prices, availability and affordability of medicines along the supply chain in Swaziland.Setting: Private- and public-sector facilities in Manzini, Swaziland.Methods: The standardised methodology designed by the World Health Organization and Health Action International was used to survey 16 chronic disease medicines. Data were collected in one administrative area in 10 private retail pharmacies and 10 public health facilities. Originator brand (OB and lowest-priced generic equivalent (LPG medicines were monitored and these prices were then compared with international reference prices (IRPs. Affordability was calculated in terms of the daily wage of the lowest-paid unskilled government worker.Results: Mean availability was 68% in the public sector. Private sector OB medicines were priced 32.4 times higher than IRPs, whilst LPGs were 7.32 times higher. OBs cost473% more than LPGs. The total cumulative mark-ups for individual medicines range from 190.99% – 440.27%. The largest contributor to add-on cost was the retail mark-up (31% – 53%. Standard treatment with originator brands cost more than a day’s wage.Conclusion: Various policy measures such as introducing price capping at all levels of the medicine supply chain, may increase the availability, whilst at the same time reducing the prices of essential medicines for the low income population.

  6. Price, availability and affordability of medicines

    Directory of Open Access Journals (Sweden)

    Brenda S. Mhlanga

    2014-06-01

    Full Text Available Background: Medicines play an important role in healthcare, but prices can be a barrier to patient care. Few studies have looked at the prices of essential medicines in low- and middle-income countries in terms of patient affordability. Aim: To determine the prices, availability and affordability of medicines along the supply chain in Swaziland. Setting: Private- and public-sector facilities in Manzini, Swaziland. Methods: The standardised methodology designed by the World Health Organization and Health Action International was used to survey 16 chronic disease medicines. Data were collected in one administrative area in 10 private retail pharmacies and 10 public health facilities. Originator brand (OB and lowest-priced generic equivalent (LPG medicines were monitored and these prices were then compared with international reference prices (IRPs. Affordability was calculated in terms of the daily wage of the lowest-paid unskilled government worker. Results: Mean availability was 68% in the public sector. Private sector OB medicines were priced 32.4 times higher than IRPs, whilst LPGs were 7.32 times higher. OBs cost473% more than LPGs. The total cumulative mark-ups for individual medicines range from 190.99% – 440.27%. The largest contributor to add-on cost was the retail mark-up (31% – 53%. Standard treatment with originator brands cost more than a day’s wage. Conclusion: Various policy measures such as introducing price capping at all levels of the medicine supply chain, may increase the availability, whilst at the same time reducing the prices of essential medicines for the low income population.

  7. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  8. Can only poorer European countries afford large carnivores?

    Science.gov (United States)

    Kojola, Ilpo; Hallikainen, Ville; Helle, Timo; Swenson, Jon E

    2018-01-01

    One of the classic approaches in environmental economics is the environmental Kuznets curve, which predicts that when a national economy grows from low to medium levels, threats to biodiversity conservation increase, but they decrease when the economy moves from medium to high. We evaluated this approach by examining how population densities of the brown bear (Ursus arctos), gray wolf (Canis lupus), and Eurasian lynx (Lynx lynx) were related to the national economy in 24 European countries. We used forest proportions, the existence of a compensation system, and country group (former socialist countries, Nordic countries, other countries) as covariates in a linear model with the first- and the second-order polynomial terms of per capita gross domestic product (GDP). Country group was treated as a random factor, but remained insignificant and was ignored. All models concerning brown bear and wolf provided evidence that population densities decreased with increasing GDP, but densities of lynx were virtually independent of GDP. Models for the wolf explained >80% of the variation in densities, without a difference between the models with all independent variables and the model with only GDP. For the bear, the model with GDP alone accounted for 10%, and all three variables 33%, of the variation in densities. Wolves exhibit a higher capacity for dispersal and reproduction than bear or lynx, but still exists at the lowest densities in wealthy European countries. We are aware that several other factors, not available for our models, influenced large carnivore densities. Based on the pronounced differences among large carnivore species in their countrywide relationships between densities and GDP, and a strikingly high relationship for the gray wolf, we suggest that our results reflected differences in political history and public acceptance of these species among countries. The compensation paid for the damages caused by the carnivores is not a key to higher carnivore

  9. Refining Grasp Affordance Models by Experience

    DEFF Research Database (Denmark)

    Detry, Renaud; Kraft, Dirk; Buch, Anders Glent

    2010-01-01

    We present a method for learning object grasp affordance models in 3D from experience, and demonstrate its applicability through extensive testing and evaluation on a realistic and largely autonomous platform. Grasp affordance refers here to relative object-gripper configurations that yield stable...... with a visual model of the object they characterize. We explore a batch-oriented, experience-based learning paradigm where grasps sampled randomly from a density are performed, and an importance-sampling algorithm learns a refined density from the outcomes of these experiences. The first such learning cycle...... is bootstrapped with a grasp density formed from visual cues. We show that the robot effectively applies its experience by downweighting poor grasp solutions, which results in increased success rates at subsequent learning cycles. We also present success rates in a practical scenario where a robot needs...

  10. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  11. The integration of novel diagnostics techniques for multi-scale monitoring of large civil infrastructures

    Directory of Open Access Journals (Sweden)

    F. Soldovieri

    2008-11-01

    Full Text Available In the recent years, structural monitoring of large infrastructures (buildings, dams, bridges or more generally man-made structures has raised an increased attention due to the growing interest about safety and security issues and risk assessment through early detection. In this framework, aim of the paper is to introduce a new integrated approach which combines two sensing techniques acting on different spatial and temporal scales. The first one is a distributed optic fiber sensor based on the Brillouin scattering phenomenon, which allows a spatially and temporally continuous monitoring of the structure with a "low" spatial resolution (meter. The second technique is based on the use of Ground Penetrating Radar (GPR, which can provide detailed images of the inner status of the structure (with a spatial resolution less then tens centimetres, but does not allow a temporal continuous monitoring. The paper describes the features of these two techniques and provides experimental results concerning preliminary test cases.

  12. The genetic etiology of Tourette Syndrome: Large-scale collaborative efforts on the precipice of discovery

    Directory of Open Access Journals (Sweden)

    Marianthi Georgitsi

    2016-08-01

    Full Text Available Gilles de la Tourette Syndrome (TS is a childhood-onset neurodevelopmental disorder that is characterized by multiple motor and phonic tics. It has a complex etiology with multiple genes likely interacting with environmental factors to lead to the onset of symptoms. The genetic basis of the disorder remains elusive;however, multiple resources and large-scale projects are coming together, launching a new era in the field and bringing us on the verge of discovery. The large-scale efforts outlined in this report, are complementary and represent a range of different approaches to the study of disorders with complex inheritance. The Tourette Syndrome Association International Consortium for Genetics (TSAICG has focused on large families, parent-proband trios and cases for large case-control designs such as genomewide association studies (GWAS, copy number variation (CNV scans and exome/genome sequencing. TIC Genetics targets rare, large effect size mutations in simplex trios and multigenerational families. The European Multicentre Tics in Children Study (EMTICS seeks to elucidate gene-environment interactions including the involvement of infection and immune mechanisms in TS etiology. Finally, TS-EUROTRAIN, a Marie Curie Initial Training Network, aims to act as a platform to unify large-scale projects in the field and to educate the next generation of experts. Importantly, these complementary large-scale efforts are joining forces to uncover the full range of genetic variation and environmental risk factors for TS, holding great promise for indentifying definitive TS susceptibility genes and shedding light into the complex pathophysiology of this disorder.

  13. The Genetic Etiology of Tourette Syndrome: Large-Scale Collaborative Efforts on the Precipice of Discovery

    Science.gov (United States)

    Georgitsi, Marianthi; Willsey, A. Jeremy; Mathews, Carol A.; State, Matthew; Scharf, Jeremiah M.; Paschou, Peristera

    2016-01-01

    Gilles de la Tourette Syndrome (TS) is a childhood-onset neurodevelopmental disorder that is characterized by multiple motor and phonic tics. It has a complex etiology with multiple genes likely interacting with environmental factors to lead to the onset of symptoms. The genetic basis of the disorder remains elusive. However, multiple resources and large-scale projects are coming together, launching a new era in the field and bringing us on the verge of discovery. The large-scale efforts outlined in this report are complementary and represent a range of different approaches to the study of disorders with complex inheritance. The Tourette Syndrome Association International Consortium for Genetics (TSAICG) has focused on large families, parent-proband trios and cases for large case-control designs such as genomewide association studies (GWAS), copy number variation (CNV) scans, and exome/genome sequencing. TIC Genetics targets rare, large effect size mutations in simplex trios, and multigenerational families. The European Multicentre Tics in Children Study (EMTICS) seeks to elucidate gene-environment interactions including the involvement of infection and immune mechanisms in TS etiology. Finally, TS-EUROTRAIN, a Marie Curie Initial Training Network, aims to act as a platform to unify large-scale projects in the field and to educate the next generation of experts. Importantly, these complementary large-scale efforts are joining forces to uncover the full range of genetic variation and environmental risk factors for TS, holding great promise for identifying definitive TS susceptibility genes and shedding light into the complex pathophysiology of this disorder. PMID:27536211

  14. Low-scale gaugino mass unification

    International Nuclear Information System (INIS)

    Endo, M.; Yoshioka, K.

    2008-04-01

    We present a new class of scenarios with the gaugino mass unification at the weak scale. The unification conditions are generally classified and then, the mirage gauge mediation is explored where gaugino masses are naturally unified and scalar partners of quarks and leptons have no mass hierarchy. The low-energy mass spectrum is governed by the mirage of unified gauge coupling which is seen by low-energy observers. We also study several explicit models for dynamically realizing the TeV-scale unification. (orig.)

  15. Low-scale gaugino mass unification

    Energy Technology Data Exchange (ETDEWEB)

    Endo, M [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Yoshioka, K [Kyoto Univ. (Japan). Dept. of Physics

    2008-04-15

    We present a new class of scenarios with the gaugino mass unification at the weak scale. The unification conditions are generally classified and then, the mirage gauge mediation is explored where gaugino masses are naturally unified and scalar partners of quarks and leptons have no mass hierarchy. The low-energy mass spectrum is governed by the mirage of unified gauge coupling which is seen by low-energy observers. We also study several explicit models for dynamically realizing the TeV-scale unification. (orig.)

  16. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  17. Max-Min SINR in Large-Scale Single-Cell MU-MIMO: Asymptotic Analysis and Low Complexity Transceivers

    KAUST Repository

    Sifaou, Houssem

    2016-12-28

    This work focuses on the downlink and uplink of large-scale single-cell MU-MIMO systems in which the base station (BS) endowed with M antennas communicates with K single-antenna user equipments (UEs). Particularly, we aim at reducing the complexity of the linear precoder and receiver that maximize the minimum signal-to-interference-plus-noise ratio subject to a given power constraint. To this end, we consider the asymptotic regime in which M and K grow large with a given ratio. Tools from random matrix theory (RMT) are then used to compute, in closed form, accurate approximations for the parameters of the optimal precoder and receiver, when imperfect channel state information (modeled by the generic Gauss-Markov formulation form) is available at the BS. The asymptotic analysis allows us to derive the asymptotically optimal linear precoder and receiver that are characterized by a lower complexity (due to the dependence on the large scale components of the channel) and, possibly, by a better resilience to imperfect channel state information. However, the implementation of both is still challenging as it requires fast inversions of large matrices in every coherence period. To overcome this issue, we apply the truncated polynomial expansion (TPE) technique to the precoding and receiving vector of each UE and make use of RMT to determine the optimal weighting coefficients on a per- UE basis that asymptotically solve the max-min SINR problem. Numerical results are used to validate the asymptotic analysis in the finite system regime and to show that the proposed TPE transceivers efficiently mimic the optimal ones, while requiring much lower computational complexity.

  18. Hierarchical Learning of Tree Classifiers for Large-Scale Plant Species Identification.

    Science.gov (United States)

    Fan, Jianping; Zhou, Ning; Peng, Jinye; Gao, Ling

    2015-11-01

    In this paper, a hierarchical multi-task structural learning algorithm is developed to support large-scale plant species identification, where a visual tree is constructed for organizing large numbers of plant species in a coarse-to-fine fashion and determining the inter-related learning tasks automatically. For a given parent node on the visual tree, it contains a set of sibling coarse-grained categories of plant species or sibling fine-grained plant species, and a multi-task structural learning algorithm is developed to train their inter-related classifiers jointly for enhancing their discrimination power. The inter-level relationship constraint, e.g., a plant image must first be assigned to a parent node (high-level non-leaf node) correctly if it can further be assigned to the most relevant child node (low-level non-leaf node or leaf node) on the visual tree, is formally defined and leveraged to learn more discriminative tree classifiers over the visual tree. Our experimental results have demonstrated the effectiveness of our hierarchical multi-task structural learning algorithm on training more discriminative tree classifiers for large-scale plant species identification.

  19. Increasing condom use and declining STI prevalence in high-risk MSM and TGs: evaluation of a large-scale prevention program in Tamil Nadu, India.

    Science.gov (United States)

    Subramanian, Thilakavathi; Ramakrishnan, Lakshmi; Aridoss, Santhakumar; Goswami, Prabuddhagopal; Kanguswami, Boopathi; Shajan, Mathew; Adhikary, Rajat; Purushothaman, Girish Kumar Chethrapilly; Ramamoorthy, Senthil Kumar; Chinnaswamy, Eswaramurthy; Veeramani, Ilaya Bharathy; Paranjape, Ramesh Shivram

    2013-09-17

    This paper presents an evaluation of Avahan, a large scale HIV prevention program that was implemented using peer-mediated strategies, condom distribution and sexually transmitted infection (STI) clinical services among high-risk men who have sex with men (HR-MSM) and male to female transgender persons (TGs) in six high-prevalence state of Tamil Nadu, in southern India. Two rounds of large scale cross-sectional bio-behavioural surveys among HR-MSM and TGs and routine program monitoring data were used to assess changes in program coverage, condom use and prevalence of STIs (including HIV) and their association to program exposure. The Avahan program for HR-MSM and TGs in Tamil Nadu was significantly scaled up and contacts by peer educators reached 77 percent of the estimated denominator by the end of the program's fourth year. Exposure to the program increased between the two rounds of surveys for both HR-MSM (from 66 percent to 90 percent; AOR = 4.6; p Tamil Nadu achieved a high coverage, resulting in improved condom use by HR-MSM with their regular and commercial male partners. Declining STI prevalence and stable HIV prevalence reflect the positive effects of the prevention strategy. Outcomes from the program logic model indiacte the effectiveness of the program for HR-MSM and TGs in Tamil Nadu.

  20. Large-scale recovery of an endangered amphibian despite ongoing exposure to multiple stressors

    Science.gov (United States)

    Knapp, Roland A.; Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A. W.; Vrendenburg, Vance T.; Rosenblum, Erica Bree; Briggs, Cheryl J.

    2016-01-01

    Amphibians are one of the most threatened animal groups, with 32% of species at risk for extinction. Given this imperiled status, is the disappearance of a large fraction of the Earth’s amphibians inevitable, or are some declining species more resilient than is generally assumed? We address this question in a species that is emblematic of many declining amphibians, the endangered Sierra Nevada yellow-legged frog (Rana sierrae). Based on >7,000 frog surveys conducted across Yosemite National Park over a 20-y period, we show that, after decades of decline and despite ongoing exposure to multiple stressors, including introduced fish, the recently emerged disease chytridiomycosis, and pesticides, R. sierrae abundance increased sevenfold during the study and at a rate of 11% per year. These increases occurred in hundreds of populations throughout Yosemite, providing a rare example of amphibian recovery at an ecologically relevant spatial scale. Results from a laboratory experiment indicate that these increases may be in part because of reduced frog susceptibility to chytridiomycosis. The disappearance of nonnative fish from numerous water bodies after cessation of stocking also contributed to the recovery. The large-scale increases in R. sierrae abundance that we document suggest that, when habitats are relatively intact and stressors are reduced in their importance by active management or species’ adaptive responses, declines of some amphibians may be partially reversible, at least at a regional scale. Other studies conducted over similarly large temporal and spatial scales are critically needed to provide insight and generality about the reversibility of amphibian declines at a global scale.

  1. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  2. Virtual neutron scattering experiments - Training and preparing students for large-scale facility experiments

    Directory of Open Access Journals (Sweden)

    Julie Hougaard Overgaard

    2016-11-01

    Full Text Available Dansk Vi beskriver, hvordan virtuelle eksperimenter kan udnyttes i et læringsdesign ved at forberede de studerende til hands-on-eksperimenter ved storskalafaciliteter. Vi illustrerer designet ved at vise, hvordan virtuelle eksperimenter bruges på Niels Bohr Institutets kandidatkursus om neutronspredning. I den sidste uge af kurset, rejser studerende til et storskala neutronspredningsfacilitet for at udføre neutronspredningseksperimenter. Vi bruger studerendes udsagn om deres oplevelser til at argumentere for, at arbejdet med virtuelle experimenter forbereder de studerende til at engagere sig mere frugtbart med eksperimenter ved at lade dem fokusere på fysikken og relevante data i stedet for instrumenternes funktion. Vi hævder, at det er, fordi de kan overføre deres erfaringer med virtuelle eksperimenter til rigtige eksperimenter. Vi finder dog, at læring stadig er situeret i den forstand, at kun kendskab til bestemte eksperimenter overføres. Vi afslutter med at diskutere de muligheder, som virtuelle eksperimenter giver. English We describe how virtual experiments can be utilized in a learning design that prepares students for hands-on experiments at large-scale facilities. We illustrate the design by showing how virtual experiments are used at the Niels Bohr Institute in a master level course on neutron scattering. In the last week of the course, students travel to a large-scale neutron scattering facility to perform real neutron scattering experiments. Through student interviews and survey answers, we argue, that the virtual training prepares the students to engage more fruitfully with experiments by letting them focus on physics and data rather than the overwhelming instrumentation. We argue that this is because they can transfer their virtual experimental experience to the real-life situation. However, we also find that learning is still situated in the sense that only knowledge of particular experiments is transferred. We proceed to

  3. Regional scale ecological risk assessment: using the relative risk model

    National Research Council Canada - National Science Library

    Landis, Wayne G

    2005-01-01

    ...) in the performance of regional-scale ecological risk assessments. The initial chapters present the methodology and the critical nature of the interaction between risk assessors and decision makers...

  4. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  5. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  6. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  7. Low-Frequency Synonymous Coding Variation in CYP2R1 Has Large Effects on Vitamin D Levels and Risk of Multiple Sclerosis.

    Science.gov (United States)

    Manousaki, Despoina; Dudding, Tom; Haworth, Simon; Hsu, Yi-Hsiang; Liu, Ching-Ti; Medina-Gómez, Carolina; Voortman, Trudy; van der Velde, Nathalie; Melhus, Håkan; Robinson-Cohen, Cassianne; Cousminer, Diana L; Nethander, Maria; Vandenput, Liesbeth; Noordam, Raymond; Forgetta, Vincenzo; Greenwood, Celia M T; Biggs, Mary L; Psaty, Bruce M; Rotter, Jerome I; Zemel, Babette S; Mitchell, Jonathan A; Taylor, Bruce; Lorentzon, Mattias; Karlsson, Magnus; Jaddoe, Vincent V W; Tiemeier, Henning; Campos-Obando, Natalia; Franco, Oscar H; Utterlinden, Andre G; Broer, Linda; van Schoor, Natasja M; Ham, Annelies C; Ikram, M Arfan; Karasik, David; de Mutsert, Renée; Rosendaal, Frits R; den Heijer, Martin; Wang, Thomas J; Lind, Lars; Orwoll, Eric S; Mook-Kanamori, Dennis O; Michaëlsson, Karl; Kestenbaum, Bryan; Ohlsson, Claes; Mellström, Dan; de Groot, Lisette C P G M; Grant, Struan F A; Kiel, Douglas P; Zillikens, M Carola; Rivadeneira, Fernando; Sawcer, Stephen; Timpson, Nicholas J; Richards, J Brent

    2017-08-03

    Vitamin D insufficiency is common, correctable, and influenced by genetic factors, and it has been associated with risk of several diseases. We sought to identify low-frequency genetic variants that strongly increase the risk of vitamin D insufficiency and tested their effect on risk of multiple sclerosis, a disease influenced by low vitamin D concentrations. We used whole-genome sequencing data from 2,619 individuals through the UK10K program and deep-imputation data from 39,655 individuals genotyped genome-wide. Meta-analysis of the summary statistics from 19 cohorts identified in CYP2R1 the low-frequency (minor allele frequency = 2.5%) synonymous coding variant g.14900931G>A (p.Asp120Asp) (rs117913124[A]), which conferred a large effect on 25-hydroxyvitamin D (25OHD) levels (-0.43 SD of standardized natural log-transformed 25OHD per A allele; p value = 1.5 × 10 -88 ). The effect on 25OHD was four times larger and independent of the effect of a previously described common variant near CYP2R1. By analyzing 8,711 individuals, we showed that heterozygote carriers of this low-frequency variant have an increased risk of vitamin D insufficiency (odds ratio [OR] = 2.2, 95% confidence interval [CI] = 1.78-2.78, p = 1.26 × 10 -12 ). Individuals carrying one copy of this variant also had increased odds of multiple sclerosis (OR = 1.4, 95% CI = 1.19-1.64, p = 2.63 × 10 -5 ) in a sample of 5,927 case and 5,599 control subjects. In conclusion, we describe a low-frequency CYP2R1 coding variant that exerts the largest effect upon 25OHD levels identified to date in the general European population and implicates vitamin D in the etiology of multiple sclerosis. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  8. Large-Scale Flows and Magnetic Fields Produced by Rotating Convection in a Quasi-Geostrophic Model of Planetary Cores

    Science.gov (United States)

    Guervilly, C.; Cardin, P.

    2017-12-01

    Convection is the main heat transport process in the liquid cores of planets. The convective flows are thought to be turbulent and constrained by rotation (corresponding to high Reynolds numbers Re and low Rossby numbers Ro). Under these conditions, and in the absence of magnetic fields, the convective flows can produce coherent Reynolds stresses that drive persistent large-scale zonal flows. The formation of large-scale flows has crucial implications for the thermal evolution of planets and the generation of large-scale magnetic fields. In this work, we explore this problem with numerical simulations using a quasi-geostrophic approximation to model convective and zonal flows at Re 104 and Ro 10-4 for Prandtl numbers relevant for liquid metals (Pr 0.1). The formation of intense multiple zonal jets strongly affects the convective heat transport, leading to the formation of a mean temperature staircase. We also study the generation of magnetic fields by the quasi-geostrophic flows at low magnetic Prandtl numbers.

  9. Risk Syndromes and Scales Determining Risk in Schizophrenia and Other Psychoses

    Directory of Open Access Journals (Sweden)

    Soner Cakmak

    2015-12-01

    Full Text Available Schizophrenia is a chronic disorder leading to lifelong deterioration of social and vocational functioning. Prodromal period, designates the time interval starting with emerging nonspecific signs and deficits and extending up to presentation of distinct and ongoing schizophrenic symptoms, is observed in most of schizophrenia patients. In schizophrenia, poor premorbid adjustment leads to a worse prognosis and thus early detection and intervention is required in prodromal period. To this end, under the heading of risk factors for schizophrenia and psychosis, classification and scales to determine the risk are being utilized. Most frequently used scales are; Bonn Scale for the Assessment of Basic Symptoms (BSABS, Comprehensive Assessment of At-Risk Mental States (CAARMS, Structured Interview for Psychosis-Risk Syndromes (SIPS. Through the light of these latest developments, recent edition of Diagnostic and Statistical Manual of Mental Disorders (DSM-5 added psychosis risk syndrome or attenuated psychosis syndrome to indicate risk of transition to psychosis. These approaches revealed that the risk of progression to psychosis was not reliably correlated with fulfilled criteria, but abscence of criteria credibly predicted the unlikelihood of psychosis emergence. Evidently, concomitant premorbid features and prodromal symptoms significantly increase the risk of progression to psychosis and schizophrenia in comparison to normal population. Nevertheless, specification and elaboration of risk criteria will enhance reliability of risk determination. [Archives Medical Review Journal 2015; 24(4.000: 494-508

  10. ANALYSIS OF CLIMATIC AND SOCIAL PERFORMANCES OF LOW COST TERRACE HOUSING (LCTH: INTRODUCING THE AFFORDABLE QUALITY HOUSING (AQH CONCEPT IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    Noor Hanita Abdul Majid

    2009-03-01

    Full Text Available Low cost terrace housing (LCTH is the most common form and popular typology of public housing in Malaysia. The provision of the houses is deemed to be the most suitable to fulfil the needs to house low income families, and also as an alternative to high rise low cost housing. Since the implementation of these housing types, development on the layout and sizes of the house has taken place to provide for better living conditions. Literature review on the current LCTH suggested that there are deficiencies in fulfilling the requirements to provide quality and affordable housing for the low income families. This paper presents the scenario of the LCTH design based on secondary findings by researchers on housing in Malaysia. The secondary data provided the grounds for the proposal of affordable and quality housing (AQH to handle the problems that occurred at the LCTH. Both social and climatic considerations are included in the AQH, addressing issues on privacy, segregation of genders and community interaction; along with thermal comfort and natural ventilation. Decisions on the AQH are carefully extracted from a comparison analysis in the view to improve the current conditions. In order to verify some of the decisions on climatic design strategies, simulation results are presented. The results indicated that the design decisions have managed to improve on the natural ventilation conditions at the low cost houses. With reservations on the social conditions that are yet to be tested at the actual houses, the AQH has proven to be a step forward towards the provision of a better living environment.

  11. Increasing stress on disaster-risk finance due to large floods

    Science.gov (United States)

    Jongman, Brenden; Hochrainer-Stigler, Stefan; Feyen, Luc; Aerts, Jeroen C. J. H.; Mechler, Reinhard; Botzen, W. J. Wouter; Bouwer, Laurens M.; Pflug, Georg; Rojas, Rodrigo; Ward, Philip J.

    2014-04-01

    Recent major flood disasters have shown that single extreme events can affect multiple countries simultaneously, which puts high pressure on trans-national risk reduction and risk transfer mechanisms. So far, little is known about such flood hazard interdependencies across regions and the corresponding joint risks at regional to continental scales. Reliable information on correlated loss probabilities is crucial for developing robust insurance schemes and public adaptation funds, and for enhancing our understanding of climate change impacts. Here we show that extreme discharges are strongly correlated across European river basins. We present probabilistic trends in continental flood risk, and demonstrate that observed extreme flood losses could more than double in frequency by 2050 under future climate change and socio-economic development. We suggest that risk management for these increasing losses is largely feasible, and we demonstrate that risk can be shared by expanding risk transfer financing, reduced by investing in flood protection, or absorbed by enhanced solidarity between countries. We conclude that these measures have vastly different efficiency, equity and acceptability implications, which need to be taken into account in broader consultation, for which our analysis provides a basis.

  12. Does the effective Lagrangian for low-energy QCD scale?

    International Nuclear Information System (INIS)

    Birse, M.C.

    1994-01-01

    Quantum chromodynamics is not an approximately scale-invariant theory. Hence a dilaton field is not expected to provide a good description of the low-energy dynamics associated with the gluon condensate. Even if such a field is introduced, it remains almost unchanged in hadronic matter at normal densities. This is because the large glueball mass together with the size of the phenomenological gluon condensate ensure that changes to that condensate are very small at such densities. Any changes in hadronic masses and decay constants in matter generated by that condensate will be much smaller than those produced directly by changes in the quark condensate. Hence, masses and decay constants are not expected to display a universal scaling. (author)

  13. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  14. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  15. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  16. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  17. Large-Scale Ocean Circulation-Cloud Interactions Reduce the Pace of Transient Climate Change

    Science.gov (United States)

    Trossman, D. S.; Palter, J. B.; Merlis, T. M.; Huang, Y.; Xia, Y.

    2016-01-01

    Changes to the large scale oceanic circulation are thought to slow the pace of transient climate change due, in part, to their influence on radiative feedbacks. Here we evaluate the interactions between CO2-forced perturbations to the large-scale ocean circulation and the radiative cloud feedback in a climate model. Both the change of the ocean circulation and the radiative cloud feedback strongly influence the magnitude and spatial pattern of surface and ocean warming. Changes in the ocean circulation reduce the amount of transient global warming caused by the radiative cloud feedback by helping to maintain low cloud coverage in the face of global warming. The radiative cloud feedback is key in affecting atmospheric meridional heat transport changes and is the dominant radiative feedback mechanism that responds to ocean circulation change. Uncertainty in the simulated ocean circulation changes due to CO2 forcing may contribute a large share of the spread in the radiative cloud feedback among climate models.

  18. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  19. Affordable Care Act (ACA)

    Data.gov (United States)

    Social Security Administration — The Affordable Care Act (ACA) is a federal statute enacted with a goal of increasing the quality and affordability of health insurance. Through a web service, CMS...

  20. Joint Attention Development in Low-risk Very Low Birth Weight Infants at Around 18 Months of Age.

    Science.gov (United States)

    Yamaoka, Noriko; Takada, Satoshi

    2016-10-18

    The purpose of this study was to clarify the developmental characteristics of joint attention in very low birth weight (VLBW) infants with a low risk of complications. Section B of the Checklist for Autism in Toddlers (CHAT) was administered to 31 VLBW and 45 normal birth weight (NBW) infants aged 18-22 months, while the sessions were recorded with a video camera. A semi-structured observation scale was developed to assess infants' joint attention from the video footage, and was shown to be reliable. VLBW, compared to NBW, infants showed significantly poorer skills in 2 of 4 items on responding to joint attention, and in 6 of 10 items on initiating joint attention. VLBW infants need more clues in order to produce joint attention. The difficulty was attributed to insufficient verbal and fine motor function skills. Continuous follow-up evaluation is essential for both high-risk and low-risk VLBW infants and their parents.

  1. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P [PA Energy, Malling (Denmark); Vedde, J [SiCon. Silicon and PV consulting, Birkeroed (Denmark)

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  2. Environmental aspects of large-scale wind-power systems in the UK

    Science.gov (United States)

    Robson, A.

    1984-11-01

    Environmental issues relating to the introduction of large, MW-scale wind turbines at land-based sites in the UK are discussed. Noise, television interference, hazards to bird life, and visual effects are considered. Areas of uncertainty are identified, but enough is known from experience elsewhere in the world to enable the first UK machines to be introduced in a safe and environementally acceptable manner. Research to establish siting criteria more clearly, and significantly increase the potential wind-energy resource is mentioned. Studies of the comparative risk of energy systems are shown to be overpessimistic for UK wind turbines.

  3. A fast approach to generate large-scale topographic maps based on new Chinese vehicle-borne Lidar system

    International Nuclear Information System (INIS)

    Youmei, Han; Bogang, Yang

    2014-01-01

    Large -scale topographic maps are important basic information for city and regional planning and management. Traditional large- scale mapping methods are mostly based on artificial mapping and photogrammetry. The traditional mapping method is inefficient and limited by the environments. While the photogrammetry methods(such as low-altitude aerial mapping) is an economical and effective way to map wide and regulate range of large scale topographic map but doesn't work well in the small area due to the high cost of manpower and resources. Recent years, the vehicle-borne LIDAR technology has a rapid development, and its application in surveying and mapping is becoming a new topic. The main objective of this investigation is to explore the potential of vehicle-borne LIDAR technology to be used to fast mapping large scale topographic maps based on new Chinese vehicle-borne LIDAR system. It studied how to use the new Chinese vehicle-borne LIDAR system measurement technology to map large scale topographic maps. After the field data capture, it can be mapped in the office based on the LIDAR data (point cloud) by software which programmed by ourselves. In addition, the detailed process and accuracy analysis were proposed by an actual case. The result show that this new technology provides a new fast method to generate large scale topographic maps, which is high efficient and accuracy compared to traditional methods

  4. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  5. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  6. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  7. Ship detection using STFT sea background statistical modeling for large-scale oceansat remote sensing image

    Science.gov (United States)

    Wang, Lixia; Pei, Jihong; Xie, Weixin; Liu, Jinyuan

    2018-03-01

    Large-scale oceansat remote sensing images cover a big area sea surface, which fluctuation can be considered as a non-stationary process. Short-Time Fourier Transform (STFT) is a suitable analysis tool for the time varying nonstationary signal. In this paper, a novel ship detection method using 2-D STFT sea background statistical modeling for large-scale oceansat remote sensing images is proposed. First, the paper divides the large-scale oceansat remote sensing image into small sub-blocks, and 2-D STFT is applied to each sub-block individually. Second, the 2-D STFT spectrum of sub-blocks is studied and the obvious different characteristic between sea background and non-sea background is found. Finally, the statistical model for all valid frequency points in the STFT spectrum of sea background is given, and the ship detection method based on the 2-D STFT spectrum modeling is proposed. The experimental result shows that the proposed algorithm can detect ship targets with high recall rate and low missing rate.

  8. Small-scale medical waste incinerators - experiences and trials in South Africa

    International Nuclear Information System (INIS)

    Rogers, David E.C.; Brent, Alan C.

    2006-01-01

    Formal waste management services are not accessible for the majority of primary healthcare clinics on the African continent, and affordable and practicable technology solutions are required in the developing country context. In response, a protocol was established for the first quantitative and qualitative evaluation of relatively low cost small-scale incinerators for use at rural primary healthcare clinics. The protocol comprised the first phase of four, which defined the comprehensive trials of three incineration units. The trials showed that all of the units could be used to render medical waste non-infectious, and to destroy syringes or render needles unsuitable for reuse. Emission loads from the incinerators are higher than large-scale commercial incinerators, but a panel of experts considered the incinerators to be more acceptable compared to the other waste treatment and disposal options available in under-serviced rural areas. However, the incinerators must be used within a safe waste management programme that provides the necessary resources in the form of collection containers, maintenance support, acceptable energy sources, and understandable operational instructions for the incinerators, whilst minimising the exposure risks to emissions through the correct placement of the units in relation to the clinic and the surrounding communities. On-going training and awareness building are essential in order to ensure that the incinerators are correctly used as a sustainable waste treatment option

  9. Backup flexibility classes in emerging large-scale renewable electricity systems

    International Nuclear Information System (INIS)

    Schlachtberger, D.P.; Becker, S.; Schramm, S.; Greiner, M.

    2016-01-01

    Highlights: • Flexible backup demand in a European wind and solar based power system is modelled. • Three flexibility classes are defined based on production and consumption timescales. • Seasonal backup capacities are shown to be only used below 50% renewable penetration. • Large-scale transmission between countries can reduce fast flexible capacities. - Abstract: High shares of intermittent renewable power generation in a European electricity system will require flexible backup power generation on the dominant diurnal, synoptic, and seasonal weather timescales. The same three timescales are already covered by today’s dispatchable electricity generation facilities, which are able to follow the typical load variations on the intra-day, intra-week, and seasonal timescales. This work aims to quantify the changing demand for those three backup flexibility classes in emerging large-scale electricity systems, as they transform from low to high shares of variable renewable power generation. A weather-driven modelling is used, which aggregates eight years of wind and solar power generation data as well as load data over Germany and Europe, and splits the backup system required to cover the residual load into three flexibility classes distinguished by their respective maximum rates of change of power output. This modelling shows that the slowly flexible backup system is dominant at low renewable shares, but its optimized capacity decreases and drops close to zero once the average renewable power generation exceeds 50% of the mean load. The medium flexible backup capacities increase for modest renewable shares, peak at around a 40% renewable share, and then continuously decrease to almost zero once the average renewable power generation becomes larger than 100% of the mean load. The dispatch capacity of the highly flexible backup system becomes dominant for renewable shares beyond 50%, and reach their maximum around a 70% renewable share. For renewable shares

  10. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  11. Observation of multi-scale oscillation of laminar lifted flames with low-frequency AC electric fields

    KAUST Repository

    Ryu, Seol

    2010-01-01

    The oscillation behavior of laminar lifted flames under the influence of low-frequency AC has been investigated experimentally in coflow jets. Various oscillation modes were existed depending on jet velocity and the voltage and frequency of AC, especially when the AC frequency was typically smaller than 30 Hz. Three different oscillation modes were observed: (1) large-scale oscillation with the oscillation frequency of about 0.1 Hz, which was independent of the applied AC frequency, (2) small-scale oscillation synchronized to the applied AC frequency, and (3) doubly-periodic oscillation with small-scale oscillation embedded in large-scale oscillation. As the AC frequency decreased from 30 Hz, the oscillation modes were in the order of the large-scale oscillation, doubly-periodic oscillation, and small-scale oscillation. The onset of the oscillation for the AC frequency smaller than 30 Hz was in close agreement with the delay time scale for the ionic wind effect to occur, that is, the collision response time. Frequency-doubling behavior for the small-scale oscillation has also been observed. Possible mechanisms for the large-scale oscillation and the frequency-doubling behavior have been discussed, although the detailed understanding of the underlying mechanisms will be a future study. © 2009 The Combustion Institute.

  12. Availability and affordability of blood pressure-lowering medicines and the effect on blood pressure control in high-income, middle-income, and low-income countries: an analysis of the PURE study data.

    OpenAIRE

    Attaei, MW; Khatib, R; McKee, M; Lear, S; Dagenais, G; Igumbor, EU; AlHabib, KF; Kaur, M; Kruger, L; Teo, K; Lanas, F; Yusoff, K; Oguz, A; Gupta, R; Yusufali, AM

    2017-01-01

    Hypertension is considered the most important risk factor for cardiovascular diseases, but its control is poor worldwide. We aimed to assess the availability and affordability of blood pressure-lowering medicines, and the association with use of these medicines and blood pressure control in countries at varying levels of economic development. We analysed the availability, costs, and affordability of blood pressure-lowering medicines with data recorded from 626 communities in 20 countries part...

  13. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  14. Large-scale risk assessment of polycyclic aromatic hydrocarbons in shoreline sediments from Saudi Arabia: environmental legacy after twelve years of the Gulf war oil spill.

    Science.gov (United States)

    Bejarano, Adriana C; Michel, Jacqueline

    2010-05-01

    A large-scale assessment of polycyclic aromatic hydrocarbons (PAHs) from the 1991 Gulf War oil spill was performed for 2002-2003 sediment samples (n = 1679) collected from habitats along the shoreline of Saudi Arabia. Benthic sediment toxicity was characterized using the Equilibrium Partitioning Sediment Benchmark Toxic Unit approach for 43 PAHs (ESBTU(FCV,43)). Samples were assigned to risk categories according to ESBTU(FCV,43) values: no-risk (1 - 2 - 3 - 5). Sixty seven percent of samples had ESBTU(FCV,43) > 1 indicating potential adverse ecological effects. Sediments from the 0-30 cm layer from tidal flats, and the >30 - oiled halophytes and mangroves had high frequency of high-risk samples. No-risk samples were characterized by chrysene enrichment and depletion of lighter molecular weight PAHs, while high-risk samples showed little oil weathering and PAH patterns similar to 1993 samples. North of Safaniya sediments were not likely to pose adverse ecological effects contrary to sediments south of Tanaqib. Landscape and geomorphology has played a role on the distribution and persistence in sediments of oil from the Gulf War. Copyright 2009 Elsevier Ltd. All rights reserved.

  15. Large-scale atomistic simulations of nanostructured materials based on divide-and-conquer density functional theory

    Directory of Open Access Journals (Sweden)

    Vashishta P.

    2011-05-01

    Full Text Available A linear-scaling algorithm based on a divide-and-conquer (DC scheme is designed to perform large-scale molecular-dynamics simulations, in which interatomic forces are computed quantum mechanically in the framework of the density functional theory (DFT. This scheme is applied to the thermite reaction at an Al/Fe2O3 interface. It is found that mass diffusion and reaction rate at the interface are enhanced by a concerted metal-oxygen flip mechanism. Preliminary simulations are carried out for an aluminum particle in water based on the conventional DFT, as a target system for large-scale DC-DFT simulations. A pair of Lewis acid and base sites on the aluminum surface preferentially catalyzes hydrogen production in a low activation-barrier mechanism found in the simulations

  16. Large-Scale Nanophotonic Solar Selective Absorbers for High-Efficiency Solar Thermal Energy Conversion.

    Science.gov (United States)

    Li, Pengfei; Liu, Baoan; Ni, Yizhou; Liew, Kaiyang Kevin; Sze, Jeff; Chen, Shuo; Shen, Sheng

    2015-08-19

    An omnidirectional nanophotonic solar selective absorber is fabricated on a large scale using a template-stripping method. The nanopyramid nickel structure achieves an average absorptance of 95% at a wavelength range below 1.3 μm and a low emittance less than 10% at wavelength >2.5 μm. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. How are the catastrophical risks quantifiable

    International Nuclear Information System (INIS)

    Chakraborty, S.

    1985-01-01

    For the assessment and evaluation of industrial risks the question must be asked how are the catastrophical risks quantifiable. Typical real catastrophical risks and risk assessment based on modelling assumptions have been placed against each other in order to put the risks into proper perspective. However, the society is risk averse when there is a catastrophic potential of severe accidents in a large scale industrial facility even though there is extremely low probability of occurence. (orig.) [de

  18. Learning at Work: Organisational Affordances and Individual Engagement

    Science.gov (United States)

    Bryson, Jane; Pajo, Karl; Ward, Robyn; Mallon, Mary

    2006-01-01

    Purpose: The purpose of this research is to explore the interaction between organisational affordances for the development of individuals' capability, and the engagement of workers at various levels with those opportunities. Design/methodology/approach: A case study of a large New Zealand wine company, using in-depth interviews. Interviews were…

  19. CLASS: The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Essinger-Hileman, Thomas; Ali, Aamir; Amiri, Mandana; Appel, John W.; Araujo, Derek; Bennett, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Chuss, David T.; hide

    2014-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is an experiment to measure the signature of a gravitational wave background from inflation in the polarization of the cosmic microwave background (CMB). CLASS is a multi-frequency array of four telescopes operating from a high-altitude site in the Atacama Desert in Chile. CLASS will survey 70% of the sky in four frequency bands centered at 38, 93, 148, and 217 GHz, which are chosen to straddle the Galactic-foreground minimum while avoiding strong atmospheric emission lines. This broad frequency coverage ensures that CLASS can distinguish Galactic emission from the CMB. The sky fraction of the CLASS survey will allow the full shape of the primordial B-mode power spectrum to be characterized, including the signal from reionization at low-length. Its unique combination of large sky coverage, control of systematic errors, and high sensitivity will allow CLASS to measure or place upper limits on the tensor-to-scalar ratio at a level of r = 0:01 and make a cosmic-variance-limited measurement of the optical depth to the surface of last scattering, tau. (c) (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  20. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  1. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  2. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  3. Do pressure ulcer risk assessment scales improve clinical practice?

    Directory of Open Access Journals (Sweden)

    Jan Kottner

    2010-07-01

    Full Text Available Jan Kottner1, Katrin Balzer21Department of Nursing Science, Charité-Universitätsmedizin Berlin, Germany; 2Nursing Research Group, Institute for Social Medicine, Universitätsklinikum Schleswig-Holstein, Lübeck, GermanyAbstract: Standardized assessment instruments are deemed important for estimating pressure ulcer risk. Today, more than 40 so-called pressure ulcer risk assessment scales are available but still there is an ongoing debate about their usefulness. From a measurement point of view pressure ulcer (PU risk assessment scales have serious limitations. Empirical evidence supporting the validity of PU risk assessment scale scores is weak and obtained scores contain varying amounts of measurement error. The concept of pressure ulcer risk is strongly related to the general health status and severity of illness. A clinical impact due do the application of these scales could also not be demonstrated. It is questionable whether completion of standardized pressure ulcer risk scales in clinical practice is really needed.Keywords: Braden pressure ulcer, prevention, risk assessment, nursing assessment, predictive value, clinical effectiveness, review

  4. Zebrafish whole-adult-organism chemogenomics for large-scale predictive and discovery chemical biology.

    Directory of Open Access Journals (Sweden)

    Siew Hong Lam

    2008-07-01

    Full Text Available The ability to perform large-scale, expression-based chemogenomics on whole adult organisms, as in invertebrate models (worm and fly, is highly desirable for a vertebrate model but its feasibility and potential has not been demonstrated. We performed expression-based chemogenomics on the whole adult organism of a vertebrate model, the zebrafish, and demonstrated its potential for large-scale predictive and discovery chemical biology. Focusing on two classes of compounds with wide implications to human health, polycyclic (halogenated aromatic hydrocarbons [P(HAHs] and estrogenic compounds (ECs, we generated robust prediction models that can discriminate compounds of the same class from those of different classes in two large independent experiments. The robust expression signatures led to the identification of biomarkers for potent aryl hydrocarbon receptor (AHR and estrogen receptor (ER agonists, respectively, and were validated in multiple targeted tissues. Knowledge-based data mining of human homologs of zebrafish genes revealed highly conserved chemical-induced biological responses/effects, health risks, and novel biological insights associated with AHR and ER that could be inferred to humans. Thus, our study presents an effective, high-throughput strategy of capturing molecular snapshots of chemical-induced biological states of a whole adult vertebrate that provides information on biomarkers of effects, deregulated signaling pathways, and possible affected biological functions, perturbed physiological systems, and increased health risks. These findings place zebrafish in a strategic position to bridge the wide gap between cell-based and rodent models in chemogenomics research and applications, especially in preclinical drug discovery and toxicology.

  5. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  6. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  7. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  8. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  9. Advancing effects analysis for integrated, large-scale wildfire risk assessment

    Science.gov (United States)

    Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager

    2011-01-01

    In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...

  10. Large scale scenario analysis of future low carbon energy options

    International Nuclear Information System (INIS)

    Olaleye, Olaitan; Baker, Erin

    2015-01-01

    In this study, we use a multi-model framework to examine a set of possible future energy scenarios resulting from R&D investments in Solar, Nuclear, Carbon Capture and Storage (CCS), Bio-fuels, Bio-electricity, and Batteries for Electric Transportation. Based on a global scenario analysis, we examine the impact on the economy of advancement in energy technologies, considering both individual technologies and the interactions between pairs of technologies, with a focus on the role of uncertainty. Nuclear and CCS have the most impact on abatement costs, with CCS mostly important at high levels of abatement. We show that CCS and Bio-electricity are complements, while most of the other energy technology pairs are substitutes. We also examine for stochastic dominance between R&D portfolios: given the uncertainty in R&D outcomes, we examine which portfolios would be preferred by all decision-makers, regardless of their attitude toward risk. We observe that portfolios with CCS tend to stochastically dominate those without CCS; and portfolios lacking CCS and Nuclear tend to be stochastically dominated by others. We find that the dominance of CCS becomes even stronger as uncertainty in climate damages increases. Finally, we show that there is significant value in carefully choosing a portfolio, as relatively small portfolios can dominate large portfolios. - Highlights: • We examine future energy scenarios in the face of R&D and climate uncertainty. • We examine the impact of advancement in energy technologies and pairs of technologies. • CCS complements Bio-electricity while most technology pairs are substitutes. • R&D portfolios without CCS are stochastically dominated by portfolios with CCS. • Higher damage uncertainty favors R&D development of CCS and Bio-electricity

  11. Large-scale experience with biological treatment of contaminated soil

    International Nuclear Information System (INIS)

    Schulz-Berendt, V.; Poetzsch, E.

    1995-01-01

    The efficiency of biological methods for the cleanup of soil contaminated with total petroleum hydrocarbons (TPH) and polycyclic aromatic hydrocarbons (PAH) was demonstrated by a large-scale example in which 38,000 tons of TPH- and PAH-polluted soil was treated onsite with the TERRAFERM reg-sign degradation system to reach the target values of 300 mg/kg TPH and 5 mg/kg PAH. Detection of the ecotoxicological potential (Microtox reg-sign assay) showed a significant decrease during the remediation. Low concentrations of PAH in the ground were treated by an in situ technology. The in situ treatment was combined with mechanical measures (slurry wall) to prevent the contamination from dispersing from the site

  12. Building America's Low-e Storm Window Adoption Program Plan (FY2014)

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-12-23

    Low emissivity (low-e) storm windows/panels appear to hold promise for effectively reducing existing home heating, ventilation, and air-conditioning (HVAC) consumption. Due to the affordability of low-e storm windows and the large numbers of existing homes that have low-performing single-pane or double-pane clear windows, a tremendous opportunity exists to provide energy savings by transforming the low-e storm window market and increasing market adoption. This report outlines U.S. Department of Energy (DOE) Building America’s planned market transformation activities in support of low-e storm window adoption during fiscal year (FY) 2014.

  13. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  14. Evaluation of a constipation risk assessment scale.

    Science.gov (United States)

    Zernike, W; Henderson, A

    1999-06-01

    This project was undertaken in order to evaluate the utility of a constipation risk assessment scale and the accompanying bowel management protocol. The risk assessment scale was primarily introduced to teach and guide staff in managing constipation when caring for patients. The intention of the project was to reduce the incidence of constipation in patients during their admission to hospital.

  15. Operational tools to build a multicriteria territorial risk scale with multiple stakeholders

    International Nuclear Information System (INIS)

    Cailloux, Olivier; Mayag, Brice; Meyer, Patrick; Mousseau, Vincent

    2013-01-01

    Evaluating and comparing the threats and vulnerabilities associated with territorial zones according to multiple criteria (industrial activity, population, etc.) can be a time-consuming task and often requires the participation of several stakeholders. Rather than a direct evaluation of these zones, building a risk assessment scale and using it in a formal procedure permits to automate the assessment and therefore to apply it in a repeated way and in large-scale contexts and, provided the chosen procedure and scale are accepted, to make it objective. One of the main difficulties of building such a formal evaluation procedure is to account for the multiple decision makers' preferences. The procedure used in this article, ELECTRE TRI, uses the performances of each territorial zone on multiple criteria, together with preferential parameters from multiple decision makers, to qualitatively assess their associated risk level. We also present operational tools in order to implement such a procedure in practice, and show their use on a detailed example

  16. Energy partitioning constraints at kinetic scales in low-β turbulence

    Science.gov (United States)

    Gershman, Daniel J.; F.-Viñas, Adolfo; Dorelli, John C.; Goldstein, Melvyn L.; Shuster, Jason; Avanov, Levon A.; Boardsen, Scott A.; Stawarz, Julia E.; Schwartz, Steven J.; Schiff, Conrad; Lavraud, Benoit; Saito, Yoshifumi; Paterson, William R.; Giles, Barbara L.; Pollock, Craig J.; Strangeway, Robert J.; Russell, Christopher T.; Torbert, Roy B.; Moore, Thomas E.; Burch, James L.

    2018-02-01

    Turbulence is a fundamental physical process through which energy injected into a system at large scales cascades to smaller scales. In collisionless plasmas, turbulence provides a critical mechanism for dissipating electromagnetic energy. Here, we present observations of plasma fluctuations in low-β turbulence using data from NASA's Magnetospheric Multiscale mission in Earth's magnetosheath. We provide constraints on the partitioning of turbulent energy density in the fluid, ion-kinetic, and electron-kinetic ranges. Magnetic field fluctuations dominated the energy density spectrum throughout the fluid and ion-kinetic ranges, consistent with previous observations of turbulence in similar plasma regimes. However, at scales shorter than the electron inertial length, fluctuation power in electron kinetic energy significantly exceeded that of the magnetic field, resulting in an electron-motion-regulated cascade at small scales. This dominance is highly relevant for the study of turbulence in highly magnetized laboratory and astrophysical plasmas.

  17. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Possible implications of large scale radiation processing of food

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1990-01-01

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of successful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fulfilment of conditions for successful processing is observed in the group of dry food, in expensive spices in particular. (author)

  19. Possible implications of large scale radiation processing of food

    Science.gov (United States)

    Zagórski, Z. P.

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of succesful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fullfilment of conditions for succesful processing is observed in the group of dry food, in expensive spices in particular.

  20. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  1. Large-scale spatial distribution patterns of gastropod assemblages in rocky shores.

    Directory of Open Access Journals (Sweden)

    Patricia Miloslavich

    Full Text Available Gastropod assemblages from nearshore rocky habitats were studied over large spatial scales to (1 describe broad-scale patterns in assemblage composition, including patterns by feeding modes, (2 identify latitudinal pattern of biodiversity, i.e., richness and abundance of gastropods and/or regional hotspots, and (3 identify potential environmental and anthropogenic drivers of these assemblages. Gastropods were sampled from 45 sites distributed within 12 Large Marine Ecosystem regions (LME following the NaGISA (Natural Geography in Shore Areas standard protocol (www.nagisa.coml.org. A total of 393 gastropod taxa from 87 families were collected. Eight of these families (9.2% appeared in four or more different LMEs. Among these, the Littorinidae was the most widely distributed (8 LMEs followed by the Trochidae and the Columbellidae (6 LMEs. In all regions, assemblages were dominated by few species, the most diverse and abundant of which were herbivores. No latitudinal gradients were evident in relation to species richness or densities among sampling sites. Highest diversity was found in the Mediterranean and in the Gulf of Alaska, while highest densities were found at different latitudes and represented by few species within one genus (e.g. Afrolittorina in the Agulhas Current, Littorina in the Scotian Shelf, and Lacuna in the Gulf of Alaska. No significant correlation was found between species composition and environmental variables (r≤0.355, p>0.05. Contributing variables to this low correlation included invasive species, inorganic pollution, SST anomalies, and chlorophyll-a anomalies. Despite data limitations in this study which restrict conclusions in a global context, this work represents the first effort to sample gastropod biodiversity on rocky shores using a standardized protocol across a wide scale. Our results will generate more work to build global databases allowing for large-scale diversity comparisons of rocky intertidal assemblages.

  2. Design Of A Small-Scale Hulling Machine For Improved Wet-Processed Coffee.

    Directory of Open Access Journals (Sweden)

    Adeleke

    2017-08-01

    Full Text Available The method of primary processing of coffee is a vital determinant of quality and price. Wet processing method produces higher quality beans but is very labourious. This work outlines the design of a small scale cost-effective ergonomic and easily maintained and operated coffee hulling machine that can improve quality and productivity of green coffee beans. The machine can be constructed from locally available materials at a relatively low cost of about NGN 140000.00 with cheap running cost. The beaters are made from rubber strip which can deflect when in contact with any obstruction causing little or no stresses on drum members and reducing the risk of damage to both the beans and machine. The machine is portable and detachable which make it fit to be owned by a group of farmers who can move it from one farm to the other making affordability and running cost easier. The easily affordable and relatively low running cost may be further reduced by the fact that the machine is powered by 3.0 Hp petrol engine which is suitable for other purposes among the rural dwellers. The eventual construction of the machine will encourage more farmers to go into wet processing of coffee and reduce the foreign exchange hitherto lost to this purpose.

  3. Affordances theory in multilingualism studies

    Directory of Open Access Journals (Sweden)

    Larissa Aronin

    2012-10-01

    Full Text Available The concept of affordances originating in Gibson’s work (Gibson, 1977 is gaining ground in multilingualism studies (cf. Aronin and Singleton, 2010; Singleton and Aronin, 2007; Dewaele, 2010. Nevertheless, studies investigating affordances in respect of teaching, learning or using languages are still somewhat rare and tend to treat isolated aspects of multilingualism. This is despite the fact that the theory of affordances can actually provide a valuable, supplementary, up-to-date framework within which a clearer, sharper description and explication of the intriguing range of attributes of multilingual communities, educational institutions and individuals, as well as teaching practices, become feasible. It is important that not only researchers and practitioners (teachers, educators, parents, community and political actors but also language users and learners themselves should be aware of how to identify or, if necessary, design new affordances for language acquisition and learning. The aim of this article is to adapt the concept of affordances to multilingualism studies and additional language teaching, and in so doing advance theoretical understanding in this context. To this end the article contains a brief summary of the findings so far available. The article also goes further into defining the ways of how affordances work in relation to multilingualism and second language teaching and puts forward an integrated model of affordances.

  4. Cost considerations in determining the affordability of adjuvant ...

    African Journals Online (AJOL)

    growth factor receptor 2 (HER2)-positive breast cancer, particularly in low- and middle-income countries. Affordability and value differ in patient groups with different baseline prognoses. This is illustrated below using the hazard ratio (HR) of survival rates obtained from a. Cochrane review[2] and personal communication ...

  5. Risk Factors Associated with Very Low Birth Weight in a Large Urban Area, Stratified by Adequacy of Prenatal Care.

    Science.gov (United States)

    Xaverius, Pamela; Alman, Cameron; Holtz, Lori; Yarber, Laura

    2016-03-01

    This study examined risk and protective factors associated with very low birth weight (VLBW) for babies born to women receiving adequate or inadequate prenatal care. Birth records from St. Louis City and County from 2000 to 2009 were used (n = 152,590). Data was categorized across risk factors and stratified by adequacy of prenatal care (PNC). Multivariate logistic regression and population attributable risk (PAR) was used to explore risk factors for VLBW infants. Women receiving inadequate prenatal care had a higher prevalence of delivering a VLBW infant than those receiving adequate PNC (4.11 vs. 1.44 %, p < .0001). The distribution of risk factors differed between adequate and inadequate PNC regarding Black race (36.4 vs. 79.0 %, p < .0001), age under 20 (13.0 vs. 33.6 %, p < .0001), <13 years of education (35.9 vs. 77.9 %, p < .0001), Medicaid status (35.7 vs. 74.9, p < .0001), primiparity (41.6 vs. 31.4 %, p < .0001), smoking (9.7 vs. 24.5 %, p < .0001), and diabetes (4.0 vs. 2.4 %, p < .0001), respectively. Black race, advanced maternal age, primiparity and gestational hypertension were significant predictors of VLBW, regardless of adequate or inadequate PNC. Among women with inadequate PNC, Medicaid was protective against (aOR 0.671, 95 % CI 0.563-0.803; PAR -32.6 %) and smoking a risk factor for (aOR 1.23, 95 % CI 1.01, 1.49; PAR 40.1 %) VLBW. When prematurity was added to the adjusted models, the largest PAR shifts to education (44.3 %) among women with inadequate PNC. Community actions around broader issues of racism and social determinants of health are needed to prevent VLBW in a large urban area.

  6. A 3D Sphere Culture System Containing Functional Polymers for Large-Scale Human Pluripotent Stem Cell Production

    Directory of Open Access Journals (Sweden)

    Tomomi G. Otsuji

    2014-05-01

    Full Text Available Utilizing human pluripotent stem cells (hPSCs in cell-based therapy and drug discovery requires large-scale cell production. However, scaling up conventional adherent cultures presents challenges of maintaining a uniform high quality at low cost. In this regard, suspension cultures are a viable alternative, because they are scalable and do not require adhesion surfaces. 3D culture systems such as bioreactors can be exploited for large-scale production. However, the limitations of current suspension culture methods include spontaneous fusion between cell aggregates and suboptimal passaging methods by dissociation and reaggregation. 3D culture systems that dynamically stir carrier beads or cell aggregates should be refined to reduce shearing forces that damage hPSCs. Here, we report a simple 3D sphere culture system that incorporates mechanical passaging and functional polymers. This setup resolves major problems associated with suspension culture methods and dynamic stirring systems and may be optimal for applications involving large-scale hPSC production.

  7. Environmental aspects of large-scale wind-power systems in the UK

    Energy Technology Data Exchange (ETDEWEB)

    Robson, A

    1983-12-01

    Environmental issues relating to the introduction of large, MW-scale wind turbines at land-based sites in the U.K. are discussed. Areas of interest include noise, television interference, hazards to bird life and visual effects. A number of areas of uncertainty are identified, but enough is known from experience elsewhere in the world to enable the first U.K. machines to be introduced in a safe and environmentally acceptable manner. Research currently under way will serve to establish siting criteria more clearly, and could significantly increase the potential wind-energy resource. Certain studies of the comparative risk of energy systems are shown to be overpessimistic for U.K. wind turbines.

  8. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    Science.gov (United States)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this

  9. Role of large-scale permeability measurements in fractured rock and their application at Stripa

    International Nuclear Information System (INIS)

    Witherspoon, P.A.; Wilson, C.R.; Long, J.C.S.; DuBois, A.O.; Gale, J.E.; McPherson, M.

    1979-10-01

    Completion of the macropermeability experiment will provide: (i) a direct, in situ measurement of the permeability of 10 5 to 10 6 m 3 of rock; (ii) a potential method for confirming the analysis of a series of small scale permeability tests performed in surface and underground boreholes; (iii) a better understanding of the effect to open borehole zone length on pressure measurement; (iv) increased volume in fractured rock; (v) a basis for evaluating the ventilation technique for flow measurement in large scale testing of low permeability rocks

  10. A new large-scale manufacturing platform for complex biopharmaceuticals.

    Science.gov (United States)

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  11. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  12. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  13. SCALES: SEVIRI and GERB CaL/VaL area for large-scale field experiments

    Science.gov (United States)

    Lopez-Baeza, Ernesto; Belda, Fernando; Bodas, Alejandro; Crommelynck, Dominique; Dewitte, Steven; Domenech, Carlos; Gimeno, Jaume F.; Harries, John E.; Jorge Sanchez, Joan; Pineda, Nicolau; Pino, David; Rius, Antonio; Saleh, Kauzar; Tarruella, Ramon; Velazquez, Almudena

    2004-02-01

    The main objective of the SCALES Project is to exploit the unique opportunity offered by the recent launch of the first European METEOSAT Second Generation geostationary satellite (MSG-1) to generate and validate new radiation budget and cloud products provided by the GERB (Geostationary Earth Radiation Budget) instrument. SCALES" specific objectives are: (i) definition and characterization of a large reasonably homogeneous area compatible to GERB pixel size (around 50 x 50 km2), (ii) validation of GERB TOA radiances and fluxes derived by means of angular distribution models, (iii) development of algorithms to estimate surface net radiation from GERB TOA measurements, and (iv) development of accurate methodologies to measure radiation flux divergence and analyze its influence on the thermal regime and dynamics of the atmosphere, also using GERB data. SCALES is highly innovative: it focuses on a new and unique space instrument and develops a new specific validation methodology for low resolution sensors that is based on the use of a robust reference meteorological station (Valencia Anchor Station) around which 3D high resolution meteorological fields are obtained from the MM5 Meteorological Model. During the 1st GERB Ground Validation Campaign (18th-24th June, 2003), CERES instruments on Aqua and Terra provided additional radiance measurements to support validation efforts. CERES instruments operated in the PAPS mode (Programmable Azimuth Plane Scanning) focusing the station. Ground measurements were taken by lidar, sun photometer, GPS precipitable water content, radiosounding ascents, Anchor Station operational meteorological measurements at 2m and 15m., 4 radiation components at 2m, and mobile stations to characterize a large area. In addition, measurements during LANDSAT overpasses on June 14th and 30th were also performed. These activities were carried out within the GIST (GERB International Science Team) framework, during GERB Commissioning Period.

  14. A Challenge Facing the Malaysian Pharmaceutical Sector: Quality and Affordability of the National Medications

    OpenAIRE

    Abubaker Abdellah; Noordin MI; Shade AM Khalifa; Rafdzah Zaki; Ahmad S Sulaiman; Ali Abdellah; Hesham R El-Seedi

    2016-01-01

    Background: Enhancing public satisfaction of the quality and affordability of medicines is an important task in health services. Objective: This study was intended to assess the trust and acceptance of public concerning quality and affordability of locally manufactured medicines in Malaysia. Methodology: A cross sectional study was performed, and a validated Likert scale questionnaire was used in this study. The results were analyzed using the statistical Package for Social Sciences (SPSS) so...

  15. Rocky intertidal macrobenthic communities across a large-scale estuarine gradient

    Directory of Open Access Journals (Sweden)

    Luis Giménez

    2010-03-01

    Full Text Available We evaluated relationships between (1 salinity and species richness and (2 frontal zones and community structure for the rocky intertidal macrobenthic community of the Uruguayan coast. A large-scale sampling design (extent ~500 km covering 9 rocky shores across 3 intertidal levels was performed between September and November 2002. The linear relationship between salinity and species richness (minimum at the freshwater extreme and the lack of correlation between variation in salinity and richness rejected two previous empirical models, explaining variations in species richness along the salinity gradient. Other factors (e.g. turbidity may explain this discrepancy. The estuarine front defined two communities—freshwater and estuarine-marine—differing in species composition and richness. The freshwater community was characterised by low richness and few individuals confined to crevices or tide pools, and must be structured by physical processes (e.g. desiccation; the estuarine-marine community, with individuals occupying almost all available substrata, must be structured by both physical and biological processes. A marine front, separating estuarine and marine habitats, had a weak effect on community structure although estuarine and marine assemblages differed according to species characterising different functional groups. We conclude that the position of the estuarine frontal zones is important for explaining large-scale patterns of community structure in the study area.

  16. The U.S. health insurance marketplace: are premiums truly affordable?

    Science.gov (United States)

    Graetz, Ilana; Kaplan, Cameron M; Kaplan, Erin K; Bailey, James E; Waters, Teresa M

    2014-10-21

    The Patient Protection and Affordable Care Act requires that individuals have health insurance or pay a penalty. Individuals are exempt from paying this penalty if the after-subsidy cost of the least-expensive plan available to them is greater than 8% of their income. For this study, premium data for all health plans offered on the state and federal health insurance marketplaces were collected; the after-subsidy cost of premiums for the least-expensive bronze plan for every county in the United States was calculated; and variations in premium affordability by age, income, and geographic area were assessed. Results indicated that-although marketplace subsidies ensure affordable health insurance for most persons in the United States-many individuals with incomes just above the subsidy threshold will lack affordable coverage and will be exempt from the mandate. Furthermore, young individuals with low incomes often pay as much as or more than older individuals for bronze plans. If substantial numbers of younger, healthier adults choose to remain uninsured because of cost, health insurance premiums across all ages may increase over time.

  17. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  18. The Feasibility of Using Large-Scale Text Mining to Detect Adverse Childhood Experiences in a VA-Treated Population.

    Science.gov (United States)

    Hammond, Kenric W; Ben-Ari, Alon Y; Laundry, Ryan J; Boyko, Edward J; Samore, Matthew H

    2015-12-01

    Free text in electronic health records resists large-scale analysis. Text records facts of interest not found in encoded data, and text mining enables their retrieval and quantification. The U.S. Department of Veterans Affairs (VA) clinical data repository affords an opportunity to apply text-mining methodology to study clinical questions in large populations. To assess the feasibility of text mining, investigation of the relationship between exposure to adverse childhood experiences (ACEs) and recorded diagnoses was conducted among all VA-treated Gulf war veterans, utilizing all progress notes recorded from 2000-2011. Text processing extracted ACE exposures recorded among 44.7 million clinical notes belonging to 243,973 veterans. The relationship of ACE exposure to adult illnesses was analyzed using logistic regression. Bias considerations were assessed. ACE score was strongly associated with suicide attempts and serious mental disorders (ORs = 1.84 to 1.97), and less so with behaviorally mediated and somatic conditions (ORs = 1.02 to 1.36) per unit. Bias adjustments did not remove persistent associations between ACE score and most illnesses. Text mining to detect ACE exposure in a large population was feasible. Analysis of the relationship between ACE score and adult health conditions yielded patterns of association consistent with prior research. Copyright © 2015 International Society for Traumatic Stress Studies.

  19. A new large-scale plasma source with plasma cathode

    International Nuclear Information System (INIS)

    Yamauchi, K.; Hirokawa, K.; Suzuki, H.; Satake, T.

    1996-01-01

    A new large-scale plasma source (200 mm diameter) with a plasma cathode has been investigated. The plasma has a good spatial uniformity, operates at low electron temperature, and is highly ionized under relatively low gas pressure of about 10 -4 Torr. The plasma source consists of a plasma chamber and a plasma cathode generator. The plasma chamber has an anode which is 200 mm in diameter, 150 mm in length, is made of 304 stainless steel, and acts as a plasma expansion cup. A filament-cathode-like plasma ''plasma cathode'' is placed on the central axis of this source. To improve the plasma spatial uniformity in the plasma chamber, a disk-shaped, floating electrode is placed between the plasma chamber and the plasma cathode. The 200 mm diameter plasma is measure by using Langmuir probes. As a result, the discharge voltage is relatively low (30-120 V), the plasma space potential is almost equal to the discharge voltage and can be easily controlled, the electron temperature is several electron volts, the plasma density is about 10 10 cm -3 , and the plasma density is about 10% variance in over a 100 mm diameter. (Author)

  20. Low-density lipoprotein electronegativity is a novel cardiometabolic risk factor.

    Directory of Open Access Journals (Sweden)

    Jing-Fang Hsu

    Full Text Available BACKGROUND: Low-density lipoprotein (LDL plays a central role in cardiovascular disease (CVD development. In LDL chromatographically resolved according to charge, the most electronegative subfraction-L5-is the only subfraction that induces atherogenic responses in cultured vascular cells. Furthermore, increasing evidence has shown that plasma L5 levels are elevated in individuals with high cardiovascular risk. We hypothesized that LDL electronegativity is a novel index for predicting CVD. METHODS: In 30 asymptomatic individuals with metabolic syndrome (MetS and 27 healthy control subjects, we examined correlations between plasma L5 levels and the number of MetS criteria fulfilled, CVD risk factors, and CVD risk according to the Framingham risk score. RESULTS: L5 levels were significantly higher in MetS subjects than in control subjects (21.9±18.7 mg/dL vs. 11.2±10.7 mg/dL, P:0.01. The Jonckheere trend test revealed that the percent L5 of total LDL (L5% and L5 concentration increased with the number of MetS criteria (P<0.001. L5% correlated with classic CVD risk factors, including waist circumference, body mass index, waist-to-height ratio, smoking status, blood pressure, and levels of fasting plasma glucose, triglyceride, and high-density lipoprotein. Stepwise regression analysis revealed that fasting plasma glucose level and body mass index contributed to 28% of L5% variance. The L5 concentration was associated with CVD risk and contributed to 11% of 30-year general CVD risk variance when controlling the variance of waist circumference. CONCLUSION: Our findings show that LDL electronegativity was associated with multiple CVD risk factors and CVD risk, suggesting that the LDL electronegativity index may have the potential to be a novel index for predicting CVD. Large-scale clinical trials are warranted to test the reliability of this hypothesis and the clinical importance of the LDL electronegativity index.

  1. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  2. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  3. Deep-UV patterning of commercial grade PMMA for low-cost, large-scale microfluidics

    International Nuclear Information System (INIS)

    Haiducu, M; Rahbar, M; Foulds, I G; Johnstone, R W; Sameoto, D; Parameswaran, M

    2008-01-01

    Although PMMA can be exposed using a variety of exposure sources, deep-UV at 254 nm is of interest because it is relatively inexpensive. Additionally, deep-UV sources can be readily scaled to large area exposures. Moreover, this paper will show that depths of over 100µm can be created in commercial grade PMMA using an uncollimated source. These depths are sufficient for creating microfluidic channels. This paper will provide measurements of the dissolution depth of commercial grade PMMA as a function of the exposure dose and etch time, using an IPA:H 2 O developer. Additionally, experiments were run to characterize the dependence of the dissolution rate on temperature and agitation. The patterned substrates were thermally bonded to blank PMMA pieces to enclose the channels and ports were drilled into the reservoirs. The resulting fluidic systems were then tested for leakage. The work herein presents the patterning, development and system behaviour of a complete microfluidics system based on commercial grade PMMA

  4. A Large-scale Plume in an X-class Solar Flare

    Energy Technology Data Exchange (ETDEWEB)

    Fleishman, Gregory D.; Nita, Gelu M.; Gary, Dale E. [Physics Department, Center for Solar-Terrestrial Research, New Jersey Institute of Technology Newark, NJ, 07102-1982 (United States)

    2017-08-20

    Ever-increasing multi-frequency imaging of solar observations suggests that solar flares often involve more than one magnetic fluxtube. Some of the fluxtubes are closed, while others can contain open fields. The relative proportion of nonthermal electrons among those distinct loops is highly important for understanding energy release, particle acceleration, and transport. The access of nonthermal electrons to the open field is also important because the open field facilitates the solar energetic particle (SEP) escape from the flaring site, and thus controls the SEP fluxes in the solar system, both directly and as seed particles for further acceleration. The large-scale fluxtubes are often filled with a tenuous plasma, which is difficult to detect in either EUV or X-ray wavelengths; however, they can dominate at low radio frequencies, where a modest component of nonthermal electrons can render the source optically thick and, thus, bright enough to be observed. Here we report the detection of a large-scale “plume” at the impulsive phase of an X-class solar flare, SOL2001-08-25T16:23, using multi-frequency radio data from Owens Valley Solar Array. To quantify the flare’s spatial structure, we employ 3D modeling utilizing force-free-field extrapolations from the line of sight SOHO /MDI magnetograms with our modeling tool GX-Simulator. We found that a significant fraction of the nonthermal electrons that accelerated at the flare site low in the corona escapes to the plume, which contains both closed and open fields. We propose that the proportion between the closed and open fields at the plume is what determines the SEP population escaping into interplanetary space.

  5. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  6. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  7. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  8. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  9. Relative Affordability of Health Insurance Premiums under CHIP Expansion Programs and the ACA.

    Science.gov (United States)

    Gresenz, Carole Roan; Laugesen, Miriam J; Yesus, Ambeshie; Escarce, José J

    2011-10-01

    Affordability is integral to the success of health care reforms aimed at ensuring universal access to health insurance coverage, and affordability determinations have major policy and practical consequences. This article describes factors that influenced the determination of affordability benchmarks and premium-contribution requirements for Children's Health Insurance Program (CHIP) expansions in three states that sought to universalize access to coverage for youth. It also compares subsidy levels developed in these states to the premium subsidy schedule under the Affordable Care Act (ACA) for health insurance plans purchased through an exchange. We find sizeable variability in premium-contribution requirements for children's coverage as a percentage of family income across the three states and in the progressivity and regressivity of the premium-contribution schedules developed. These findings underscore the ambiguity and subjectivity of affordability standards. Further, our analyses suggest that while the ACA increases the affordability of family coverage for families with incomes below 400 percent of the federal poverty level, the evolution of CHIP over the next five to ten years will continue to have significant implications for low-income families.

  10. Passive technologies for future large-scale photonic integrated circuits on silicon: polarization handling, light non-reciprocity and loss reduction

    Directory of Open Access Journals (Sweden)

    Daoxin Dai

    2012-03-01

    Full Text Available Silicon-based large-scale photonic integrated circuits are becoming important, due to the need for higher complexity and lower cost for optical transmitters, receivers and optical buffers. In this paper, passive technologies for large-scale photonic integrated circuits are described, including polarization handling, light non-reciprocity and loss reduction. The design rule for polarization beam splitters based on asymmetrical directional couplers is summarized and several novel designs for ultra-short polarization beam splitters are reviewed. A novel concept for realizing a polarization splitter–rotator is presented with a very simple fabrication process. Realization of silicon-based light non-reciprocity devices (e.g., optical isolator, which is very important for transmitters to avoid sensitivity to reflections, is also demonstrated with the help of magneto-optical material by the bonding technology. Low-loss waveguides are another important technology for large-scale photonic integrated circuits. Ultra-low loss optical waveguides are achieved by designing a Si3N4 core with a very high aspect ratio. The loss is reduced further to <0.1 dB m−1 with an improved fabrication process incorporating a high-quality thermal oxide upper cladding by means of wafer bonding. With the developed ultra-low loss Si3N4 optical waveguides, some devices are also demonstrated, including ultra-high-Q ring resonators, low-loss arrayed-waveguide grating (demultiplexers, and high-extinction-ratio polarizers.

  11. Managing the risks of risk management on large fires

    Science.gov (United States)

    Donald G. MacGregor; Armando González-Cabán

    2013-01-01

    Large fires pose risks to a number of important values, including the ecology, property and the lives of incident responders. A relatively unstudied aspect of fire management is the risks to which incident managers are exposed due to organizational and sociopolitical factors that put them in a position of, for example, potential liability or degradation of their image...

  12. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  13. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  14. The AFFORD clinical decision aid to identify emergency department patients with atrial fibrillation at low risk for 30-day adverse events.

    Science.gov (United States)

    Barrett, Tyler W; Storrow, Alan B; Jenkins, Cathy A; Abraham, Robert L; Liu, Dandan; Miller, Karen F; Moser, Kelly M; Russ, Stephan; Roden, Dan M; Harrell, Frank E; Darbar, Dawood

    2015-03-15

    There is wide variation in the management of patients with atrial fibrillation (AF) in the emergency department (ED). We aimed to derive and internally validate the first prospective, ED-based clinical decision aid to identify patients with AF at low risk for 30-day adverse events. We performed a prospective cohort study at a university-affiliated tertiary-care ED. Patients were enrolled from June 9, 2010, to February 28, 2013, and followed for 30 days. We enrolled a convenience sample of patients in ED presenting with symptomatic AF. Candidate predictors were based on ED data available in the first 2 hours. The decision aid was derived using model approximation (preconditioning) followed by strong bootstrap internal validation. We used an ordinal outcome hierarchy defined as the incidence of the most severe adverse event within 30 days of the ED evaluation. Of 497 patients enrolled, stroke and AF-related death occurred in 13 (3%) and 4 (aid included the following: age, triage vitals (systolic blood pressure, temperature, respiratory rate, oxygen saturation, supplemental oxygen requirement), medical history (heart failure, home sotalol use, previous percutaneous coronary intervention, electrical cardioversion, cardiac ablation, frequency of AF symptoms), and ED data (2 hours heart rate, chest radiograph results, hemoglobin, creatinine, and brain natriuretic peptide). The decision aid's c-statistic in predicting any 30-day adverse event was 0.7 (95% confidence interval 0.65, 0.76). In conclusion, in patients with AF in the ED, Atrial Fibrillation and Flutter Outcome Risk Determination provides the first evidence-based decision aid for identifying patients who are at low risk for 30-day adverse events and candidates for safe discharge. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  16. On the measurements of large scale solar velocity fields

    International Nuclear Information System (INIS)

    Andersen, B.N.

    1985-01-01

    A general mathematical formulation for the correction of the scattered light influence on solar Doppler shift measurements has been developed. This method has been applied to the straylight correction of measurements of solar rotation, limb effect, large scale flows and oscillations. It is shown that neglecting the straylight errors may cause spurious large scale velocity fields, oscillations and erronous values for the solar rotation and limb effect. The influence of active regions on full disc velocity measurements has been studied. It is shown that a 13 day periodicity in the global velocity signal will be introduced by the passage of sunspots over the solar disc. With different types of low resolution apertures, other periodicities may be introduced. Accurate measurements of the center-to-limb velocity shift are presented for a set of magnetic insensitive lines well suited for solar velocity measurements. The absolute wavelenght shifts are briefly discussed. The stronger lines have a ''supergravitational'' shift of 300-400 m/s at the solar limb. The results may be explained by the presence of a 20-25 m/s poleward meridional flow and a latitudinal dependence of the granular parameters. Using a simple model it is shown that the main properites of the observations are explained by a 5% increase in the granular size with latitude. Data presented indicate that the resonance line K I, 769.9 nm has a small but significant limb effect of 125 m/s from center to limb

  17. Low self-esteem is a risk factor for depressive symptoms from young adulthood to old age.

    Science.gov (United States)

    Orth, Ulrich; Robins, Richard W; Trzesniewski, Kali H; Maes, Jürgen; Schmitt, Manfred

    2009-08-01

    Data from two large longitudinal studies were used to analyze reciprocal relations between self-esteem and depressive symptoms across the adult life span. Study 1 included 1,685 participants aged 18 to 96 years assessed 4 times over a 9-year period. Study 2 included 2,479 participants aged 18 to 88 years assessed 3 times over a 4-year period. In both studies, cross-lagged regression analyses indicated that low self-esteem predicted subsequent depressive symptoms, but depressive symptoms did not predict subsequent levels of self-esteem. This pattern of results replicated across all age groups, for both affective-cognitive and somatic symptoms of depression, and after controlling for content overlap between the self-esteem and depression scales. The results suggest that low self-esteem operates as a risk factor for depressive symptoms at all phases of the adult life span.

  18. Influence of weathering and pre-existing large scale fractures on gravitational slope failure: insights from 3-D physical modelling

    Directory of Open Access Journals (Sweden)

    D. Bachmann

    2004-01-01

    Full Text Available Using a new 3-D physical modelling technique we investigated the initiation and evolution of large scale landslides in presence of pre-existing large scale fractures and taking into account the slope material weakening due to the alteration/weathering. The modelling technique is based on the specially developed properly scaled analogue materials, as well as on the original vertical accelerator device enabling increases in the 'gravity acceleration' up to a factor 50. The weathering primarily affects the uppermost layers through the water circulation. We simulated the effect of this process by making models of two parts. The shallower one represents the zone subject to homogeneous weathering and is made of low strength material of compressive strength σl. The deeper (core part of the model is stronger and simulates intact rocks. Deformation of such a model subjected to the gravity force occurred only in its upper (low strength layer. In another set of experiments, low strength (σw narrow planar zones sub-parallel to the slope surface (σwl were introduced into the model's superficial low strength layer to simulate localized highly weathered zones. In this configuration landslides were initiated much easier (at lower 'gravity force', were shallower and had smaller horizontal size largely defined by the weak zone size. Pre-existing fractures were introduced into the model by cutting it along a given plan. They have proved to be of small influence on the slope stability, except when they were associated to highly weathered zones. In this latter case the fractures laterally limited the slides. Deep seated rockslides initiation is thus directly defined by the mechanical structure of the hillslope's uppermost levels and especially by the presence of the weak zones due to the weathering. The large scale fractures play a more passive role and can only influence the shape and the volume of the sliding units.

  19. Controllable and affordable utility-scale electricity from intermittent wind resources and compressed air energy storage (CAES)

    International Nuclear Information System (INIS)

    Cavallo, Alfred

    2007-01-01

    World wind energy resources are substantial, and in many areas, such as the US and northern Europe, could in theory supply all of the electricity demand. However, the remote or challenging location (i.e. offshore) and especially the intermittent character of the wind resources present formidable barriers to utilization on the scale required by a modern industrial economy. All of these technical challenges can be overcome. Long distance transmission is well understood, while offshore wind technology is being developed rapidly. Intermittent wind power can be transformed to a controllable power source with hybrid wind/compressed air energy storage (CAES) systems. The cost of electricity from such hybrid systems (including transmission) is affordable, and comparable to what users in some modern industrial economies already pay for electricity. This approach to intermittent energy integration has many advantages compared to the current strategy of forcing utilities to cope with supply uncertainty and transmission costs. Above all, it places intermittent wind on an equal technical footing with every other generation technology, including nuclear power, its most important long-term competitor

  20. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.