WorldWideScience

Sample records for project leveraging massive

  1. Analysis of debt leveraging in private power projects

    International Nuclear Information System (INIS)

    Kahn, E.P.; Meal, M.; Doerrer, S.; Morse, S.

    1992-08-01

    As private power (non-utility generation) has grown to become a significant part of the electricity system, increasing concern about its financial implications has arisen. In many cases, the source of this concern has been the substantial reliance of these projects on debt financing. This study examines debt leveraging in private power projects. The policy debate on these issues has typically been conducted at a high level of generality. Critics of the private power industry assert that high debt leveraging confers an unfair competitive advantage by lowering the cost of capital, and that this leveraging is only possible because risks are shifted to the utility. Further, debt leveraging is claimed to be a threat to reliability. On the opposite side, it is argued that debt leveraging imposes costs and obligations not home by utilities, and so there is no financial advantage. The private producers also argue that on balance more risk is shifted away from utilities than to them, and that incentives for reliability are strong. In this study we examine the project finance mechanisms used in private power lending in detail, relying on a sample of actual loan documents. This review and its findings should be relevant to the further evolution of this debate. State regulatory commissions are likely to be interested in it, and Federal legislation to amend the Public Utility Holding Company Act (PUHCA) could require states to consider the implications of debt leveraging in relation to their oversight of utility power purchase programs

  2. Analysis of debt leveraging in private power projects

    Energy Technology Data Exchange (ETDEWEB)

    Kahn, E.P. (Lawrence Berkeley Lab., CA (United States)); Meal, M.; Doerrer, S.; Morse, S. (Morse, Richard, Weisenmiller Associates, Inc., Oakland, CA (United States))

    1992-08-01

    As private power has grown to become a significant part of the electricity system, increasing concern about its financial implications has arisen. In many cases, the source of this concern has been the substantial reliance of these projects on debt financing. This study examines debt leveraging in private power projects. The policy debate on these issues has typically been conducted at a high level of generality. Critics of the private power industry assert that high debt leveraging confers an unfair competitive advantage by lowering the cost of capital. This leveraging is only possible because risks are shifted to the utility. Further, debt leveraging is claimed to be a threat to reliability. On the opposite side, it is argued that debt leveraging imposes costs and obligations not borne by utilities, and so there is no financial advantage. The private producers also argue that on balance more risk is shifted away from utilities than to them, and that incentives for reliability are strong. In this study we examine the project finance mechanisms used in private power lending in detail, relying on a sample of actual loan documents. This review and its findings should be relevant to the further evolution of this debate. State regulatory commissions are likely to be interested in it, and Federal legislation to amend the Public Utility Holding Company Act (PUHCA) could require states to consider the implications of debt leveraging in relation to their oversight of utility power purchase programs.

  3. A Model Suggestion to Predict Leverage Ratio for Construction Projects

    Directory of Open Access Journals (Sweden)

    Özlem Tüz

    2013-12-01

    Full Text Available Due to the nature, construction is an industry with high uncertainty and risk. Construction industry carries high leverage ratios. Firms with low equities work in big projects through progress payment system, but in this case, even a small negative in the planned cash flows constitute a major risk for the company.The use of leverage, with a small investment to achieve profit targets large-scale, high-profit, but also brings a high risk with it. Investors may lose all or the portion of the money. In this study, monitoring and measuring of the leverage ratio because of the displacement in cash inflows of construction projects which uses high leverage and low cash to do business in the sector is targeted. Cash need because of drifting the cash inflows may be seen due to the model. Work should be done in the early stages of the project with little capital but in the later stages, rapidly growing capital need arises.The values obtained from the model may be used to supply the capital held in the right time by anticipating the risks because of the delay in cashflow of construction projects which uses high leverage ratio.

  4. Massive hydraulic fracturing gas stimulation project

    International Nuclear Information System (INIS)

    Appledorn, C.R.; Mann, R.L.

    1977-01-01

    The Rio Blanco Massive Hydraulic Fracturing Project was fielded in 1974 as a joint Industry/ERDA demonstration to test the relative formations that were stimulated by the Rio Blanco Nuclear fracturing experiment. The project is a companion effort to and a continuation of the preceding nuclear stimulation project, which took place in May 1973. 8 figures

  5. A Model Suggestion to Predict Leverage Ratio for Construction Projects

    OpenAIRE

    Özlem Tüz; Şafak Ebesek

    2013-01-01

    Due to the nature, construction is an industry with high uncertainty and risk. Construction industry carries high leverage ratios. Firms with low equities work in big projects through progress payment system, but in this case, even a small negative in the planned cash flows constitute a major risk for the company.The use of leverage, with a small investment to achieve profit targets large-scale, high-profit, but also brings a high risk with it. Investors may lose all or the portion of th...

  6. eButterfly: Leveraging Massive Online Citizen Science for Butterfly Conservation

    Science.gov (United States)

    Prudic, Kathleen L.; McFarland, Kent P.; Oliver, Jeffrey C.; Hutchinson, Rebecca A.; Long, Elizabeth C.; Kerr, Jeremy T.; Larrivée, Maxim

    2017-01-01

    Data collection, storage, analysis, visualization, and dissemination are changing rapidly due to advances in new technologies driven by computer science and universal access to the internet. These technologies and web connections place human observers front and center in citizen science-driven research and are critical in generating new discoveries and innovation in such fields as astronomy, biodiversity, and meteorology. Research projects utilizing a citizen science approach address scientific problems at regional, continental, and even global scales otherwise impossible for a single lab or even a small collection of academic researchers. Here we describe eButterfly an integrative checklist-based butterfly monitoring and database web-platform that leverages the skills and knowledge of recreational butterfly enthusiasts to create a globally accessible unified database of butterfly observations across North America. Citizen scientists, conservationists, policy makers, and scientists are using eButterfly data to better understand the biological patterns of butterfly species diversity and how environmental conditions shape these patterns in space and time. eButterfly in collaboration with thousands of butterfly enthusiasts has created a near real-time butterfly data resource producing tens of thousands of observations per year open to all to share and explore. PMID:28524117

  7. Rio Blanco massive hydraulic fracture: project definition

    International Nuclear Information System (INIS)

    1976-01-01

    A recent Federal Power Commission feasibility study assessed the possibility of economically producing gas from three Rocky Mountain basins. These basins have potentially productive horizons 2,000 to 4,000 feet thick containing an estimated total of 600 trillion cubic feet of gas in place. However, the producing sands are of such low permeability and heterogeneity that conventional methods have failed to develop these basins economically. The Natural Gas Technology Task Force, responsible for preparing the referenced feasibility study, determined that, if effective well stimulation methods for these basins can be developed, it might be possible to recover 40 to 50 percent of the gas in place. The Task Force pointed out two possible underground fracturing methods: Nuclear explosive fracturing, and massive hydraulic fracturing. They argued that once technical viability has been demonstrated, and with adequate economic incentives, there should be no reason why one or even both of these approaches could not be employed, thus making a major contribution toward correcting the energy deficiency of the Nation. A joint Government-industry demonstration program has been proposed to test the relative effectiveness of massive hydraulic fracturing of the same formation and producing horizons that were stimulated by the Rio Blanco nuclear project

  8. Human genome project: revolutionizing biology through leveraging technology

    Science.gov (United States)

    Dahl, Carol A.; Strausberg, Robert L.

    1996-04-01

    The Human Genome Project (HGP) is an international project to develop genetic, physical, and sequence-based maps of the human genome. Since the inception of the HGP it has been clear that substantially improved technology would be required to meet the scientific goals, particularly in order to acquire the complete sequence of the human genome, and that these technologies coupled with the information forthcoming from the project would have a dramatic effect on the way biomedical research is performed in the future. In this paper, we discuss the state-of-the-art for genomic DNA sequencing, technological challenges that remain, and the potential technological paths that could yield substantially improved genomic sequencing technology. The impact of the technology developed from the HGP is broad-reaching and a discussion of other research and medical applications that are leveraging HGP-derived DNA analysis technologies is included. The multidisciplinary approach to the development of new technologies that has been successful for the HGP provides a paradigm for facilitating new genomic approaches toward understanding the biological role of functional elements and systems within the cell, including those encoded within genomic DNA and their molecular products.

  9. Leveraging Text Content for Management of Construction Project Documents

    Science.gov (United States)

    Alqady, Mohammed

    2012-01-01

    The construction industry is a knowledge intensive industry. Thousands of documents are generated by construction projects. Documents, as information carriers, must be managed effectively to ensure successful project management. The fact that a single project can produce thousands of documents and that a lot of the documents are generated in a…

  10. Open Crowdsourcing: Leveraging Community Software Developers for IT Projects

    Science.gov (United States)

    Phair, Derek

    2012-01-01

    This qualitative exploratory single-case study was designed to examine and understand the use of volunteer community participants as software developers and other project related roles, such as testers, in completing a web-based application project by a non-profit organization. This study analyzed the strategic decision to engage crowd…

  11. Leveraging Entrepreneurship through the design of Artificial Intelligence Projects

    OpenAIRE

    Osegi , E.N; Wokoma , B.A; Bruce-Allison , S.A

    2017-01-01

    Conference Proceeding: Port-Harcourt School of Engineering Science and Technology, Port-Harcourt, Rivers State, Nigeria, 2017; International audience; Artificial Intelligence projects (AIP), is currently attracting popular attention as a viable business area for young and mature entrepreneurs. Most industries, particularly in research and development, now use AIPs for the discovery and synthesis of countless of novel products and services of incomprehensible commercial and functional value. H...

  12. Analyzing Naval Strategy for Counterpiracy Operations, Using the Massive Multiplayer Online War Game Leveraging the Internet (MMOWGLI) and Discrete Event Simulation (DES)

    Science.gov (United States)

    2013-03-01

    XML program leveraging the Netbeans platform. X3D–Edit can launch X3D scenes for rendering in any X3D compliant 3D browser, including Xj3D, a Java...adding to the Netbeans or Eclipse library for the project. Once this is done creating a Sandbox Frame and a Sandbox is a straightforward process. The

  13. Data Analysis Project: Leveraging Massive Textual Corpora Using n-Gram Statistics

    Science.gov (United States)

    2008-05-01

    oysters cereals oatmeal, sorghum, Frosted Flakes, wheat , Cheerios, oats, maize, rye, millet, bran cities Chicago, Beijing, San Francisco, Los Angeles...Coolmax flatworms no extractions fruits apples, oranges, bananas, grapes, strawberries, peaches, mangoes, mango, peach, pineapple fungi email directories...incidents, leap years, renewals, monsoon, collaborators, HBO, Showtime, carrots, VAT, proposals organisms fungi , humans, algae, Coliform bacteria, E. coli

  14. Leverage bubble

    Science.gov (United States)

    Yan, Wanfeng; Woodard, Ryan; Sornette, Didier

    2012-01-01

    Leverage is strongly related to liquidity in a market and lack of liquidity is considered a cause and/or consequence of the recent financial crisis. A repurchase agreement is a financial instrument where a security is sold simultaneously with an agreement to buy it back at a later date. Repurchase agreement (repo) market size is a very important element in calculating the overall leverage in a financial market. Therefore, studying the behavior of repo market size can help to understand a process that can contribute to the birth of a financial crisis. We hypothesize that herding behavior among large investors led to massive over-leveraging through the use of repos, resulting in a bubble (built up over the previous years) and subsequent crash in this market in early 2008. We use the Johansen-Ledoit-Sornette (JLS) model of rational expectation bubbles and behavioral finance to study the dynamics of the repo market that led to the crash. The JLS model qualifies a bubble by the presence of characteristic patterns in the price dynamics, called log-periodic power law (LPPL) behavior. We show that there was significant LPPL behavior in the market before that crash and that the predicted range of times predicted by the model for the end of the bubble is consistent with the observations.

  15. Embedded Leverage

    DEFF Research Database (Denmark)

    Frazzini, Andrea; Heje Pedersen, Lasse

    find that asset classes with embedded leverage offer low risk-adjusted returns and, in the cross-section, higher embedded leverage is associated with lower returns. A portfolio which is long low-embedded-leverage securities and short high-embedded-leverage securities earns large abnormal returns...

  16. Monitoring Leverage

    DEFF Research Database (Denmark)

    Geanakoplos, John; Heje Pedersen, Lasse

    2014-01-01

    measure of systemic risk. Indeed, systemic crises tend to erupt when highly leveraged economic agents are forced to deleverage, sending the economy into recession. We emphasize the importance of measuring both the average leverage on old loans (which captures the economy's vulnerability) and the leverage...... offered on new loans (which captures current credit conditions) since the economy enters a crisis when leverage on new loans is low and leverage on old loans is high. While leverage plays an important role in several economic models, the data on leverage is model-free and simply needs to be collected...

  17. THE MILKY WAY PROJECT: A STATISTICAL STUDY OF MASSIVE STAR FORMATION ASSOCIATED WITH INFRARED BUBBLES

    Energy Technology Data Exchange (ETDEWEB)

    Kendrew, S.; Robitaille, T. P. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); Simpson, R.; Lintott, C. J. [Department of Astrophysics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Bressert, E. [School of Physics, University of Exeter, Stocker Road, Exeter EX4 4QL (United Kingdom); Povich, M. S. [Department of Astronomy and Astrophysics, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States); Sherman, R. [Department of Astronomy and Astrophysics, University of Chicago, 5640 S. Ellis Avenue, Chicago, IL 60637 (United States); Schawinski, K. [Yale Center for Astronomy and Astrophysics, Yale University, P.O. Box 208121, New Haven, CT 06520 (United States); Wolf-Chase, G., E-mail: kendrew@mpia.de [Astronomy Department, Adler Planetarium, 1300 S. Lake Shore Drive, Chicago, IL 60605 (United States)

    2012-08-10

    The Milky Way Project citizen science initiative recently increased the number of known infrared bubbles in the inner Galactic plane by an order of magnitude compared to previous studies. We present a detailed statistical analysis of this data set with the Red MSX Source (RMS) catalog of massive young stellar sources to investigate the association of these bubbles with massive star formation. We particularly address the question of massive triggered star formation near infrared bubbles. We find a strong positional correlation of massive young stellar objects (MYSOs) and H II regions with Milky Way Project bubbles at separations of <2 bubble radii. As bubble sizes increase, a statistically significant overdensity of massive young sources emerges in the region of the bubble rims, possibly indicating the occurrence of triggered star formation. Based on numbers of bubble-associated RMS sources, we find that 67% {+-} 3% of MYSOs and (ultra-)compact H II regions appear to be associated with a bubble. We estimate that approximately 22% {+-} 2% of massive young stars may have formed as a result of feedback from expanding H II regions. Using MYSO-bubble correlations, we serendipitously recovered the location of the recently discovered massive cluster Mercer 81, suggesting the potential of such analyses for discovery of heavily extincted distant clusters.

  18. THE MILKY WAY PROJECT: A STATISTICAL STUDY OF MASSIVE STAR FORMATION ASSOCIATED WITH INFRARED BUBBLES

    International Nuclear Information System (INIS)

    Kendrew, S.; Robitaille, T. P.; Simpson, R.; Lintott, C. J.; Bressert, E.; Povich, M. S.; Sherman, R.; Schawinski, K.; Wolf-Chase, G.

    2012-01-01

    The Milky Way Project citizen science initiative recently increased the number of known infrared bubbles in the inner Galactic plane by an order of magnitude compared to previous studies. We present a detailed statistical analysis of this data set with the Red MSX Source (RMS) catalog of massive young stellar sources to investigate the association of these bubbles with massive star formation. We particularly address the question of massive triggered star formation near infrared bubbles. We find a strong positional correlation of massive young stellar objects (MYSOs) and H II regions with Milky Way Project bubbles at separations of <2 bubble radii. As bubble sizes increase, a statistically significant overdensity of massive young sources emerges in the region of the bubble rims, possibly indicating the occurrence of triggered star formation. Based on numbers of bubble-associated RMS sources, we find that 67% ± 3% of MYSOs and (ultra-)compact H II regions appear to be associated with a bubble. We estimate that approximately 22% ± 2% of massive young stars may have formed as a result of feedback from expanding H II regions. Using MYSO-bubble correlations, we serendipitously recovered the location of the recently discovered massive cluster Mercer 81, suggesting the potential of such analyses for discovery of heavily extincted distant clusters.

  19. Leveraging Economy of Scale across Construction Projects by Implementing Coordinated Purchasing

    DEFF Research Database (Denmark)

    Thuesen, Christian Langhoff

    2010-01-01

    coordinated purchasing is an important step in the attempt to rethink the existing business model in construction. Going from competing on overhead (in a red ocean) to start to compete on company specific core competencies. The paper concludes highlighting the next milestones at the journey leveraging economy...

  20. On ISSM and leveraging the Cloud towards faster quantification of the uncertainty in ice-sheet mass balance projections

    Science.gov (United States)

    Larour, E.; Schlegel, N.

    2016-11-01

    With the Amazon EC2 Cloud becoming available as a viable platform for parallel computing, Earth System Models are increasingly interested in leveraging its capabilities towards improving climate projections. In particular, faced with long wait periods on high-end clusters, the elasticity of the Cloud presents a unique opportunity of potentially "infinite" availability of small-sized clusters running on high-performance instances. Among specific applications of this new paradigm, we show here how uncertainty quantification in climate projections of polar ice sheets (Antarctica and Greenland) can be significantly accelerated using the Cloud. Indeed, small-sized clusters are very efficient at delivering sensitivity and sampling analysis, core tools of uncertainty quantification. We demonstrate how this approach was used to carry out an extensive analysis of ice-flow projections on one of the largest basins in Greenland, the North-East Greenland Glacier, using the Ice Sheet System Model, the public-domain NASA-funded ice-flow modeling software. We show how errors in the projections were accurately quantified using Monte-Carlo sampling analysis on the EC2 Cloud, and how a judicious mix of high-end parallel computing and Cloud use can best leverage existing infrastructures, and significantly accelerate delivery of potentially ground-breaking climate projections, and in particular, enable uncertainty quantification that were previously impossible to achieve.

  1. Leveraging Smart Open Innovation for Achieving Cultural Sustainability: Learning from a New City Museum Project

    Directory of Open Access Journals (Sweden)

    Luisa Errichiello

    2018-06-01

    Full Text Available In recent years, cultural sustainability has attracted increasing attention within the discourse of sustainable development and sustainable cities. Notwithstanding some effort put on conceptualizing the relationship between culture and sustainability, research on the issue is still in a pre-paradigmatic stage and related empirical studies are scant. In particular, further knowledge is required to understand not only how cultural sustainability has been addressed strategically but also implemented in practice. In this direction, research has pointed out the role of social structures (e.g., partnerships, collaborations, etc. for achieving cultural sustainability goals. However, focusing on smart cities, attention is limited to how collaborative arrangements can be leveraged within the development of new city services (e.g., smart open innovation to sustain goals of environmental, economic and social sustainability, with cultural sustainability still playing a marginal role. This paper develops a new framework linking together the strategic level and the practice level in addressing cultural sustainability and conceptualizing the role of collaborative structures in the development of smart innovation. The framework is then used as a frame of reference for analyzing the case of MuseoTorino, a new city museum realized within the smart city strategy of Turin (Italy. The analysis provides evidence of some practices adopted to leverage collaboration and stakeholders’ engagement to strategically address cultural sustainability and to realize it in practice throughout the new service development process.

  2. Developing a Massively Parallel Forward Projection Radiography Model for Large-Scale Industrial Applications

    Energy Technology Data Exchange (ETDEWEB)

    Bauerle, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-08-01

    This project utilizes Graphics Processing Units (GPUs) to compute radiograph simulations for arbitrary objects. The generation of radiographs, also known as the forward projection imaging model, is computationally intensive and not widely utilized. The goal of this research is to develop a massively parallel algorithm that can compute forward projections for objects with a trillion voxels (3D pixels). To achieve this end, the data are divided into blocks that can each t into GPU memory. The forward projected image is also divided into segments to allow for future parallelization and to avoid needless computations.

  3. Two-spinor description of massive particles and relativistic spin projection operators

    Directory of Open Access Journals (Sweden)

    A.P. Isaev

    2018-04-01

    Full Text Available On the basis of the Wigner unitary representations of the covering group ISL(2,C of the Poincaré group, we obtain spin-tensor wave functions of free massive particles with arbitrary spin. The wave functions automatically satisfy the Dirac–Pauli–Fierz equations. In the framework of the two-spinor formalism we construct spin-vectors of polarizations and obtain conditions that fix the corresponding relativistic spin projection operators (Behrends–Fronsdal projection operators. With the help of these conditions we find explicit expressions for relativistic spin projection operators for integer spins (Behrends–Fronsdal projection operators and then find relativistic spin projection operators for half integer spins. These projection operators determine the numerators in the propagators of fields of relativistic particles. We deduce generalizations of the Behrends–Fronsdal projection operators for arbitrary space–time dimensions D>2.

  4. Two-spinor description of massive particles and relativistic spin projection operators

    Science.gov (United States)

    Isaev, A. P.; Podoinitsyn, M. A.

    2018-04-01

    On the basis of the Wigner unitary representations of the covering group ISL (2 , C) of the Poincaré group, we obtain spin-tensor wave functions of free massive particles with arbitrary spin. The wave functions automatically satisfy the Dirac-Pauli-Fierz equations. In the framework of the two-spinor formalism we construct spin-vectors of polarizations and obtain conditions that fix the corresponding relativistic spin projection operators (Behrends-Fronsdal projection operators). With the help of these conditions we find explicit expressions for relativistic spin projection operators for integer spins (Behrends-Fronsdal projection operators) and then find relativistic spin projection operators for half integer spins. These projection operators determine the numerators in the propagators of fields of relativistic particles. We deduce generalizations of the Behrends-Fronsdal projection operators for arbitrary space-time dimensions D > 2.

  5. Industrial Sponsor Perspective on Leveraging Capstone Design Projects to Enhance Their Business

    Science.gov (United States)

    Weissbach, Robert S.; Snyder, Joseph W.; Evans, Edward R., Jr.; Carucci, James R., Jr.

    2017-01-01

    Capstone design projects have become commonplace among engineering and engineering technology programs. These projects are valuable tools when assessing students, as they require students to work in teams, communicate effectively, and demonstrate technical competency. The use of industrial sponsors enhances these projects by giving these projects…

  6. Bill project related to the struggle against the proliferation of arms of massive destruction and their vectors

    International Nuclear Information System (INIS)

    2011-01-01

    This bill project addresses several issues: the struggle against proliferation of arms of massive destruction (nuclear weapons, nuclear materials, biological weapons, and chemical weapons), the struggle against proliferation of vectors of arms of massive destruction, double-use goods, the use of these weapons and vectors in acts of terrorism

  7. THE MILKY WAY PROJECT: LEVERAGING CITIZEN SCIENCE AND MACHINE LEARNING TO DETECT INTERSTELLAR BUBBLES

    International Nuclear Information System (INIS)

    Beaumont, Christopher N.; Williams, Jonathan P.; Goodman, Alyssa A.; Kendrew, Sarah; Simpson, Robert

    2014-01-01

    We present Brut, an algorithm to identify bubbles in infrared images of the Galactic midplane. Brut is based on the Random Forest algorithm, and uses bubbles identified by >35,000 citizen scientists from the Milky Way Project to discover the identifying characteristics of bubbles in images from the Spitzer Space Telescope. We demonstrate that Brut's ability to identify bubbles is comparable to expert astronomers. We use Brut to re-assess the bubbles in the Milky Way Project catalog, and find that 10%-30% of the objects in this catalog are non-bubble interlopers. Relative to these interlopers, high-reliability bubbles are more confined to the mid-plane, and display a stronger excess of young stellar objects along and within bubble rims. Furthermore, Brut is able to discover bubbles missed by previous searches—particularly bubbles near bright sources which have low contrast relative to their surroundings. Brut demonstrates the synergies that exist between citizen scientists, professional scientists, and machine learning techniques. In cases where ''untrained' citizens can identify patterns that machines cannot detect without training, machine learning algorithms like Brut can use the output of citizen science projects as input training sets, offering tremendous opportunities to speed the pace of scientific discovery. A hybrid model of machine learning combined with crowdsourced training data from citizen scientists can not only classify large quantities of data, but also address the weakness of each approach if deployed alone

  8. THE MILKY WAY PROJECT: LEVERAGING CITIZEN SCIENCE AND MACHINE LEARNING TO DETECT INTERSTELLAR BUBBLES

    Energy Technology Data Exchange (ETDEWEB)

    Beaumont, Christopher N.; Williams, Jonathan P. [Institute for Astronomy, University of Hawai' i, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States); Goodman, Alyssa A. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Kendrew, Sarah; Simpson, Robert, E-mail: beaumont@ifa.hawaii.edu [Department of Astrophysics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom)

    2014-09-01

    We present Brut, an algorithm to identify bubbles in infrared images of the Galactic midplane. Brut is based on the Random Forest algorithm, and uses bubbles identified by >35,000 citizen scientists from the Milky Way Project to discover the identifying characteristics of bubbles in images from the Spitzer Space Telescope. We demonstrate that Brut's ability to identify bubbles is comparable to expert astronomers. We use Brut to re-assess the bubbles in the Milky Way Project catalog, and find that 10%-30% of the objects in this catalog are non-bubble interlopers. Relative to these interlopers, high-reliability bubbles are more confined to the mid-plane, and display a stronger excess of young stellar objects along and within bubble rims. Furthermore, Brut is able to discover bubbles missed by previous searches—particularly bubbles near bright sources which have low contrast relative to their surroundings. Brut demonstrates the synergies that exist between citizen scientists, professional scientists, and machine learning techniques. In cases where ''untrained' citizens can identify patterns that machines cannot detect without training, machine learning algorithms like Brut can use the output of citizen science projects as input training sets, offering tremendous opportunities to speed the pace of scientific discovery. A hybrid model of machine learning combined with crowdsourced training data from citizen scientists can not only classify large quantities of data, but also address the weakness of each approach if deployed alone.

  9. Research capacity building integrated into PHIT projects: leveraging research and research funding to build national capacity.

    Science.gov (United States)

    Hedt-Gauthier, Bethany L; Chilengi, Roma; Jackson, Elizabeth; Michel, Cathy; Napua, Manuel; Odhiambo, Jackline; Bawah, Ayaga

    2017-12-21

    Inadequate research capacity impedes the development of evidence-based health programming in sub-Saharan Africa. However, funding for research capacity building (RCB) is often insufficient and restricted, limiting institutions' ability to address current RCB needs. The Doris Duke Charitable Foundation's African Health Initiative (AHI) funded Population Health Implementation and Training (PHIT) partnership projects in five African countries (Ghana, Mozambique, Rwanda, Tanzania and Zambia) to implement health systems strengthening initiatives inclusive of RCB. Using Cooke's framework for RCB, RCB activity leaders from each country reported on RCB priorities, activities, program metrics, ongoing challenges and solutions. These were synthesized by the authorship team, identifying common challenges and lessons learned. For most countries, each of the RCB domains from Cooke's framework was a high priority. In about half of the countries, domain specific activities happened prior to PHIT. During PHIT, specific RCB activities varied across countries. However, all five countries used AHI funding to improve research administrative support and infrastructure, implement research trainings and support mentorship activities and research dissemination. While outcomes data were not systematically collected, countries reported holding 54 research trainings, forming 56 mentor-mentee relationships, training 201 individuals and awarding 22 PhD and Masters-level scholarships. Over the 5 years, 116 manuscripts were developed. Of the 59 manuscripts published in peer-reviewed journals, 29 had national first authors and 18 had national senior authors. Trainees participated in 99 conferences and projects held 37 forums with policy makers to facilitate research translation into policy. All five PHIT projects strongly reported an increase in RCB activities and commended the Doris Duke Charitable Foundation for prioritizing RCB, funding RCB at adequate levels and time frames and for allowing

  10. Implications of commodity price risk and operating leverage on petroleum project economic evaluations

    International Nuclear Information System (INIS)

    Salahor, G.; Laughton, D.G.

    1999-01-01

    The modern asset pricing method, MAP, can provide businesses with improved tools for economic analysis. This in turn leads to greater precision in the analysis of the effects of the following parameters: project structure, time, and uncertainty. This greater precision with MAP extends to analysis of the possibility for active control of the decision alternatives for managers in the petroleum business, especially where this possibility is not questioned. A methodology is developed as a model that quantifies revenue risk based on the nature of commodity price volatility and the accepted price of risk in the commodity market. A mathematical description is included of a natural gas log-normal distribution incorporating the annual volatility in the forecast, and a measure of the rate at which volatility decreases in the long run in the forecast. Give this volatility model, a risk discount factor is determinable and applicable to the current expectation of the commodity prices at a given time, and a discount time factor of all parts of the cash flow stream. Cases are used to evaluate a natural gas development project for the purpose of yielding scenarios for capital vs. operating cost trade-offs, price risk management, production profile, and the effect of the reverting vs. non-reverting price model. In application one, a comparison is made of discounted cash flow (DCF) to MAP evaluations giving a perspective on the various development choices which a producer has through third-party service providers. Further, an example is used to compare the two methods as alternative evaluations of development alternatives to speed up or slow down the production rate and decline profile of a gas field. As in the first example, the DCF discounting is higher than the net discounting in the MAP evaluation. But in this example both methods produce the same project structure decision. The small amount of incremental capital and operating costs needed for the higher production case are

  11. Five secrets to leveraging maximum buying power with your media project.

    Science.gov (United States)

    Hirsch, Lonnie

    2010-11-01

    Planning and executing a successful media campaign or project requires knowledge and expert execution of specific techniques and skills, including understanding of the requirements for proper media research and competitive intelligence, effective planning of media schedules, negotiation of best rates with media companies, monitoring the campaign, accurately tracking and evaluating results, and making smart adjustments based on tracking data to maximize the profitability and success of the enterprise. Some of the most important knowledge and techniques are not generally known by most advertisers, particularly small businesses like health care practices. This article reveals these tips that are the most effective and includes information on the use of experts and other professional resources that help increase the likelihood of a successful outcome for a well-planned and executed media campaign. Copyright © 2010 Elsevier Inc. All rights reserved.

  12. MedHySol: Future federator project of massive production of solar hydrogen

    Energy Technology Data Exchange (ETDEWEB)

    Mahmah, Bouziane; Harouadi, Farid; Chader, Samira; Belhamel, Maiouf; M' Raoui, Abdelhamid; Abdeladim, Kamel [CDER, BP 62, Route de l' Observatoire, Bouzareah, Alger (Algeria); Benmoussa, H. [LESEI, Universite de Batna, Batna (Algeria); Cherigui, Adel Nasser [Universite Joseph Fourier Grenoble I, BP 87, Saint-Martin-D' Heres 38400 (France); Etievant, Claude [CETH, Innov' valley Entreprises, 91460 Marcoussis (France)

    2009-06-15

    Mediterranean Hydrogen Solar (MedHySol) is a federator project for development of a massive hydrogen production starting from solar energy and its exportation within a framework of a Euro-Maghrebian Cooperation project for industrial and energetic needs in the Mediterranean basin. The proposal of this project is included in the Algiers Declaration's on Hydrogen from Renewable Origin following the organization of the first international workshop on hydrogen which was held in 2005. Algeria is the privileged site to receive the MedHySol platform. The objective of the first step of the project is to realize a technological platform allowing the evaluation of emergent technologies of hydrogen production from solar energy with a significant size (10-100 kW) and to maintain the development of energetic rupture technologies. The second step of the project is to implement the most effective and less expensive technologies to pilot great projects (1-1000 MW). In this article we present the potentialities and the feasibility of MedHySol, as well as the fundamental elements for a scientific and technical supervision of this great project. (author)

  13. Leveraging Google Geo Tools for Interactive STEM Education: Insights from the GEODE Project

    Science.gov (United States)

    Dordevic, M.; Whitmeyer, S. J.; De Paor, D. G.; Karabinos, P.; Burgin, S.; Coba, F.; Bentley, C.; St John, K. K.

    2016-12-01

    Web-based imagery and geospatial tools have transformed our ability to immerse students in global virtual environments. Google's suite of geospatial tools, such as Google Earth (± Engine), Google Maps, and Street View, allow developers and instructors to create interactive and immersive environments, where students can investigate and resolve common misconceptions in STEM concepts and natural processes. The GEODE (.net) project is developing digital resources to enhance STEM education. These include virtual field experiences (VFEs), such as an interactive visualization of the breakup of the Pangaea supercontinent, a "Grand Tour of the Terrestrial Planets," and GigaPan-based VFEs of sites like the Canadian Rockies. Web-based challenges, such as EarthQuiz (.net) and the "Fold Analysis Challenge," incorporate scaffolded investigations of geoscience concepts. EarthQuiz features web-hosted imagery, such as Street View, Photo Spheres, GigaPans, and Satellite View, as the basis for guided inquiry. In the Fold Analysis Challenge, upper-level undergraduates use Google Earth to evaluate a doubly-plunging fold at Sheep Mountain, WY. GEODE.net also features: "Reasons for the Seasons"—a Google Earth-based visualization that addresses misconceptions that abound amongst students, teachers, and the public, many of whom believe that seasonality is caused by large variations in Earth's distance from the Sun; "Plate Euler Pole Finder," which helps students understand rotational motion of tectonic plates on the globe; and "Exploring Marine Sediments Using Google Earth," an exercise that uses empirical data to explore the surficial distribution of marine sediments in the modern ocean. The GEODE research team includes the authors and: Heather Almquist, Cinzia Cervato, Gene Cooper, Helen Crompton, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Barb Tewksbury, and their students and collaborating colleagues. We are supported by NSF DUE 1323419 and a Google Geo

  14. Massive graviton propagation of the deformed Horava-Lifshitz gravity without projectability condition

    International Nuclear Information System (INIS)

    Myung, Yun Soo

    2010-01-01

    We study graviton propagations of scalar, vector, and tensor modes in the deformed Horava-Lifshitz gravity (λR-model) without projectability condition. The quadratic Lagrangian is invariant under diffeomorphism only for λ=1 case, which contradicts to the fact that λ is irrelevant to a consistent Hamiltonian approach to the λR-model. In this case, as far as scalar propagations are concerned, there is no essential difference between deformed Horava-Lifshitz gravity (λR-model) and general relativity. This implies that there are two degrees of freedom for a massless graviton without Horava scalar, and five degrees of freedom appear for a massive graviton when introducing Lorentz-violating and Fierz-Pauli mass terms. Finally, it is shown that for λ=1, the vDVZ discontinuity is absent in the massless limit of Lorentz-violating mass terms by considering external source terms.

  15. The Riverscape Analysis Project: Using Remote Sensing to Leverage Salmon Science and Management Applications Around the Pacific Rim

    Science.gov (United States)

    Chilcote, S.; Maumenee, N.; Lucotch, J.; Whited, D.; Bansack, T.; Kimball, J. S.; Stanford, J.

    2009-12-01

    The Salmonid Rivers Observatory Network (SaRON) is an intensive field research project which aims to describe the relation between salmon productivion and diversity in relation to environmental drivers and physical complexity of riverine shifting habitat mosaics. The Riverscape Analysis Project (RAP) is a spatially explicit remote sensing database which quantifies and ranks different combinations of physical landscape metrics around the Pacific Rim, displaying results through a publically accessible web based decision support framework designed to empower regional management and conservation efforts for wild salmon. The objective of our research is to explicitly describe and relate different habitat types and their potential fish production at a variety of scales and throughout the range of Pacific salmon, leveraging our field research through available satellite remote sensing and geospatial analysis. We find that rivers exhibit a range of physical, chemical, and biotic conditions consistent with the shifting habitat mosaic (SHM) concept. Landscape physical variables derived from global Landsat imagery and SRTM-DEM information explain 93.2% of observed variability in over 1500 watersheds across the Pacific Rim. We expect that it is these coarse scale differences in river typologies which are responsible for the fine scale differences in habitat conditions and juvenile salmon production. Therefore, we ranked rivers using landscape scale physical variables to prioritize them for management actions based on potential productivity. For example, the Kvichak River of Bristol Bay is highly ranked, 8th, based on its physical landscape structure as well as current human impacts. Currently, the Bristol Bay fishery is extremely productive. Habitat structure can be used not only to define reference conditions and management targets for how many fish we would expect a river to produce based on its potential habitat capacity, but it also provides new analytical tools to

  16. Leveraging CRT Awareness in Creating Web-Based Projects through Use of Online Collaborative Learning for Pre-Service Teachers

    Science.gov (United States)

    Chuang, Hsueh-Hua

    2016-01-01

    This paper explores the roles played by cloud computing technologies and social media in facilitating a learning community for online group collaborative learning, and particularly explores opportunities and challenges in leveraging culturally responsive teaching (CRT) awareness in educational technology. It describes implementation of a…

  17. Bill project related to the struggle against the proliferation of arms of massive destruction and their vectors; Projet de Loi relatif a la lutte contre la proliferation des armes de destruction massive et de leurs vecteurs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This bill project addresses several issues: the struggle against proliferation of arms of massive destruction (nuclear weapons, nuclear materials, biological weapons, and chemical weapons), the struggle against proliferation of vectors of arms of massive destruction, double-use goods, the use of these weapons and vectors in acts of terrorism

  18. The TESIS Project: Revealing Massive Early-Type Galaxies at z > 1

    Science.gov (United States)

    Saracco, P.; Longhetti, M.; Severgnini, P.; Della Ceca, R.; Braito, V.; Bender, R.; Drory, N.; Feulner, G.; Hopp, U.; Mannucci, F.; Maraston, C.

    How and when present-day massive early-type galaxies built up and what type of evolution has characterized their growth (star formation and/or merging) still remain open issues. The different competing scenarios of galaxy formation predict much different properties of early-type galaxies at z > 1. The "monolithic" collapse predicts that massive spheroids formed at high redshift (z > 2.5-3) and that their comoving density is constant at z 1, their comoving density decreases from z = 0 to z ~ 1.5 and they should experience their last burst of star formation at z 1 can be probed observationally once a well defined sample of massive early-types at z > 1 is available. We are constructing such a sample through a dedicated near-IR very low resolution (λ/Δλ≃50) spectroscopic survey (TNG EROs Spectroscopic Identification Survey, TESIS, [6]) of a complete sample of 30 bright (K < 18.5) Extremely Red Objects (EROs).

  19. AN APPROACH TO THE QUALITY IMPROVEMENT OF A MASSIVE INVESTMENT PROJECT BY INTEGRATING ICT AND QMS

    Directory of Open Access Journals (Sweden)

    Tamara Gvozdenovic

    2007-12-01

    Full Text Available This work has presented an approach to the quality improvement of an investment project by the change in the concept of project management. Building time of the investment project is a complex factor which needs a special attention. It is well known that the PERT method has been applied with long-lasting investment projects, where a big time distance brings about significant uncertainty of future situations. Microsoft Project 2002 and Matlab: Neural Network Toolbox are the software tools used for solving the problem of investment project management.

  20. Time-dependent density-functional theory in massively parallel computer architectures: the OCTOPUS project.

    Science.gov (United States)

    Andrade, Xavier; Alberdi-Rodriguez, Joseba; Strubbe, David A; Oliveira, Micael J T; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Louie, Steven G; Aspuru-Guzik, Alán; Rubio, Angel; Marques, Miguel A L

    2012-06-13

    Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures.

  1. Time-dependent density-functional theory in massively parallel computer architectures: the octopus project

    Science.gov (United States)

    Andrade, Xavier; Alberdi-Rodriguez, Joseba; Strubbe, David A.; Oliveira, Micael J. T.; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Louie, Steven G.; Aspuru-Guzik, Alán; Rubio, Angel; Marques, Miguel A. L.

    2012-06-01

    Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures.

  2. Time-dependent density-functional theory in massively parallel computer architectures: the octopus project

    International Nuclear Information System (INIS)

    Andrade, Xavier; Aspuru-Guzik, Alán; Alberdi-Rodriguez, Joseba; Rubio, Angel; Strubbe, David A; Louie, Steven G; Oliveira, Micael J T; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Marques, Miguel A L

    2012-01-01

    Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures. (topical review)

  3. The Leverage Ratchet Effect

    OpenAIRE

    Anat R. Admati; Peter M. DeMarzo; Martin F. Hellwig; Paul Pfleiderer

    2013-01-01

    Shareholder-creditor conflicts can create leverage ratchet effects, resulting in inefficient capital structures. Once debt is in place, shareholders may inefficiently increase leverage but avoid reducing it no matter how beneficial leverage reduction might be to total firm value. We present conditions for an irrelevance result under which shareholders view asset sales, pure recapitalization and asset expansion with new equity as equally undesirable. We then analyze how seniority, asset hetero...

  4. Leverage effect in energy futures

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2014-01-01

    Roč. 45, č. 1 (2014), s. 1-9 ISSN 0140-9883 R&D Projects: GA ČR(CZ) GP14-11402P Grant - others:GA ČR(CZ) GAP402/11/0948 Program:GA Institutional support: RVO:67985556 Keywords : energy commodities * leverage effect * volatility * long-term memory Subject RIV: AH - Economics Impact factor: 2.708, year: 2014 http://library.utia.cas.cz/separaty/2014/E/kristoufek-0433531.pdf

  5. Application of finance project for leverage of small size hydroelectric enterprising; Aplicacao do project finance para alavancagem de empreendimentos hidreletricos de pequeno porte

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Silvana dos

    2003-07-01

    In the same way that the majority of the countries, project financing of substructure in Brazil, in project finance modality, depend on a skillful structure of guaranties and contracts to become possible. In the case of projects of centrals of generation of electrical energy, that financial engineering becomes still more complicated. In Brazil, due to particularities of the sectors of electricity, the arrangements of guaranties requested but creditors pass to present levels of complexity and exigency well elevated. The contractual appliances that give support to the project finance, originally projected to developed countries, request an extreme adaptation to these particularities. The development of Brazil is directly related to its capacity in expanding the offer of electric energy in the just measure of the national necessity. In this context, the small central hydroelectric (PCH's) represent, actually, an efficient and fast form to complete the offer of energy in such a way to supply the crescent demand the national market. For its characteristics, that type of undertaking can be developed by small manager, from among which are the owners of the areas in which on can find these hydraulic potentials which, however they do not dispose of capital to integral raising. These undertakings are tasks, normally, of low global cost, at the rate of US$ 1.000,00/k W, and of a smaller ambient impact, compared to the return that they give to the enterprise and to the Brazilian electric system as a whole, by having to receive special attention in the planned politics to the sector and to merit a series of incentives to become business still more attractive. By thinking in the found difficulty by small enterprises in rising undertakings of generation of electric energy of small port through the convectional mechanisms of financing is being proposed in that work a well-founded methodology in the concepts of the modality of financing project finance. (author)

  6. Application of finance project for leverage of small size hydroelectric enterprising; Aplicacao do project finance para alavancagem de empreendimentos hidreletricos de pequeno porte

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Silvana dos

    2003-07-01

    In the same way that the majority of the countries, project financing of substructure in Brazil, in project finance modality, depend on a skillful structure of guaranties and contracts to become possible. In the case of projects of centrals of generation of electrical energy, that financial engineering becomes still more complicated. In Brazil, due to particularities of the sectors of electricity, the arrangements of guaranties requested but creditors pass to present levels of complexity and exigency well elevated. The contractual appliances that give support to the project finance, originally projected to developed countries, request an extreme adaptation to these particularities. The development of Brazil is directly related to its capacity in expanding the offer of electric energy in the just measure of the national necessity. In this context, the small central hydroelectric (PCH's) represent, actually, an efficient and fast form to complete the offer of energy in such a way to supply the crescent demand the national market. For its characteristics, that type of undertaking can be developed by small manager, from among which are the owners of the areas in which on can find these hydraulic potentials which, however they do not dispose of capital to integral raising. These undertakings are tasks, normally, of low global cost, at the rate of US$ 1.000,00/k W, and of a smaller ambient impact, compared to the return that they give to the enterprise and to the Brazilian electric system as a whole, by having to receive special attention in the planned politics to the sector and to merit a series of incentives to become business still more attractive. By thinking in the found difficulty by small enterprises in rising undertakings of generation of electric energy of small port through the convectional mechanisms of financing is being proposed in that work a well-founded methodology in the concepts of the modality of financing project finance. (author)

  7. Risks of Leveraged Products

    NARCIS (Netherlands)

    A. Di Cesare (Antonio)

    2012-01-01

    textabstractLeveraged investments have become a fundamental feature of modern economies. The new financial products allow people to take greater-than-usual exposures to risk factors. This thesis analyzes several different aspects of the risks involved by some frequently used leveraged products:

  8. Massive Niagara Falls power generation project uses unique concrete locking system

    Energy Technology Data Exchange (ETDEWEB)

    Polski, A. [Con Cast Pipe, Niagara Falls, ON (Canada)

    2006-09-15

    A 512 metre long accelerating wall and a 360 metre-long approach wall in the Niagara River are being built using a novel locking system to withstand the forces of nature. The walls have been designed to direct continuous flow to a new diversion tunnel below the City of Niagara Falls, Ontario. The walls are made of a single row of pre-cast concrete boxes that lock together in a special configuration to prevent movement from extreme load combinations in the Niagara River. The system was designed as part of a larger project to increase the power generating capabilities of the Sir Adam Beck 2 power generation station. Water channelled into the new tunnel will provide an estimated additional 1.6 terawatt-hours of renewable electricity annually and expand capacity at the station by about 15 per cent. The pre-cast reinforced concrete box design was chosen for the walls as it allowed fast and simple assembly of the structures. The basic structural system for each box is 4 vertical panels that form an open rectangular wall. The boxes are filled with clean rock fragments that are uniformly graded. Once the boxes are installed, cast-in-place concrete slabs will be poured to a depth of approximately 600 mm on top of the wall to cap the entire structure. The value of the design-build contract for the Niagara project is nearly $600 million out of an estimated $985 million budget. Commonly used for the design of culverts, the concrete box technology holds promise for applications including the stabilization of shorelines and the construction of small dams. 3 figs.

  9. Massive Gravity

    OpenAIRE

    de Rham, Claudia

    2014-01-01

    We review recent progress in massive gravity. We start by showing how different theories of massive gravity emerge from a higher-dimensional theory of general relativity, leading to the Dvali–Gabadadze–Porrati model (DGP), cascading gravity, and ghost-free massive gravity. We then explore their theoretical and phenomenological consistency, proving the absence of Boulware–Deser ghosts and reviewing the Vainshtein mechanism and the cosmological solutions in these models. Finally, we present alt...

  10. Leveraging the geospatial advantage

    Science.gov (United States)

    Ben Butler; Andrew Bailey

    2013-01-01

    The Wildland Fire Decision Support System (WFDSS) web-based application leverages geospatial data to inform strategic decisions on wildland fires. A specialized data team, working within the Wildland Fire Management Research Development and Application group (WFM RD&A), assembles authoritative national-level data sets defining values to be protected. The use of...

  11. Massive branes

    International Nuclear Information System (INIS)

    Bergshoeff, E.; Ortin, T.

    1998-01-01

    We investigate the effective world-volume theories of branes in a background given by (the bosonic sector of) 10-dimensional massive IIA supergravity (''''massive branes'''') and their M-theoretic origin. In the case of the solitonic 5-brane of type IIA superstring theory the construction of the Wess-Zumino term in the world-volume action requires a dualization of the massive Neveu-Schwarz/Neveu-Schwarz target space 2-form field. We find that, in general, the effective world-volume theory of massive branes contains new world-volume fields that are absent in the massless case, i.e. when the mass parameter m of massive IIA supergravity is set to zero. We show how these new world-volume fields can be introduced in a systematic way. (orig.)

  12. Three-dimensional gyrokinetic particle-in-cell simulation of plasmas on a massively parallel computer: Final report on LDRD Core Competency Project, FY 1991--FY 1993

    International Nuclear Information System (INIS)

    Byers, J.A.; Williams, T.J.; Cohen, B.I.; Dimits, A.M.

    1994-01-01

    One of the programs of the Magnetic fusion Energy (MFE) Theory and computations Program is studying the anomalous transport of thermal energy across the field lines in the core of a tokamak. We use the method of gyrokinetic particle-in-cell simulation in this study. For this LDRD project we employed massively parallel processing, new algorithms, and new algorithms, and new formal techniques to improve this research. Specifically, we sought to take steps toward: researching experimentally-relevant parameters in our simulations, learning parallel computing to have as a resource for our group, and achieving a 100 x speedup over our starting-point Cray2 simulation code's performance

  13. Stochastic volatility and stochastic leverage

    DEFF Research Database (Denmark)

    Veraart, Almut; Veraart, Luitgard A. M.

    This paper proposes the new concept of stochastic leverage in stochastic volatility models. Stochastic leverage refers to a stochastic process which replaces the classical constant correlation parameter between the asset return and the stochastic volatility process. We provide a systematic...... treatment of stochastic leverage and propose to model the stochastic leverage effect explicitly, e.g. by means of a linear transformation of a Jacobi process. Such models are both analytically tractable and allow for a direct economic interpretation. In particular, we propose two new stochastic volatility...... models which allow for a stochastic leverage effect: the generalised Heston model and the generalised Barndorff-Nielsen & Shephard model. We investigate the impact of a stochastic leverage effect in the risk neutral world by focusing on implied volatilities generated by option prices derived from our new...

  14. The BAHAMAS project: the CMB-large-scale structure tension and the roles of massive neutrinos and galaxy formation

    Science.gov (United States)

    McCarthy, Ian G.; Bird, Simeon; Schaye, Joop; Harnois-Deraps, Joachim; Font, Andreea S.; van Waerbeke, Ludovic

    2018-05-01

    Recent studies have presented evidence for tension between the constraints on Ωm and σ8 from the cosmic microwave background (CMB) and measurements of large-scale structure (LSS). This tension can potentially be resolved by appealing to extensions of the standard model of cosmology and/or untreated systematic errors in the modelling of LSS, of which baryonic physics has been frequently suggested. We revisit this tension using, for the first time, carefully calibrated cosmological hydrodynamical simulations, which thus capture the backreaction of the baryons on the total matter distribution. We have extended the BAryons and HAloes of MAssive Sysmtes simulations to include a treatment of massive neutrinos, which currently represents the best-motivated extension to the standard model. We make synthetic thermal Sunyaev-Zel'dovich effect, weak galaxy lensing, and CMB lensing maps and compare to observed auto- and cross-power spectra from a wide range of recent observational surveys. We conclude that: (i) in general, there is tension between the primary CMB and LSS when adopting the standard model with minimal neutrino mass; (ii) after calibrating feedback processes to match the gas fractions of clusters, the remaining uncertainties in the baryonic physics modelling are insufficient to reconcile this tension; and (iii) if one accounts for internal tensions in the Planck CMB data set (by allowing the lensing amplitude, ALens, to vary), invoking a non-minimal neutrino mass, typically of 0.2-0.4 eV, can resolve the tension. This solution is fully consistent with separate constraints from the primary CMB and baryon acoustic oscillations.

  15. Leverage Aversion and Risk Parity

    DEFF Research Database (Denmark)

    Asness, Clifford; Frazzini, Andrea; Heje Pedersen, Lasse

    2012-01-01

    The authors show that leverage aversion changes the predictions of modern portfolio theory: Safer assets must offer higher risk-adjusted returns than riskier assets. Consuming the high risk-adjusted returns of safer assets requires leverage, creating an opportunity for investors with the ability...... to apply leverage. Risk parity portfolios exploit this opportunity by equalizing the risk allocation across asset classes, thus overweighting safer assets relative to their weight in the market portfolio....

  16. Report on the behalf of the Foreign Affairs, Defense and Armed Forces Commission on the bill project, adopted by the National Assembly, related to the struggle against the proliferation of arms of massive destruction and their vectors

    International Nuclear Information System (INIS)

    2011-01-01

    This report recalls the origins of the bill project which is the implementation of the UN Security Council resolution 1540, the aim of which was to promote the setting up of efficient tools to struggle against proliferation. The bill project aims at updating and reinforcing the existing law arsenal. The report also contains remarks made by the Commission. The bill project addresses several issues: the struggle against proliferation of arms of massive destruction (nuclear weapons, nuclear materials, biological weapons, and chemical weapons), the struggle against proliferation of vectors of arms of massive destruction, double-use goods, the use of these weapons and vectors in acts of terrorism

  17. Leveraging organisational cultural capital

    Directory of Open Access Journals (Sweden)

    R Scheel

    2007-10-01

    Full Text Available Organisational culture discourse mandates a linear approach of diagnosis, measurement and gap analysis as standard practice in relation to most culture change initiatives. Therefore, a problem solving framework geared toward “fixing�? and/or realigning an organisation’s culture is usually prescribed. The traditional problem solving model seeks to identify gaps between current and desired organisational cultural states, inhibiting the discovery of an organisation’s unique values and strengths, namely its cultural capital. In pursuit of discovering and leveraging organisational cultural capital, a descriptive case study is used to show how an Appreciative Inquiry process can rejuvenate the spirit of an organisation as a system-wide inquiry mobilises a workforce toward a shared vision.

  18. A novel approach to imaging extinct seafloor massive sulphides (eSMS) by using ocean bottom seismometer data from the Blue Mining project

    Science.gov (United States)

    Gil, A.; Chidlow, K. L.; Vardy, M. E.; Bialas, J.; Schroeder, H.; Stobbs, I. J.; Gehrmann, R. A. S.; North, L. J.; Minshull, T. A.; Petersen, S.; Murton, B. J.

    2017-12-01

    Seafloor massive sulphide (SMS) deposits have generated great interest regarding their formation and composition, since their discovery in 1977. SMS deposits form through hydrothermal circulation and are therefore commonly found near hydrothermal vent sites. The high base (Cu, Zn) and precious metal (Au, Ag) content has interested mining companies, due to their potentially high economic value. Currently, the possibility of mining extinct seafloor massive sulphides (eSMS) deposits has opened a debate about their environmentally and economically sustainable exploitation. A major goal is the rapid exploration and assessment of deposit structure and volume. This is challenging due to their small dimensions (100s m diameter) and typically great water depths (> 3000 mbsl). Here we present a novel approach combining seismic reflection/refraction forward modelling to data acquired from the TAG hydrothermal field (26ºN, Mid-Atlantic Ridge, 3500mbsl) to image deep-water eSMS deposits. In May 2016, the RV METEOR shot 30, short (Mining' project, n˚ 604500.

  19. International Severe Weather and Flash Flood Hazard Early Warning Systems—Leveraging Coordination, Cooperation, and Partnerships through a Hydrometeorological Project in Southern Africa

    Directory of Open Access Journals (Sweden)

    Robert Jubach

    2016-06-01

    Full Text Available Climate, weather and water hazards do not recognize national boundaries. Transboundary/regional programs and cooperation are essential to reduce the loss of lives and damage to livelihoods when facing these hazards. The development and implementation of systems to provide early warnings for severe weather events such as cyclones and flash floods requires data and information sharing in real time, and coordination among the government agencies at all levels. Within a country, this includes local, municipal, provincial-to-national levels as well as regional and international entities involved in hydrometeorological services and Disaster Risk Reduction (DRR. Of key importance are the National Meteorological and Hydrologic Services (NMHSs. The NMHS is generally the authority solely responsible for issuing warnings for these hazards. However, in many regions of the world, the linkages and interfaces between the NMHS and other agencies are weak or non-existent. Therefore, there is a critical need to assess, strengthen, and formalize collaborations when addressing the concept of reducing risk and impacts from severe weather and floods. The U.S. Agency for International Development/Office of U.S. Foreign Disaster Assistance; the United Nations World Meteorological Organization (WMO; the WMO Southern Africa Regional Specialized Meteorological Center, hosted by the South African Weather Service; the U.S. National Oceanic and Atmospheric Administration/National Weather Service and the Hydrologic Research Center (a non-profit corporation are currently implementing a project working with Southern Africa NMHSs on addressing this gap. The project aims to strengthen coordination and collaboration mechanisms from national to local levels. The project partners are working with the NMHSs to apply and implement appropriate tools and infrastructure to enhance currently operational severe weather and flash flood early warning systems in each country in support of

  20. Magnetohydrodynamics: Parallel computation of the dynamics of thermonuclear and astrophysical plasmas. 1. Annual report of massively parallel computing pilot project 93MPR05

    International Nuclear Information System (INIS)

    1994-08-01

    This is the first annual report of the MPP pilot project 93MPR05. In this pilot project four research groups with different, complementary backgrounds collaborate with the aim to develop new algorithms and codes to simulate the magnetohydrodynamics of thermonuclear and astrophysical plasmas on massively parallel machines. The expected speed-up is required to simulate the dynamics of the hot plasmas of interest which are characterized by very large magnetic Reynolds numbers and, hence, require high spatial and temporal resolutions (for details see section 1). The four research groups that collaborated to produce the results reported here are: The MHD group of Prof. Dr. J.P. Goedbloed at the FOM-Institute for Plasma Physics 'Rijnhuizen' in Nieuwegein, the group of Prof. Dr. H. van der Vorst at the Mathematics Institute of Utrecht University, the group of Prof. Dr. A.G. Hearn at the Astronomical Institute of Utrecht University, and the group of Dr. Ir. H.J.J. te Riele at the CWI in Amsterdam. The full project team met frequently during this first project year to discuss progress reports, current problems, etc. (see section 2). The main results of the first project year are: - Proof of the scalability of typical linear and nonlinear MHD codes - development and testing of a parallel version of the Arnoldi algorithm - development and testing of alternative methods for solving large non-Hermitian eigenvalue problems - porting of the 3D nonlinear semi-implicit time evolution code HERA to an MPP system. The steps that were scheduled to reach these intended results are given in section 3. (orig./WL)

  1. Magnetohydrodynamics: Parallel computation of the dynamics of thermonuclear and astrophysical plasmas. 1. Annual report of massively parallel computing pilot project 93MPR05

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-08-01

    This is the first annual report of the MPP pilot project 93MPR05. In this pilot project four research groups with different, complementary backgrounds collaborate with the aim to develop new algorithms and codes to simulate the magnetohydrodynamics of thermonuclear and astrophysical plasmas on massively parallel machines. The expected speed-up is required to simulate the dynamics of the hot plasmas of interest which are characterized by very large magnetic Reynolds numbers and, hence, require high spatial and temporal resolutions (for details see section 1). The four research groups that collaborated to produce the results reported here are: The MHD group of Prof. Dr. J.P. Goedbloed at the FOM-Institute for Plasma Physics `Rijnhuizen` in Nieuwegein, the group of Prof. Dr. H. van der Vorst at the Mathematics Institute of Utrecht University, the group of Prof. Dr. A.G. Hearn at the Astronomical Institute of Utrecht University, and the group of Dr. Ir. H.J.J. te Riele at the CWI in Amsterdam. The full project team met frequently during this first project year to discuss progress reports, current problems, etc. (see section 2). The main results of the first project year are: - Proof of the scalability of typical linear and nonlinear MHD codes - development and testing of a parallel version of the Arnoldi algorithm - development and testing of alternative methods for solving large non-Hermitian eigenvalue problems - porting of the 3D nonlinear semi-implicit time evolution code HERA to an MPP system. The steps that were scheduled to reach these intended results are given in section 3. (orig./WL).

  2. Integrating Distributed Interactive Simulations With the Project Darkstar Open-Source Massively Multiplayer Online Game (MMOG) Middleware

    Science.gov (United States)

    2009-09-01

    be complete MMOG solutions such as Multiverse are not within the scope of this thesis, though it is recommended that readers compare this type of...software to the middleware described here ( Multiverse , 2009). 1. University of Munster: Real-Time Framework The Real-Time Framework (RTF) project is...10, 2009, from http://wiki.secondlife.com/wiki/MMOX Multiverse . (2009). Multiverse platform architecture. Retrieved September 9, 2009, from http

  3. Leverage, Investment, and Firm Growth

    OpenAIRE

    Larry Lang; Eli Ofek; Rene M. Stulz

    1995-01-01

    We show that there is a negative relation between leverage and future growth at the firm level and, for diversified firms, at the segment level. Further, this negative relation between leverage and growth holds for firms with low Tobin's q, but not for high-q firms or firms in high-q industries. Therefore, leverage does not reduce growth for firms known to have good investment opportunities, but is negatively related to growth for firms whose growth opportunities are either not recognized by ...

  4. Opportunities and challenges for the integration of massively parallel genomic sequencing into clinical practice: lessons from the ClinSeq project.

    Science.gov (United States)

    Biesecker, Leslie G

    2012-04-01

    The debate surrounding the return of results from high-throughput genomic interrogation encompasses many important issues including ethics, law, economics, and social policy. As well, the debate is also informed by the molecular, genetic, and clinical foundations of the emerging field of clinical genomics, which is based on this new technology. This article outlines the main biomedical considerations of sequencing technologies and demonstrates some of the early clinical experiences with the technology to enable the debate to stay focused on real-world practicalities. These experiences are based on early data from the ClinSeq project, which is a project to pilot the use of massively parallel sequencing in a clinical research context with a major aim to develop modes of returning results to individual subjects. The study has enrolled >900 subjects and generated exome sequence data on 572 subjects. These data are beginning to be interpreted and returned to the subjects, which provides examples of the potential usefulness and pitfalls of clinical genomics. There are numerous genetic results that can be readily derived from a genome including rare, high-penetrance traits, and carrier states. However, much work needs to be done to develop the tools and resources for genomic interpretation. The main lesson learned is that a genome sequence may be better considered as a health-care resource, rather than a test, one that can be interpreted and used over the lifetime of the patient.

  5. Stochastic volatility and leverage effect

    OpenAIRE

    Josep Perello; Jaume Masoliver

    2002-01-01

    We prove that Brownian market models with random diffusion coefficients provide an exact measure of the leverage effect [J-P. Bouchaud et al., Phys. Rev. Lett. 87, 228701 (2001)]. This empirical fact asserts that past returns are anticorrelated with future diffusion coefficient. Several models with random diffusion have been suggested but without a quantitative study of the leverage effect. Our analysis lets us to fully estimate all parameters involved and allows a deeper study of correlated ...

  6. From up to date climate and ocean evidence with updated UN emissions projections, the time is now to recommend an immediate massive effort on CO2.

    Science.gov (United States)

    Carter, Peter

    2017-04-01

    This paper provides further compelling evidence for 'an immediate, massive effort to control CO2 emissions, stopped by mid-century' (Cai, Lenton & Lontzek, 2016). Atmospheric CO2 which is above 405 ppm (actual and trend) still accelerating, despite flat emissions since 2014, with a 2015 >3ppm unprecedented spike in Earth history (A. Glikson),is on the worst case IPCC scenario. Atmospheric methane is increasing faster than its past 20-year rate, almost on the worst-case IPCC AR5 scenario (Global Carbon Project, 2016). Observed effects of atmospheric greenhouse gas (GHG) pollution are increasing faster. This includes long-lived atmospheric GHG concentrations, radiative forcing, surface average warming, Greenland ice sheet melting, Arctic daily sea ice anomaly, ocean heat (and rate of going deeper), ocean acidification, and ocean de-oxygenation. The atmospheric GHG concentration of 485 ppm CO2 eq (WMO, 2015) commits us to 'about 2°C' equilibrium (AR5). 2°C by 2100 would require 'substantial emissions reductions over the next few decades' (AR5). Instead, the May 2016 UN update on 'intended' national emissions targets under the Paris Agreement projects global emissions will be 16% higher by 2030 and the November 2016 International Energy Agency update projects energy-related CO2 eq emissions will be 30% higher by 2030, leading to 'around 2.7°C by 2100 and above 3°C thereafter'. Climate change feedback will be positive this century and multiple large vulnerable sources of amplifying feedback exist (AR5). 'Extensive tree mortality and widespread forest die-back linked to drought and temperature stress have been documented on all vegetated continents' (AR5). 'Recent studies suggest a weakening of the land sink, further amplifying atmospheric growth of CO2' (WMO, 2016). Under all but the best-case IPCC AR5 scenario, surface temperature is projected to increase above 2°C by 2100, which is above 3°C (equilibrium) after 2100, with ocean acidification still increasing at

  7. NMSBA Leveraged Project Interim Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Sergei A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-05

    We have investigated the quality of copper particles available on the market today and determined their complete unsuitability to be used for copper inks, since all of them were significantly oxidized: there was only 50% of metallic copper in each batch and the rest was Cu2O and CuO. To date, we have fully identified the challenges and developed the synthesis for large amounts of the copper ink precursor, namely, copper(I) mesityl. Currently, the amounts of tens of grams of the precursor have been obtained. From this precursor, four small batches of copper nanoparticles (50 mg each) have been synthesized to investigate the possibility of decreasing particle sensitivity to oxygen. These particles have been treated with different surface-stabilizing agents (namely, octylamine, oleylamine, pyridine, benzotriazole, and dodecanethiol) in order to investigate their influence on particle oxygen sensitivity. A batch of copper nanoparticles in the amount of 2 grams has also been synthesized in order to start preparation of test ink batches with different solvents, surfactants, and stabilizers.

  8. Leverage points for sustainability transformation.

    Science.gov (United States)

    Abson, David J; Fischer, Joern; Leventon, Julia; Newig, Jens; Schomerus, Thomas; Vilsmaier, Ulli; von Wehrden, Henrik; Abernethy, Paivi; Ives, Christopher D; Jager, Nicolas W; Lang, Daniel J

    2017-02-01

    Despite substantial focus on sustainability issues in both science and politics, humanity remains on largely unsustainable development trajectories. Partly, this is due to the failure of sustainability science to engage with the root causes of unsustainability. Drawing on ideas by Donella Meadows, we argue that many sustainability interventions target highly tangible, but essentially weak, leverage points (i.e. using interventions that are easy, but have limited potential for transformational change). Thus, there is an urgent need to focus on less obvious but potentially far more powerful areas of intervention. We propose a research agenda inspired by systems thinking that focuses on transformational 'sustainability interventions', centred on three realms of leverage: reconnecting people to nature, restructuring institutions and rethinking how knowledge is created and used in pursuit of sustainability. The notion of leverage points has the potential to act as a boundary object for genuinely transformational sustainability science.

  9. Report on the behalf of the Foreign Affairs, Defense and Armed Forces Commission on the bill project, adopted by the National Assembly, related to the struggle against the proliferation of arms of massive destruction and their vectors; Rapport fait au nom de la commission des affaires etrangeres, de la defense et des forces armees (1) sur le projet de loi, ADOPTE PAR L'ASSEMBLEE NATIONALE, relatif a la lutte contre la proliferation des armes de destruction massive et de leurs vecteurs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This report recalls the origins of the bill project which is the implementation of the UN Security Council resolution 1540, the aim of which was to promote the setting up of efficient tools to struggle against proliferation. The bill project aims at updating and reinforcing the existing law arsenal. The report also contains remarks made by the Commission. The bill project addresses several issues: the struggle against proliferation of arms of massive destruction (nuclear weapons, nuclear materials, biological weapons, and chemical weapons), the struggle against proliferation of vectors of arms of massive destruction, double-use goods, the use of these weapons and vectors in acts of terrorism

  10. Leveraging FPGAs for Accelerating Short Read Alignment.

    Science.gov (United States)

    Arram, James; Kaplan, Thomas; Luk, Wayne; Jiang, Peiyong

    2017-01-01

    One of the key challenges facing genomics today is how to efficiently analyze the massive amounts of data produced by next-generation sequencing platforms. With general-purpose computing systems struggling to address this challenge, specialized processors such as the Field-Programmable Gate Array (FPGA) are receiving growing interest. The means by which to leverage this technology for accelerating genomic data analysis is however largely unexplored. In this paper, we present a runtime reconfigurable architecture for accelerating short read alignment using FPGAs. This architecture exploits the reconfigurability of FPGAs to allow the development of fast yet flexible alignment designs. We apply this architecture to develop an alignment design which supports exact and approximate alignment with up to two mismatches. Our design is based on the FM-index, with optimizations to improve the alignment performance. In particular, the n-step FM-index, index oversampling, a seed-and-compare stage, and bi-directional backtracking are included. Our design is implemented and evaluated on a 1U Maxeler MPC-X2000 dataflow node with eight Altera Stratix-V FPGAs. Measurements show that our design is 28 times faster than Bowtie2 running with 16 threads on dual Intel Xeon E5-2640 CPUs, and nine times faster than Soap3-dp running on an NVIDIA Tesla C2070 GPU.

  11. Leverage, monetary policy, and firm investment

    OpenAIRE

    Charles X. Hu

    1999-01-01

    In this paper, I investigate whether the effects of monetary policy on firm investment can be transmitted through leverage. I find that monetary contractions reduce the growth of investment more for highly leveraged firms than for less leveraged firms. The results suggest that the board credit channel for monetary policy exists, and that it can operate through leverage, as adverse monetary shocks aggravate real debt burdens and raise the effective costs of investment.

  12. Leveraging Facebook to Brand Radiology.

    Science.gov (United States)

    Tso, Hilda H; Parikh, Jay R

    2018-03-30

    In the current health care climate, radiologists should consider developing their brand. Facebook is the market leader for social media networking in the United States. The authors describe how radiologists can leverage Facebook to develop and market organizational, group, and individual brands. The authors then address concerns related to the use of social media by radiologists. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  13. New massive gravity

    NARCIS (Netherlands)

    Bergshoeff, Eric A.; Hohm, Olaf; Townsend, Paul K.

    2012-01-01

    We present a brief review of New Massive Gravity, which is a unitary theory of massive gravitons in three dimensions obtained by considering a particular combination of the Einstein-Hilbert and curvature squared terms.

  14. On the irrelevance of the leverage effect

    OpenAIRE

    Nippel, Peter

    2001-01-01

    Financial leverage increases the expected return on equity. We show that this leverage effect is not only irrelevant for shareholders' present wealth but also for the return on their investments. This result is straightforward if we do not only look at the return on equity but at the return on shareholders' total wealth. Any relevance leverage may have is definitely due to market imperfections. These may simply cause differences in market access for firms and individuals or lead to agency pro...

  15. Implementation of Strategies to Leverage Public and Private Resources for National Security Workforce Development

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-04-01

    This report documents implementation strategies to leverage public and private resources for the development of an adequate national security workforce as part of the National Security Preparedness Project (NSPP), being performed under a U.S. Department of Energy (DOE)/National Nuclear Security Administration (NNSA) grant. There are numerous efforts across the United States to develop a properly skilled and trained national security workforce. Some of these efforts are the result of the leveraging of public and private dollars. As budget dollars decrease and the demand for a properly skilled and trained national security workforce increases, it will become even more important to leverage every education and training dollar. This report details some of the efforts that have been implemented to leverage public and private resources, as well as implementation strategies to further leverage public and private resources.

  16. Leveraging Digital Innovation in Healthcare

    DEFF Research Database (Denmark)

    Brown, Carol V.; Jensen, Tina Blegind; Aanestad, Margun

    2014-01-01

    Harnessing digital innovations for healthcare delivery has raised high expectations as well as major concerns. Several countries across the globe have made progress in achieving three common goals of lower costs, higher quality, and increased patient access to healthcare services through...... investments in digital infrastructures. New technologies are leveraged to achieve widespread 24x7 disease management, patients’ wellbeing, home-based healthcare and other patient-centric service innovations. Yet, digital innovations in healthcare face barriers in terms of standardization, data privacy...... landscapes in selected countries. Then panelists with expertise in digital data streams, cloud, and mobile computing will present concrete examples of healthcare service innovations that have the potential to address one or more of the global goals. ECIS attendees are invited to join a debate about...

  17. Leveraging Twitter to gauge evacuation compliance: Spatiotemporal analysis of Hurricane Matthew.

    Science.gov (United States)

    Martín, Yago; Li, Zhenlong; Cutter, Susan L

    2017-01-01

    Hurricane Matthew was the deadliest Atlantic storm since Katrina in 2005 and prompted one of the largest recent hurricane evacuations along the Southeastern coast of the United States. The storm and its projected landfall triggered a massive social media reaction. Using Twitter data, this paper examines the spatiotemporal variability in social media response and develops a novel approach to leverage geotagged tweets to assess the evacuation responses of residents. The approach involves the retrieval of tweets from the Twitter Stream, the creation and filtering of different datasets, and the statistical and spatial processing and treatment to extract, plot and map the results. As expected, peak Twitter response was reached during the pre-impact and preparedness phase, and decreased abruptly after the passage of the storm. A comparison between two time periods-pre-evacuation (October 2th-4th) and post-evacuation (October 7th-9th)-indicates that 54% of Twitter users moved away from the coast to a safer location, with observed differences by state on the timing of the evacuation. A specific sub-state analysis of South Carolina illustrated overall compliance with evacuation orders and detailed information on the timing of departure from the coast as well as the destination location. These findings advance the use of big data and citizen-as-sensor approaches for public safety issues, providing an effective and near real-time alternative for measuring compliance with evacuation orders.

  18. Leveraging Twitter to gauge evacuation compliance: Spatiotemporal analysis of Hurricane Matthew.

    Directory of Open Access Journals (Sweden)

    Yago Martín

    Full Text Available Hurricane Matthew was the deadliest Atlantic storm since Katrina in 2005 and prompted one of the largest recent hurricane evacuations along the Southeastern coast of the United States. The storm and its projected landfall triggered a massive social media reaction. Using Twitter data, this paper examines the spatiotemporal variability in social media response and develops a novel approach to leverage geotagged tweets to assess the evacuation responses of residents. The approach involves the retrieval of tweets from the Twitter Stream, the creation and filtering of different datasets, and the statistical and spatial processing and treatment to extract, plot and map the results. As expected, peak Twitter response was reached during the pre-impact and preparedness phase, and decreased abruptly after the passage of the storm. A comparison between two time periods-pre-evacuation (October 2th-4th and post-evacuation (October 7th-9th-indicates that 54% of Twitter users moved away from the coast to a safer location, with observed differences by state on the timing of the evacuation. A specific sub-state analysis of South Carolina illustrated overall compliance with evacuation orders and detailed information on the timing of departure from the coast as well as the destination location. These findings advance the use of big data and citizen-as-sensor approaches for public safety issues, providing an effective and near real-time alternative for measuring compliance with evacuation orders.

  19. Determinants of Leverage and Agency problems

    NARCIS (Netherlands)

    de Jong, A.; Dijk, R.

    1998-01-01

    In this paper we empirically investigate the determinants of leverage and agency problems and we examine the relationships between leverage and agency problems. As in Titman and Wessels (1988) we use structural equations modeling with latent variables. In contrast to Titman and Wessels (1988), who

  20. 17 CFR 31.6 - Registration of leverage commodities.

    Science.gov (United States)

    2010-04-01

    ... taking delivery to buy or sell the leverage commodity; (2) Explain the effect of such changes upon the... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Registration of leverage... LEVERAGE TRANSACTIONS § 31.6 Registration of leverage commodities. (a) Registration of leverage commodities...

  1. Strategy as stretch and leverage.

    Science.gov (United States)

    Hamel, G; Prahalad, C K

    1993-01-01

    Global competition is not just product versus product or company versus company. It is mind-set versus mind-set. Driven to understand the dynamics of competition, we have learned a lot about what makes one company more successful than another. But to find the root of competitiveness--to understand why some companies create new forms of competitive advantage while others watch and follow--we must look at strategic mind-sets. For many managers, "being strategic" means pursuing opportunities that fit the company's resources. This approach is not wrong, Gary Hamel and C.K. Prahalad contend, but it obscures an approach in which "stretch" supplements fit and being strategic means creating a chasm between ambition and resources. Toyota, CNN, British Airways, Sony, and others all displaced competitors with stronger reputations and deeper pockets. Their secret? In each case, the winner had greater ambition than its well-endowed rivals. Winners also find less resource-intensive ways of achieving their ambitious goals. This is where leverage complements the strategic allocation of resources. Managers at competitive companies can get a bigger bang for their buck in five basic ways: by concentrating resources around strategic goals; by accumulating resources more efficiently; by complementing one kind of resource with another; by conserving resources whenever they can; and by recovering resources from the market-place as quickly as possible. As recent competitive battles have demonstrated, abundant resources can't guarantee continued industry leadership.(ABSTRACT TRUNCATED AT 250 WORDS)

  2. AMIDST: Analysis of MassIve Data STreams

    DEFF Research Database (Denmark)

    Masegosa, Andres; Martinez, Ana Maria; Borchani, Hanen

    2015-01-01

    The Analysis of MassIve Data STreams (AMIDST) Java toolbox provides a collection of scalable and parallel algorithms for inference and learning of hybrid Bayesian networks from data streams. The toolbox, available at http://amidst.github.io/toolbox/ under the Apache Software License version 2.......0, also efficiently leverages existing functionalities and algorithms by interfacing to software tools such as HUGIN and MOA....

  3. Massive Conformal Gravity

    International Nuclear Information System (INIS)

    Faria, F. F.

    2014-01-01

    We construct a massive theory of gravity that is invariant under conformal transformations. The massive action of the theory depends on the metric tensor and a scalar field, which are considered the only field variables. We find the vacuum field equations of the theory and analyze its weak-field approximation and Newtonian limit.

  4. Leveraging Chaos in Continuous Thrust Trajectory Design

    Data.gov (United States)

    National Aeronautics and Space Administration — A trajectory design tool is sought to leverage chaos and nonlinear dynamics present in multi-body gravitational fields to design ultra-low energy transfer...

  5. Leveraging the Development of Inclusive and Sustainable ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Nevertheless, the most vulnerable actors in the value chain still lack the capacity to ... They will identify and prioritize leverage points for ICT interventions that are ... IDRC is pleased to announce a new funding opportunity aimed at fostering ...

  6. Financing drug discovery via dynamic leverage.

    Science.gov (United States)

    Montazerhodjat, Vahid; Frishkopf, John J; Lo, Andrew W

    2016-03-01

    We extend the megafund concept for funding drug discovery to enable dynamic leverage in which the portfolio of candidate therapeutic assets is predominantly financed initially by equity, and debt is introduced gradually as assets mature and begin generating cash flows. Leverage is adjusted so as to maintain an approximately constant level of default risk throughout the life of the fund. Numerical simulations show that applying dynamic leverage to a small portfolio of orphan drug candidates can boost the return on equity almost twofold compared with securitization with a static capital structure. Dynamic leverage can also add significant value to comparable all-equity-financed portfolios, enhancing the return on equity without jeopardizing debt performance or increasing risk to equity investors. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. The Profits–Leverage Puzzle Revisited

    OpenAIRE

    Murray Z. Frank; Vidhan K. Goyal

    2015-01-01

    The inverse relation between leverage and profitability is widely regarded as a serious defect of the trade-off theory. We show that the defect is not with the theory but with the use of a leverage ratio in which profitability affects both the numerator and the denominator. Profitability directly increases the value of equity. Firms do take the predicted offsetting actions. They issue debt and repurchase equity when profitability rises, and retire debt and issue equity when profitability fall...

  8. Corporate Leverage and Product Differentiation Strategy

    OpenAIRE

    Arping, Stefan; Lóránth, Gyöngyi

    2002-01-01

    This article develops a model of the interplay between corporate leverage and product differentiation strategy. Leverage improves managerial discipline, but it can also raise customer concerns about a vendor's long-term viability. We argue that customer concerns about firm viability will be particularly pronounced when products are highly differentiated from competitors' products. In this context, optimal product differentiation strategies solve a trade-off between softening price competition...

  9. Leverage and growth: effect of stock options

    OpenAIRE

    Francis, Bill; Hasan , Iftekhar; Sharma, Zenu

    2011-01-01

    This paper investigates the potential effects of stock options on managers’ investment decisions and therefore on a firm’s growth or, alternatively, on its leverage-growth relationship. To structure the analysis addressing this issue, the paper utilizes a framework establishing a negative relationship between leverage and the firm’s growth. However, in contrast to some of the existing results, the empirical analysis of manufacturing firms in this paper shows that the negative relationship bet...

  10. Hedge Ratios for short and leveraged ETFs

    Directory of Open Access Journals (Sweden)

    Leo Schubert

    2011-06-01

    Full Text Available Exchange Traded Funds (ETFs exist for stock-, bond- and commodity markets. In most cases the underlying of an ETF is an index. Fund management today uses the active and passive way to construct a portfolio. ETFs can be used for passive portfolio management. Then ETFs with positive leverage factors are preferred. In the frame of active portfolio also the ETFs with negative leverage factors can be applied for the hedge or cross hedge of a portfolio. These hedging possibilities will be analyzed in this paper. Short ETFs exist with different leverage factors. In Europe, the leverage factors 1 (e.g. ShortDAX ETF and 2 (e.g. DJ STOXX 600 Double Short are offered while in the financial markets of the United States factors from 1 to 4 can be found. To investigate the effect of the different leverage factors and other parameters Monte Carlo Simulation was used. The results show e.g. that higher leverage factors achieve higher profits as well as losses. In the case, that a bearish market is supposed, minimizing the variance of the hedge seem not to be until to get better hedging results, due to a very skewed return distribution of the hedge. The risk measure target-shortfall-probability confirms the use of the standard hedge weightings which depend only on the leverage factor. This characteristic remains, when a portfolio has to be hedged instead of the underlying index of the short ETF. For portfolios which have a low correlation with the index return should not be used high leverage factors for hedging, due to the higher volatility and target-shortfall-probability.

  11. Identification of Strategies to Leverage Public and Private Resources for National Security Workforce Development

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-02-01

    This report documents the identification of strategies to leverage public and private resources for the development of an adequate national security workforce as part of the National Security Preparedness Project (NSPP).There are numerous efforts across the United States to develop a properly skilled and trained national security workforce. Some of these efforts are the result of the leveraging of public and private dollars. As budget dollars decrease and the demand for a properly skilled and trained national security workforce increases, it will become even more important to leverage every education and training dollar. The leveraging of dollars serves many purposes. These include increasing the amount of training that can be delivered and therefore increasing the number of people reached, increasing the number and quality of public/private partnerships, and increasing the number of businesses that are involved in the training of their future workforce.

  12. Topological massive sigma models

    International Nuclear Information System (INIS)

    Lambert, N.D.

    1995-01-01

    In this paper we construct topological sigma models which include a potential and are related to twisted massive supersymmetric sigma models. Contrary to a previous construction these models have no central charge and do not require the manifold to admit a Killing vector. We use the topological massive sigma model constructed here to simplify the calculation of the observables. Lastly it is noted that this model can be viewed as interpolating between topological massless sigma models and topological Landau-Ginzburg models. ((orig.))

  13. Massive neutrinos in astrophysics

    International Nuclear Information System (INIS)

    Qadir, A.

    1982-08-01

    Massive neutrinos are among the big hopes of cosmologists. If they happen to have the right mass they can close the Universe, explain the motion of galaxies in clusters, provide galactic halos and even, possibly, explain galaxy formation. Tremaine and Gunn have argued that massive neutrinos cannot do all these things. I will explain, here, what some of us believe is wrong with their arguments. (author)

  14. Massive graviton geons

    Science.gov (United States)

    Aoki, Katsuki; Maeda, Kei-ichi; Misonoh, Yosuke; Okawa, Hirotada

    2018-02-01

    We find vacuum solutions such that massive gravitons are confined in a local spacetime region by their gravitational energy in asymptotically flat spacetimes in the context of the bigravity theory. We call such self-gravitating objects massive graviton geons. The basic equations can be reduced to the Schrödinger-Poisson equations with the tensor "wave function" in the Newtonian limit. We obtain a nonspherically symmetric solution with j =2 , ℓ=0 as well as a spherically symmetric solution with j =0 , ℓ=2 in this system where j is the total angular momentum quantum number and ℓ is the orbital angular momentum quantum number, respectively. The energy eigenvalue of the Schrödinger equation in the nonspherical solution is smaller than that in the spherical solution. We then study the perturbative stability of the spherical solution and find that there is an unstable mode in the quadrupole mode perturbations which may be interpreted as the transition mode to the nonspherical solution. The results suggest that the nonspherically symmetric solution is the ground state of the massive graviton geon. The massive graviton geons may decay in time due to emissions of gravitational waves but this timescale can be quite long when the massive gravitons are nonrelativistic and then the geons can be long-lived. We also argue possible prospects of the massive graviton geons: applications to the ultralight dark matter scenario, nonlinear (in)stability of the Minkowski spacetime, and a quantum transition of the spacetime.

  15. 17 CFR 31.15 - Reporting to leverage customers.

    Science.gov (United States)

    2010-04-01

    ... customers. 31.15 Section 31.15 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION LEVERAGE TRANSACTIONS § 31.15 Reporting to leverage customers. Each leverage transaction merchant shall furnish in writing directly to each leverage customer: (a) Promptly upon the repurchase, resale...

  16. Combating Terrorism with Socioeconomics: Leveraging the Private Sector

    Science.gov (United States)

    2007-01-01

    ndupress .ndu.edu   issue 46, 3d quarter 2007  /  JFQ        127 Leveraging the Private Sector b y m i e m i e W i n n b y R d Major Miemie...the Private Sector 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK...nongovernmental organizations, private sector , academia, and the U.S. Government (including the military). To attract all the necessary

  17. Epidemiology of Massive Transfusion

    DEFF Research Database (Denmark)

    Halmin, Märit; Chiesa, Flaminia; Vasan, Senthil K

    2016-01-01

    in Sweden from 1987 and in Denmark from 1996. A total of 92,057 patients were included. Patients were followed until the end of 2012. MEASUREMENTS AND MAIN RESULTS: Descriptive statistics were used to characterize the patients and indications. Post transfusion mortality was expressed as crude 30-day...... mortality and as long-term mortality using the Kaplan-Meier method and using standardized mortality ratios. The incidence of massive transfusion was higher in Denmark (4.5 per 10,000) than in Sweden (2.5 per 10,000). The most common indication for massive transfusion was major surgery (61.2%) followed...

  18. Topologically massive supergravity

    Directory of Open Access Journals (Sweden)

    S. Deser

    1983-01-01

    Full Text Available The locally supersymmetric extension of three-dimensional topologically massive gravity is constructed. Its fermionic part is the sum of the (dynamically trivial Rarita-Schwinger action and a gauge-invariant topological term, of second derivative order, analogous to the gravitational one. It is ghost free and represents a single massive spin 3/2 excitation. The fermion-gravity coupling is minimal and the invariance is under the usual supergravity transformations. The system's energy, as well as that of the original topological gravity, is therefore positive.

  19. Epidemiology of massive transfusion

    DEFF Research Database (Denmark)

    Halmin, M A; Chiesa, F; Vasan, S K

    2015-01-01

    and to describe characteristics and mortality of massively transfused patients. Methods: We performed a retrospective cohort study based on the Scandinavian Donations and Transfusions (SCANDAT2) database, linking data on blood donation, blood components and transfused patients with inpatient- and population.......4% among women transfused for obstetrical bleeding. Mortality increased gradually with age and among all patients massively transfused at age 80 years, only 26% were alive [TABLE PRESENTED] after 5 years. The relative mortality, early after transfusion, was high and decreased with time since transfusion...

  20. Firm Leverage and the Financial Crisis

    OpenAIRE

    Fatih Altunok; Arif Oduncu

    2014-01-01

    The firm growth dynamics is an important topic since the growth performance of firms is the main source of the economic growth in countries. Generally, crises produce a sharp decline in firms’ growth and this leads to a decline in both the level of employment and the income of households. This paper focuses on the role of firm leverage on the growth performance of the firm during the global financial crisis. We investigate whether the firms that experienced a large leverage increase before th...

  1. Leveraging investments for energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Tonnos, A. [Bestech, Sudbury, ON (Canada)

    2008-07-01

    This paper described the application of a computerized energy management program at a 40-year old mine in Ontario. The purpose of the project was to implement standards that promote process owner accountability, employee engagement, and the development of sustainable systems within the mine's administrative core. The program was developed to consider key parameters in the mine's energy consumption, including ventilation, compressed air, and process water. An account center monthly overview function was used. A breakdown of the account center by process was also provided. The system also used historical plotting capabilities. Use of the system meant that leaks could be detected by monitoring irregularities in underground readings. An alarm search page was also provided for control room operators. Parameters were presented for each alarm in order to identify contacts and procedures. System upgrades to the program were performed remotely. The program is available for installation at other mines. tabs., figs.

  2. Radiology in massive hemoptysis

    International Nuclear Information System (INIS)

    Marini, M.; Castro, J.M.; Gayol, A.; Aguilera, C.; Blanco, M.; Beraza, A.; Torres, J.

    1995-01-01

    We have reviewed our experience in diseases involving massive hemoptysis, systematizing the most common causes which include tuberculosis, bronchiectasis and cancer of the lung. Other less frequent causes, such as arteriovenous fistula, Aspergilloma, aneurysm, etc.; are also evaluated, and the most demonstrative images of each produced by the most precise imaging methods for their assessment are presented

  3. Massive Supergravity and Deconstruction

    CERN Document Server

    Gregoire, T; Shadmi, Y; Gregoire, Thomas; Schwartz, Matthew D; Shadmi, Yael

    2004-01-01

    We present a simple superfield Lagrangian for massive supergravity. It comprises the minimal supergravity Lagrangian with interactions as well as mass terms for the metric superfield and the chiral compensator. This is the natural generalization of the Fierz-Pauli Lagrangian for massive gravity which comprises mass terms for the metric and its trace. We show that the on-shell bosonic and fermionic fields are degenerate and have the appropriate spins: 2, 3/2, 3/2 and 1. We then study this interacting Lagrangian using goldstone superfields. We find that a chiral multiplet of goldstones gets a kinetic term through mixing, just as the scalar goldstone does in the non-supersymmetric case. This produces Planck scale (Mpl) interactions with matter and all the discontinuities and unitarity bounds associated with massive gravity. In particular, the scale of strong coupling is (Mpl m^4)^1/5, where m is the multiplet's mass. Next, we consider applications of massive supergravity to deconstruction. We estimate various qu...

  4. Update on massive transfusion.

    Science.gov (United States)

    Pham, H P; Shaz, B H

    2013-12-01

    Massive haemorrhage requires massive transfusion (MT) to maintain adequate circulation and haemostasis. For optimal management of massively bleeding patients, regardless of aetiology (trauma, obstetrical, surgical), effective preparation and communication between transfusion and other laboratory services and clinical teams are essential. A well-defined MT protocol is a valuable tool to delineate how blood products are ordered, prepared, and delivered; determine laboratory algorithms to use as transfusion guidelines; and outline duties and facilitate communication between involved personnel. In MT patients, it is crucial to practice damage control resuscitation and to administer blood products early in the resuscitation. Trauma patients are often admitted with early trauma-induced coagulopathy (ETIC), which is associated with mortality; the aetiology of ETIC is likely multifactorial. Current data support that trauma patients treated with higher ratios of plasma and platelet to red blood cell transfusions have improved outcomes, but further clinical investigation is needed. Additionally, tranexamic acid has been shown to decrease the mortality in trauma patients requiring MT. Greater use of cryoprecipitate or fibrinogen concentrate might be beneficial in MT patients from obstetrical causes. The risks and benefits for other therapies (prothrombin complex concentrate, recombinant activated factor VII, or whole blood) are not clearly defined in MT patients. Throughout the resuscitation, the patient should be closely monitored and both metabolic and coagulation abnormalities corrected. Further studies are needed to clarify the optimal ratios of blood products, treatment based on underlying clinical disorder, use of alternative therapies, and integration of laboratory testing results in the management of massively bleeding patients.

  5. Massive antenatal fetomaternal hemorrhage

    DEFF Research Database (Denmark)

    Dziegiel, Morten Hanefeld; Koldkjaer, Ole; Berkowicz, Adela

    2005-01-01

    Massive fetomaternal hemorrhage (FMH) can lead to life-threatening anemia. Quantification based on flow cytometry with anti-hemoglobin F (HbF) is applicable in all cases but underestimation of large fetal bleeds has been reported. A large FMH from an ABO-compatible fetus allows an estimation...

  6. COLA with massive neutrinos

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Bill S.; Winther, Hans A.; Koyama, Kazuya, E-mail: bill.wright@port.ac.uk, E-mail: hans.winther@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth, Hampshire, PO1 3FX (United Kingdom)

    2017-10-01

    The effect of massive neutrinos on the growth of cold dark matter perturbations acts as a scale-dependent Newton's constant and leads to scale-dependent growth factors just as we often find in models of gravity beyond General Relativity. We show how to compute growth factors for ΛCDM and general modified gravity cosmologies combined with massive neutrinos in Lagrangian perturbation theory for use in COLA and extensions thereof. We implement this together with the grid-based massive neutrino method of Brandbyge and Hannestad in MG-PICOLA and compare COLA simulations to full N -body simulations of ΛCDM and f ( R ) gravity with massive neutrinos. Our implementation is computationally cheap if the underlying cosmology already has scale-dependent growth factors and it is shown to be able to produce results that match N -body to percent level accuracy for both the total and CDM matter power-spectra up to k ∼< 1 h /Mpc.

  7. Leveraging Relational Technology through Industry Partnerships.

    Science.gov (United States)

    Brush, Leonard M.; Schaller, Anthony J.

    1988-01-01

    Carnegie Mellon University has leveraged its technological expertise with database management systems (DBMS) into joint technological and developmental partnerships with DBMS and application software vendors. Carnegie's relational database strategy, the strategy of partnerships and how they were formed, and how the partnerships are doing are…

  8. Topics in Finance Part III--Leverage

    Science.gov (United States)

    Laux, Judy

    2010-01-01

    This article investigates operating and financial leverage from the perspective of the financial manager, accenting the relationships to stockholder wealth maximization (SWM), risk and return, and potential agency problems. It also covers some of the pertinent literature related specifically to the implications of operating and financial risk and…

  9. Equity Mispricing and Leverage Adjustment Costs

    NARCIS (Netherlands)

    Warr, R.S.; Elliott, W.B.; Koeter-Kant, J.; Oztekin, O.

    2012-01-01

    We find that equity mispricing impacts the speed at which firms adjust to their target leverage (TL) and does so in predictable ways depending on whether the firm is over- or underlevered. For example, firms that are above their TL and should therefore issue equity (or retire debt) adjust more

  10. The Complexity of Leveraging University Program Change

    Science.gov (United States)

    Crow, Gary M.; Arnold, Noelle Witherspoon; Reed, Cynthia J.; Shoho, Alan R.

    2012-01-01

    This article identifies four elements of complexity that influence how university educational leadership programs can leverage program change: faculty reward systems, faculty governance, institutional resources, and state-level influence on leadership preparation. Following the discussion of the elements of complexity, the article provides a…

  11. Massive propagators in instanton fields

    International Nuclear Information System (INIS)

    Brown, L.S.; Lee, C.

    1978-01-01

    Green's functions for massive spinor and vector particles propagating in a self-dual but otherwise arbitrary non-Abelian gauge field are shown to be completely determined by the corresponding Green's functions of massive scalar particles

  12. Permutations of massive vacua

    Energy Technology Data Exchange (ETDEWEB)

    Bourget, Antoine [Department of Physics, Universidad de Oviedo, Avenida Calvo Sotelo 18, 33007 Oviedo (Spain); Troost, Jan [Laboratoire de Physique Théorique de l’É cole Normale Supérieure, CNRS,PSL Research University, Sorbonne Universités, 75005 Paris (France)

    2017-05-09

    We discuss the permutation group G of massive vacua of four-dimensional gauge theories with N=1 supersymmetry that arises upon tracing loops in the space of couplings. We concentrate on superconformal N=4 and N=2 theories with N=1 supersymmetry preserving mass deformations. The permutation group G of massive vacua is the Galois group of characteristic polynomials for the vacuum expectation values of chiral observables. We provide various techniques to effectively compute characteristic polynomials in given theories, and we deduce the existence of varying symmetry breaking patterns of the duality group depending on the gauge algebra and matter content of the theory. Our examples give rise to interesting field extensions of spaces of modular forms.

  13. Massive stars in galaxies

    International Nuclear Information System (INIS)

    Humphreys, R.M.

    1987-01-01

    The relationship between the morphologic type of a galaxy and the evolution of its massive stars is explored, reviewing observational results for nearby galaxies. The data are presented in diagrams, and it is found that the massive-star populations of most Sc spiral galaxies and irregular galaxies are similar, while those of Sb spirals such as M 31 and M 81 may be affected by morphology (via differences in the initial mass function or star-formation rate). Consideration is also given to the stability-related upper luminosity limit in the H-R diagram of hypergiant stars (attributed to radiation pressure in hot stars and turbulence in cool stars) and the goals of future observation campaigns. 88 references

  14. Forecasting volatility in the presence of Leverage Effect

    OpenAIRE

    Jean-Christophe Domenge; Rémi Rhodes; Vincent Vargas

    2010-01-01

    We define a simple and tractable method for adding the Leverage effect in general volatility predictions. As an application, we compare volatility predictions with and without Leverage on the SP500 Index during the period 2002-2010.

  15. Massive Open Online Courses

    Directory of Open Access Journals (Sweden)

    Tharindu Rekha Liyanagunawardena

    2015-01-01

    Full Text Available Massive Open Online Courses (MOOCs are a new addition to the open educational provision. They are offered mainly by prestigious universities on various commercial and non-commercial MOOC platforms allowing anyone who is interested to experience the world class teaching practiced in these universities. MOOCs have attracted wide interest from around the world. However, learner demographics in MOOCs suggest that some demographic groups are underrepresented. At present MOOCs seem to be better serving the continuous professional development sector.

  16. Evolution of massive stars

    International Nuclear Information System (INIS)

    Loore, C. de

    1984-01-01

    The evolution of stars with masses larger than 15 sun masses is reviewed. These stars have large convective cores and lose a substantial fraction of their matter by stellar wind. The treatment of convection and the parameterisation of the stellar wind mass loss are analysed within the context of existing disagreements between theory and observation. The evolution of massive close binaries and the origin of Wolf-Rayet Stars and X-ray binaries is also sketched. (author)

  17. Macroeconomic Dynamics of Assets, Leverage and Trust

    Science.gov (United States)

    Rozendaal, Jeroen C.; Malevergne, Yannick; Sornette, Didier

    A macroeconomic model based on the economic variables (i) assets, (ii) leverage (defined as debt over asset) and (iii) trust (defined as the maximum sustainable leverage) is proposed to investigate the role of credit in the dynamics of economic growth, and how credit may be associated with both economic performance and confidence. Our first notable finding is the mechanism of reward/penalty associated with patience, as quantified by the return on assets. In regular economies where the EBITA/Assets ratio is larger than the cost of debt, starting with a trust higher than leverage results in the highest long-term return on assets (which can be seen as a proxy for economic growth). Therefore, patient economies that first build trust and then increase leverage are positively rewarded. Our second main finding concerns a recommendation for the reaction of a central bank to an external shock that affects negatively the economic growth. We find that late policy intervention in the model economy results in the highest long-term return on assets. However, this comes at the cost of suffering longer from the crisis until the intervention occurs. The phenomenon that late intervention is most effective to attain a high long-term return on assets can be ascribed to the fact that postponing intervention allows trust to increase first, and it is most effective to intervene when trust is high. These results are derived from two fundamental assumptions underlying our model: (a) trust tends to increase when it is above leverage; (b) economic agents learn optimally to adjust debt for a given level of trust and amount of assets. Using a Markov Switching Model for the EBITA/Assets ratio, we have successfully calibrated our model to the empirical data of the return on equity of the EURO STOXX 50 for the time period 2000-2013. We find that dynamics of leverage and trust can be highly nonmonotonous with curved trajectories, as a result of the nonlinear coupling between the variables. This

  18. Project financing

    International Nuclear Information System (INIS)

    Alvarez, M.U.

    1990-01-01

    This paper presents the basic concepts and components of the project financing of large industrial facilities. Diagrams of a simple partnership structure and a simple leveraged lease structure are included. Finally, a Hypothetical Project is described with basic issues identified for discussion purposes. The topics of the paper include non-recourse financing, principal advantages and objectives, disadvantages, project financing participants and agreements, feasibility studies, organization of the project company, principal agreements in a project financing, insurance, and an examination of a hypothetical project

  19. The Disciplining Role of Leverage in Dutch Firms

    NARCIS (Netherlands)

    de Jong, A.

    2001-01-01

    In this study we investigate the role of leverage in disciplining overinvestment problems.We measure the relationships between leverage, Tobin s q and corporate governance characteristics for Dutch listed firms.Besides, our empirical analysis tests for determinants of leverage from tax and

  20. ALFIL: A Crowd Simulation Serious Game for Massive Evacuation Training and Awareness

    Science.gov (United States)

    García-García, César; Fernández-Robles, José Luis; Larios-Rosillo, Victor; Luga, Hervé

    2012-01-01

    This article presents the current development of a serious game for the simulation of massive evacuations. The purpose of this project is to promote self-protection through awareness of the procedures and different possible scenarios during the evacuation of a massive event. Sophisticated behaviors require massive computational power and it has…

  1. Leveraged Leasing in the Federal Sector.

    Science.gov (United States)

    1983-12-01

    entity obtains cash while the private investors can depreciate the property and obtain investment tax credits and other tax benefits. The costs are borne...lease from a common base point to familiarize the reader with the applicable portions of generally accepted accounting principles, GAAP , and how they...chapter identifies the differences between a basic lease and a leveraged lease in terms of structure and accounting requirements relative to GAAP

  2. Leverage Website Favicon to Detect Phishing Websites

    OpenAIRE

    Kang Leng Chiew; Jeffrey Soon-Fatt Choo; San Nah Sze; Kelvin S. C. Yong

    2018-01-01

    Phishing attack is a cybercrime that can lead to severe financial losses for Internet users and entrepreneurs. Typically, phishers are fond of using fuzzy techniques during the creation of a website. They confuse the victim by imitating the appearance and content of a legitimate website. In addition, many websites are vulnerable to phishing attacks, including financial institutions, social networks, e-commerce, and airline websites. This paper is an extension of our previous work that leverag...

  3. North Korea: Economic Leverage and Policy Analysis

    Science.gov (United States)

    2010-01-22

    although non- governmental groups do run operations in the DPRK in activities such as goat dairy farming and transportation. North -South Korean...Finance Minister Says “At Least” 34m US Dollars Sent to North Korea. Financial Times Information, Global News Wire—Asia Africa Intelligence Wire. June 6...CRS Report for Congress Prepared for Members and Committees of Congress North Korea: Economic Leverage and Policy Analysis Dick K

  4. Short-Selling, Leverage and Systemic Risk

    OpenAIRE

    Pais, Amelia; Stork, Philip A.

    2013-01-01

    During the Global Financial Crisis, regulators imposed short-selling bans to protect financial institutions. The rationale behind the bans was that “bear raids”, driven by short-sellers, would increase the individual and systemic risk of financial institutions, especially for institutions with high leverage. This study uses Extreme Value Theory to estimate the effect of short-selling on financial institutions’ individual and systemic risks in France, Italy and Spain; it also analyses the rela...

  5. Introduction to massive neutrinos

    International Nuclear Information System (INIS)

    Kayser, B.

    1984-01-01

    We discuss the theoretical ideas which make it natural to expect that neutrinos do indeed have mass. Then we focus on the physical consequences of neutrino mass, including neutrino oscillation and other phenomena whose observation would be very interesting, and would serve to demonstrate that neutrinos are indeed massive. We comment on the legitimacy of comparing results from different types of experiments. Finally, we consider the question of whether neutrinos are their own antiparticles. We explain what this question means, discuss the nature of a neutrino which is its own antiparticles, and consider how one might determine experimentally whether neutrinos are their own antiparticles or not

  6. Leveraging public private partnerships to innovate under challenging budget times.

    Science.gov (United States)

    Portilla, Lili M; Rohrbaugh, Mark L

    2014-01-01

    The National Institutes of Health (NIH), academic medical centers and industry have a long and productive history in collaborating together. Decreasing R&D budgets in both the private and public sector have made the need for such collaborations paramount to reduce the risk of further declines in the number of innovative drugs reaching the market to address pressing public health needs. Doing more with less has forced both industry and public sector research institutions (PSRIs) to leverage resources and expertise in order to de-risk projects. In addition, it provides an opportunity to envision and implement new approaches to accomplish these goals. We discuss several of these innovative collaborations and partnerships at the NIH that demonstrate how the NIH and industry are working together to strengthen the drug development pipeline.

  7. Massively Parallel QCD

    International Nuclear Information System (INIS)

    Soltz, R; Vranas, P; Blumrich, M; Chen, D; Gara, A; Giampap, M; Heidelberger, P; Salapura, V; Sexton, J; Bhanot, G

    2007-01-01

    The theory of the strong nuclear force, Quantum Chromodynamics (QCD), can be numerically simulated from first principles on massively-parallel supercomputers using the method of Lattice Gauge Theory. We describe the special programming requirements of lattice QCD (LQCD) as well as the optimal supercomputer hardware architectures that it suggests. We demonstrate these methods on the BlueGene massively-parallel supercomputer and argue that LQCD and the BlueGene architecture are a natural match. This can be traced to the simple fact that LQCD is a regular lattice discretization of space into lattice sites while the BlueGene supercomputer is a discretization of space into compute nodes, and that both are constrained by requirements of locality. This simple relation is both technologically important and theoretically intriguing. The main result of this paper is the speedup of LQCD using up to 131,072 CPUs on the largest BlueGene/L supercomputer. The speedup is perfect with sustained performance of about 20% of peak. This corresponds to a maximum of 70.5 sustained TFlop/s. At these speeds LQCD and BlueGene are poised to produce the next generation of strong interaction physics theoretical results

  8. Phases of massive gravity

    CERN Document Server

    Dubovsky, S L

    2004-01-01

    We systematically study the most general Lorentz-violating graviton mass invariant under three-dimensional Eucledian group using the explicitly covariant language. We find that at general values of mass parameters the massive graviton has six propagating degrees of freedom, and some of them are ghosts or lead to rapid classical instabilities. However, there is a number of different regions in the mass parameter space where massive gravity can be described by a consistent low-energy effective theory with cutoff $\\sim\\sqrt{mM_{Pl}}$ free of rapid instabilities and vDVZ discontinuity. Each of these regions is characterized by certain fine-tuning relations between mass parameters, generalizing the Fierz--Pauli condition. In some cases the required fine-tunings are consequences of the existence of the subgroups of the diffeomorphism group that are left unbroken by the graviton mass. We found two new cases, when the resulting theories have a property of UV insensitivity, i.e. remain well behaved after inclusion of ...

  9. Leverage and Deepening Business Cycle Skewness

    DEFF Research Database (Denmark)

    Jensen, Henrik; Petrella, Ivan; Ravn, Søren Hove

    2017-01-01

    We document that the U.S. economy has been characterized by an increasingly negative business cycle asymmetry over the last three decades. This finding can be explained by the concurrent increase in the financial leverage of households and firms. To support this view, we devise and estimate......, booms become progressively smoother and more prolonged than busts. We are therefore able to reconcile a more negatively skewed business cycle with the Great Moderation in cyclical volatility. Finally, in line with recent empirical evidence, financially-driven expansions lead to deeper contractions...

  10. Minimal massive 3D gravity

    International Nuclear Information System (INIS)

    Bergshoeff, Eric; Merbis, Wout; Hohm, Olaf; Routh, Alasdair J; Townsend, Paul K

    2014-01-01

    We present an alternative to topologically massive gravity (TMG) with the same ‘minimal’ bulk properties; i.e. a single local degree of freedom that is realized as a massive graviton in linearization about an anti-de Sitter (AdS) vacuum. However, in contrast to TMG, the new ‘minimal massive gravity’ has both a positive energy graviton and positive central charges for the asymptotic AdS-boundary conformal algebra. (paper)

  11. Massively parallel multicanonical simulations

    Science.gov (United States)

    Gross, Jonathan; Zierenberg, Johannes; Weigel, Martin; Janke, Wolfhard

    2018-03-01

    Generalized-ensemble Monte Carlo simulations such as the multicanonical method and similar techniques are among the most efficient approaches for simulations of systems undergoing discontinuous phase transitions or with rugged free-energy landscapes. As Markov chain methods, they are inherently serial computationally. It was demonstrated recently, however, that a combination of independent simulations that communicate weight updates at variable intervals allows for the efficient utilization of parallel computational resources for multicanonical simulations. Implementing this approach for the many-thread architecture provided by current generations of graphics processing units (GPUs), we show how it can be efficiently employed with of the order of 104 parallel walkers and beyond, thus constituting a versatile tool for Monte Carlo simulations in the era of massively parallel computing. We provide the fully documented source code for the approach applied to the paradigmatic example of the two-dimensional Ising model as starting point and reference for practitioners in the field.

  12. Massive Galileon positivity bounds

    Science.gov (United States)

    de Rham, Claudia; Melville, Scott; Tolley, Andrew J.; Zhou, Shuang-Yong

    2017-09-01

    The EFT coefficients in any gapped, scalar, Lorentz invariant field theory must satisfy positivity requirements if there is to exist a local, analytic Wilsonian UV completion. We apply these bounds to the tree level scattering amplitudes for a massive Galileon. The addition of a mass term, which does not spoil the non-renormalization theorem of the Galileon and preserves the Galileon symmetry at loop level, is necessary to satisfy the lowest order positivity bound. We further show that a careful choice of successively higher derivative corrections are necessary to satisfy the higher order positivity bounds. There is then no obstruction to a local UV completion from considerations of tree level 2-to-2 scattering alone. To demonstrate this we give an explicit example of such a UV completion.

  13. Massively parallel mathematical sieves

    Energy Technology Data Exchange (ETDEWEB)

    Montry, G.R.

    1989-01-01

    The Sieve of Eratosthenes is a well-known algorithm for finding all prime numbers in a given subset of integers. A parallel version of the Sieve is described that produces computational speedups over 800 on a hypercube with 1,024 processing elements for problems of fixed size. Computational speedups as high as 980 are achieved when the problem size per processor is fixed. The method of parallelization generalizes to other sieves and will be efficient on any ensemble architecture. We investigate two highly parallel sieves using scattered decomposition and compare their performance on a hypercube multiprocessor. A comparison of different parallelization techniques for the sieve illustrates the trade-offs necessary in the design and implementation of massively parallel algorithms for large ensemble computers.

  14. Project financing

    International Nuclear Information System (INIS)

    Cowan, A.

    1998-01-01

    Project financing was defined ('where a lender to a specific project has recourse only to the cash flow and assets of that project for repayment and security respectively') and its attributes were described. Project financing was said to be particularly well suited to power, pipeline, mining, telecommunications, petro-chemicals, road construction, and oil and gas projects, i.e. large infrastructure projects that are difficult to fund on-balance sheet, where the risk profile of a project does not fit the corporation's risk appetite, or where higher leverage is required. Sources of project financing were identified. The need to analyze and mitigate risks, and being aware that lenders always take a conservative view and gravitate towards the lowest common denominator, were considered the key to success in obtaining project financing funds. TransAlta Corporation's project financing experiences were used to illustrate the potential of this source of financing

  15. On risk, leverage and banks: do highly leveraged banks take on excessive risk?

    NARCIS (Netherlands)

    Koudstaal, M.; van Wijnbergen, S.

    2012-01-01

    This paper deals with the relation between excessive risk taking and capital structure in banks. Examining a quarterly dataset of U.S. banks between 1993 and 2010, we find that equity is valued higher when more risky portfolios are chosen when leverage is high, and that more risk taking has a

  16. Leveraging liquid dielectrophoresis for microfluidic applications

    International Nuclear Information System (INIS)

    Chugh, Dipankar; Kaler, Karan V I S

    2008-01-01

    Miniaturized fluidic systems have been developed in recent years and offer new and novel means of leveraging the domain of microfluidics for the development of micro-total analysis systems (μTAS). Initially, such systems employed closed microchannels in order to facilitate chip-based biochemical assays, requiring very small quantities of sample and/or reagents and furthermore providing rapid and low-cost analysis on a compact footprint. More recently, advancements in the domain of surface microfluidics have suggested that similar low volume sample handling and manipulation capabilities for bioassays can be attained by leveraging the phenomena of liquid dielectrophoresis and droplet dielectrophoresis (DEP), without the need for separate pumps or valves. Some of the key aspects of this surface microfluidic technology and its capabilities are discussed and highlighted in this paper. We, furthermore, examine the integration and utility of liquid DEP and droplet DEP in providing rapid and automated sample handling and manipulation capabilities on a compact chip-based platform

  17. Leveraging electronic health records for clinical research.

    Science.gov (United States)

    Raman, Sudha R; Curtis, Lesley H; Temple, Robert; Andersson, Tomas; Ezekowitz, Justin; Ford, Ian; James, Stefan; Marsolo, Keith; Mirhaji, Parsa; Rocca, Mitra; Rothman, Russell L; Sethuraman, Barathi; Stockbridge, Norman; Terry, Sharon; Wasserman, Scott M; Peterson, Eric D; Hernandez, Adrian F

    2018-04-30

    Electronic health records (EHRs) can be a major tool in the quest to decrease costs and timelines of clinical trial research, generate better evidence for clinical decision making, and advance health care. Over the past decade, EHRs have increasingly offered opportunities to speed up, streamline, and enhance clinical research. EHRs offer a wide range of possible uses in clinical trials, including assisting with prestudy feasibility assessment, patient recruitment, and data capture in care delivery. To fully appreciate these opportunities, health care stakeholders must come together to face critical challenges in leveraging EHR data, including data quality/completeness, information security, stakeholder engagement, and increasing the scale of research infrastructure and related governance. Leaders from academia, government, industry, and professional societies representing patient, provider, researcher, industry, and regulator perspectives convened the Leveraging EHR for Clinical Research Now! Think Tank in Washington, DC (February 18-19, 2016), to identify barriers to using EHRs in clinical research and to generate potential solutions. Think tank members identified a broad range of issues surrounding the use of EHRs in research and proposed a variety of solutions. Recognizing the challenges, the participants identified the urgent need to look more deeply at previous efforts to use these data, share lessons learned, and develop a multidisciplinary agenda for best practices for using EHRs in clinical research. We report the proceedings from this think tank meeting in the following paper. Copyright © 2018 Elsevier, Inc. All rights reserved.

  18. Adapting algorithms to massively parallel hardware

    CERN Document Server

    Sioulas, Panagiotis

    2016-01-01

    In the recent years, the trend in computing has shifted from delivering processors with faster clock speeds to increasing the number of cores per processor. This marks a paradigm shift towards parallel programming in which applications are programmed to exploit the power provided by multi-cores. Usually there is gain in terms of the time-to-solution and the memory footprint. Specifically, this trend has sparked an interest towards massively parallel systems that can provide a large number of processors, and possibly computing nodes, as in the GPUs and MPPAs (Massively Parallel Processor Arrays). In this project, the focus was on two distinct computing problems: k-d tree searches and track seeding cellular automata. The goal was to adapt the algorithms to parallel systems and evaluate their performance in different cases.

  19. The effect of leverage increases on real earnings management

    OpenAIRE

    Zagers-Mamedova, Irina

    2009-01-01

    textabstractMain subject of this paper is to understand whether there could be an incentive for managers to manipulate cash flow from operating activities (CFO) through the use of real earnings management (REM), in situations with increasing leverage. Based upon a study of Jelinek (2007) who researched the correlation between increasing levels of leverage and accrual earnings management, I developed my main hypothesis with respect to the effect of leverage increases on REM to influence CFO. R...

  20. Building Project Competence

    DEFF Research Database (Denmark)

    Pemsel, Sofia; Wiewiora, Anna

    This research investigates the development of project competence, and particularly, three related dynamic capabilities (shifting, adapting, leveraging) that contribute to project competence development. In doing so, we make use of the emerging literature on knowledge governance and theorize how...... of dynamic capability building promoting project competence development....

  1. Leveraging the national cyberinfrastructure for biomedical research.

    Science.gov (United States)

    LeDuc, Richard; Vaughn, Matthew; Fonner, John M; Sullivan, Michael; Williams, James G; Blood, Philip D; Taylor, James; Barnett, William

    2014-01-01

    In the USA, the national cyberinfrastructure refers to a system of research supercomputer and other IT facilities and the high speed networks that connect them. These resources have been heavily leveraged by scientists in disciplines such as high energy physics, astronomy, and climatology, but until recently they have been little used by biomedical researchers. We suggest that many of the 'Big Data' challenges facing the medical informatics community can be efficiently handled using national-scale cyberinfrastructure. Resources such as the Extreme Science and Discovery Environment, the Open Science Grid, and Internet2 provide economical and proven infrastructures for Big Data challenges, but these resources can be difficult to approach. Specialized web portals, support centers, and virtual organizations can be constructed on these resources to meet defined computational challenges, specifically for genomics. We provide examples of how this has been done in basic biology as an illustration for the biomedical informatics community.

  2. Leveraging Gaming Technology to Deliver Effective Training

    Science.gov (United States)

    Cimino, James D.

    2011-01-01

    The best way to engage a soldier is to present them with training content consistent with their learning preference. Blended Interactive Multimedia Instruction (IMI) can be used to leach soldiers what they need to do, how to do each step, and utilize a COTS game engine to actually practices the skills learned. Blended IMI provides an enjoyable experience for the soldier, thereby increasing retention rates and motivation while decreasing the time to subject mastery. And now mobile devices have emerged as an exciting new platform, literally placing the training into the soldier's hands. In this paper, we will discuss how we leveraged commercial game engine technology, tightly integrated with the Blended IMI, to train soldiers on both laptops and mobile devices. We will provide a recent case study of how this training is being utilized, benefits and student/instructor feedback.

  3. Financial Leverage Behaviour and Firm Performance: Evidence from Publicly Quoted Companies in Nigeria

    Directory of Open Access Journals (Sweden)

    Godsday Okoro Edesiri

    2014-08-01

    Full Text Available This paper scrutinizes financial leverage behaviour and firm performance of publicly quoted companies in Nigeria. Data of Leverage, Profitability and Firm Size were sourced from the Nigerian Stock Exchange Fact-book and Annual Report and Accounts of 120 publicly quoted companies in Nigeria during the period 1990 through 2013. Findings suggest that profitability and firm size had a negative effect on financial leverage behaviour of publicly quoted companies in Nigeria. Thus, it was recommended that firms should carry out projects that would help enhance size and profitability in all aspect of the firm. Size in terms of assets would help increase the internal funding. This in turn will have a positive impact on the financial structure of firm as more of internally generated funds will be used instead of external borrowings. Firms should not assume that making of profit shows good application of leverage as this was not found to be true from the analysis. This implies that the result can be relied upon for policy direction.

  4. GeoDash: Assisting Visual Image Interpretation in Collect Earth Online by Leveraging Big Data on Google Earth Engine

    Science.gov (United States)

    Markert, K. N.; Ashmall, W.; Johnson, G.; Saah, D. S.; Anderson, E.; Flores Cordova, A. I.; Díaz, A. S. P.; Mollicone, D.; Griffin, R.

    2017-12-01

    Collect Earth Online (CEO) is a free and open online implementation of the FAO Collect Earth system for collaboratively collecting environmental data through the visual interpretation of Earth observation imagery. The primary collection mechanism in CEO is human interpretation of land surface characteristics in imagery served via Web Map Services (WMS). However, interpreters may not have enough contextual information to classify samples by only viewing the imagery served via WMS, be they high resolution or otherwise. To assist in the interpretation and collection processes in CEO, SERVIR, a joint NASA-USAID initiative that brings Earth observations to improve environmental decision making in developing countries, developed the GeoDash system, an embedded and critical component of CEO. GeoDash leverages Google Earth Engine (GEE) by allowing users to set up custom browser-based widgets that pull from GEE's massive public data catalog. These widgets can be quick looks of other satellite imagery, time series graphs of environmental variables, and statistics panels of the same. Users can customize widgets with any of GEE's image collections, such as the historical Landsat collection with data available since the 1970s, select date ranges, image stretch parameters, graph characteristics, and create custom layouts, all on-the-fly to support plot interpretation in CEO. This presentation focuses on the implementation and potential applications, including the back-end links to GEE and the user interface with custom widget building. GeoDash takes large data volumes and condenses them into meaningful, relevant information for interpreters. While designed initially with national and global forest resource assessments in mind, the system will complement disaster assessments, agriculture management, project monitoring and evaluation, and more.

  5. GeoDash: Assisting Visual Image Interpretation in Collect Earth Online by Leveraging Big Data on Google Earth Engine

    Science.gov (United States)

    Markert, Kel; Ashmall, William; Johnson, Gary; Saah, David; Mollicone, Danilo; Diaz, Alfonso Sanchez-Paus; Anderson, Eric; Flores, Africa; Griffin, Robert

    2017-01-01

    Collect Earth Online (CEO) is a free and open online implementation of the FAO Collect Earth system for collaboratively collecting environmental data through the visual interpretation of Earth observation imagery. The primary collection mechanism in CEO is human interpretation of land surface characteristics in imagery served via Web Map Services (WMS). However, interpreters may not have enough contextual information to classify samples by only viewing the imagery served via WMS, be they high resolution or otherwise. To assist in the interpretation and collection processes in CEO, SERVIR, a joint NASA-USAID initiative that brings Earth observations to improve environmental decision making in developing countries, developed the GeoDash system, an embedded and critical component of CEO. GeoDash leverages Google Earth Engine (GEE) by allowing users to set up custom browser-based widgets that pull from GEE's massive public data catalog. These widgets can be quick looks of other satellite imagery, time series graphs of environmental variables, and statistics panels of the same. Users can customize widgets with any of GEE's image collections, such as the historical Landsat collection with data available since the 1970s, select date ranges, image stretch parameters, graph characteristics, and create custom layouts, all on-the-fly to support plot interpretation in CEO. This presentation focuses on the implementation and potential applications, including the back-end links to GEE and the user interface with custom widget building. GeoDash takes large data volumes and condenses them into meaningful, relevant information for interpreters. While designed initially with national and global forest resource assessments in mind, the system will complement disaster assessments, agriculture management, project monitoring and evaluation, and more.

  6. KBS4FIA: Leveraging advanced knowledge-based systems for financial information analysis

    OpenAIRE

    García-Sánchez, Francisco; Paredes-Valverde, Mario Andrés; Valencia García, Rafael; Alcaraz Mármol, Gema; Almela Sánchez-Lafuente, Ángela

    2017-01-01

    Decision making takes place in an environment of uncertainty. Therefore, it is necessary to have information which is as accurate and complete as possible in order to minimize the risk that is inherent to the decision-making process. In the financial domain, the situation becomes even more critical due to the intrinsic complexity of the analytical tasks within this field. The main aim of the KBS4FIA project is to automate the processes associated with financial analysis by leveraging the tech...

  7. The Destructive Birth of Massive Stars and Massive Star Clusters

    Science.gov (United States)

    Rosen, Anna; Krumholz, Mark; McKee, Christopher F.; Klein, Richard I.; Ramirez-Ruiz, Enrico

    2017-01-01

    Massive stars play an essential role in the Universe. They are rare, yet the energy and momentum they inject into the interstellar medium with their intense radiation fields dwarfs the contribution by their vastly more numerous low-mass cousins. Previous theoretical and observational studies have concluded that the feedback associated with massive stars' radiation fields is the dominant mechanism regulating massive star and massive star cluster (MSC) formation. Therefore detailed simulation of the formation of massive stars and MSCs, which host hundreds to thousands of massive stars, requires an accurate treatment of radiation. For this purpose, we have developed a new, highly accurate hybrid radiation algorithm that properly treats the absorption of the direct radiation field from stars and the re-emission and processing by interstellar dust. We use our new tool to perform a suite of three-dimensional radiation-hydrodynamic simulations of the formation of massive stars and MSCs. For individual massive stellar systems, we simulate the collapse of massive pre-stellar cores with laminar and turbulent initial conditions and properly resolve regions where we expect instabilities to grow. We find that mass is channeled to the massive stellar system via gravitational and Rayleigh-Taylor (RT) instabilities. For laminar initial conditions, proper treatment of the direct radiation field produces later onset of RT instability, but does not suppress it entirely provided the edges of the radiation-dominated bubbles are adequately resolved. RT instabilities arise immediately for turbulent pre-stellar cores because the initial turbulence seeds the instabilities. To model MSC formation, we simulate the collapse of a dense, turbulent, magnetized Mcl = 106 M⊙ molecular cloud. We find that the influence of the magnetic pressure and radiative feedback slows down star formation. Furthermore, we find that star formation is suppressed along dense filaments where the magnetic field is

  8. The effect of financial leverage on profitability of manufacturing ...

    African Journals Online (AJOL)

    The effect of financial leverage on profitability of manufacturing companies listed on the Ghana stock exchange. ... Journal of Business Research ... For many years many studies have focused on the effect of financial leverage on firm performance and yet there has been no specific result that can be generalized regarding ...

  9. 17 CFR 31.8 - Cover of leverage contracts.

    Science.gov (United States)

    2010-04-01

    ... receipt for two business days: Provided, however, That the amount of physical commodities subject to such... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Cover of leverage contracts. 31.8 Section 31.8 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION LEVERAGE...

  10. Color Magnitude Diagrams of Old, Massive GCs in M31

    Science.gov (United States)

    Caldwell, Nelson; Williams, B.; Dolphin, A. E.; Johnson, L. C.; Weisz, D. R.

    2013-01-01

    Multicolor stellar photometry of HST data of M31 collected as part of the PHAT project has been performed using the DOLPHOT suite of programs. We present results of color-magnitude diagrams created in F475W and F814W (BI) of more than 50 massive, old clusters. These are clusters in or projected on the disk. We compare the metallicities derived from the color of the giant branch stars with that derived from integrated light spectroscopy. As well, we compare the ages of massive, young clusters with those found from spectra.

  11. Personalized Infrastructure: Leveraging Behavioral Strategies for Future Mobility

    Energy Technology Data Exchange (ETDEWEB)

    Duvall, Andrew L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-10-23

    toward an ever more complex mobility landscape, and with a quickly growing population, we look for answers to these questions as the core of developing strategies for the future of transportation. Using available data from emergent modes, and experiments conducted as part of an Advanced Research Projects Agency - Energy (ARPA-E) Traveler Response Architecture using Novel Signaling for Network Efficiency in Transportation (TRANSNET) project, we look at how the sharing economy and transportation mobility services have begun to radically alter transportation behavior, while operating in parallel with traditional transportation infrastructure. Emerging modes and practices are affecting car dependence and enabling multimodality. We weigh influences on travel behaviors, identify decision breakpoints where inelastic behavior becomes elastic, incentives, and societal leverage points.

  12. Distress risk and leverage puzzles: Evidence from Taiwan

    Directory of Open Access Journals (Sweden)

    Kung-Cheng Ho

    2016-05-01

    Full Text Available Financial distress has been invoked in the asset pricing literature to explain the anomalous patterns in the cross-section of stock returns. The risk of financial distress can be measured using indexes. George and Hwang (2010 suggest that leverage can explain the distress risk puzzle and that firms with high costs choose low leverage to reduce distress intensities and earn high returns. This study investigates whether this relationship exists in the Taiwan market. When examined separately, distress intensity is found to be negatively related to stock returns, but leverage is found to not be significantly related to stock returns. The results are the same when distress intensity and leverage are examined simultaneously. After assessing the robustness by using O-scores, distress risk puzzle is found to exist in the Taiwan market, but the leverage puzzle is not

  13. Projectables

    DEFF Research Database (Denmark)

    Rasmussen, Troels A.; Merritt, Timothy R.

    2017-01-01

    CNC cutting machines have become essential tools for designers and architects enabling rapid prototyping, model-building and production of high quality components. Designers often cut from new materials, discarding the irregularly shaped remains. We introduce ProjecTables, a visual augmented...... reality system for interactive packing of model parts onto sheet materials. ProjecTables enables designers to (re)use scrap materials for CNC cutting that would have been previously thrown away, at the same time supporting aesthetic choices related to wood grain, avoiding surface blemishes, and other...... relevant material properties. We conducted evaluations of ProjecTables with design students from Aarhus School of Architecture, demonstrating that participants could quickly and easily place and orient model parts reducing material waste. Contextual interviews and ideation sessions led to a deeper...

  14. Massive Black Hole Binary Evolution

    Directory of Open Access Journals (Sweden)

    Merritt David

    2005-11-01

    Full Text Available Coalescence of binary supermassive black holes (SBHs would constitute the strongest sources of gravitational waves to be observed by LISA. While the formation of binary SBHs during galaxy mergers is almost inevitable, coalescence requires that the separation between binary components first drop by a few orders of magnitude, due presumably to interaction of the binary with stars and gas in a galactic nucleus. This article reviews the observational evidence for binary SBHs and discusses how they would evolve. No completely convincing case of a bound, binary SBH has yet been found, although a handful of systems (e.g. interacting galaxies; remnants of galaxy mergers are now believed to contain two SBHs at projected separations of <~ 1kpc. N-body studies of binary evolution in gas-free galaxies have reached large enough particle numbers to reproduce the slow, “diffusive” refilling of the binary’s loss cone that is believed to characterize binary evolution in real galactic nuclei. While some of the results of these simulations - e.g. the binary hardening rate and eccentricity evolution - are strongly N-dependent, others - e.g. the “damage” inflicted by the binary on the nucleus - are not. Luminous early-type galaxies often exhibit depleted cores with masses of ~ 1-2 times the mass of their nuclear SBHs, consistent with the predictions of the binary model. Studies of the interaction of massive binaries with gas are still in their infancy, although much progress is expected in the near future. Binary coalescence has a large influence on the spins of SBHs, even for mass ratios as extreme as 10:1, and evidence of spin-flips may have been observed.

  15. Massive gravity from bimetric gravity

    International Nuclear Information System (INIS)

    Baccetti, Valentina; Martín-Moruno, Prado; Visser, Matt

    2013-01-01

    We discuss the subtle relationship between massive gravity and bimetric gravity, focusing particularly on the manner in which massive gravity may be viewed as a suitable limit of bimetric gravity. The limiting procedure is more delicate than currently appreciated. Specifically, this limiting procedure should not unnecessarily constrain the background metric, which must be externally specified by the theory of massive gravity itself. The fact that in bimetric theories one always has two sets of metric equations of motion continues to have an effect even in the massive gravity limit, leading to additional constraints besides the one set of equations of motion naively expected. Thus, since solutions of bimetric gravity in the limit of vanishing kinetic term are also solutions of massive gravity, but the contrary statement is not necessarily true, there is no complete continuity in the parameter space of the theory. In particular, we study the massive cosmological solutions which are continuous in the parameter space, showing that many interesting cosmologies belong to this class. (paper)

  16. Leveraging GIS in a real-time data environment

    Energy Technology Data Exchange (ETDEWEB)

    Nemeth, D.B. [Panhandle Energy, Houston, TX, (United States); Spangler, J. [Global Information Systems, Kearney, MO (United States)

    2010-07-01

    This presentation discussed a project to integrate Gas Control (GC) with a Geographic Information System (GIS) for meeting asset, schematic, and mapping needs. The new system allows maps to be updated accurately and in real time, thereby avoiding first-flow delays. This is a substantial improvement over the previous system, in which maps were updated annually. GC users required a greater depth of data, the authority of update data and send commands, and viewing capability for real-time values for flow and pressure, with multiple concurrent views of the system and near constant availability of views and data. GC users needed to be able to see asset attributes with real-time values; send commands to facilities/equipment to control product flow; coordinate with asset management teams to control product flow; and have strict data/quality control processes. The project team defined and refined the system requirements, reviewed technologies that could be leveraged into a solution, provided data clean-up/migration services to supplement the GIS database with additional data needed for Supervisory Control and Data Acquisition (SCADA), and created overlays of pipe information for map viewing annotated with real-time data readings/asset information. Detailed schematics were presented for the data flow migration. The project resulted in the completed data capture process to supplement GIS asset data for the 5,000-mile Florida Gas Transmission (FGT) system, the completed clean-up of network and schematic diagrams, and the linking of real-time operations data for FGT. The presentation concluded with a discussion regarding opportunities for improvement to the user interface. 24 figs.

  17. Leverage Website Favicon to Detect Phishing Websites

    Directory of Open Access Journals (Sweden)

    Kang Leng Chiew

    2018-01-01

    Full Text Available Phishing attack is a cybercrime that can lead to severe financial losses for Internet users and entrepreneurs. Typically, phishers are fond of using fuzzy techniques during the creation of a website. They confuse the victim by imitating the appearance and content of a legitimate website. In addition, many websites are vulnerable to phishing attacks, including financial institutions, social networks, e-commerce, and airline websites. This paper is an extension of our previous work that leverages the favicon with Google image search to reveal the identity of a website. Our identity retrieval technique involves an effective mathematical model that can be used to assist in retrieving the right identity from the many entries of the search results. In this paper, we introduced an enhanced version of the favicon-based phishing attack detection with the introduction of the Domain Name Amplification feature and incorporation of addition features. Additional features are very useful when the website being examined does not have a favicon. We have collected a total of 5,000 phishing websites from PhishTank and 5,000 legitimate websites from Alexa to verify the effectiveness of the proposed method. From the experimental results, we achieved a 96.93% true positive rate with only a 4.13% false positive rate.

  18. Leveraging Distributions in Physical Unclonable Functions

    Directory of Open Access Journals (Sweden)

    Wenjie Che

    2017-10-01

    Full Text Available A special class of Physical Unclonable Functions (PUFs referred to as strong PUFs can be used in novel hardware-based authentication protocols. Strong PUFs are required for authentication because the bit strings and helper data are transmitted openly by the token to the verifier, and therefore are revealed to the adversary. This enables the adversary to carry out attacks against the token by systematically applying challenges and obtaining responses in an attempt to machine learn, and later predict, the token’s response to an arbitrary challenge. Therefore, strong PUFs must both provide an exponentially large challenge space and be resistant to machine-learning attacks in order to be considered secure. We investigate a transformation called temperature–voltage compensation (TVCOMP, which is used within the Hardware-Embedded Delay PUF (HELP bit string generation algorithm. TVCOMP increases the diversity and unpredictability of the challenge–response space, and therefore increases resistance to model-building attacks. HELP leverages within-die variations in path delays as a source of random information. TVCOMP is a linear transformation designed specifically for dealing with changes in delay introduced by adverse temperature–voltage (environmental variations. In this paper, we show that TVCOMP also increases entropy and expands the challenge–response space dramatically.

  19. Holographically viable extensions of topologically massive and minimal massive gravity?

    Science.gov (United States)

    Altas, Emel; Tekin, Bayram

    2016-01-01

    Recently [E. Bergshoeff et al., Classical Quantum Gravity 31, 145008 (2014)], an extension of the topologically massive gravity (TMG) in 2 +1 dimensions, dubbed as minimal massive gravity (MMG), which is free of the bulk-boundary unitarity clash that inflicts the former theory and all the other known three-dimensional theories, was found. Field equations of MMG differ from those of TMG at quadratic terms in the curvature that do not come from the variation of an action depending on the metric alone. Here we show that MMG is a unique theory and there does not exist a deformation of TMG or MMG at the cubic and quartic order (and beyond) in the curvature that is consistent at the level of the field equations. The only extension of TMG with the desired bulk and boundary properties having a single massive degree of freedom is MMG.

  20. Cash Holdings and Leverage of German Listed Firms

    DEFF Research Database (Denmark)

    Rapp, Marc Steffen; Killi, Andreas Maximilian

    2016-01-01

    We examine cash holdings and leverage levels of German listed (non-financial and non-utility) firms. We document a secular increase in cash ratios over the last twenty years (1992–2011), reducing the net debt book leverage ratio for the average sample firm close to zero. Using prediction models...... firms are associated with measures of uncertainty faced by firms. Our results suggest that German firms have increased (reduced) their cash (net debt leverage) levels over time in order to adopt more precautionary financial policies....

  1. Massive Submucosal Ganglia in Colonic Inertia.

    Science.gov (United States)

    Naemi, Kaveh; Stamos, Michael J; Wu, Mark Li-Cheng

    2018-02-01

    - Colonic inertia is a debilitating form of primary chronic constipation with unknown etiology and diagnostic criteria, often requiring pancolectomy. We have occasionally observed massively enlarged submucosal ganglia containing at least 20 perikarya, in addition to previously described giant ganglia with greater than 8 perikarya, in cases of colonic inertia. These massively enlarged ganglia have yet to be formally recognized. - To determine whether such "massive submucosal ganglia," defined as ganglia harboring at least 20 perikarya, characterize colonic inertia. - We retrospectively reviewed specimens from colectomies of patients with colonic inertia and compared the prevalence of massive submucosal ganglia occurring in this setting to the prevalence of massive submucosal ganglia occurring in a set of control specimens from patients lacking chronic constipation. - Seven of 8 specimens affected by colonic inertia harbored 1 to 4 massive ganglia, for a total of 11 massive ganglia. One specimen lacked massive ganglia but had limited sampling and nearly massive ganglia. Massive ganglia occupied both superficial and deep submucosal plexus. The patient with 4 massive ganglia also had 1 mitotically active giant ganglion. Only 1 massive ganglion occupied the entire set of 10 specimens from patients lacking chronic constipation. - We performed the first, albeit distinctly small, study of massive submucosal ganglia and showed that massive ganglia may be linked to colonic inertia. Further, larger studies are necessary to determine whether massive ganglia are pathogenetic or secondary phenomena, and whether massive ganglia or mitotically active ganglia distinguish colonic inertia from other types of chronic constipation.

  2. Leveraging R&D Resources via the Joint LLC Model

    Science.gov (United States)

    Ganz, Matthew W.

    2008-03-01

    Industrial scientific research labs have become increasingly stressed in recent years by a variety of external forces. Both corporations and government funding agencies have shifted their priorities from long-term fundamental research toward projects that have a high probability of shorter-term payoff. Industrial funding has been further stressed by an increasing demand for quarterly results and fierce global competition. Industry leaders are now asking their R&D labs for ``home runs” and not just a solid base in the physical sciences. The end of the Cold War has also left the US without a declared enemy whose overt intention was to defeat us through a mastery of large-scale weaponry based upon exploitation of fundamental physics. This, when combined with a bona-fide need for technology gap fillers to respond to on-the-ground threats in the current Middle East conflicts, has led to diminished government emphasis on long-term research in the physical sciences. Simultaneously, the global sources of R&D spending are expanding. The dramatic growth of private equity in the technology development arena has both drawn talent from industry and changed the expectations on researchers. R&D spending in China, India and many other countries is growing significantly. Thus, in order to become relevant, industry must now keep its finger on the pulse of the hundreds of billions of dollars being invested privately and publicly around the world. HRL Laboratories, LLC in Malibu, California represents a unique and successful new business model for industrial R&D. HRL was founded by Howard Hughes in 1948 as the Hughes Research Laboratory and for more than four decades was the internal R&D lab for the Hughes Aircraft Company. After a series of mergers, acquisitions and divestitures over the past 15 years, HRL is now a stand-alone LLC that is owned jointly by General Motors and the Boeing Company. HRL, with a staff of about 300, performs R&D services for GM and Boeing as well as for

  3. Leveraging New and Social Media to Educate the Masses

    Science.gov (United States)

    Gay, Pamela; CosmoQuest Team

    2018-01-01

    In today's connected world, scientists as individuals and as projects and institutions are turning to blogs, videos, and social media outlets like Twitter to share achievements, request aid, and discuss the issues of our science. Beyond sharing the thing-of-the-moment, these platforms also provide an environment where education is possible, and where creativity allows educators to engage broad audiences in active learning. In this presentation, we discuss how polling, ask-me-anything sessions, emoji, and animated gifs can be leveraged to test knowledge and facilitate engagement.Beyond looking at these techniques, we also examine audience engagement. Previously, it has been unclear if our day-to-day social media efforts have been merely preaching to one homogeneous choir from which we have all drawn our audiences, or if our individual efforts have been able to reach into different communities to multiply our impact. In this preliminary study, we examine the social media audiences of several space science Twitter feeds that relate to: podcasting; professional societies; individual programs; and individuals. This study directly measures the overlap in audiences and the diversity of interests held by these audiences. Through statistical analysis, we can discern if these audiences are all drawn from one single population, or if we are sampling different base populations with different feeds. The data generated in this project allow us to look beyond how our audiences interact with space science, with the added benefit of revealing their other interests. These interests are reflected by the non-space science accounts they follow on Twitter. This information will allow us to effectively recruit new people from space science adjacent interests.

  4. Leveraging Interactive Patient Care Technology to Improve Pain Management Engagement.

    Science.gov (United States)

    Rao-Gupta, Suma; Kruger, David; Leak, Lonna D; Tieman, Lisa A; Manworren, Renee C B

    2017-12-15

    Most children experience pain in hospitals; and their parents report dissatisfaction with how well pain was managed. Engaging patients and families in the development and evaluation of pain treatment plans may improve perceptions of pain management and hospital experiences. The aim of this performance improvement project was to engage patients and families to address hospitalized pediatric patients' pain using interactive patient care technology. The goal was to stimulate conversations about pain management expectations and perceptions of treatment plan effectiveness among patients, parents, and health care teams. Plan-Do-Study-Act was used to design, develop, test, and pilot new workflows to integrate the interactive patient care technology system with the automated medication dispensing system and document actions from both systems into the electronic health record. The pediatric surgical unit and hematology/oncology unit of a free-standing, university-affiliated, urban children's hospital were selected to pilot this performance improvement project because of the high prevalence of pain from surgeries and hematologic and oncologic diseases, treatments, and invasive procedures. Documentation of pain assessments, nonpharmacologic interventions, and evaluation of treatment effectiveness increased. The proportion of positive family satisfaction responses for pain management significantly increased from fiscal year 2014 to fiscal year 2016 (p = .006). By leveraging interactive patient care technologies, patients and families were engaged to take an active role in pain treatment plans and evaluation of treatment outcomes. Improved active communication and partnership with patients and families can effectively change organizational culture to be more sensitive to patients' pain and patients' and families' hospital experiences. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  5. TIA: algorithms for development of identity-linked SNP islands for analysis by massively parallel DNA sequencing.

    Science.gov (United States)

    Farris, M Heath; Scott, Andrew R; Texter, Pamela A; Bartlett, Marta; Coleman, Patricia; Masters, David

    2018-04-11

    Single nucleotide polymorphisms (SNPs) located within the human genome have been shown to have utility as markers of identity in the differentiation of DNA from individual contributors. Massively parallel DNA sequencing (MPS) technologies and human genome SNP databases allow for the design of suites of identity-linked target regions, amenable to sequencing in a multiplexed and massively parallel manner. Therefore, tools are needed for leveraging the genotypic information found within SNP databases for the discovery of genomic targets that can be evaluated on MPS platforms. The SNP island target identification algorithm (TIA) was developed as a user-tunable system to leverage SNP information within databases. Using data within the 1000 Genomes Project SNP database, human genome regions were identified that contain globally ubiquitous identity-linked SNPs and that were responsive to targeted resequencing on MPS platforms. Algorithmic filters were used to exclude target regions that did not conform to user-tunable SNP island target characteristics. To validate the accuracy of TIA for discovering these identity-linked SNP islands within the human genome, SNP island target regions were amplified from 70 contributor genomic DNA samples using the polymerase chain reaction. Multiplexed amplicons were sequenced using the Illumina MiSeq platform, and the resulting sequences were analyzed for SNP variations. 166 putative identity-linked SNPs were targeted in the identified genomic regions. Of the 309 SNPs that provided discerning power across individual SNP profiles, 74 previously undefined SNPs were identified during evaluation of targets from individual genomes. Overall, DNA samples of 70 individuals were uniquely identified using a subset of the suite of identity-linked SNP islands. TIA offers a tunable genome search tool for the discovery of targeted genomic regions that are scalable in the population frequency and numbers of SNPs contained within the SNP island regions

  6. Enterprise Cloud Adoption: Leveraging on the Business and ...

    African Journals Online (AJOL)

    Enterprise Cloud Adoption: Leveraging on the Business and Security Benefits. ... on security, privacy and forensic issues associated with this new computing platform for ... Keywords: Cloud Computing, Cloud Security, Cloud Forensic, Security ...

  7. 78 FR 17766 - Interagency Guidance on Leveraged Lending

    Science.gov (United States)

    2013-03-22

    ... high-level principles related to safe-and-sound leveraged lending activities, including underwriting considerations, assessing and documenting enterprise value, risk management expectations for credits awaiting distribution, stress- testing expectations, pipeline portfolio management, and risk management expectations for...

  8. Estimation of the Continuous and Discontinuous Leverage Effects.

    Science.gov (United States)

    Aït-Sahalia, Yacine; Fan, Jianqing; Laeven, Roger J A; Wang, Christina Dan; Yang, Xiye

    2017-01-01

    This paper examines the leverage effect, or the generally negative covariation between asset returns and their changes in volatility, under a general setup that allows the log-price and volatility processes to be Itô semimartingales. We decompose the leverage effect into continuous and discontinuous parts and develop statistical methods to estimate them. We establish the asymptotic properties of these estimators. We also extend our methods and results (for the continuous leverage) to the situation where there is market microstructure noise in the observed returns. We show in Monte Carlo simulations that our estimators have good finite sample performance. When applying our methods to real data, our empirical results provide convincing evidence of the presence of the two leverage effects, especially the discontinuous one.

  9. Leveraged exchange-traded funds price dynamics and options valuation

    CERN Document Server

    Leung, Tim

    2016-01-01

    This book provides an analysis, under both discrete-time and continuous-time frameworks, on the price dynamics of leveraged exchange-traded funds (LETFs), with emphasis on the roles of leverage ratio, realized volatility, investment horizon, and tracking errors. This study provides new insights on the risks associated with LETFs. It also leads to the discussion of new risk management concepts, such as admissible leverage ratios and admissible risk horizon, as well as the mathematical and empirical analyses of several trading strategies, including static portfolios, pairs trading, and stop-loss strategies involving ETFs and LETFs. The final part of the book addresses the pricing of options written on LETFs. Since different LETFs are designed to track the same reference index, these funds and their associated options share very similar sources of randomness. The authors provide a no-arbitrage pricing approach that consistently value options on LETFs with different leverage ratios with stochastic volatility and ...

  10. Key Technologies in Massive MIMO

    Directory of Open Access Journals (Sweden)

    Hu Qiang

    2018-01-01

    Full Text Available The explosive growth of wireless data traffic in the future fifth generation mobile communication system (5G has led researchers to develop new disruptive technologies. As an extension of traditional MIMO technology, massive MIMO can greatly improve the throughput rate and energy efficiency, and can effectively improve the link reliability and data transmission rate, which is an important research direction of 5G wireless communication. Massive MIMO technology is nearly three years to get a new technology of rapid development and it through a lot of increasing the number of antenna communication, using very duplex communication mode, make the system spectrum efficiency to an unprecedented height.

  11. Hunting for a massive neutrino

    CERN Document Server

    AUTHOR|(CDS)2108802

    1997-01-01

    A great effort is devoted by many groups of physicists all over the world to give an answer to the following question: Is the neutrino massive ? This question has profound implications with particle physics, astrophysics and cosmology, in relation to the so-called Dark Matter puzzle. The neutrino oscillation process, in particular, can only occur if the neutrino is massive. An overview of the neutrino mass measurements, of the oscillation formalism and experiments will be given, also in connection with the present experimental programme at CERN with the two experiments CHORUS and NOMAD.

  12. The Effects of Logistics Leverage in Marketing Systems

    OpenAIRE

    G.N. Okeudo

    2012-01-01

    An effective logistics system when incorporated into marketing can strengthen its operations and further give the firm a competitive edge. To design a marketing system which must maintain its market share, a firm must consider the effects of logistics and how its integration into marketing can produce several points of leverage. It is the purpose of this paper to highlight the leverage points available in any logistics units and to further analyze how marketing managers can work in sync with ...

  13. Leverage, debt maturity and firm investment: An empirical analysis

    OpenAIRE

    Dang, Viet A.

    2011-01-01

    In this paper, we examine the potential interactions of corporate financing and investment decisions in the presence of incentive problems. We develop a system-based approach to investigate the effects of growth opportunities on leverage and debt maturity as well as the effects of these financing decisions on firm investment. Using a panel of UK firms between 1996 and 2003, we find that high-growth firms control underinvestment incentives by reducing leverage but not by shortening debt maturi...

  14. The Role of Target Leverage in Security Issues and Repurchases

    OpenAIRE

    Armen Hovakimian

    2004-01-01

    The paper examines whether security issues and repurchases adjust the capital structure toward the target. The time-series patterns of debt ratios imply that only debt reductions are initiated to offset the accumulated deviation from target leverage. The importance of target leverage in earlier debt-equity choice studies is driven by the subsample of equity issues accompanied by debt reductions. Unlike debt issues and reductions, equity issues and repurchases have no significant lasting effec...

  15. The VLT-FLAMES Tarantula Survey. III. A very massive star in apparent isolation from the massive cluster R136

    NARCIS (Netherlands)

    Bestenlehner, J.M.; Vink, J.S.; Gräfener, G.; Najarro, F.; Evans, C.J.; Bastian, N.; Bonanos, A.Z.; Bressert, E.; Crowther, P.A.; Doran, E.; Friedrich, K.; Hénault-Brunet, V.; Herrero, A.; de Koter, A.; Langer, N.; Lennon, D.J.; Maíz Apellániz, J.; Sana, H.; Soszynski, I.; Taylor, W.D.

    2011-01-01

    VFTS 682 is located in an active star-forming region, at a projected distance of 29 pc from the young massive cluster R136 in the Tarantula Nebula of the Large Magellanic Cloud. It was previously reported as a candidate young stellar object, and more recently spectroscopically revealed as a

  16. THE IMPACT OF FINANCIAL LEVERAGE ON RETURN AND RISK

    Directory of Open Access Journals (Sweden)

    HAKAN SARITAŞ

    2013-05-01

    Full Text Available Financing with debt and preferred stock to increase the potential return to the residual common shareholders’ equity is referred to as financial leverage. A firm’s return on equity (ROE is a key determinant of the growth rate of its earnings. Return on equity is affected profoundly by the firm’s degree of financial leverage. Increased debt will make a positive contribution to a firm’s ROE only if the firm’s return on assets (ROA exceeds the interest rate on the debt. In spite of the fact that financial leverage increases the rate of return on common stock equity, the grater the proportion of debt in the capital structure, however, the greater the risk the common shareholders bear. Introduction of financial leverage increases the average profitability of the firm as well as its risk. In good economic years, the impact of financial leverage will most likely be positive; however, the leverage effect may be negative in relatively bad years. Traditionally, studies treated short-term debt and long-term debt as perfect substitutes for each other. There is, however, risk-sharing by long-term debtholders which makes short-term debt financing riskier to shareholders than long-term debt financing. The significant affect associated with the total debt usage is largely attributable to short-term debt financing, since the impact of short-term debt financing on the expected returns is shown to be greater than that of long-term debt financing.

  17. Leveraging e-learning in medical education.

    Science.gov (United States)

    Lewis, Kadriye O; Cidon, Michal J; Seto, Teresa L; Chen, Haiqin; Mahan, John D

    2014-07-01

    e-Learning has become a popular medium for delivering instruction in medical education. This innovative method of teaching offers unique learning opportunities for medical trainees. The purpose of this article is to define the present state of e-learning in pediatrics and how to best leverage e-learning for educational effectiveness and change in medical education. Through addressing under-examined and neglected areas in implementation strategies for e-learning, its usefulness in medical education can be expanded. This study used a systematic database review of published studies in the field of e-learning in pediatric training between 2003 and 2013. The search was conducted using educational and health databases: Scopus, ERIC, PubMed, and search engines Google and Hakia. A total of 72 reference articles were suitable for analysis. This review is supplemented by the use of "e-Learning Design Screening Questions" to define e-learning design and development in 10 randomly selected articles. Data analysis used template-based coding themes and counting of the categories using descriptive statistics.Our search for pediatric e-learning (using Google and Hakia) resulted in six well-defined resources designed to support the professional development of doctors, residents, and medical students. The majority of studies focused on instructional effectiveness and satisfaction. There were few studies about e-learning development, implementation, and needs assessments used to identify the institutional and learners' needs. Reviewed studies used various study designs, measurement tools, instructional time, and materials for e-learning interventions. e-Learning is a viable solution for medical educators faced with many challenges, including (1) promoting self-directed learning, (2) providing flexible learning opportunities that would offer continuous (24h/day/7 days a week) availability for learners, and (3) engaging learners through collaborative learning communities to gain

  18. Massive Neurofibroma of the Breast

    African Journals Online (AJOL)

    Valued eMachines Customer

    Neurofibromas are benign nerve sheath tumors that are extremely rare in the breast. We report a massive ... plexiform breast neurofibromas may transform into a malignant peripheral nerve sheath tumor1. We present a case .... Breast neurofibroma. http://www.breast-cancer.ca/type/breast-neurofibroma.htm. August 2011. 2.

  19. Cleaning Massive Sonar Point Clouds

    DEFF Research Database (Denmark)

    Arge, Lars Allan; Larsen, Kasper Green; Mølhave, Thomas

    2010-01-01

    We consider the problem of automatically cleaning massive sonar data point clouds, that is, the problem of automatically removing noisy points that for example appear as a result of scans of (shoals of) fish, multiple reflections, scanner self-reflections, refraction in gas bubbles, and so on. We...

  20. Topologically Massive Higher Spin Gravity

    NARCIS (Netherlands)

    Bagchi, A.; Lal, S.; Saha, A.; Sahoo, B.

    2011-01-01

    We look at the generalisation of topologically massive gravity (TMG) to higher spins, specifically spin-3. We find a special "chiral" point for the spin-three, analogous to the spin-two example, which actually coincides with the usual spin-two chiral point. But in contrast to usual TMG, there is the

  1. Supernovae from massive AGB stars

    NARCIS (Netherlands)

    Poelarends, A.J.T.; Izzard, R.G.; Herwig, F.; Langer, N.; Heger, A.

    2006-01-01

    We present new computations of the final fate of massive AGB-stars. These stars form ONeMg cores after a phase of carbon burning and are called Super AGB stars (SAGB). Detailed stellar evolutionary models until the thermally pulsing AGB were computed using three di erent stellar evolution codes. The

  2. Massively parallel quantum computer simulator

    NARCIS (Netherlands)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray

  3. PENGGUNAAN LEVERAGE PADA PERUSAHAAN: PERBEDAAN ANTARA CEO PRIA DAN WANITA

    Directory of Open Access Journals (Sweden)

    Farida Titik Kritanti

    2014-07-01

    Full Text Available AbstractVarious studies show that women are more risk averse in making decisions and better long-term oriented. Women would rather risk averse than men, making it less likely they will use debt in their capital structure, since by increasing debt increases the risk of the company's financial means. This study want to test whether gender became a significant factor in financial leverage, to see whether there are differences in policy between the company's leverage, led by men and women. Financial leverage is used as a measure of corporate risk because these variables can be changed by the CEO. Data from companies listed on the Jakarta Stock Exchange as sample. The results showed that there were differences in leverage between firms that have a men CEO with the women CEO of a company. Men CEO use more debt than the women CEO. But for the performance measured by ROI, obtained different results for the type of industry studied. For the consumer goods industry, there are performance differences between the men CEO and the women. But for the internet service industry and enamel kitchen showed no performance difference between women CEO with men CEO.Key words: leverage, woman CEO, man CEO, performanceAbstrakBerbagai penelitian menunjukkan bahwa wanita lebih risk averse dalam mengambil keputusan dan lebih berorientasi jangka panjang. Wanita lebih suka menolak risiko dibandingkan pria, sehingga kecil kemungkinan mereka akan menggunakan hutang dalam struktur modalnya, karena dengan menambah hutang berarti memperbesar risiko keuangan perusahaan. Penelitian ini ingin menguji apakah jender menjadi faktor yang cukup signifikan dalam financial leverage, dengan melihat apakah ada perbedaan dalam kebijakan leverage antara perusahaan yang dipimpin oleh pria dan wanita. Financial leverage dipakai sebagai ukuran risiko perusahaan karena variabel ini bisa diubah oleh CEO. Sampel menggunakan data dari perusahaan yang listed di Jakarta Stock Exchange. Hasil penelitian

  4. 13 CFR 107.1130 - Leverage fees and additional charges payable by Licensee.

    Science.gov (United States)

    2010-01-01

    ... you issue a Debenture or Participating Security to repay or redeem existing Leverage, you must pay the leverage fee before SBA will guarantee or purchase the new Leverage security. (2) If you issue a Debenture... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Leverage fees and additional...

  5. Massive lepton pair production in massive quantum electrodynamics

    International Nuclear Information System (INIS)

    Raychaudhuri, P.

    1976-01-01

    The pp → l + +l - +x inclusive interaction has been studied at high energies in terms of the massive quantum electrodynamics. The differential cross-section (dsigma/dQ 2 ) is derived and proves to be proportional to Q -4 , where Q-mass of the lepton pair. Basic features of the cross-section are demonstrated to be consistent with the Drell-Yan model

  6. MassiveNuS: cosmological massive neutrino simulations

    Science.gov (United States)

    Liu, Jia; Bird, Simeon; Zorrilla Matilla, José Manuel; Hill, J. Colin; Haiman, Zoltán; Madhavacheril, Mathew S.; Petri, Andrea; Spergel, David N.

    2018-03-01

    The non-zero mass of neutrinos suppresses the growth of cosmic structure on small scales. Since the level of suppression depends on the sum of the masses of the three active neutrino species, the evolution of large-scale structure is a promising tool to constrain the total mass of neutrinos and possibly shed light on the mass hierarchy. In this work, we investigate these effects via a large suite of N-body simulations that include massive neutrinos using an analytic linear-response approximation: the Cosmological Massive Neutrino Simulations (MassiveNuS). The simulations include the effects of radiation on the background expansion, as well as the clustering of neutrinos in response to the nonlinear dark matter evolution. We allow three cosmological parameters to vary: the neutrino mass sum Mν in the range of 0–0.6 eV, the total matter density Ωm, and the primordial power spectrum amplitude As. The rms density fluctuation in spheres of 8 comoving Mpc/h (σ8) is a derived parameter as a result. Our data products include N-body snapshots, halo catalogues, merger trees, ray-traced galaxy lensing convergence maps for four source redshift planes between zs=1–2.5, and ray-traced cosmic microwave background lensing convergence maps. We describe the simulation procedures and code validation in this paper. The data are publicly available at http://columbialensing.org.

  7. Spacetime structure of massive Majorana particles and massive gravitino

    Energy Technology Data Exchange (ETDEWEB)

    Ahluwalia, D.V.; Kirchbach, M. [Theoretical Physics Group, Facultad de Fisica, Universidad Autonoma de Zacatecas, A.P. 600, 98062 Zacatecas (Mexico)

    2003-07-01

    The profound difference between Dirac and Majorana particles is traced back to the possibility of having physically different constructs in the (1/2, 0) 0 (0,1/2) representation space. Contrary to Dirac particles, Majorana-particle propagators are shown to differ from the simple linear {gamma} {mu} p{sub {mu}}, structure. Furthermore, neither Majorana particles, nor their antiparticles can be associated with a well defined arrow of time. The inevitable consequence of this peculiarity is the particle-antiparticle metamorphosis giving rise to neutrinoless double beta decay, on the one side, and enabling spin-1/2 fields to act as gauge fields, gauginos, on the other side. The second part of the lecture notes is devoted to massive gravitino. We argue that a spin measurement in the rest frame for an unpolarized ensemble of massive gravitino, associated with the spinor-vector [(1/2, 0) 0 (0,1/2)] 0 (1/2,1/2) representation space, would yield the results 3/2 with probability one half, and 1/2 with probability one half. The latter is distributed uniformly, i.e. as 1/4, among the two spin-1/2+ and spin-1/2- states of opposite parities. From that we draw the conclusion that the massive gravitino should be interpreted as a particle of multiple spin. (Author)

  8. What is project finance?

    OpenAIRE

    João M. Pinto

    2017-01-01

    Project finance is the process of financing a specific economic unit that the sponsors create, in which creditors share much of the venture’s business risk and funding is obtained strictly for the project itself. Project finance creates value by reducing the costs of funding, maintaining the sponsors financial flexibility, increasing the leverage ratios, avoiding contamination risk, reducing corporate taxes, improving risk management, and reducing the costs associated with market ...

  9. Minimal theory of massive gravity

    International Nuclear Information System (INIS)

    De Felice, Antonio; Mukohyama, Shinji

    2016-01-01

    We propose a new theory of massive gravity with only two propagating degrees of freedom. While the homogeneous and isotropic background cosmology and the tensor linear perturbations around it are described by exactly the same equations as those in the de Rham–Gabadadze–Tolley (dRGT) massive gravity, the scalar and vector gravitational degrees of freedom are absent in the new theory at the fully nonlinear level. Hence the new theory provides a stable nonlinear completion of the self-accelerating cosmological solution that was originally found in the dRGT theory. The cosmological solution in the other branch, often called the normal branch, is also rendered stable in the new theory and, for the first time, makes it possible to realize an effective equation-of-state parameter different from (either larger or smaller than) −1 without introducing any extra degrees of freedom.

  10. Spin-3 topologically massive gravity

    Energy Technology Data Exchange (ETDEWEB)

    Chen Bin, E-mail: bchen01@pku.edu.cn [Department of Physics, and State Key Laboratory of Nuclear Physics and Technology, Peking University, Beijing 100871 (China); Center for High Energy Physics, Peking University, Beijing 100871 (China); Long Jiang, E-mail: longjiang0301@gmail.com [Department of Physics, and State Key Laboratory of Nuclear Physics and Technology, Peking University, Beijing 100871 (China); Wu Junbao, E-mail: wujb@ihep.ac.cn [Institute of High Energy Physics, and Theoretical Physics Center for Science Facilities, Chinese Academy of Sciences, Beijing 100049 (China)

    2011-11-24

    In this Letter, we study the spin-3 topologically massive gravity (TMG), paying special attention to its properties at the chiral point. We propose an action describing the higher spin fields coupled to TMG. We discuss the traceless spin-3 fluctuations around the AdS{sub 3} vacuum and find that there is an extra local massive mode, besides the left-moving and right-moving boundary massless modes. At the chiral point, such extra mode becomes massless and degenerates with the left-moving mode. We show that at the chiral point the only degrees of freedom in the theory are the boundary right-moving graviton and spin-3 field. We conjecture that spin-3 chiral gravity with generalized Brown-Henneaux boundary condition is holographically dual to 2D chiral CFT with classical W{sub 3} algebra and central charge c{sub R}=3l/G.

  11. Minimal theory of massive gravity

    Directory of Open Access Journals (Sweden)

    Antonio De Felice

    2016-01-01

    Full Text Available We propose a new theory of massive gravity with only two propagating degrees of freedom. While the homogeneous and isotropic background cosmology and the tensor linear perturbations around it are described by exactly the same equations as those in the de Rham–Gabadadze–Tolley (dRGT massive gravity, the scalar and vector gravitational degrees of freedom are absent in the new theory at the fully nonlinear level. Hence the new theory provides a stable nonlinear completion of the self-accelerating cosmological solution that was originally found in the dRGT theory. The cosmological solution in the other branch, often called the normal branch, is also rendered stable in the new theory and, for the first time, makes it possible to realize an effective equation-of-state parameter different from (either larger or smaller than −1 without introducing any extra degrees of freedom.

  12. Search of massive star formation with COMICS

    Science.gov (United States)

    Okamoto, Yoshiko K.

    2004-04-01

    Mid-infrared observations is useful for studies of massive star formation. Especially COMICS offers powerful tools: imaging survey of the circumstellar structures of forming massive stars such as massive disks and cavity structures, mass estimate from spectroscopy of fine structure lines, and high dispersion spectroscopy to census gas motion around formed stars. COMICS will open the next generation infrared studies of massive star formation.

  13. The physics of massive neutrinos

    CERN Document Server

    Kayser, Boris; Perrier, Frederic

    1989-01-01

    This book explains the physics and phenomenology of massive neutrinos. The authors argue that neutrino mass is not unlikely and consider briefly the search for evidence of this mass in decay processes before they examine the physics and phenomenology of neutrino oscillation. The physics of Majorana neutrinos (neutrinos which are their own antiparticles) is then discussed. This volume requires of the reader only a knowledge of quantum mechanics and of very elementary quantum field theory.

  14. Leveraging the Cloud for Robust and Efficient Lunar Image Processing

    Science.gov (United States)

    Chang, George; Malhotra, Shan; Wolgast, Paul

    2011-01-01

    The Lunar Mapping and Modeling Project (LMMP) is tasked to aggregate lunar data, from the Apollo era to the latest instruments on the LRO spacecraft, into a central repository accessible by scientists and the general public. A critical function of this task is to provide users with the best solution for browsing the vast amounts of imagery available. The image files LMMP manages range from a few gigabytes to hundreds of gigabytes in size with new data arriving every day. Despite this ever-increasing amount of data, LMMP must make the data readily available in a timely manner for users to view and analyze. This is accomplished by tiling large images into smaller images using Hadoop, a distributed computing software platform implementation of the MapReduce framework, running on a small cluster of machines locally. Additionally, the software is implemented to use Amazon's Elastic Compute Cloud (EC2) facility. We also developed a hybrid solution to serve images to users by leveraging cloud storage using Amazon's Simple Storage Service (S3) for public data while keeping private information on our own data servers. By using Cloud Computing, we improve upon our local solution by reducing the need to manage our own hardware and computing infrastructure, thereby reducing costs. Further, by using a hybrid of local and cloud storage, we are able to provide data to our users more efficiently and securely. 12 This paper examines the use of a distributed approach with Hadoop to tile images, an approach that provides significant improvements in image processing time, from hours to minutes. This paper describes the constraints imposed on the solution and the resulting techniques developed for the hybrid solution of a customized Hadoop infrastructure over local and cloud resources in managing this ever-growing data set. It examines the performance trade-offs of using the more plentiful resources of the cloud, such as those provided by S3, against the bandwidth limitations such use

  15. PENGARUH PROFITABILITAS, LEVERAGE DAN LIKUIDITAS TERHADAP KINERJA LINGKUNGAN

    Directory of Open Access Journals (Sweden)

    Agus Widarsono

    2015-12-01

    Full Text Available This study aims to test and obtain empirical evidence of factors that affect the environmental performance partially and simultaneously. Factors studied in this research are profitability, leverage and liquidity. The research method used is descriptive method verifikatif. With verificative testing using multiple regression, partial test (t test and simultaneous test (F test. The data used are secondary data that is the company's annual report and PROPER report of Ministry of Environment as sample in the research. The sample of research is 11 State-Owned Enterprise (BUMN Year 2009-2013 taken by using purposive sampling method. The results of this study indicate that profitability, leverage and liquidity have no significant effect on environmental performance partially. And profitability, leverage, and profitability have no significant effect on environmental performance simultaneously.

  16. Asimetri Informasi, Leverage, dan Ukuran Perusahaan pada Manajemen Laba

    Directory of Open Access Journals (Sweden)

    Tiya Mahawyahrti

    2017-03-01

    Full Text Available This study aims at finding the empirical evidence of the effect of asymmetry information, leverage, and firm size on earning management. This research uses agency theory and positive accounting theory to explain the effect of asymmetry information, leverage, and firm size on earning management. This study was conducted on companies listed in Indonesia Stock Exchange during the period of 2009-2013. The samples were selected by purposive sampling method. The number of selected samples were 39 companies. Multiple linear regression analysis was used to analyze the data. Based on the data analysis, the study proves that the asymmetry information has positive effects on earning management, leverage has positive effects on earning management and firm size has negative effects on earning management.

  17. DOWNWARD SLOPING DEMAND CURVES FOR STOCK AND LEVERAGE

    Directory of Open Access Journals (Sweden)

    Liem Pei Fun

    2006-01-01

    Full Text Available This research attempts to investigate the effect of downward sloping demand curves for stock on firms' financing decisions. For the same size of equity issuance, firms with steeper slope of demand curves for their stocks experience a larger price drop in their share price compare to their counterparts. As a consequence, firms with a steeper slope of demand curves are less likely to issue equity and hence they have higher leverage ratios. This research finds that the steeper the slope of demand curve for firm's stock, the higher the actual leverage of the firm. Furthermore, firms with a steeper slope of demand curves have higher target leverage ratios, signifying that these firms prefer debt to equity financing in order to avoid the adverse price impact of equity issuance on their share price.

  18. Systemic risk and heterogeneous leverage in banking networks

    Science.gov (United States)

    Kuzubaş, Tolga Umut; Saltoğlu, Burak; Sever, Can

    2016-11-01

    This study probes systemic risk implications of leverage heterogeneity in banking networks. We show that the presence of heterogeneous leverages drastically changes the systemic effects of defaults and the nature of the contagion in interbank markets. Using financial leverage data from the US banking system, through simulations, we analyze the systemic significance of different types of borrowers, the evolution of the network, the consequences of interbank market size and the impact of market segmentation. Our study is related to the recent Basel III regulations on systemic risk and the treatment of the Global Systemically Important Banks (GSIBs). We also assess the extent to which the recent capital surcharges on GSIBs may curb financial fragility. We show the effectiveness of surcharge policy for the most-levered banks vis-a-vis uniform capital injection.

  19. Random diffusion and leverage effect in financial markets.

    Science.gov (United States)

    Perelló, Josep; Masoliver, Jaume

    2003-03-01

    We prove that Brownian market models with random diffusion coefficients provide an exact measure of the leverage effect [J-P. Bouchaud et al., Phys. Rev. Lett. 87, 228701 (2001)]. This empirical fact asserts that past returns are anticorrelated with future diffusion coefficient. Several models with random diffusion have been suggested but without a quantitative study of the leverage effect. Our analysis lets us to fully estimate all parameters involved and allows a deeper study of correlated random diffusion models that may have practical implications for many aspects of financial markets.

  20. Leveraging the wisdom of the crowd in software testing

    CERN Document Server

    Sharma, Mukesh

    2015-01-01

    Its scale, flexibility, cost effectiveness, and fast turnaround are just a few reasons why crowdsourced testing has received so much attention lately. While there are a few online resources that explain what crowdsourced testing is all about, there's been a need for a book that covers best practices, case studies, and the future of this technique.Filling this need, Leveraging the Wisdom of the Crowd in Software Testing shows you how to leverage the wisdom of the crowd in your software testing process. Its comprehensive coverage includes the history of crowdsourcing and crowdsourced testing, im

  1. Vaidya spacetime in massive gravity's rainbow

    Directory of Open Access Journals (Sweden)

    Yaghoub Heydarzade

    2017-11-01

    Full Text Available In this paper, we will analyze the energy dependent deformation of massive gravity using the formalism of massive gravity's rainbow. So, we will use the Vainshtein mechanism and the dRGT mechanism for the energy dependent massive gravity, and thus analyze a ghost free theory of massive gravity's rainbow. We study the energy dependence of a time-dependent geometry, by analyzing the radiating Vaidya solution in this theory of massive gravity's rainbow. The energy dependent deformation of this Vaidya metric will be performed using suitable rainbow functions.

  2. STABLE ISOTOPE GEOCHEMISTRY OF MASSIVE ICE

    Directory of Open Access Journals (Sweden)

    Yurij K. Vasil’chuk

    2016-01-01

    Full Text Available The paper summarises stable-isotope research on massive ice in the Russian and North American Arctic, and includes the latest understanding of massive-ice formation. A new classification of massive-ice complexes is proposed, encompassing the range and variabilityof massive ice. It distinguishes two new categories of massive-ice complexes: homogeneousmassive-ice complexes have a similar structure, properties and genesis throughout, whereasheterogeneous massive-ice complexes vary spatially (in their structure and properties andgenetically within a locality and consist of two or more homogeneous massive-ice bodies.Analysis of pollen and spores in massive ice from Subarctic regions and from ice and snow cover of Arctic ice caps assists with interpretation of the origin of massive ice. Radiocarbon ages of massive ice and host sediments are considered together with isotope values of heavy oxygen and deuterium from massive ice plotted at a uniform scale in order to assist interpretation and correlation of the ice.

  3. Legal contamination of food products in case of nuclear accident. The CRIIRAD criticizes the outrageous work performed by Euratom experts, and calls for a massive mobilisation against the project of the European Commission

    International Nuclear Information System (INIS)

    Castanier, Corinne

    2015-01-01

    After having recalled the content of the project of the European Commission on the definition of maximum permissible levels of radioactive contamination of food products which will be applied in case of nuclear accident, this report first outlines that the associated risk levels are unacceptable (the maximum dose limit would not be respected by far). The authors outline numerous extremely severe anomalies and errors which occurred in the process of elaboration of the project. They try to identify responsibilities for these errors, and wander whether they are due to incompetence, or made on purpose as they always go in the same direction. The CRIIRAD therefore calls for a European mobilisation to sign a petition for a complete review of the applicable regulation. Letters written to or by members of European institutions are provided

  4. Spacetime structure of massive Majorana particles and massive gravitino

    CERN Document Server

    Ahluwalia, D V

    2003-01-01

    The profound difference between Dirac and Majorana particles is traced back to the possibility of having physically different constructs in the (1/2, 0) 0 (0,1/2) representation space. Contrary to Dirac particles, Majorana-particle propagators are shown to differ from the simple linear gamma mu p submu, structure. Furthermore, neither Majorana particles, nor their antiparticles can be associated with a well defined arrow of time. The inevitable consequence of this peculiarity is the particle-antiparticle metamorphosis giving rise to neutrinoless double beta decay, on the one side, and enabling spin-1/2 fields to act as gauge fields, gauginos, on the other side. The second part of the lecture notes is devoted to massive gravitino. We argue that a spin measurement in the rest frame for an unpolarized ensemble of massive gravitino, associated with the spinor-vector [(1/2, 0) 0 (0,1/2)] 0 (1/2,1/2) representation space, would yield the results 3/2 with probability one half, and 1/2 with probability one half. The ...

  5. The evolution of massive stars

    International Nuclear Information System (INIS)

    Loore, C. de

    1980-01-01

    The evolution of stars with masses between 15 M 0 and 100 M 0 is considered. Stars in this mass range lose a considerable fraction of their matter during their evolution. The treatment of convection, semi-convection and the influence of mass loss by stellar winds at different evolutionary phases are analysed as well as the adopted opacities. Evolutionary sequences computed by various groups are examined and compared with observations, and the advanced evolution of a 15 M 0 and a 25 M 0 star from zero-age main sequence (ZAMS) through iron collapse is discussed. The effect of centrifugal forces on stellar wind mass loss and the influence of rotation on evolutionary models is examined. As a consequence of the outflow of matter deeper layers show up and when the mass loss rates are large enough layers with changed composition, due to interior nuclear reactions, appear on the surface. The evolution of massive close binaries as well during the phase of mass loss by stellar wind as during the mass exchange and mass loss phase due to Roche lobe overflow is treated in detail, and the value of the parameters governing mass and angular momentum losses are discussed. The problem of the Wolf-Rayet stars, their origin and the possibilities of their production either as single stars or as massive binaries is examined. Finally, the origin of X-ray binaries is discussed and the scenario for the formation of these objects (starting from massive ZAMS close binaries, through Wolf-Rayet binaries leading to OB-stars with a compact companion after a supernova explosion) is reviewed and completed, including stellar wind mass loss. (orig.)

  6. Leveraging Metadata to Create Interactive Images... Today!

    Science.gov (United States)

    Hurt, Robert L.; Squires, G. K.; Llamas, J.; Rosenthal, C.; Brinkworth, C.; Fay, J.

    2011-01-01

    The image gallery for NASA's Spitzer Space Telescope has been newly rebuilt to fully support the Astronomy Visualization Metadata (AVM) standard to create a new user experience both on the website and in other applications. We encapsulate all the key descriptive information for a public image, including color representations and astronomical and sky coordinates and make it accessible in a user-friendly form on the website, but also embed the same metadata within the image files themselves. Thus, images downloaded from the site will carry with them all their descriptive information. Real-world benefits include display of general metadata when such images are imported into image editing software (e.g. Photoshop) or image catalog software (e.g. iPhoto). More advanced support in Microsoft's WorldWide Telescope can open a tagged image after it has been downloaded and display it in its correct sky position, allowing comparison with observations from other observatories. An increasing number of software developers are implementing AVM support in applications and an online image archive for tagged images is under development at the Spitzer Science Center. Tagging images following the AVM offers ever-increasing benefits to public-friendly imagery in all its standard forms (JPEG, TIFF, PNG). The AVM standard is one part of the Virtual Astronomy Multimedia Project (VAMP); http://www.communicatingastronomy.org

  7. Massive stars, successes and challenges

    OpenAIRE

    Meynet, Georges; Maeder, André; Georgy, Cyril; Ekström, Sylvia; Eggenberger, Patrick; Barblan, Fabio; Song, Han Feng

    2017-01-01

    We give a brief overview of where we stand with respect to some old and new questions bearing on how massive stars evolve and end their lifetime. We focus on the following key points that are further discussed by other contributions during this conference: convection, mass losses, rotation, magnetic field and multiplicity. For purpose of clarity, each of these processes are discussed on its own but we have to keep in mind that they are all interacting between them offering a large variety of ...

  8. Massive stars, successes and challenges

    Science.gov (United States)

    Meynet, Georges; Maeder, André; Georgy, Cyril; Ekström, Sylvia; Eggenberger, Patrick; Barblan, Fabio; Song, Han Feng

    2017-11-01

    We give a brief overview of where we stand with respect to some old and new questions bearing on how massive stars evolve and end their lifetime. We focus on the following key points that are further discussed by other contributions during this conference: convection, mass losses, rotation, magnetic field and multiplicity. For purpose of clarity, each of these processes are discussed on its own but we have to keep in mind that they are all interacting between them offering a large variety of outputs, some of them still to be discovered.

  9. Heterogeneity in the Speed of Adjustment toward Target Leverage

    DEFF Research Database (Denmark)

    Elsas, Ralf; Florysiak, David

    2011-01-01

    Estimating the speed of adjustment toward target leverage using the standard partial adjustment model assumes that all firms within the sample adjust at the same (average) pace. Dynamic capital structure theory predicts heterogeneity in adjustment speed due to firm-specific adjustment costs. Appl...

  10. Mining E-mail to Leverage Knowledge Networks in Organizations

    NARCIS (Netherlands)

    van Reijsen, J.; Helms, R.W.; Jackson, T.W.

    2009-01-01

    There is nothing new about the notion that in today‟s knowledge driven economy, knowledge is the key strategic asset for competitive advantage in an organization. Also, we have learned that knowledge is residing in the organization‟s informal network. Hence, to leverage business performance from a

  11. Leveraging Mobile Games for Place-Based Language Learning

    Science.gov (United States)

    Holden, Christopher L.; Sykes, Julie M.

    2011-01-01

    This paper builds on the emerging body of research aimed at exploring the educational potential of mobile technologies, specifically, how to leverage place-based, augmented reality mobile games for language learning. Mentira is the first place-based, augmented reality mobile game for learning Spanish in a local neighborhood in the Southwestern…

  12. Banking Competition and Stability : The Role of Leverage

    NARCIS (Netherlands)

    Freixas, X.; Ma, K.

    2014-01-01

    This paper reexamines the classical issue of the possible trade-offs between banking competition and financial stability by highlighting different types of risk and the role of leverage. By means of a simple model we show that competition can affect portfolio risk, insolvency risk, liquidity risk,

  13. The impact of Taxation on Bank Leverage and Asset Risk

    NARCIS (Netherlands)

    Horvath, B.L.

    2013-01-01

    Abstract: The tax-bene t of interest deductibility encourages debt nancing, but regulatory and market constraints create dependency between bank leverage and risk. Using a large international sample of banks this paper estimates the short and long run effects of corporate income taxes (CIT) on bank

  14. Leveraging Proximity Sensing to Mine the Behavior of Museum Visitors

    NARCIS (Netherlands)

    Martella, Claudio; Miraglia, Armando; Cattani, Marco; van Steen, Martinus Richardus

    Face-to-face proximity has been successfully leveraged to study the relationships between individuals in various contexts, from a working place, to a conference, a museum, a fair, and a date. We spend time facing the individuals with whom we chat, discuss, work, and play. However, face-to-face

  15. Real interest rates, leverage, and bank risk-taking

    NARCIS (Netherlands)

    Dell’Ariccia, G.; Laeven, L.; Marquez, R.

    2014-01-01

    Do low interest rate environments lead to greater bank risk-taking? We show that, when banks can adjust their capital structures, reductions in real interest rates lead to greater leverage and higher risk for any downward sloping loan demand function. However, if the capital structure is fixed, the

  16. The effect of leverage increases on real earnings management

    NARCIS (Netherlands)

    I. Zagers-Mamedova (Irina)

    2009-01-01

    textabstractMain subject of this paper is to understand whether there could be an incentive for managers to manipulate cash flow from operating activities (CFO) through the use of real earnings management (REM), in situations with increasing leverage. Based upon a study of Jelinek (2007) who

  17. Factors affecting Leverage: An empirical analysis of Mauritius ...

    African Journals Online (AJOL)

    Nafiisah

    presumably have an impact on the WACC and the firm's investment decision and ... debt is advocated as it consists of fixed interest payment which does not lead to ... explain the level of leverage – company size, profitability, asset tangibility and ... The firm will thus pursue an optimal capital structure or target debt ratio by.

  18. Leveraging Innovation Capabilities of Asian Micro, Small and ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Leveraging Innovation Capabilities of Asian Micro, Small and Medium Enterprises through Intermediary Organizations. Micro, small and medium enterprises (MSMEs) are a source of livelihood for billions of poor people worldwide. The current global economic downturn has hit these enterprises particularly hard, putting ...

  19. A feasible central limit theory for realised volatility under leverage

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, Neil

    In this note we show that the feasible central limit theory for realised volatility and realised covariation recently developed by Barndor-Nielsen and Shephard applies under arbitrary diusion based leverage eects. Results from a simulation experiment suggest that the feasible version of the limit...

  20. Metallicity dependence of envelope inflation in massive stars

    Czech Academy of Sciences Publication Activity Database

    Sanyal, D.; Langer, N.; Szécsi, Dorottya; Yoon, S.-C.; Grassitelli, L.

    2017-01-01

    Roč. 597, January (2017), A71/1-A71/16 E-ISSN 1432-0746 R&D Projects: GA ČR(CZ) GA14-02385S Institutional support: RVO:67985815 Keywords : stars evolution * stars massive * stars interiors Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics OBOR OECD: Astronomy (including astrophysics,space science) Impact factor: 5.014, year: 2016

  1. Payeeship, financial leverage, and the client-provider relationship.

    Science.gov (United States)

    Angell, Beth; Martinez, Noriko I; Mahoney, Colleen A; Corrigan, Patrick W

    2007-03-01

    Although representative payeeship provided within clinical settings is believed to have therapeutic benefits, its potential negative impact on the therapeutic alliance or client-provider relationship is of concern. This study examined the effects of payeeship and perceived financial leverage on positive and negative dimensions of the client-provider relationship. The sample consisted of 205 adults ages 18 to 65 with axis I disorders who were receiving mental health services from a large urban community mental health clinic. Information about money management characteristics and ratings of the client-provider relationship were collected via face-to-face interview. Fifty-three percent of the sample had a payee or money manager, and 79% of this group had a clinician payee. Respondents with co-occurring psychotic and substance use disorders, lower functioning, and lower insight about their illness were more likely to have a clinician payee. Forty percent of those with a clinician payee reported perceived financial leverage. Having a clinician payee was also associated with perceived financial leverage and with higher levels of conflict in the case management relationship. When examined in combination, financial leverage was found to mediate the effects of payeeship on conflict in the case management relationship (mean+/-SE=2.37+/-1.33, 95% confidence interval=16-5.52, pconflict in the therapeutic alliance when used as a source of treatment leverage. Although payeeship provides important support and may enhance functional outcomes for the patient, decisions about using the mechanism for promoting treatment adherence should take into account the potential disruption to the client-provider relationship.

  2. Solid holography and massive gravity

    International Nuclear Information System (INIS)

    Alberte, Lasma; Baggioli, Matteo; Khmelnitsky, Andrei; Pujolàs, Oriol

    2016-01-01

    Momentum dissipation is an important ingredient in condensed matter physics that requires a translation breaking sector. In the bottom-up gauge/gravity duality, this implies that the gravity dual is massive. We start here a systematic analysis of holographic massive gravity (HMG) theories, which admit field theory dual interpretations and which, therefore, might store interesting condensed matter applications. We show that there are many phases of HMG that are fully consistent effective field theories and which have been left overlooked in the literature. The most important distinction between the different HMG phases is that they can be clearly separated into solids and fluids. This can be done both at the level of the unbroken spacetime symmetries as well as concerning the elastic properties of the dual materials. We extract the modulus of rigidity of the solid HMG black brane solutions and show how it relates to the graviton mass term. We also consider the implications of the different HMGs on the electric response. We show that the types of response that can be consistently described within this framework is much wider than what is captured by the narrow class of models mostly considered so far.

  3. Solid holography and massive gravity

    Energy Technology Data Exchange (ETDEWEB)

    Alberte, Lasma [Abdus Salam International Centre for Theoretical Physics,Strada Costiera 11, 34151, Trieste (Italy); Baggioli, Matteo [Institut de Física d’Altes Energies (IFAE),The Barcelona Institute of Science and Technology (BIST), Campus UAB, 08193 Bellaterra, Barcelona (Spain); Department of Physics, Institute for Condensed Matter Theory, University of Illinois,1110 W. Green Street, Urbana, IL 61801 (United States); Khmelnitsky, Andrei [Abdus Salam International Centre for Theoretical Physics, Strada Costiera 11, 34151, Trieste (Italy); Pujolàs, Oriol [Institut de Física d’Altes Energies (IFAE),The Barcelona Institute of Science and Technology (BIST), Campus UAB, 08193 Bellaterra, Barcelona (Spain)

    2016-02-17

    Momentum dissipation is an important ingredient in condensed matter physics that requires a translation breaking sector. In the bottom-up gauge/gravity duality, this implies that the gravity dual is massive. We start here a systematic analysis of holographic massive gravity (HMG) theories, which admit field theory dual interpretations and which, therefore, might store interesting condensed matter applications. We show that there are many phases of HMG that are fully consistent effective field theories and which have been left overlooked in the literature. The most important distinction between the different HMG phases is that they can be clearly separated into solids and fluids. This can be done both at the level of the unbroken spacetime symmetries as well as concerning the elastic properties of the dual materials. We extract the modulus of rigidity of the solid HMG black brane solutions and show how it relates to the graviton mass term. We also consider the implications of the different HMGs on the electric response. We show that the types of response that can be consistently described within this framework is much wider than what is captured by the narrow class of models mostly considered so far.

  4. Discussion on massive gravitons and propagating torsion in arbitrary dimensions

    International Nuclear Information System (INIS)

    Hernaski, C.A.; Vargas-Paredes, A.A.; Helayel-Neto, J.A.

    2009-01-01

    Full text. Massive gravity has been an issue of particular interest since the early days of Quantum Gravity. More recently, in connection with models based on brane-world scenarios, the discussion of massive gravitons is drawing a great deal of attention, in view of the possibility of their production at LHC and the feasibility of detection of quantum gravity effects at the TeV scale. In this paper, we reassess a particular R 2 -type gravity action in D dimensions, recently studied by Nakasone and Oda, taking now torsion effects into account. Considering that the vielbein and the spin connection carry independent propagating degrees of freedom, we conclude that ghosts and tachyons are absent only if torsion is non-propagating, and we also conclude that there is no room for massive gravitons. To include these excitations, we understand how to enlarge Nakasone-Oda's model by means of explicit torsion terms in the action and we discuss the unitarity of the enlarged model for arbitrary dimensions. To make this we construct a complete basis of operators that projects the degrees of freedom of the dynamical fields of the model in their irreducible spin decomposition. The outcome is that we find a set of Lagrangians with a massive graviton that, in D=4, reproduce those already studied in the literature. (author)

  5. Massive-Star Magnetospheres: Now in 3-D!

    Science.gov (United States)

    Townsend, Richard

    prototyped. Simulation data from these codes will be used to synthesize observables, suitable for comparison with datasets from ground- and space-based facilities. Project results will be disseminated in the form of journal papers, presentations, data and visualizations, to facilitate the broad communication of our results. In addition, we will release the project codes under an open- source license, to encourage other groups' involvement in modeling massive-star magnetospheres. Through furthering our insights into these magnetospheres, the project is congruous with NASA's Strategic Goal 2, 'Expand scientific understanding of the Earth and the universe in which we live'. By making testable predictions of X-ray emission and UV line profiles, it is naturally synergistic with observational studies of magnetic massive stars using NASA's ROSAT, Chandra, IUE and FUSE missions. By exploring magnetic braking, it will have a direct impact on theoretical predictions of collapsar yields, and thereby help drive forward the analysis and interpretation of gamma-ray burst observations by NASA's Swift and Fermi missions. And, through its general contribution toward understanding the lifecycle of massive stars, the project will complement the past, present and future investments in studying these stars using NASA's other space-based observatories.

  6. On maximal massive 3D supergravity

    OpenAIRE

    Bergshoeff , Eric A; Hohm , Olaf; Rosseel , Jan; Townsend , Paul K

    2010-01-01

    ABSTRACT We construct, at the linearized level, the three-dimensional (3D) N = 4 supersymmetric " general massive supergravity " and the maximally supersymmetric N = 8 " new massive supergravity ". We also construct the maximally supersymmetric linearized N = 7 topologically massive supergravity, although we expect N = 6 to be maximal at the non-linear level. (Bergshoeff, Eric A) (Hohm, Olaf) (Rosseel, Jan) P.K.Townsend@da...

  7. On the singularities of massive superstring amplitudes

    International Nuclear Information System (INIS)

    Foda, O.

    1987-01-01

    Superstring one-loop amplitudes with massive external states are shown to be in general ill-defined due to internal on-shell propagators. However, we argue that since any massive string state (in the uncompactified theory) has a finite lifetime to decay into massless particles, such amplitudes are not terms in the perturbative expansion of physical S-matrix elements: These can be defined only with massless external states. Consistent massive amplitudes repuire an off-shell formalism. (orig.)

  8. On the singularities of massive superstring amplitudes

    Energy Technology Data Exchange (ETDEWEB)

    Foda, O.

    1987-06-04

    Superstring one-loop amplitudes with massive external states are shown to be in general ill-defined due to internal on-shell propagators. However, we argue that since any massive string state (in the uncompactified theory) has a finite lifetime to decay into massless particles, such amplitudes are not terms in the perturbative expansion of physical S-matrix elements: These can be defined only with massless external states. Consistent massive amplitudes repuire an off-shell formalism.

  9. From Stove-pipe to Network Centric Leveraging Technology to Present a Unified View

    National Research Council Canada - National Science Library

    Abuhantash, Medhat A; Shoultz, Matthew V

    2004-01-01

    .... The paper will also demonstrate how the application of current technology can be leveraged to present a unified view of data from disparate data sources, and how our organization is leveraging...

  10. The Media, Intelligence, and Information Proliferation: Managing and Leveraging the Chaos

    National Research Council Canada - National Science Library

    Steetin, Robert

    1999-01-01

    ... the chaos and leverage that coverage and flow of information. Leveraging the coverage only refers to improving and maintaining the leadership's situational awareness in a volatile, uncertain, complex, and ambiguous (VUCA) world...

  11. Massively Clustered CubeSats NCPS Demo Mission

    Science.gov (United States)

    Robertson, Glen A.; Young, David; Kim, Tony; Houts, Mike

    2013-01-01

    Technologies under development for the proposed Nuclear Cryogenic Propulsion Stage (NCPS) will require an un-crewed demonstration mission before they can be flight qualified over distances and time frames representative of a crewed Mars mission. In this paper, we describe a Massively Clustered CubeSats platform, possibly comprising hundreds of CubeSats, as the main payload of the NCPS demo mission. This platform would enable a mechanism for cost savings for the demo mission through shared support between NASA and other government agencies as well as leveraged commercial aerospace and academic community involvement. We believe a Massively Clustered CubeSats platform should be an obvious first choice for the NCPS demo mission when one considers that cost and risk of the payload can be spread across many CubeSat customers and that the NCPS demo mission can capitalize on using CubeSats developed by others for its own instrumentation needs. Moreover, a demo mission of the NCPS offers an unprecedented opportunity to invigorate the public on a global scale through direct individual participation coordinated through a web-based collaboration engine. The platform we describe would be capable of delivering CubeSats at various locations along a trajectory toward the primary mission destination, in this case Mars, permitting a variety of potential CubeSat-specific missions. Cameras on various CubeSats can also be used to provide multiple views of the space environment and the NCPS vehicle for video monitoring as well as allow the public to "ride along" as virtual passengers on the mission. This collaborative approach could even initiate a brand new Science, Technology, Engineering and Math (STEM) program for launching student developed CubeSat payloads beyond Low Earth Orbit (LEO) on future deep space technology qualification missions. Keywords: Nuclear Propulsion, NCPS, SLS, Mars, CubeSat.

  12. Light weakly interacting massive particles

    Science.gov (United States)

    Gelmini, Graciela B.

    2017-08-01

    Light weakly interacting massive particles (WIMPs) are dark matter particle candidates with weak scale interaction with the known particles, and mass in the GeV to tens of GeV range. Hints of light WIMPs have appeared in several dark matter searches in the last decade. The unprecedented possible coincidence into tantalizingly close regions of mass and cross section of four separate direct detection experimental hints and a potential indirect detection signal in gamma rays from the galactic center, aroused considerable interest in our field. Even if these hints did not so far result in a discovery, they have had a significant impact in our field. Here we review the evidence for and against light WIMPs as dark matter candidates and discuss future relevant experiments and observations.

  13. Massive postpartum right renal hemorrhage.

    Science.gov (United States)

    Kiracofe, H L; Peterson, N

    1975-06-01

    All reported cases of massive postpartum right renal hemorrhage have involved healthy young primigravidas and blacks have predominated (4 of 7 women). Coagulopathies and underlying renal disease have been absent. Hematuria was painless in 5 of 8 cases. Hemorrhage began within 24 hours in 1 case, within 48 hours in 4 cases and 4 days post partum in 3 cases. Our first case is the only report in which hemorrhage has occurred in a primipara. Failure of closure or reopening of pyelovenous channels is suggested as the pathogenesis. The hemorrhage has been self-limiting, requiring no more than 1,500 cc whole blood replacement. Bleeding should stop spontaneously, and rapid renal pelvic clot lysis should follow with maintenance of adequate urine output and Foley catheter bladder decompression. To date surgical intervention has not been necessary.

  14. Cosmological attractors in massive gravity

    CERN Document Server

    Dubovsky, S; Tkachev, I I

    2005-01-01

    We study Lorentz-violating models of massive gravity which preserve rotations and are invariant under time-dependent shifts of the spatial coordinates. In the linear approximation the Newtonian potential in these models has an extra ``confining'' term proportional to the distance from the source. We argue that during cosmological expansion the Universe may be driven to an attractor point with larger symmetry which includes particular simultaneous dilatations of time and space coordinates. The confining term in the potential vanishes as one approaches the attractor. In the vicinity of the attractor the extra contribution is present in the Friedmann equation which, in a certain range of parameters, gives rise to the cosmic acceleration.

  15. Massive Black Holes and Galaxies

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Evidence has been accumulating for several decades that many galaxies harbor central mass concentrations that may be in the form of black holes with masses between a few million to a few billion time the mass of the Sun. I will discuss measurements over the last two decades, employing adaptive optics imaging and spectroscopy on large ground-based telescopes that prove the existence of such a massive black hole in the Center of our Milky Way, beyond any reasonable doubt. These data also provide key insights into its properties and environment. Most recently, a tidally disrupting cloud of gas has been discovered on an almost radial orbit that reached its peri-distance of ~2000 Schwarzschild radii in 2014, promising to be a valuable tool for exploring the innermost accretion zone. Future interferometric studies of the Galactic Center Black hole promise to be able to test gravity in its strong field limit.

  16. Stable massive particles at colliders

    Energy Technology Data Exchange (ETDEWEB)

    Fairbairn, M.; /Stockholm U.; Kraan, A.C.; /Pennsylvania U.; Milstead, D.A.; /Stockholm U.; Sjostrand, T.; /Lund U.; Skands, P.; /Fermilab; Sloan, T.; /Lancaster U.

    2006-11-01

    We review the theoretical motivations and experimental status of searches for stable massive particles (SMPs) which could be sufficiently long-lived as to be directly detected at collider experiments. The discovery of such particles would address a number of important questions in modern physics including the origin and composition of dark matter in the universe and the unification of the fundamental forces. This review describes the techniques used in SMP-searches at collider experiments and the limits so far obtained on the production of SMPs which possess various colour, electric and magnetic charge quantum numbers. We also describe theoretical scenarios which predict SMPs, the phenomenology needed to model their production at colliders and interactions with matter. In addition, the interplay between collider searches and open questions in cosmology such as dark matter composition are addressed.

  17. The contribution of bank regulation and fair value accounting to procyclical leverage

    OpenAIRE

    Amel-Zadeh; Barth, ME; Landsman, WR

    2017-01-01

    Our analytical description of how banks’ responses to asset price changes can result in procyclical leverage reveals that for banks with a binding regulatory leverage constraint, absent differences in regulatory risk weights across assets, procyclical leverage does not occur. For banks without a binding constraint, fair value and bank regulation both can contribute to procyclical leverage. Empirical findings based on a large sample of US commercial banks reveal that bank regulation explains p...

  18. Size, Leverage, Concentration, and R&D Investment in Generating Growth Opportunities

    OpenAIRE

    Yew Kee Ho; Mira Tjahjapranata; Chee Meng Yap

    2006-01-01

    We show that a firm's ability to reap growth opportunities from R&D investments depends on its size, leverage, and the industry concentration. While the direct effects of these factors are significant, the size-leverage interaction reveals further important insights. Large firms' advantages over small firms disappear as their leverage increases. Specifically, small firms with high leverage reap the greatest growth opportunities. Our results provide explanations for inconsistent findings obser...

  19. 17 CFR 31.23 - Limited right to rescind first leverage contract.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Limited right to rescind first... COMMISSION LEVERAGE TRANSACTIONS § 31.23 Limited right to rescind first leverage contract. (a) A leverage... pursuant to the following provisions: (1) Such customer may be assessed actual price losses accruing to the...

  20. 17 CFR 31.13 - Financial reports of leverage transaction merchants.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Financial reports of leverage... COMMISSION LEVERAGE TRANSACTIONS § 31.13 Financial reports of leverage transaction merchants. (a) Each... person must include with such financial report a statement describing the source of his current assets...

  1. Leveraging Collaborative Filtering to Accelerate Rare Disease Diagnosis.

    Science.gov (United States)

    Shen, Feichen; Liu, Sijia; Wang, Yanshan; Wang, Liwei; Afzal, Naveed; Liu, Hongfang

    2017-01-01

    In the USA, rare diseases are defined as those affecting fewer than 200,000 patients at any given time. Patients with rare diseases are frequently misdiagnosed or undiagnosed which may due to the lack of knowledge and experience of care providers. We hypothesize that patients' phenotypic information available in electronic medical records (EMR) can be leveraged to accelerate disease diagnosis based on the intuition that providers need to document associated phenotypic information to support the diagnosis decision, especially for rare diseases. In this study, we proposed a collaborative filtering system enriched with natural language processing and semantic techniques to assist rare disease diagnosis based on phenotypic characterization. Specifically, we leveraged four similarity measurements with two neighborhood algorithms on 2010-2015 Mayo Clinic unstructured large patient cohort and evaluated different approaches. Preliminary results demonstrated that the use of collaborative filtering with phenotypic information is able to stratify patients with relatively similar rare diseases.

  2. PENGARUH KUALITAS AKRUAL DAN LEVERAGE TERHADAP CASH HOLDING PERUSAHAAN

    Directory of Open Access Journals (Sweden)

    Anggita Langgeng Wijaya

    2010-12-01

    Full Text Available This research tests the effect of accrual quality and leverage on corporate cash holding for a sample of manufacturing company listed in Indonesian Stock Exchange over the period 2006-2007. This research also tests the role of asymmetric information as a mediating variable on the relation between accrual quality and cash holding. Population of this research is 197 manufacturing companies at the Indonesian Stock Exchange. This research uses the purposive sampling method. Hypothesis test of this research em­ploys multiple regression analysis and path analysis. The results show that: accrual quality does not affect asymmetric information; asymmetric information positively affects corporate cash holdings; asymmetric information is not a mediating variable on the relation between accrual quality and cash holding; leverage negatively affects corporate cash holding.

  3. Defining and Leveraging Game Qualities for Serious Games

    Science.gov (United States)

    Martin, Michael W.; Shen, Yuzhong

    2011-01-01

    Serious games can and should leverage the unique qualities of video games to effectively deliver educational experiences for the learners. However, leveraging these qualities is incumbent upon understanding what these unique 'game' qualities are , and how they can facilitate the learning process. This paper presents an examination of the meaning of the term 'game' . as it applies to both serious games and digital entertainment games. Through the examination of counter examples, we derive three game characteristics; games are self contained, provide a variety of meaningful choices, and are intrinsically compelling. We also discuss the theoretical educational foundations which support the application of these 'game qualities' to educational endeavors. This paper concludes with a presentation of results achieved through the application of these qualities and the applicable educational theories to teach learners about the periodic table of elements via a serious game developed by the authors.

  4. Leveraging mobile computing and communication technologies in education

    DEFF Research Database (Denmark)

    Annan, Nana Kofi

    education and technology have evolved in tandem over the past years, this dissertation recognises the lapse that there is, in not being able to effectively leverage technology to improve education delivery by most educators. The study appreciates the enormousness of mobile computing and communication...... technologies in contributing to the development of tertiary education delivery, and has taken keen interest to investigate how the capacities of these technologies can be leveraged and incorporated effectively into the pedagogic framework of tertiary education. The purpose is to research into how...... of the results conducted after rigorous theoretical and empirical research unveiled the following: Mobile technologies can be incorporated into tertiary education if it has a strong theoretical underpinning, which links technology and pedagogy; the technology would not work if the user’s concerns in relation...

  5. LEVERAGE IMPACTS ON AGRO-INDUSTRIAL COMPANY INVESTMENTS

    Directory of Open Access Journals (Sweden)

    Nugroho A.C.

    2018-03-01

    Full Text Available Agro-industry has an important role in Indonesian economic growth. One of the crucial constraints in agro-industry investments in developing country is due to limited access to investment fund. This research was aimed to analyze the impacts of leverage on the agro-industrial company investments. The research used financial report data of the manufacturing industries on agro-industrial bases registered in Indonesian Stock-Exchange from 2007 to 2016. The data were analyzed using panel data regression analysis. The results of the research showed that the leverage influenced negatively on the agro-industrial companies. Cash flow has a negative impact on the company investments, which shows the existence of financial constraints when the company decide to invest.

  6. Influence analysis of Arctic tide gauges using leverages

    DEFF Research Database (Denmark)

    Svendsen, Peter Limkilde; Andersen, Ole Baltazar; Nielsen, Allan Aasbjerg

    2014-01-01

    a calibration period, in this preliminary case Drakkar ocean model data, which are forced using historical tide gauge data from the PSMSL database. The resulting leverage for each tide gauge may indicate that it represents a distinct mode of variability, or that its time series is perturbed in a way......Reconstructions of historical sea level in the Arctic Ocean are fraught with difficulties related to lack of data, uneven distribution of tide gauges and seasonal ice cover. Considering the period from 1950 to the present, we attempt to identify conspicuous tide gauges in an automated way, using...... the statistical leverage of each individual gauge. This may be of help in determining appropriate procedures for data preprocessing, of particular importance for the Arctic area as the GIA is hard to constrain and many gauges are located on rivers. We use a model based on empirical orthogonal functions from...

  7. Leveraging best practices to promote health, safety, sustainability, and stewardship.

    Science.gov (United States)

    Weiss, Marjorie D

    2013-08-01

    Strategically leveraging health and safety initiatives with sustainability and stewardship helps organizations improve profitability and positively impact team member and customer attachment to the organization. Collective efficacy enhances the triple bottom line: healthy people, healthy planet, and healthy profits. The HS(3)™ Best Practice Exchanges group demonstrated that collective efficacy can leverage the social cohesion, communication channels, and activities within workplaces to promote a healthy, sustainable work culture. This in turn (1) protects the health and safety of workers, (2) preserves the natural environment, and (3) increases attachment to the organization. Community-based participatory research using the Attach21 survey assessed the progress of these companies in their efforts to integrate health, safety, sustainability, and stewardship. Monthly Best Practice Exchanges promoted collective efficacy by providing support, encouragement, and motivation to share and adopt new ideas. Copyright 2013, SLACK Incorporated.

  8. Leveraging the Customer Base: Creating Competitive Advantage Through Knowledge Management

    OpenAIRE

    Elie Ofek; Miklos Sarvary

    2001-01-01

    Professional services firms (e.g., consultants, accounting firms, or advertising agencies) generate and sell business solutions to their customers. In doing so, they can leverage the cumulative experience gained from serving their customer base to either reduce their variable costs or increase the quality of their products/services. In other words, their "production technology" exhibits some form of increasing returns to scale. Growth and globalization, coupled with recent advances in informa...

  9. The Economics of Hedge Funds: Alpha, Fees, Leverage, and Valuation

    OpenAIRE

    Yingcong Lan; Neng Wang; Jinqiang Yang

    2011-01-01

    Hedge fund managers are compensated via management fees on the assets under management (AUM) and incentive fees indexed to the high-water mark (HWM). We study the effects of managerial skills (alpha) and compensation on dynamic leverage choices and the valuation of fees and investors' payoffs. Increasing the investment allocation to the alpha-generating strategy typically lowers the fund's risk-adjusted excess return due to frictions such as price pressure. When the manager is only paid via m...

  10. Bank stock returns, leverage and the business cycle

    OpenAIRE

    Jing Yang; Kostas Tsatsaronis

    2012-01-01

    The returns on bank stocks rise and fall with the business cycle, making bank equity financing cheaper in the boom and dearer during a recession. This provides support for prudential tools that give incentives for banks to build capital buffers at times when the cost of equity is lower. In addition, banks with higher leverage face a higher cost of equity, which suggests that higher capital ratios are associated with lower funding costs.

  11. Karakteristik Eksekutif Terhadap Tax Avoidance Dengan Leverage Sebagai Variabel Intervening

    OpenAIRE

    Carolina, Verani; Natalia, Maria; Debbianita, Debbianita

    2014-01-01

    This research aimed to examine the influence of the executive characteristic on corporate tax avoidance. Risktaker’s executive tended to be more courageous and aggressive in taking decision related to the tax. On thecontrary, the risk averse executive tended to be carefully (Low, 2006). This research used leverage as interveningvariable. Therefore, there was an assumption that the executive characteristic determined the corporateleverage which then influenced their tax avoidance in the compan...

  12. Leveraging Technological Capabilities across Polarized Cultures: Shanghai Delco Electronics Limited

    OpenAIRE

    Lucy A. Ojode

    2006-01-01

    Rallying its units for an impending spin-off from General Motors, the Delphi Automotive Systems division cleared the Delphi Delco Electronics (Delphi-D) unit to begin planning for entry into China in 1994. Delphi saw China as ideal for leveraging its technological and innovation capabilities as well as the enormous General Motor heritage and reputation from years of experience delivering quality products to the automotive industry. Delphi-D found a perfect partner in Shanghai Changjiang YiBia...

  13. Leveraging the NPS Femto Satellite for Alternative Satellite Communication Networks

    Science.gov (United States)

    2017-09-01

    programmed for eventual integration with the Iridium Network , which is then tested. C. THESIS ORGANIZATION The thesis addresses these questions...NPS FEMTO SATELLITE FOR ALTERNATIVE SATELLITE COMMUNICATION NETWORKS by Faisal S. Alshaya September 2017 Co-Advisors: Steven J. Iatrou...TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE LEVERAGING THE NPS FEMTO SATELLITE FOR ALTERNATIVE SATELLITE COMMUNICATION NETWORKS 5

  14. Good news is bad news: Leverage cycles and sudden stops

    OpenAIRE

    Akinci, Ozge; Chahrour, Ryan

    2015-01-01

    We show that a model with imperfectly forecastable changes in future productivity and an occasionally binding collateral constraint can match a set of stylized facts about “sudden stop” events. “Good” news about future productivity raises leverage during times of expansion, increasing the probability that the constraint binds, and a sudden stop occurs, in future periods. The economy exhibits a boom period in the run-up to the sudden stop, with output, consumption, and investment all above tre...

  15. Information management for decommissioning projects

    International Nuclear Information System (INIS)

    LeClair, A.N.; Lemire, D.S.

    2011-01-01

    This paper explores the importance of records and information management for decommissioning projects. Key decommissioning information and elements of a sound information management strategy are identified. Various knowledge management strategies and tools are discussed as opportunities for leveraging decommissioning information. The paper also examines the implementation of Atomic Energy of Canada Limited's (AECL) strategy for the long term preservation of decommissioning information, and its initiatives in leveraging of information with the application of several knowledge management strategies and tools. The implementation of AECL's strategy illustrates common as well as unique information and knowledge management challenges and opportunities for decommissioning projects. (author)

  16. Leverage, Asymmetric Information, Firm Value, and Cash Holdings in Indonesia

    Directory of Open Access Journals (Sweden)

    Aldea Mita Cheryta

    2018-02-01

    Full Text Available This research aimed to analyze the effect of leverage and asymmetry information on the firm value through cash holding as mediation variable. The populations of this research were all the firms which listed on the Indonesia Stock Exchange since 2012 – 2015. A sample of this research was saturated sample and census, consisted 56 firms related the population criteria.  This research used secondary data from the firm financial report through path analysis method. This research showed that leverage had a negative effect on the cash holdings, asymmetry information had a negative effect on the firm value through cash holding, and cash holding had a negative effect on the firm value.  With leverage and effect on cash, holding cannot affect the firm value, due to investor risk-averse, investor risk seeker, and neutral investor has their own point of view in assessing the company. Cash holdings can lead to asymmetric information that can lead to agency conflict that can affect a company's performance, so that indirectly, with the existence of asymmetry information had an effect on the declining the firm value. 

  17. User requirements Massive Point Clouds for eSciences (WP1)

    NARCIS (Netherlands)

    Suijker, P.M.; Alkemade, I.; Kodde, M.P.; Nonhebel, A.E.

    2014-01-01

    This report is a milestone in work package 1 (WP1) of the project Massive point clouds for eSciences. In WP1 the basic functionalities needed for a new Point Cloud Spatial Database Management System are identified. This is achieved by (1) literature research, (2) discussions with the project

  18. Massive Star Burps, Then Explodes

    Science.gov (United States)

    2007-04-01

    Berkeley -- In a galaxy far, far away, a massive star suffered a nasty double whammy. On Oct. 20, 2004, Japanese amateur astronomer Koichi Itagaki saw the star let loose an outburst so bright that it was initially mistaken for a supernova. The star survived, but for only two years. On Oct. 11, 2006, professional and amateur astronomers witnessed the star actually blowing itself to smithereens as Supernova 2006jc. Swift UVOT Image Swift UVOT Image (Credit: NASA / Swift / S.Immler) "We have never observed a stellar outburst and then later seen the star explode," says University of California, Berkeley, astronomer Ryan Foley. His group studied the event with ground-based telescopes, including the 10-meter (32.8-foot) W. M. Keck telescopes in Hawaii. Narrow helium spectral lines showed that the supernova's blast wave ran into a slow-moving shell of material, presumably the progenitor's outer layers ejected just two years earlier. If the spectral lines had been caused by the supernova's fast-moving blast wave, the lines would have been much broader. artistic rendering This artistic rendering depicts two years in the life of a massive blue supergiant star, which burped and spewed a shell of gas, then, two years later, exploded. When the supernova slammed into the shell of gas, X-rays were produced. (Credit: NASA/Sonoma State Univ./A.Simonnet) Another group, led by Stefan Immler of NASA's Goddard Space Flight Center, Greenbelt, Md., monitored SN 2006jc with NASA's Swift satellite and Chandra X-ray Observatory. By observing how the supernova brightened in X-rays, a result of the blast wave slamming into the outburst ejecta, they could measure the amount of gas blown off in the 2004 outburst: about 0.01 solar mass, the equivalent of about 10 Jupiters. "The beautiful aspect of our SN 2006jc observations is that although they were obtained in different parts of the electromagnetic spectrum, in the optical and in X-rays, they lead to the same conclusions," says Immler. "This

  19. An effective theory of massive gauge bosons

    International Nuclear Information System (INIS)

    Doria, R.M.; Helayel Neto, J.A.

    1986-01-01

    The coupling of a group-valued massive scalar field to a gauge field through a symmetric rank-2 field strenght is studied. By considering energies very small compared with the mass of the scalar and invoking the decoupling theorem, one is left with a low-energy effective theory describing a dynamics of massive vector fields. (Author) [pt

  20. On the singularities of massive superstring amplitudes

    NARCIS (Netherlands)

    Foda, O.

    1987-01-01

    Superstring one-loop amplitudes with massive external states are shown to be in general ill-defined due to internal on-shell propagators. However, we argue that since any massive string state (in the uncompactified theory) has a finite lifetime to decay into massless particles, such amplitudes are

  1. Massive vector fields and black holes

    International Nuclear Information System (INIS)

    Frolov, V.P.

    1977-04-01

    A massive vector field inside the event horizon created by the static sources located outside the black hole is investigated. It is shown that the back reaction of such a field on the metric near r = 0 cannot be neglected. The possibility of the space-time structure changing near r = 0 due to the external massive field is discussed

  2. Management of massive haemoptysis | Adegboye | Nigerian Journal ...

    African Journals Online (AJOL)

    Background: This study compares two management techniques in the treatment of massive haemotysis. Method: All patients with massive haemoptysis treated between January 1969 and December 1980 (group 1) were retrospectively reviewed and those prospectively treated between January 1981 and August 1999 ...

  3. Nitrogen chronology of massive main sequence stars

    NARCIS (Netherlands)

    Köhler, K.; Borzyszkowski, M.; Brott, I.; Langer, N.; de Koter, A.

    2012-01-01

    Context. Rotational mixing in massive main sequence stars is predicted to monotonically increase their surface nitrogen abundance with time. Aims. We use this effect to design a method for constraining the age and the inclination angle of massive main sequence stars, given their observed luminosity,

  4. ANALISIS FINANCIAL LEVERAGE PADA PT. RAJAWALI JAYA SAKTI CONTRINDO DI MAKASSAR.

    OpenAIRE

    ANWAR, H. MUH.

    2013-01-01

    2013 H.Muh.ANWAR, A financial leverage analysis at PT.Rajawali Jaya Sakti Contrindo of Makassar (Supervised by HJ.Siti Haerani and Kasman Damang). The problem statement of this research is whether financial leverage can increase company???s profit. The objectives of this research is to find out the calculation of financial leverage applied by company and to analyse the impact of financial leverage toward profit gained by company. The result of the research on leverage ratio of PT.Raja...

  5. Leveraging Radioactive Waste Disposal at WIPP for Science

    Science.gov (United States)

    Rempe, N. T.

    2008-12-01

    space (100m x 10m x 6m) is the North Experimental Area (NExA). There, Enriched Xenon Observatory (EXO) collaborators have since mid-2007 been assembling and outfitting six modules and associated structures that were pre-assembled at Stanford University, then dismantled, and shipped to WIPP. Transporting the modules underground presented several interesting challenges, all of which were overcome. Access through increasingly cleaner joined modules leads to the class-100 clean room detector module. Inside, a time projection chamber (TPC) contains 200kg liquid Xe- 136 (the largest non-defense related stockpile of an enriched isotope ever assembled for research). After the experiment starts in early 2009, it is expected to run for 3-5 years. University of Pennsylvania researchers recently sampled WIPP salt to attempt measuring stable Ne-22, resulting from the interaction of cosmogenic muons with Na-23 and preserved in the halite lattice, to determine variations in the cosmic-radiation flux. They in turn could reveal the history of nearby supernovae. University of Chicago/Fermilab researchers evaluate whether to install a superheated-fluid bubble-chamber to search for weakly interacting massive particles (WIMPs). A helium-filled solar neutrino TPC, dark matter and neutron detectors, and proton-decay and supernova-neutrino detectors are other projects that were and are under discussion. Rounding out the spectrum of possibilities are experiments to investigate the effects of long-term ultra-low-dose radiation on cell cultures and laboratory animals to verify or falsify the linear, no- threshold hypothesis. WIPP welcomes additional proposals and projects.

  6. Leveraging Transcultural Enrollments to Enhance Application of the Scientific Method

    Science.gov (United States)

    Loudin, M.

    2013-12-01

    Continued growth of transcultural academic programs presents an opportunity for all of the students involved to improve utilization of the scientific method. Our own business success depends on how effectively we apply the scientific method, and so it is unsurprising that our hiring programs focus on three broad areas of capability among applicants which are strongly related to the scientific method. These are 1) ability to continually learn up-to-date earth science concepts, 2) ability to effectively and succinctly communicate in the English language, both oral and written, and 3) ability to employ behaviors that are advantageous with respect to the various phases of the scientific method. This third area is often the most difficult to develop, because neither so-called Western nor Eastern cultures encourage a suite of behaviors that are ideally suited. Generally, the acceptance of candidates into academic programs, together with subsequent high performance evidenced by grades, is a highly valid measure of continuous learning capability. Certainly, students for whom English is not a native language face additional challenges, but succinct and effective communication is an art which requires practice and development, regardless of native language. The ability to communicate in English is crucial, since it is today's lingua franca for both science and commerce globally. Therefore, we strongly support the use of frequent English written assignments and oral presentations as an integral part of all scientific academic programs. There is no question but that this poses additional work for faculty; nevertheless it is a key ingredient to the optimal development of students. No one culture has a monopoly with respect to behaviors that promote effective leveraging of the scientific method. For instance, the growing complexity of experimental protocols argues for a high degree of interdependent effort, which is more often associated with so-called Eastern than Western

  7. Accelerating Precision Drug Development and Drug Repurposing by Leveraging Human Genetics.

    Science.gov (United States)

    Pulley, Jill M; Shirey-Rice, Jana K; Lavieri, Robert R; Jerome, Rebecca N; Zaleski, Nicole M; Aronoff, David M; Bastarache, Lisa; Niu, Xinnan; Holroyd, Kenneth J; Roden, Dan M; Skaar, Eric P; Niswender, Colleen M; Marnett, Lawrence J; Lindsley, Craig W; Ekstrom, Leeland B; Bentley, Alan R; Bernard, Gordon R; Hong, Charles C; Denny, Joshua C

    2017-04-01

    The potential impact of using human genetic data linked to longitudinal electronic medical records on drug development is extraordinary; however, the practical application of these data necessitates some organizational innovations. Vanderbilt has created resources such as an easily queried database of >2.6 million de-identified electronic health records linked to BioVU, which is a DNA biobank with more than 230,000 unique samples. To ensure these data are used to maximally benefit and accelerate both de novo drug discovery and drug repurposing efforts, we created the Accelerating Drug Development and Repurposing Incubator, a multidisciplinary think tank of experts in various therapeutic areas within both basic and clinical science as well as experts in legal, business, and other operational domains. The Incubator supports a diverse pipeline of drug indication finding projects, leveraging the natural experiment of human genetics.

  8. Bridging the Water Policy and Management Silos: An Opportunity for Leveraged Capacity Building

    Science.gov (United States)

    Wegner, D. L.

    2017-12-01

    The global community is challenged by increasing demand and decreasing water supplies. Historically nations have focused on local or regional water development projects that meet specific needs, often without consideration of the impact on downstream transboundary water users or the watershed itself. Often these decisions have been based on small sets of project specific data with little assessment on river basin impacts. In the United States this disjointed approach to water has resulted in 26 federal agencies having roles in water management or regulation, 50 states addressing water rights and compliance, and a multitude of tribal and local entities intersecting the water process. This approach often manifests itself in a convoluted, disjointed and time-consuming approach. The last systematic and comprehensive review of nationwide water policy was the 1973 National Water Commission Report. A need exists for capacity building collaborative and integrative leadership and dialogue. NASA's Western Water Applications Office (WWAO) provides a unique opportunity to leverage water and terrain data with water agencies and policy makers. A supported WWAO can provide bridges between federal and state water agencies; provide consistent integrated hydrologic and terrain based data set acquired from multiple earth orbiting satellites and airborne platforms; provide data sets leveraged with academic and research based entities to develop specific integrative predictive tools; and evaluate hydrology information across multiple boundaries. It is the author's conclusion that the Western Water Applications Office can provide a value-added approach that will help translate transboundary water and earth terrain information to national policy decisions through education, increased efficiency, increased connectivity, improved coordination, and increased communication. To be effective the WWAO should embrace five objectives: (1) be technically and scientifically valid; (2

  9. The VIDA Framework as an Education Tool: Leveraging Volcanology Data for Educational Purposes

    Science.gov (United States)

    Faied, D.; Sanchez, A.

    2009-04-01

    The VIDA Framework as an Education Tool: Leveraging Volcanology Data for Educational Purposes Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. While the basic intention of VIDA is to support disaster risk reduction efforts, there are several methods of leveraging raw science data to support education across a wide demographic. Basic geophysical data could be used to educate school children about the characteristics of volcanoes, satellite mappings could support informed growth and development of societies in at-risk areas, and raw sensor data could contribute to a wide range of university-level research projects. Satellite maps, basic geophysical data, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource

  10. Nonsingular universe in massive gravity's rainbow

    Science.gov (United States)

    Hendi, S. H.; Momennia, M.; Eslam Panah, B.; Panahiyan, S.

    2017-06-01

    One of the fundamental open questions in cosmology is whether we can regard the universe evolution without singularity like a Big Bang or a Big Rip. This challenging subject stimulates one to regard a nonsingular universe in the far past with an arbitrarily large vacuum energy. Considering the high energy regime in the cosmic history, it is believed that Einstein gravity should be corrected to an effective energy dependent theory which could be acquired by gravity's rainbow. On the other hand, employing massive gravity provided us with solutions to some of the long standing fundamental problems of cosmology such as cosmological constant problem and self acceleration of the universe. Considering these aspects of gravity's rainbow and massive gravity, in this paper, we initiate studying FRW cosmology in the massive gravity's rainbow formalism. At first, we show that although massive gravity modifies the FRW cosmology, but it does not itself remove the big bang singularity. Then, we generalize the massive gravity to the case of energy dependent spacetime and find that massive gravity's rainbow can remove the early universe singularity. We bring together all the essential conditions for having a nonsingular universe and the effects of both gravity's rainbow and massive gravity generalizations on such criteria are determined.

  11. Massive open star clusters using the VVV survey IV. WR 62-2, a new very massive star in the core of the VVV CL041 cluster

    Czech Academy of Sciences Publication Activity Database

    Chene, A.-N.; Alegria, S.R.; Borissova, J.; O'Leary, E.; Martins, F.; Hervé, Anthony; Kuhn, M.; Kurtev, R.; Consuelo Amigo Fuentes, P.; Bonatto, C.; Minniti, D.

    2015-01-01

    Roč. 584, December (2015), A31/1-A31/8 ISSN 0004-6361 R&D Projects: GA ČR(CZ) GA14-02385S Institutional support: RVO:67985815 Keywords : open clusters and associations * VVV CL041 * massive star s Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics Impact factor: 4.378, year: 2014

  12. Using massive digital libraries a LITA guide

    CERN Document Server

    Weiss, Andrew

    2014-01-01

    Some have viewed the ascendance of the digital library as some kind of existential apocalypse, nothing less than the beginning of the end for the traditional library. But Weiss, recognizing the concept of the library as a ""big idea"" that has been implemented in many ways over thousands of years, is not so gloomy. In this thought-provoking and unabashedly optimistic book, he explores how massive digital libraries are already adapting to society's needs, and looks ahead to the massive digital libraries of tomorrow, coveringThe author's criteria for defining massive digital librariesA history o

  13. Leverage effect in financial markets: the retarded volatility model.

    Science.gov (United States)

    Bouchaud, J P; Matacz, A; Potters, M

    2001-11-26

    We investigate quantitatively the so-called "leverage effect," which corresponds to a negative correlation between past returns and future volatility. For individual stocks this correlation is moderate and decays over 50 days, while for stock indices it is much stronger but decays faster. For individual stocks the magnitude of this correlation has a universal value that can be rationalized in terms of a new "retarded" model which interpolates between a purely additive and a purely multiplicative stochastic process. For stock indices a specific amplification phenomenon seems to be necessary to account for the observed amplitude of the effect.

  14. HTSL massive motor. Project: Motor field calculation. Final report

    International Nuclear Information System (INIS)

    Gutt, H.J.; Gruener, A.

    2003-01-01

    HTS motors up to 300 kW were to be developed and optimized. For this, specific calculation methods were enhanced to include superconducting rotor types (hysteresis, reluctance and permanent magnet HTS rotors). The experiments were carried out in a SHM70-45 hysteresis motor. It was shown how static and dynamic trapped field magnetisation of the rotor with YBCO rings will increase flux in the air gap motor, increasing the motor capacity to twice its original level. (orig.) [de

  15. Age of blood and survival after massive transfusion.

    Science.gov (United States)

    Sanz, C C; Pereira, A

    2017-11-01

    Massive transfusion is the clinical scenario where the presumed adverse effects of stored blood are expected to be more evident because the whole patient's blood volume is replaced by stored blood. To analyse the association between age of transfused red blood cells (RBC) and survival in massively transfused patients. In this retrospective study, clinical and transfusion data of all consecutive patients massively transfused between 2008 and 2014 in a large, tertiary-care hospital were electronically extracted from the Transfusion Service database and the patients' electronic medical records. Prognostic factors for in-hospital mortality were investigated by multivariate logistic regression. A total of 689 consecutive patients were analysed (median age: 61 years; 65% males) and 272 died in-hospital. Projected mortality at 2, 30, and 90 days was 21%, 35% and 45%, respectively. The odds ratio (OR) for in-hospital mortality among patients who survived after the 2nd day increased with patient age (OR: 1.037, 95% CI: 1.021-1.054; per year Ptransfused in the first 48hours (OR: 1.060; 95% CI: 1.038-1.020 per unit; Ptransfusion was associated with a higher proportion of old RBCs transfused in the first 48hours. Other factors associated with poor prognosis were older patient's age and larger volumes of transfused RBCs. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  16. MASSIVE PROTOPLANETARY DISKS IN ORION BEYOND THE TRAPEZIUM CLUSTER

    International Nuclear Information System (INIS)

    Mann, Rita K.; Williams, Jonathan P.

    2009-01-01

    We present Submillimeter Array 1 The Submillimeter Array is a joint project between the Submillimeter Astrophysical Observatory and the Academica Sinica Institute of Astronomy and Astrophysics and is funded by the Smithsonian Institution and the Academica Sinica. observations of the 880 μm continuum emission from three circumstellar disks around young stars in Orion that lie several arcminutes (∼> 1 pc) north of the Trapezium cluster. Two of the three disks are in the binary system 253-1536. Silhouette disks 216-0939 and 253-1536a are found to be more massive than any previously observed Orion disks, with dust masses derived from their submillimeter emission of 0.045 M sun and 0.066 M sun , respectively. The existence of these massive disks reveals that the disk mass distribution in Orion does extend to high masses, and that the truncation observed in the central Trapezium cluster is a result of photoevaporation due to the proximity of O-stars. 253-1536b has a disk mass of 0.018 M sun , making the 253-1536 system the first optical binary in which each protoplanetary disk is massive enough to potentially form solar systems.

  17. A simple analytical model for dynamics of time-varying target leverage ratios

    Science.gov (United States)

    Lo, C. F.; Hui, C. H.

    2012-03-01

    In this paper we have formulated a simple theoretical model for the dynamics of the time-varying target leverage ratio of a firm under some assumptions based upon empirical observations. In our theoretical model the time evolution of the target leverage ratio of a firm can be derived self-consistently from a set of coupled Ito's stochastic differential equations governing the leverage ratios of an ensemble of firms by the nonlinear Fokker-Planck equation approach. The theoretically derived time paths of the target leverage ratio bear great resemblance to those used in the time-dependent stationary-leverage (TDSL) model [Hui et al., Int. Rev. Financ. Analy. 15, 220 (2006)]. Thus, our simple model is able to provide a theoretical foundation for the selected time paths of the target leverage ratio in the TDSL model. We also examine how the pace of the adjustment of a firm's target ratio, the volatility of the leverage ratio and the current leverage ratio affect the dynamics of the time-varying target leverage ratio. Hence, with the proposed dynamics of the time-dependent target leverage ratio, the TDSL model can be readily applied to generate the default probabilities of individual firms and to assess the default risk of the firms.

  18. Massive congenital tricuspid insufficiency in the newborn

    International Nuclear Information System (INIS)

    Bogren, H.G.; Ikeda, R.; Riemenschneider, T.A.; Merten, D.F.; Janos, G.G.

    1979-01-01

    Three cases of massive congenital tricuspid incompetence in the newborn are reported and discussed from diagnostic, pathologic and etiologic points of view. The diagnosis is important as cases have been reported with spontaneous resolution. (Auth.)

  19. Current management of massive hemorrhage in trauma

    DEFF Research Database (Denmark)

    Johansson, Pär I; Stensballe, Jakob; Ostrowski, Sisse R

    2012-01-01

    ABSTRACT: Hemorrhage remains a major cause of potentially preventable deaths. Trauma and massive transfusion are associated with coagulopathy secondary to tissue injury, hypoperfusion, dilution, and consumption of clotting factors and platelets. Concepts of damage control surgery have evolved...

  20. How I treat patients with massive hemorrhage

    DEFF Research Database (Denmark)

    Johansson, Pär I; Stensballe, Jakob; Oliveri, Roberto

    2014-01-01

    Massive hemorrhage is associated with coagulopathy and high mortality. The transfusion guidelines up to 2006 recommended that resuscitation of massive hemorrhage should occur in successive steps using crystalloids, colloids and red blood cells (RBC) in the early phase, and plasma and platelets...... in the late phase. With the introduction of the cell-based model of hemostasis in the mid 1990ties, our understanding of the hemostatic process and of coagulopathy has improved. This has contributed to a change in resuscitation strategy and transfusion therapy of massive hemorrhage along with an acceptance...... outcome, although final evidence on outcome from randomized controlled trials are lacking. We here present how we in Copenhagen and Houston, today, manage patients with massive hemorrhage....

  1. Massive cerebellar infarction: a neurosurgical approach

    Directory of Open Access Journals (Sweden)

    Salazar Luis Rafael Moscote

    2015-12-01

    Full Text Available Cerebellar infarction is a challenge for the neurosurgeon. The rapid recognition will crucial to avoid devastating consequences. The massive cerebellar infarction has pseudotumoral behavior, should affect at least one third of the volume of the cerebellum. The irrigation of the cerebellum presents anatomical diversity, favoring the appearance of atypical infarcts. The neurosurgical management is critical for massive cerebellar infarction. We present a review of the literature.

  2. Mini researchers for massive experiments

    CERN Multimedia

    Mélissa Lanaro

    2011-01-01

    On Friday 15 April, CERN welcomed the first classes participating in the “Dans la peau d’un chercheur” project. Over the last two months, students from 30 primary school classes have been gaining new insight into life as a researcher and learning the principles of the experimental method (see Bulletin No. 05-06/2011). The school visits to CERN or the University of Geneva are an important part of the project. For a few hours, students are given the chance to meet physicists to get a behind-the-scenes look at experimental physics in “real” laboratories. Laetitia Dufay-Chanat and Johan Bremer, from the cryogenics laboratory, delighted students from the Ornex School (see photo) by conducting experiments demonstrating different states of matter.      

  3. PENGARUH KEPEMILIKAN MANAJERIAL, LEVERAGE DAN PROFITABILITAS TERHADAP KEBIJAKAN INVESTASI PERUSAHAAN

    Directory of Open Access Journals (Sweden)

    Anggita Langgeng Wijaya

    2012-03-01

    Full Text Available Penelitian ini menguji dampak kepemilikan managerial, buku besar dan profitabilitas pada kebijakan investasi perusahaan pada sampel perusahaan manufaktur yang ada di Indonesian Stock Exchange. Metode purposive sampling digunakan untuk menentukan sampel dan analisis multiple regression digunakan untuk menguji hipotesis. Hasil dari penelitian ini menunjukkan bahwa kepemilikan manajerial tidak berpengaruh secara signifikan pada kebijakan investasi perusahaan, buku besar berpengaruh positif pada kebijakan investasi publik, sedangkan profitabilitas berpengaruh positif terhadap investasi kebijakan publik. This research tests the effect of managerial ownership, leverage, and profitability on corporate investment policy for a sample of manufacturing companies listed in Indonesian Stock Exchange over the period 2006-2008. Population of this research is all of manufacturing companies at Indonesian Stock Exchange. Purposive sampling method was employed and the data analysis technique was classic assumption test: multicollinearity test, autocorrelation test, heteroscedasticity test, and normality test. The hypothesis test used multiple regression analysis. The results show that managerial ownership has no significant effects on corporate investment policy; leverage positively affects corporate investment policy; while profitability positively affects corporate investment policy.

  4. KARAKTERISTIK EKSEKUTIF TERHADAP TAX AVOIDANCE DENGAN LEVERAGE SEBAGAI VARIABEL INTERVENING

    Directory of Open Access Journals (Sweden)

    Verani Carolina

    2017-03-01

    Full Text Available This research aimed to examine the influence of the executive characteristic on corporate tax avoidance. Risktaker’s executive tended to be more courageous and aggressive in taking decision related to the tax. On thecontrary, the risk averse executive tended to be carefully (Low, 2006. This research used leverage as interveningvariable. Therefore, there was an assumption that the executive characteristic determined the corporateleverage which then influenced their tax avoidance in the company. Manufacturing companies which werelisted in Indonesia Stock Exchange during the period 2010-2012 were used as samples. This research usedpurposive sampling method to select the sample with the criteria as follows: they were listed in Indonesia StockExchange during the period of 2010-2012, they made a profit during the period of 2010-2012, and they usedrupiah as reporting currency. Data was processed using path analysis and the result showed that the executivecharacteristic had an impact on corporate tax avoidance with leverage as the intervening variable. The resultof this research could be used for the investors to assess the corporate tax avoidance before they made a decision,and also for the policy makers to detect the corporate tax avoidance.

  5. Pengaruh Rasio Aktivitas Dan Rasio Leverage Terhadap Tingkat Profitabilitas

    Directory of Open Access Journals (Sweden)

    Tri Noormuliyaningsih

    2016-06-01

    Full Text Available The objectives of this research to analyze the influence of activity ratio (inventory turnover, fixed assets turnover, and total assets turnover and leverage ratio (debt ratio and debt to equity ratio to profitability level (return on assets and return on equity on food and beverage companies that listed in Indonesia Stock Exchange (IDX.  Sample on this research consist of 14 (fourteen food and beverage companies that listed in Indonesia Stock Exchange (IDX. The observation periods  are 3 (three years that start from 2012 until 2014. Multiple linear regression is a method that used to analyze data, and for testing the raised hypothesis with t test. The result of research conclude that debt ratio significantly affect the company’s profitability level (return on assets and return on equity with value of negative coefficient. Other variables such as inventory turnover, fixed assets turnover, and total assets turnover are not affect to profitability level (return on assets and return on equity. Influence of debt to equity ratio an profitability level can not be concluded in this research. Keywords: Activity Ratio, Inventory Turnover, Fixed Assets Turnover, Total Assets Turnover, Leverage Ratio, Debt Ratio, Debt to Equity Ratio, Profitability Level, Return On Assets, Return On Equity.

  6. LEVERAGING EXISTING HERITAGE DOCUMENTATION FOR ANIMATIONS: SENATE VIRTUAL TOUR

    Directory of Open Access Journals (Sweden)

    A. Dhanda

    2017-08-01

    Full Text Available The use of digital documentation techniques has led to an increase in opportunities for using documentation data for valorization purposes, in addition to technical purposes. Likewise, building information models (BIMs made from these data sets hold valuable information that can be as effective for public education as it is for rehabilitation. A BIM can reveal the elements of a building, as well as the different stages of a building over time. Valorizing this information increases the possibility for public engagement and interest in a heritage place. Digital data sets were leveraged by the Carleton Immersive Media Studio (CIMS for parts of a virtual tour of the Senate of Canada. For the tour, workflows involving four different programs were explored to determine an efficient and effective way to leverage the existing documentation data to create informative and visually enticing animations for public dissemination: Autodesk Revit, Enscape, Autodesk 3ds Max, and Bentley Pointools. The explored workflows involve animations of point clouds, BIMs, and a combination of the two.

  7. Leveraging Existing Heritage Documentation for Animations: Senate Virtual Tour

    Science.gov (United States)

    Dhanda, A.; Fai, S.; Graham, K.; Walczak, G.

    2017-08-01

    The use of digital documentation techniques has led to an increase in opportunities for using documentation data for valorization purposes, in addition to technical purposes. Likewise, building information models (BIMs) made from these data sets hold valuable information that can be as effective for public education as it is for rehabilitation. A BIM can reveal the elements of a building, as well as the different stages of a building over time. Valorizing this information increases the possibility for public engagement and interest in a heritage place. Digital data sets were leveraged by the Carleton Immersive Media Studio (CIMS) for parts of a virtual tour of the Senate of Canada. For the tour, workflows involving four different programs were explored to determine an efficient and effective way to leverage the existing documentation data to create informative and visually enticing animations for public dissemination: Autodesk Revit, Enscape, Autodesk 3ds Max, and Bentley Pointools. The explored workflows involve animations of point clouds, BIMs, and a combination of the two.

  8. Supplier relationship management leverages intellectual capital for increased competitive advantage

    Directory of Open Access Journals (Sweden)

    C. R. Van Zyl

    2005-12-01

    Full Text Available The main purpose of this article is to demonstrate how supplier relationship management (SRM enables the capture and creation of intellectual capital, thereby attaining and sustaining a strategic competitive advantage and increasing supply chain profitability. In order to achieve this purpose, a large part of the article is devoted to exploring the relatively new and unknown field of SRM. It is shown that an organisation must possess a thorough understanding of good supplier characteristics and of the drivers, benefits and requirements for the successful implementation of SRM, in order to enable that organisation to leverage their supplier relationships to ensure the capture of supplier expertise, patents, experiences etc. (i.e. their intellectual capital. The article then explores how the integration of technology in SRM applications can improve the efficiency of supplier collaboration and intellectual capital capture and creation. It is then demonstrated how efficient and collaborative supplier relationships improve supply chain profitability and competitiveness. Lastly, the article explores the implementation pitfalls and trends of SRM that must be constantly considered and monitored by an organisation in order to continually capture and create intellectual capital and reap the full benefits of SRM. This exploration involved an examination of contemporary literature, theories and business cases and subsequently revealed that SRM is a vital discipline/philosophy that must be implemented by any organisation wishing to achieve greater supply chain efficiency and competitiveness. This competitiveness can only be achieved through the mutual unlocking, sharing and leveraging of intellectual capital.

  9. IDC Reengineering Phase 2 & 3 Rough Order of Magnitude (ROM) Cost Estimate Summary (Leveraged NDC Case).

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M.; Prescott, Ryan; Dawson, Jericah M.; Huelskamp, Robert M.

    2014-11-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, based on leveraging a fully funded, Sandia executed NDC Modernization project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  10. On the Performance of the Measure for Diagnosing Multiple High Leverage Collinearity-Reducing Observations

    Directory of Open Access Journals (Sweden)

    Arezoo Bagheri

    2012-01-01

    Full Text Available There is strong evidence indicating that the existing measures which are designed to detect a single high leverage collinearity-reducing observation are not effective in the presence of multiple high leverage collinearity-reducing observations. In this paper, we propose a cutoff point for a newly developed high leverage collinearity-influential measure and two existing measures ( and to identify high leverage collinearity-reducing observations, the high leverage points which hide multicollinearity in a data set. It is important to detect these observations as they are responsible for the misleading inferences about the fitting of the regression model. The merit of our proposed measure and cutoff point in detecting high leverage collinearity-reducing observations is investigated by using engineering data and Monte Carlo simulations.

  11. Leveraging Citizen Science and Information Technology for Population Physical Activity Promotion

    Science.gov (United States)

    King, Abby C.; Winter, Sandra J.; Sheats, Jylana L.; Rosas, Lisa G.; Buman, Matthew P.; Salvo, Deborah; Rodriguez, Nicole M.; Seguin, Rebecca A.; Moran, Mika; Garber, Randi; Broderick, Bonnie; Zieff, Susan G.; Sarmiento, Olga Lucia; Gonzalez, Silvia A.; Banchoff, Ann; Dommarco, Juan Rivera

    2016-01-01

    PURPOSE While technology is a major driver of many of society’s comforts, conveniences, and advances, it has been responsible, in a significant way, for engineering regular physical activity and a number of other positive health behaviors out of people’s daily lives. A key question concerns how to harness information and communication technologies (ICT) to bring about positive changes in the health promotion field. One such approach involves community-engaged “citizen science,” in which local residents leverage the potential of ICT to foster data-driven consensus-building and mobilization efforts that advance physical activity at the individual, social, built environment, and policy levels. METHOD The history of citizen science in the research arena is briefly described and an evidence-based method that embeds citizen science in a multi-level, multi-sectoral community-based participatory research framework for physical activity promotion is presented. RESULTS Several examples of this citizen science-driven community engagement framework for promoting active lifestyles, called “Our Voice”, are discussed, including pilot projects from diverse communities in the U.S. as well as internationally. CONCLUSIONS The opportunities and challenges involved in leveraging citizen science activities as part of a broader population approach to promoting regular physical activity are explored. The strategic engagement of citizen scientists from socio-demographically diverse communities across the globe as both assessment as well as change agents provides a promising, potentially low-cost and scalable strategy for creating more active, healthful, and equitable neighborhoods and communities worldwide. PMID:27525309

  12. The Golden Target: Analyzing the Tracking Performance of Leveraged Gold ETFs

    OpenAIRE

    Tim Leung; Brian Ward

    2015-01-01

    This paper studies the empirical tracking performance of leveraged ETFs on gold, and their price relationships with gold spot and futures. For tracking the gold spot, we find that our optimized portfolios with short-term gold futures are highly effective in replicating prices. The market-traded gold ETF (GLD) also exhibits a similar tracking performance. However, we show that leveraged gold ETFs tend to underperform their corresponding leveraged benchmark. Moreover, the underperformance worse...

  13. REVERSE LEVERAGED BUYOUT RETURN BEHAVIOR: SOME EUROPEAN EVIDENCE

    Directory of Open Access Journals (Sweden)

    Trevor W. Chamberlain

    2017-12-01

    Full Text Available This study investigates the stock performance of reverse leveraged buyouts (RLBOs before, during, and after the global financial crisis. An RLBO consists of the return to public investors (i.e. the offering of stocks to the public of a company that had gone private after a leveraged buyout (LBO led by a private equity fund. The value created by an RLBO resides in the changes brought by the LBO fund while it owns the company. After a “repackaging” of the bought company, the private equity fund sells the company’s shares to the public. Most of the research on this topic, based on RLBOs that occurred between 1980 and 2005 in the US, has shown that RLBOs outperform their peers (i.e. other IPOs and outperform the market after going public again. Focusing on RLBO companies in Europe in the financial crisis era, this study investigates whether they also outperform other IPOs and the market. The study is based on a sample of 421 IPOs occurring between 2001 and 2011 in France, Germany and the UK, of which 52 are RLBOs. We examine RLBO performance one day, one month, one year and three years after the offering. We also use event study methods to investigate the impact of the global financial crisis on RLBO performance. We find that European RLBOs outperform both their peers (i.e. “classic” IPOs and the market during the period studied. This outperformance does not diminish in the long-term. The global financial crisis appears to have affected RLBO performance, which weakened between 2007 and 2009, though RLBOs still outperformed the market. In addition, multivariable regressions were used to examine various extant explanations for RLBO outperformance. This analysis did not support any of the prevailing theories. In particular, the value created by RLBOs does not appear to be linked to LBO duration, sponsor reputation, or to the level of leverage employed. There is no evidence of time or industry effects. Moreover, RLBO performance shows no

  14. A scalable approach to modeling groundwater flow on massively parallel computers

    International Nuclear Information System (INIS)

    Ashby, S.F.; Falgout, R.D.; Tompson, A.F.B.

    1995-12-01

    We describe a fully scalable approach to the simulation of groundwater flow on a hierarchy of computing platforms, ranging from workstations to massively parallel computers. Specifically, we advocate the use of scalable conceptual models in which the subsurface model is defined independently of the computational grid on which the simulation takes place. We also describe a scalable multigrid algorithm for computing the groundwater flow velocities. We axe thus able to leverage both the engineer's time spent developing the conceptual model and the computing resources used in the numerical simulation. We have successfully employed this approach at the LLNL site, where we have run simulations ranging in size from just a few thousand spatial zones (on workstations) to more than eight million spatial zones (on the CRAY T3D)-all using the same conceptual model

  15. Distributed Fast Self-Organized Maps for Massive Spectrophotometric Data Analysis †

    Directory of Open Access Journals (Sweden)

    Carlos Dafonte

    2018-05-01

    Full Text Available Analyzing huge amounts of data becomes essential in the era of Big Data, where databases are populated with hundreds of Gigabytes that must be processed to extract knowledge. Hence, classical algorithms must be adapted towards distributed computing methodologies that leverage the underlying computational power of these platforms. Here, a parallel, scalable, and optimized design for self-organized maps (SOM is proposed in order to analyze massive data gathered by the spectrophotometric sensor of the European Space Agency (ESA Gaia spacecraft, although it could be extrapolated to other domains. The performance comparison between the sequential implementation and the distributed ones based on Apache Hadoop and Apache Spark is an important part of the work, as well as the detailed analysis of the proposed optimizations. Finally, a domain-specific visualization tool to explore astronomical SOMs is presented.

  16. Distributed Fast Self-Organized Maps for Massive Spectrophotometric Data Analysis †.

    Science.gov (United States)

    Dafonte, Carlos; Garabato, Daniel; Álvarez, Marco A; Manteiga, Minia

    2018-05-03

    Analyzing huge amounts of data becomes essential in the era of Big Data, where databases are populated with hundreds of Gigabytes that must be processed to extract knowledge. Hence, classical algorithms must be adapted towards distributed computing methodologies that leverage the underlying computational power of these platforms. Here, a parallel, scalable, and optimized design for self-organized maps (SOM) is proposed in order to analyze massive data gathered by the spectrophotometric sensor of the European Space Agency (ESA) Gaia spacecraft, although it could be extrapolated to other domains. The performance comparison between the sequential implementation and the distributed ones based on Apache Hadoop and Apache Spark is an important part of the work, as well as the detailed analysis of the proposed optimizations. Finally, a domain-specific visualization tool to explore astronomical SOMs is presented.

  17. Science from a glimpse: Hubble SNAPshot observations of massive galaxy clusters

    Science.gov (United States)

    Repp, A.; Ebeling, H.

    2018-06-01

    Hubble Space Telescope SNAPshot surveys of 86 X-ray selected galaxy clusters at 0.3 0.3. Examining the evolution of the slope of the cluster red sequence, we observe at best a slight decrease with redshift, indicating minimal age contribution since z ˜ 1. Congruent to previous studies' findings, we note that the two BCGs which are significantly bluer (≥5σ) than their clusters' red sequences reside in relaxed clusters and exhibit pronounced internal structure. Thanks to our targets' high X-ray luminosity, the subset of our sample observed with Chandra adds valuable leverage to the X-ray luminosity-optical richness relation, which, albeit with substantial scatter, is now clearly established from groups to extremely massive clusters of galaxies. We conclude that SNAPshot observations of MACS clusters stand to continue to play a vital pathfinder role for astrophysical investigations across the entire electromagnetic spectrum.

  18. Leveraging business intelligence to make better decisions: Part III.

    Science.gov (United States)

    Reimers, Mona

    2014-01-01

    Accounts receivable and scheduling datasets have been available to medical practices since the 1990s, and discrete medical records data have become available over the past few years. But the frustrations that arose from the difficulties in reporting data grew with each keyboard stroke and mouse click. With reporting mandated to meet changing payment models, measuring quality of care and medical outcomes, practice managers must find more efficient and effective methods of extracting and compiling the data they have in their systems. Taming the reporting beast and learning to effectively apply business intelligence (BI) tools will become an expected managerial proficiency in the next few years. Practice managers' roles are changing quickly, and they will be required to understand the meaning of their practice's data and craft ways to leverage that data toward a strategic advantage.

  19. Leveraging LSTM for rapid intensifications prediction of tropical cyclones

    Science.gov (United States)

    Li, Y.; Yang, R.; Yang, C.; Yu, M.; Hu, F.; Jiang, Y.

    2017-10-01

    Tropical cyclones (TCs) usually cause severe damages and destructions. TC intensity forecasting helps people prepare for the extreme weather and could save lives and properties. Rapid Intensifications (RI) of TCs are the major error sources of TC intensity forecasting. A large number of factors, such as sea surface temperature and wind shear, affect the RI processes of TCs. Quite a lot of work have been done to identify the combination of conditions most favorable to RI. In this study, deep learning method is utilized to combine conditions for RI prediction of TCs. Experiments show that the long short-term memory (LSTM) network provides the ability to leverage past conditions to predict TC rapid intensifications.

  20. Leveraging LSTM for rapid intensifications prediction of tropical cyclones

    Directory of Open Access Journals (Sweden)

    Y. Li

    2017-10-01

    Full Text Available Tropical cyclones (TCs usually cause severe damages and destructions. TC intensity forecasting helps people prepare for the extreme weather and could save lives and properties. Rapid Intensifications (RI of TCs are the major error sources of TC intensity forecasting. A large number of factors, such as sea surface temperature and wind shear, affect the RI processes of TCs. Quite a lot of work have been done to identify the combination of conditions most favorable to RI. In this study, deep learning method is utilized to combine conditions for RI prediction of TCs. Experiments show that the long short-term memory (LSTM network provides the ability to leverage past conditions to predict TC rapid intensifications.

  1. Leveraging ecological theory to guide natural product discovery.

    Science.gov (United States)

    Smanski, Michael J; Schlatter, Daniel C; Kinkel, Linda L

    2016-03-01

    Technological improvements have accelerated natural product (NP) discovery and engineering to the point that systematic genome mining for new molecules is on the horizon. NP biosynthetic potential is not equally distributed across organisms, environments, or microbial life histories, but instead is enriched in a number of prolific clades. Also, NPs are not equally abundant in nature; some are quite common and others markedly rare. Armed with this knowledge, random 'fishing expeditions' for new NPs are increasingly harder to justify. Understanding the ecological and evolutionary pressures that drive the non-uniform distribution of NP biosynthesis provides a rational framework for the targeted isolation of strains enriched in new NP potential. Additionally, ecological theory leads to testable hypotheses regarding the roles of NPs in shaping ecosystems. Here we review several recent strain prioritization practices and discuss the ecological and evolutionary underpinnings for each. Finally, we offer perspectives on leveraging microbial ecology and evolutionary biology for future NP discovery.

  2. Route Instruction Mechanism for Mobile Users Leveraging Distributed Wireless Resources

    Science.gov (United States)

    Kakehi, Takeshi; Shinkuma, Ryoichi; Murase, Tutomu; Motoyoshi, Gen; Yamori, Kyoko; Takahashi, Tatsuro

    The market growths of smart-phones and thin clients have been significantly increasing communication traffic in mobile networks. To handle the increased traffic, network operators should consider how to leverage distributed wireless resources such as distributed spots of wireless local access networks. In this paper, we consider the system where multiple moving users share distributed wireless access points on their traveling routes between their start and goal points and formulate as an optimization problem. Then, we come up with three algorithms as a solution for the problem. The key idea here is ‘longcut route instruction’, in which users are instructed to choose a traveling route where less congested access points are available; even if the moving distance increases, the throughput for users in the system would improve. In this paper, we define the gain function. Moreover, we analyze the basic characteristics of the system using as a simple model as possible.

  3. Quality and efficiency successes leveraging IT and new processes.

    Science.gov (United States)

    Chaiken, Barry P; Christian, Charles E; Johnson, Liz

    2007-01-01

    Today, healthcare annually invests billions of dollars in information technology, including clinical systems, electronic medical records and interoperability platforms. While continued investment and parallel development of standards are critical to secure exponential benefits from clinical information technology, intelligent and creative redesign of processes through path innovation is necessary to deliver meaningful value. Reports from two organizations included in this report review the steps taken to reinvent clinical processes that best leverage information technology to deliver safer and more efficient care. Good Samaritan Hospital, Vincennes, Indiana, implemented electronic charting, point-of-care bar coding of medications prior to administration, and integrated clinical documentation for nursing, laboratory, radiology and pharmacy. Tenet Healthcare, during its implementation and deployment of multiple clinical systems across several hospitals, focused on planning that included team-based process redesign. In addition, Tenet constructed valuable and measurable metrics that link outcomes with its strategic goals.

  4. Leverage hadoop framework for large scale clinical informatics applications.

    Science.gov (United States)

    Dong, Xiao; Bahroos, Neil; Sadhu, Eugene; Jackson, Tommie; Chukhman, Morris; Johnson, Robert; Boyd, Andrew; Hynes, Denise

    2013-01-01

    In this manuscript, we present our experiences using the Apache Hadoop framework for high data volume and computationally intensive applications, and discuss some best practice guidelines in a clinical informatics setting. There are three main aspects in our approach: (a) process and integrate diverse, heterogeneous data sources using standard Hadoop programming tools and customized MapReduce programs; (b) after fine-grained aggregate results are obtained, perform data analysis using the Mahout data mining library; (c) leverage the column oriented features in HBase for patient centric modeling and complex temporal reasoning. This framework provides a scalable solution to meet the rapidly increasing, imperative "Big Data" needs of clinical and translational research. The intrinsic advantage of fault tolerance, high availability and scalability of Hadoop platform makes these applications readily deployable at the enterprise level cluster environment.

  5. Leveraging Social Links for Trust and Privacy in Networks

    Science.gov (United States)

    Cutillo, Leucio Antonio; Molva, Refik; Strufe, Thorsten

    Existing on-line social networks (OSN) such as Facebook suffer from several weaknesses regarding privacy and security due to their inherent handling of personal data. As pointed out in [4], a preliminary analysis of existing OSNs shows that they are subject to a number of vulnerabilities, ranging from cloning legitimate users to sybil attacks through privacy violations. Starting from these OSN vulnerabilities as the first step of a broader research activity, we came up with a new approach that is very promising in re-visiting security and privacy problems in distributed systems and networks. We suggest a solution that both aims at avoiding any centralized control and leverages on the real life trust between users, that is part of the social network application itself. An anonymization technique based on multi-hop routing among trusted nodes guarantees privacy in data access and, generally speaking, in all the OSN operations.

  6. PENGARUH PROFIT MARGIN, ASSETS TURNOVER DAN LEVERAGE TERHADAP SUSTAINABLE GROWTH RATE PADA PERUSAHAAN SEKTOR JASA YANG TERDAFTAR DI BURSA EFEK INDONESIA PERIODE 2010-2012

    Directory of Open Access Journals (Sweden)

    Arim Nasim

    2015-04-01

    Full Text Available This study aims to determine the effect of Profit Margin, Assets Turnover and Leverage on Sustainable Growth Rate. The variables used are profit margin, asset turnover and leverage as independent variable and sustainable growth rate as dependent variable. This study also aims to describe the state of profit margin projected by Net Profit Margin (NPM, asset turnover proxied by Total Assets Turnover (TATO, leverage which is proxied by Debt to Equity Ratio (DER and Sustainable Growth Rate (SGR Service sector. This research was conducted on service sector companies listed in Indonesia Stock Exchange 2010-2012.Data obtained from website Bursa Efek Indonesia.Teknik data analysis used is multiple linear regression and use t-statistics to test the influence of each independent variable to variable Dependent partially.Previously done classical assumption test that includes data normality test, multicolinierity test, heteroskedastisitas test and autocorrelation test.Based on data normality test, multicolinierity test, heteroscedasticity test and autocorrelation test did not found any variables that deviate from the classical assumption.From the results of research Shows that profit margin positively affect sustainable growth rate, asset turnover have positive effect to sustainable growth rate, and leverage have positive effect to sustainable growth rate.

  7. Leveraging Environmental Correlations: The Thermodynamics of Requisite Variety

    Science.gov (United States)

    Boyd, Alexander B.; Mandal, Dibyendu; Crutchfield, James P.

    2017-06-01

    Key to biological success, the requisite variety that confronts an adaptive organism is the set of detectable, accessible, and controllable states in its environment. We analyze its role in the thermodynamic functioning of information ratchets—a form of autonomous Maxwellian Demon capable of exploiting fluctuations in an external information reservoir to harvest useful work from a thermal bath. This establishes a quantitative paradigm for understanding how adaptive agents leverage structured thermal environments for their own thermodynamic benefit. General ratchets behave as memoryful communication channels, interacting with their environment sequentially and storing results to an output. The bulk of thermal ratchets analyzed to date, however, assume memoryless environments that generate input signals without temporal correlations. Employing computational mechanics and a new information-processing Second Law of Thermodynamics (IPSL) we remove these restrictions, analyzing general finite-state ratchets interacting with structured environments that generate correlated input signals. On the one hand, we demonstrate that a ratchet need not have memory to exploit an uncorrelated environment. On the other, and more appropriate to biological adaptation, we show that a ratchet must have memory to most effectively leverage structure and correlation in its environment. The lesson is that to optimally harvest work a ratchet's memory must reflect the input generator's memory. Finally, we investigate achieving the IPSL bounds on the amount of work a ratchet can extract from its environment, discovering that finite-state, optimal ratchets are unable to reach these bounds. In contrast, we show that infinite-state ratchets can go well beyond these bounds by utilizing their own infinite "negentropy". We conclude with an outline of the collective thermodynamics of information-ratchet swarms.

  8. Critical N = (1, 1) general massive supergravity

    Science.gov (United States)

    Deger, Nihat Sadik; Moutsopoulos, George; Rosseel, Jan

    2018-04-01

    In this paper we study the supermultiplet structure of N = (1, 1) General Massive Supergravity at non-critical and critical points of its parameter space. To do this, we first linearize the theory around its maximally supersymmetric AdS3 vacuum and obtain the full linearized Lagrangian including fermionic terms. At generic values, linearized modes can be organized as two massless and 2 massive multiplets where supersymmetry relates them in the standard way. At critical points logarithmic modes appear and we find that in three of such points some of the supersymmetry transformations are non-invertible in logarithmic multiplets. However, in the fourth critical point, there is a massive logarithmic multiplet with invertible supersymmetry transformations.

  9. HOW TO FIND YOUNG MASSIVE CLUSTER PROGENITORS

    Energy Technology Data Exchange (ETDEWEB)

    Bressert, E.; Longmore, S.; Testi, L. [European Southern Observatory, Karl Schwarzschild Str. 2, D-85748 Garching bei Muenchen (Germany); Ginsburg, A.; Bally, J.; Battersby, C. [Center for Astrophysics and Space Astronomy, University of Colorado, Boulder, CO 80309 (United States)

    2012-10-20

    We propose that bound, young massive stellar clusters form from dense clouds that have escape speeds greater than the sound speed in photo-ionized gas. In these clumps, radiative feedback in the form of gas ionization is bottled up, enabling star formation to proceed to sufficiently high efficiency so that the resulting star cluster remains bound even after gas removal. We estimate the observable properties of the massive proto-clusters (MPCs) for existing Galactic plane surveys and suggest how they may be sought in recent and upcoming extragalactic observations. These surveys will potentially provide a significant sample of MPC candidates that will allow us to better understand extreme star-formation and massive cluster formation in the Local Universe.

  10. Primordial inhomogeneities from massive defects during inflation

    Energy Technology Data Exchange (ETDEWEB)

    Firouzjahi, Hassan; Karami, Asieh; Rostami, Tahereh, E-mail: firouz@ipm.ir, E-mail: karami@ipm.ir, E-mail: t.rostami@ipm.ir [School of Astronomy, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5531, Tehran (Iran, Islamic Republic of)

    2016-10-01

    We consider the imprints of local massive defects, such as a black hole or a massive monopole, during inflation. The massive defect breaks the background homogeneity. We consider the limit that the physical Schwarzschild radius of the defect is much smaller than the inflationary Hubble radius so a perturbative analysis is allowed. The inhomogeneities induced in scalar and gravitational wave power spectrum are calculated. We obtain the amplitudes of dipole, quadrupole and octupole anisotropies in curvature perturbation power spectrum and identify the relative configuration of the defect to CMB sphere in which large observable dipole asymmetry can be generated. We observe a curious reflection symmetry in which the configuration where the defect is inside the CMB comoving sphere has the same inhomogeneous variance as its mirror configuration where the defect is outside the CMB sphere.

  11. Massive type IIA supergravity and E10

    International Nuclear Information System (INIS)

    Henneaux, M.; Kleinschmidt, A.; Persson, D.; Jamsin, E.

    2009-01-01

    In this talk we investigate the symmetry under E 10 of Romans' massive type IIA supergravity. We show that the dynamics of a spinning particle in a non-linear sigma model on the coset space E 10 /K(E 10 ) reproduces the bosonic and fermionic dynamics of massive IIA supergravity, in the standard truncation. In particular, we identify Romans' mass with a generator of E 10 that is beyond the realm of the generators of E 10 considered in the eleven-dimensional analysis, but using the same, underformed sigma model. As a consequence, this work provides a dynamical unification of the massless and massive versions of type IIA supergravity inside E 10 . (Abstract Copyright [2009], Wiley Periodicals, Inc.)

  12. Massive stars and X-ray pulsars

    International Nuclear Information System (INIS)

    Henrichs, H.

    1982-01-01

    This thesis is a collection of 7 separate articles entitled: long term changes in ultraviolet lines in γ CAS, UV observations of γ CAS: intermittent mass-loss enhancement, episodic mass loss in γ CAS and in other early-type stars, spin-up and spin-down of accreting neutron stars, an excentric close binary model for the X Persei system, has a 97 minute periodicity in 4U 1700-37/HD 153919 really been discovered, and, mass loss and stellar wind in massive X-ray binaries. (Articles 1, 2, 5, 6 and 7 have been previously published). The first three articles are concerned with the irregular mass loss in massive stars. The fourth critically reviews thoughts since 1972 on the origin of the changes in periodicity shown by X-ray pulsars. The last articles indicate the relation between massive stars and X-ray pulsars. (C.F.)

  13. Leveraging the Unified Access Framework: A Tale of an Integrated Ocean Data Prototype

    Science.gov (United States)

    O'Brien, K.; Kern, K.; Smith, B.; Schweitzer, R.; Simons, R.; Mendelssohn, R.; Diggs, S. C.; Belbeoch, M.; Hankin, S.

    2014-12-01

    The Tropical Pacific Observing System (TPOS) has been functioning and capturing measurements since the mid 1990s during the very successful Tropical Ocean Global Atmosphere (TOGA) project. Unfortunately, in the current environment, some 20 years after the end of the TOGA project, sustaining the observing system is proving difficult. With the many advances in methods of observing the ocean, a group of scientists is taking a fresh look at what the Tropical Pacific Observing System requires for sustainability. This includes utilizing a wide variety of observing system platforms, including Argo floats, unmanned drifters, moorings, ships, etc. This variety of platforms measuring ocean data also provides a significant challenge in terms of integrated data management. It is recognized that data and information management is crucial to the success and impact of any observing system. In order to be successful, it is also crucial to avoid building stovepipes for data management. To that end, NOAA's Observing System Monitoring Center (OSMC) has been tasked to create a testbed of integrated real time and delayed mode observations for the Tropical Pacific region in support of the TPOS. The observing networks included in the prototype are: Argo floats, OceanSites moorings, drifting buoys, hydrographic surveys, underway carbon observations and, of course, real time ocean measurements. In this presentation, we will discuss how the OSMC project is building the integrated data prototype using existing free and open source software. We will explore how we are leveraging successful data management frameworks pioneered by efforts such as NOAA's Unified Access Framework project. We will also show examples of how conforming to well known conventions and standards allows for discoverability, usability and interoperability of data.

  14. A Massively Parallel Face Recognition System

    Directory of Open Access Journals (Sweden)

    Lahdenoja Olli

    2007-01-01

    Full Text Available We present methods for processing the LBPs (local binary patterns with a massively parallel hardware, especially with CNN-UM (cellular nonlinear network-universal machine. In particular, we present a framework for implementing a massively parallel face recognition system, including a dedicated highly accurate algorithm suitable for various types of platforms (e.g., CNN-UM and digital FPGA. We study in detail a dedicated mixed-mode implementation of the algorithm and estimate its implementation cost in the view of its performance and accuracy restrictions.

  15. Massive gravity and Fierz-Pauli theory

    International Nuclear Information System (INIS)

    Blasi, Alberto; Maggiore, Nicola

    2017-01-01

    Linearized gravity is considered as an ordinary gauge field theory. This implies the need for gauge fixing in order to have well-defined propagators. Only after having achieved this, the most general mass term is added. The aim of this paper is to study of the degrees of freedom of the gauge fixed theory of linearized gravity with mass term. The main result is that, even outside the usual Fierz-Pauli constraint on the mass term, it is possible to choose a gauge fixing belonging to the Landau class, which leads to a massive theory of gravity with the five degrees of freedom of a spin-2 massive particle. (orig.)

  16. Massive gravity and Fierz-Pauli theory

    Energy Technology Data Exchange (ETDEWEB)

    Blasi, Alberto [Universita di Genova, Dipartimento di Fisica, Genova (Italy); Maggiore, Nicola [I.N.F.N.-Sezione di Genova, Genoa (Italy)

    2017-09-15

    Linearized gravity is considered as an ordinary gauge field theory. This implies the need for gauge fixing in order to have well-defined propagators. Only after having achieved this, the most general mass term is added. The aim of this paper is to study of the degrees of freedom of the gauge fixed theory of linearized gravity with mass term. The main result is that, even outside the usual Fierz-Pauli constraint on the mass term, it is possible to choose a gauge fixing belonging to the Landau class, which leads to a massive theory of gravity with the five degrees of freedom of a spin-2 massive particle. (orig.)

  17. SALT Spectroscopy of Evolved Massive Stars

    Science.gov (United States)

    Kniazev, A. Y.; Gvaramadze, V. V.; Berdnikov, L. N.

    2017-06-01

    Long-slit spectroscopy with the Southern African Large Telescope (SALT) of central stars of mid-infrared nebulae detected with the Spitzer Space Telescope and Wide-Field Infrared Survey Explorer (WISE) led to the discovery of numerous candidate luminous blue variables (cLBVs) and other rare evolved massive stars. With the recent advent of the SALT fiber-fed high-resolution echelle spectrograph (HRS), a new perspective for the study of these interesting objects is appeared. Using the HRS we obtained spectra of a dozen newly identified massive stars. Some results on the recently identified cLBV Hen 3-729 are presented.

  18. A Massively Parallel Face Recognition System

    Directory of Open Access Journals (Sweden)

    Ari Paasio

    2006-12-01

    Full Text Available We present methods for processing the LBPs (local binary patterns with a massively parallel hardware, especially with CNN-UM (cellular nonlinear network-universal machine. In particular, we present a framework for implementing a massively parallel face recognition system, including a dedicated highly accurate algorithm suitable for various types of platforms (e.g., CNN-UM and digital FPGA. We study in detail a dedicated mixed-mode implementation of the algorithm and estimate its implementation cost in the view of its performance and accuracy restrictions.

  19. Crowdsourcing biomedical research: leveraging communities as innovation engines.

    Science.gov (United States)

    Saez-Rodriguez, Julio; Costello, James C; Friend, Stephen H; Kellen, Michael R; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo

    2016-07-15

    The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories.

  20. 13 CFR 107.1000 - Licensees without Leverage-exceptions to the regulations.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Licensees without Leverage-exceptions to the regulations. 107.1000 Section 107.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS INVESTMENT COMPANIES Non-leveraged Licensees-Exceptions to Regulations § 107.1000...

  1. 13 CFR 107.1160 - Maximum amount of Leverage for a Section 301(d) Licensee.

    Science.gov (United States)

    2010-01-01

    ... Leverage, you must maintain Venture Capital Financings (at cost) that equal at least 30 percent of your... maintain at least the same dollar amount of Venture Capital Financings (at cost). (e) Definition of “Total... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Maximum amount of Leverage for a...

  2. Makification: Towards a Framework for Leveraging the Maker Movement in Formal Education

    Science.gov (United States)

    Cohen, Jonathan; Jones, W. Monty; Smith, Shaunna; Calandra, Brendan

    2017-01-01

    Maker culture is part of a burgeoning movement in which individuals leverage modern digital technologies to produce and share physical artifacts with a broader community. Certain components of the maker movement, if properly leveraged, hold promise for transforming formal education in a variety of contexts. The authors here work towards a…

  3.  Basic assumptions and definitions in the analysis of financial leverage

    Directory of Open Access Journals (Sweden)

    Tomasz Berent

    2015-12-01

    Full Text Available The financial leverage literature has been in a state of terminological chaos for decades as evidenced, for example, by the Nobel Prize Lecture mistake on the one hand, and the global financial crisis on the other. A meaningful analysis of the leverage phenomenon calls for the formulation of a coherent set of assumptions and basic definitions. The objective of the paper is to answer this call. The paper defines leverage as a value neutral concept useful in explaining the magnification effect exerted by financial activity upon the whole spectrum of financial results. By adopting constructivism as a methodological approach, we are able to introduce various types of leverage such as capital and income, base and non-base, accounting and market value, for levels and for distances (absolute and relative, costs and simple etc. The new definitions formulated here are subsequently adopted in the analysis of the content of leverage statements used by the leading finance textbook.

  4. The leverage effect on wealth distribution in a controllable laboratory stock market.

    Science.gov (United States)

    Zhu, Chenge; Yang, Guang; An, Kenan; Huang, Jiping

    2014-01-01

    Wealth distribution has always been an important issue in our economic and social life, since it affects the harmony and stabilization of the society. Under the background of widely used financial tools to raise leverage these years, we studied the leverage effect on wealth distribution of a population in a controllable laboratory market in which we have conducted several human experiments, and drawn the conclusion that higher leverage leads to a higher Gini coefficient in the market. A higher Gini coefficient means the wealth distribution among a population becomes more unequal. This is a result of the ascending risk with growing leverage level in the market plus the diversified trading abilities and risk preference of the participants. This work sheds light on the effects of leverage and its related regulations, especially its impact on wealth distribution. It also shows the capability of the method of controllable laboratory markets which could be helpful in several fields of study such as economics, econophysics and sociology.

  5. The human genome project

    International Nuclear Information System (INIS)

    Worton, R.

    1996-01-01

    The Human Genome Project is a massive international research project, costing 3 to 5 billion dollars and expected to take 15 years, which will identify the all the genes in the human genome - i.e. the complete sequence of bases in human DNA. The prize will be the ability to identify genes causing or predisposing to disease, and in some cases the development of gene therapy, but this new knowledge will raise important ethical issues

  6. Massive rectal bleeding from colonic diverticulosis

    African Journals Online (AJOL)

    ABEOLUGBENGAS

    Rapport De Cas: Nous mettons un cas d'un homme de 79 ans quiàprésente une hémorragie rectal massive ... cause of overt lower gastrointestinal (GI) ... vessels into the intestinal lumen results in ... placed on a high fibre diet, and intravenous.

  7. Improved visibility computation on massive grid terrains

    NARCIS (Netherlands)

    Fishman, J.; Haverkort, H.J.; Toma, L.; Wolfson, O.; Agrawal, D.; Lu, C.-T.

    2009-01-01

    This paper describes the design and engineering of algorithms for computing visibility maps on massive grid terrains. Given a terrain T, specified by the elevations of points in a regular grid, and given a viewpoint v, the visibility map or viewshed of v is the set of grid points of T that are

  8. Facial transplantation for massive traumatic injuries.

    Science.gov (United States)

    Alam, Daniel S; Chi, John J

    2013-10-01

    This article describes the challenges of facial reconstruction and the role of facial transplantation in certain facial defects and injuries. This information is of value to surgeons assessing facial injuries with massive soft tissue loss or injury. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Difference equations in massive higher order calculations

    International Nuclear Information System (INIS)

    Bierenbaum, I.; Bluemlein, J.; Klein, S.; Schneider, C.

    2007-07-01

    The calculation of massive 2-loop operator matrix elements, required for the higher order Wilson coefficients for heavy flavor production in deeply inelastic scattering, leads to new types of multiple infinite sums over harmonic sums and related functions, which depend on the Mellin parameter N. We report on the solution of these sums through higher order difference equations using the summation package Sigma. (orig.)

  10. FRW Cosmological Perturbations in Massive Bigravity

    CERN Document Server

    Comelli, D; Pilo, L

    2014-01-01

    Cosmological perturbations of FRW solutions in ghost free massive bigravity, including also a second matter sector, are studied in detail. At early time, we find that sub horizon exponential instabilities are unavoidable and they lead to a premature departure from the perturbative regime of cosmological perturbations.

  11. Circular symmetry in topologically massive gravity

    International Nuclear Information System (INIS)

    Deser, S; Franklin, J

    2010-01-01

    We re-derive, compactly, a topologically massive gravity (TMG) decoupling theorem: source-free TMG separates into its Einstein and Cotton sectors for spaces with a hypersurface-orthogonal Killing vector, here concretely for circular symmetry. We then generalize the theorem to include matter; surprisingly, the single Killing symmetry also forces conformal invariance, requiring the sources to be null. (note)

  12. NOTE: Circular symmetry in topologically massive gravity

    Science.gov (United States)

    Deser, S.; Franklin, J.

    2010-05-01

    We re-derive, compactly, a topologically massive gravity (TMG) decoupling theorem: source-free TMG separates into its Einstein and Cotton sectors for spaces with a hypersurface-orthogonal Killing vector, here concretely for circular symmetry. We then generalize the theorem to include matter; surprisingly, the single Killing symmetry also forces conformal invariance, requiring the sources to be null.

  13. Circular symmetry in topologically massive gravity

    Energy Technology Data Exchange (ETDEWEB)

    Deser, S [Physics Department, Brandeis University, Waltham, MA 02454 (United States); Franklin, J, E-mail: deser@brandeis.ed, E-mail: jfrankli@reed.ed [Reed College, Portland, OR 97202 (United States)

    2010-05-21

    We re-derive, compactly, a topologically massive gravity (TMG) decoupling theorem: source-free TMG separates into its Einstein and Cotton sectors for spaces with a hypersurface-orthogonal Killing vector, here concretely for circular symmetry. We then generalize the theorem to include matter; surprisingly, the single Killing symmetry also forces conformal invariance, requiring the sources to be null. (note)

  14. Massively parallel sequencing of forensic STRs

    DEFF Research Database (Denmark)

    Parson, Walther; Ballard, David; Budowle, Bruce

    2016-01-01

    The DNA Commission of the International Society for Forensic Genetics (ISFG) is reviewing factors that need to be considered ahead of the adoption by the forensic community of short tandem repeat (STR) genotyping by massively parallel sequencing (MPS) technologies. MPS produces sequence data that...

  15. Semantic similarity measure in biomedical domain leverage web search engine.

    Science.gov (United States)

    Chen, Chi-Huang; Hsieh, Sheau-Ling; Weng, Yung-Ching; Chang, Wen-Yung; Lai, Feipei

    2010-01-01

    Semantic similarity measure plays an essential role in Information Retrieval and Natural Language Processing. In this paper we propose a page-count-based semantic similarity measure and apply it in biomedical domains. Previous researches in semantic web related applications have deployed various semantic similarity measures. Despite the usefulness of the measurements in those applications, measuring semantic similarity between two terms remains a challenge task. The proposed method exploits page counts returned by the Web Search Engine. We define various similarity scores for two given terms P and Q, using the page counts for querying P, Q and P AND Q. Moreover, we propose a novel approach to compute semantic similarity using lexico-syntactic patterns with page counts. These different similarity scores are integrated adapting support vector machines, to leverage the robustness of semantic similarity measures. Experimental results on two datasets achieve correlation coefficients of 0.798 on the dataset provided by A. Hliaoutakis, 0.705 on the dataset provide by T. Pedersen with physician scores and 0.496 on the dataset provided by T. Pedersen et al. with expert scores.

  16. Leveraging information technology to drive improvement in patient satisfaction.

    Science.gov (United States)

    Nash, Mary; Pestrue, Justin; Geier, Peter; Sharp, Karen; Helder, Amy; McAlearney, Ann Scheck

    2010-01-01

    A healthcare organization's commitment to quality and the patient experience requires senior leader involvement in improvement strategies, and accountability for goals. Further, improvement strategies are most effective when driven by data, and in the world of patient satisfaction, evidence is growing that nurse leader rounding and discharge calls are strategic tactics that can improve patient satisfaction. This article describes how The Ohio State University Medical Center (OSUMC) leveraged health information technology (IT) to apply a data-driven strategy execution to improve the patient experience. Specifically, two IT-driven approaches were used: (1) business intelligence reporting tools were used to create a meaningful reporting system including dashboards, scorecards, and tracking reports and (2) an improvement plan was implemented that focused on two high-impact tactics and data to hardwire accountability. Targeted information from the IT systems enabled clinicians and administrators to execute these strategic tactics, and senior leaders to monitor achievement of strategic goals. As a result, OSUMC's inpatient satisfaction scores on the Hospital Consumer Assessment of Healthcare Providers and Systems survey improved from 56% nines and tens in 2006 to 71% in 2009. © 2010 National Association for Healthcare Quality.

  17. Leveraging the Domain of Work to Improve Migrant Health.

    Science.gov (United States)

    Flynn, Michael A; Wickramage, Kolitha

    2017-10-19

    Work is a principal driver of current international migration, a primary social determinant of health, and a fundamental point of articulation between migrants and their host society. Efforts by international organizations to promote migrant health have traditionally focused on infectious diseases and access to healthcare, while international labor organizations have largely focused on issues of occupational health. The underutilization of the domain of work in addressing the health of migrants is truly a missed opportunity for influencing worker well-being and reducing societal economic burden. Understanding of the relationships among migration, work, and health would facilitate further integration of migrant health concerns into the policy agenda of governments and international agencies that work at the nexus of labor, health and development. The domain of work offers an opportunity to capitalize on the existing health and development infrastructure and leverage technical resources, programs and research to promote migrant health. It also provides the opportunity to advance migrant health through new and innovative approaches and partnerships.

  18. Towards Personalized Medicine: Leveraging Patient Similarity and Drug Similarity Analytics

    Science.gov (United States)

    Zhang, Ping; Wang, Fei; Hu, Jianying; Sorrentino, Robert

    2014-01-01

    The rapid adoption of electronic health records (EHR) provides a comprehensive source for exploratory and predictive analytic to support clinical decision-making. In this paper, we investigate how to utilize EHR to tailor treatments to individual patients based on their likelihood to respond to a therapy. We construct a heterogeneous graph which includes two domains (patients and drugs) and encodes three relationships (patient similarity, drug similarity, and patient-drug prior associations). We describe a novel approach for performing a label propagation procedure to spread the label information representing the effectiveness of different drugs for different patients over this heterogeneous graph. The proposed method has been applied on a real-world EHR dataset to help identify personalized treatments for hypercholesterolemia. The experimental results demonstrate the effectiveness of the approach and suggest that the combination of appropriate patient similarity and drug similarity analytics could lead to actionable insights for personalized medicine. Particularly, by leveraging drug similarity in combination with patient similarity, our method could perform well even on new or rarely used drugs for which there are few records of known past performance. PMID:25717413

  19. Digital marketing as a tool for innovation and business leverage

    Directory of Open Access Journals (Sweden)

    Andrea Cristina Marin

    2018-01-01

    Full Text Available This article presents a research that sought to clarify and evidence the evolution of the historical scenario of marketing science and the sales processes attributed to it, contextualizing how such changes of concepts through their epochs of evolution and their paradigms were applied by companies as to the present day. The speed of structural changes in the new economy and the opportunities generated by current information and communication technologies require companies to continually rethink their business, analyzing new marketing concepts that outperform traditional marketing and can be used in the digital environment as a way of attracting and relationship with customers. This is a descriptive bibliographical research with a qualitative focus, which includes a brief historical survey about the science of Marketing, as well as an investigation that discusses the technology attributed to Digital Marketing as a tool for leveraging business in the contemporary model of today's companies, based on the works of the authors who discuss the theme, such as Las Casas (2009; At the end of the research it was possible to verify that the impact of the new information and communication technologies transformed the way in which the companies are currently related using new forms of advertisements to intensify and adapt the competitiveness established by the current market parameters to other competing companies .

  20. The potential / opportunities for leveraging competences: the intangible assets dimension

    Directory of Open Access Journals (Sweden)

    Eglė Kazlauskienė

    2017-03-01

    Full Text Available A lot of discussions on the variety and identification of individual abilities and / or general competences arise. There is a lack of unanimous approach among scholars. Relevance of the research is proved by numerous publications on human, intellectual, knowledge capital, the impact of intangible assets on economic growth of the country and competitiveness. Some intangible assets could be easily identified, it is easy to determine their value because they are manifested in material forms, e.g. software, and patents; however, there is a increasing demand to identify and evaluate those intangible assets, which are complicated in terms of determining their value; those are e.g. knowledge, experience, abilities, and competences. The aim of this paper is to determine the potential of leveraging abilities to increase income of Lithuanian population by distinguishing abilities in the context of intangible assets definition and evaluation. The methods of research include the following: analysis of scientific literature, comparative analysis, questionnaire survey, summarizing method, statistical data analysis methods. Empirical research allowed determining statistically significant relations between general abilities, population income and expenditure, and education. The majority of surveyed Lithuanian inhabitants think that their income will not change, if they improve their abilities in any of identified domains.

  1. Leveraging multi-generational workforce values in interactive information societies

    Directory of Open Access Journals (Sweden)

    Sophie van der Walt

    2010-11-01

    Objectives: This article advocates the need for generational awareness and addresses how this awareness presents benefits to companies, such as, increased productivity, improved succession planning policies and strategies to recruit and retain a diverse workforce. The research problem is directed at how diversity management influences Traditionalists, Baby Boomers, Generation X and Generation Y in terms of their work performance and co-worker relationships. Method: The research design combines Critical Theory and Generational Theory within the mixed-method paradigm. The sequential exploratory design was decided upon as it studies the unknown relationships between different generations of employees. The literature review was followed by a quantitative empirical research component and data was collected by means of a questionnaire. Results: The findings highlight specific differences between generations regarding their perspectives on work values and co-worker relationships, rewards, work-life balance and retirement. Conclusion: The article concludes with recommendations on the role diversity management plays in terms of work performance and co-worker relationships. By leveraging generational awareness in the interactive information society organizations with a multi-generational workforce will succeed in the competitive business environment.

  2. Leveraging OpenStudio's Application Programming Interfaces: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Long, N.; Ball, B.; Goldwasser, D.; Parker, A.; Elling, J.; Davis, O.; Kruchten, D.

    2013-11-01

    OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scale incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.

  3. Leveraging Old Intellectual Property to Accelerate Technology Entrepreneurship

    Directory of Open Access Journals (Sweden)

    Derek Smith

    2013-06-01

    Full Text Available Acquiring or licensing assets to older technologies, including surviving intellectual property rights, is an often-overlooked viable strategy for accelerating technology entrepreneurship. This strategy can help entrepreneurs short-cut the growth of a customer base, reduce development effort, and shorten the time to market with a minimum viable product. However, this strategy is not without risk; entrepreneurs need to be careful that the acquired intellectual property rights are not fraught with issues that could severely outweigh any perceived value. Proper investigation is required to ensure success because the current literature fails to provide tools that an entrepreneur can apply when considering the acquisition of intellectual property. This article includes a case study of a technology company – Piranha Games – that indirectly acquired sole and exclusive access to a substantial historical customer base by acquiring and licensing older technology and surviving intellectual property assets. The founders then leveraged the existing product brand and its historical customers to acquire significant funding and went global with a minimum viable product in three years. The copyright and trademark assets provided value on day one to Piranha Games by making it difficult and risky for others to exploit the technology. Based on this case study, this article offers recommendations to entrepreneurs who may benefit from acquiring old intellectual property to accelerate the growth of their startups.

  4. Learning to leverage existing information systems: Part 1. Principles.

    Science.gov (United States)

    Neil, Nancy; Nerenz, David

    2003-10-01

    The success of performance improvement efforts depends on effective measurement and feedback regarding clinical processes and outcomes. Yet most health care organizations have fragmented rather than integrated data systems. Methods and practical guidance are provided for leveraging available information sources to obtain and create valid performance improvement-related information for use by clinicians and administrators. At Virginia Mason Health System (VMHS; Seattle), a vertically integrated hospital and multispecialty group practice, patient records are paper based and are supplemented with electronic reporting for laboratory and radiology services. Despite growth in the resources and interest devoted to organization-wide performance measurement, quality improvement, and evidence-based tools, VMHS's information systems consist of largely stand-alone, legacy systems organized around the ability to retrieve information on patients, one at a time. By 2002, without any investment in technology, VMHS had developed standardized, clinic-wide key indicators of performance updated and reported regularly at the patient, provider, site, and organizational levels. On the basis of VHMS's experience, principles can be suggested to guide other organizations to explore solutions using their own information systems: for example, start simply, but start; identify information needs; tap multiple data streams; and improve incrementally.

  5. Endomyocardial fibrosis associated with massive calcification of the left ventricle

    Directory of Open Access Journals (Sweden)

    Canesin Manoel Fernandes

    1999-01-01

    Full Text Available This is the report of a rare case of endomyocardial fibrosis associated with massive calcification of the left ventricle in a male patient with dyspnea on great exertion, which began 5 years earlier and rapidly evolved. Due to lack of information and the absence of clinical signs that could characterize impairment of other organs, the case was initially managed as a disease with a pulmonary origin. With the evolution of the disease and in the presence of radiological images of heterogeneous opacification in the projection of the left ventricle, the diagnostic hypothesis of endomyocardial disease was established. This hypothesis was later confirmed on chest computed tomography. The patient died on the 16th day of the hospital stay, probably because of lack of myocardial reserve, with clinical findings of refractory heart failure, possibly aggravated by pulmonary infection. This shows that a rare disease such as endomyocardial fibrosis associated with massive calcification of the left ventricle may be suspected on a simple chest X-ray and confirmed by computed tomography.

  6. 13 CFR 108.1630 - SBA regulation of Brokers and Dealers and disclosure to purchasers of Leverage or Trust...

    Science.gov (United States)

    2010-01-01

    ... Financial Assistance for NMVC Companies (Leverage) Funding Leverage by Use of Sba Guaranteed Trust... standing in respect to compliance with the financial, ethical, and reporting requirements of such body...

  7. Leveraging long read sequencing from a single individual to provide a comprehensive resource for benchmarking variant calling methods.

    Science.gov (United States)

    Mu, John C; Tootoonchi Afshar, Pegah; Mohiyuddin, Marghoob; Chen, Xi; Li, Jian; Bani Asadi, Narges; Gerstein, Mark B; Wong, Wing H; Lam, Hugo Y K

    2015-09-28

    A high-confidence, comprehensive human variant set is critical in assessing accuracy of sequencing algorithms, which are crucial in precision medicine based on high-throughput sequencing. Although recent works have attempted to provide such a resource, they still do not encompass all major types of variants including structural variants (SVs). Thus, we leveraged the massive high-quality Sanger sequences from the HuRef genome to construct by far the most comprehensive gold set of a single individual, which was cross validated with deep Illumina sequencing, population datasets, and well-established algorithms. It was a necessary effort to completely reanalyze the HuRef genome as its previously published variants were mostly reported five years ago, suffering from compatibility, organization, and accuracy issues that prevent their direct use in benchmarking. Our extensive analysis and validation resulted in a gold set with high specificity and sensitivity. In contrast to the current gold sets of the NA12878 or HS1011 genomes, our gold set is the first that includes small variants, deletion SVs and insertion SVs up to a hundred thousand base-pairs. We demonstrate the utility of our HuRef gold set to benchmark several published SV detection tools.

  8. KEPEMILIKAN MANAJERIAL DAN LEVERAGE SEBAGAI PREDIKTOR PROFITABILITAS DAN PENGUNGKAPAN CORPORATE SOCIAL RESPONSIBILITY

    Directory of Open Access Journals (Sweden)

    Putu Agus Dwipayadnya

    2016-02-01

    Full Text Available Managerial Ownership and Leverage as Profitability Predictor and Corporate Social Responsibility Disclosure. This study aimed to determine the effect of managerial ownership composition and leverage on profitability and disclosure of CSR. The population in this study are manufacturing companies listed in Indonesia Stock Exchange. Sampling was conducted research with purposive sampling method so that the sample of this study as many as 24 companies. The research data is secondary data obtained from the website of the Indonesia Stock Exchange and the Indonesian Capital Market Directory from 2009 until 2013. Testing research hypotheses using path analysis technique (path analysis. The results showed that: (1 managerial ownership and leverage impact positivelly on profitability. (2 managerial ownership effect negativelly on leverage. (3 managerial ownership and leverage do not affect the disclosure of CSR. (4 profitability impact positively on the disclosure of CSR. (5 profitability is able to mediate the relationship managerial ownership and leverage on CSR disclosure.   Keywords: Managerial Ownership, Leverage, Profitability and Corporate Social Responsibility Disclosure

  9. Modeling Financial Time Series Based on a Market Microstructure Model with Leverage Effect

    Directory of Open Access Journals (Sweden)

    Yanhui Xi

    2016-01-01

    Full Text Available The basic market microstructure model specifies that the price/return innovation and the volatility innovation are independent Gaussian white noise processes. However, the financial leverage effect has been found to be statistically significant in many financial time series. In this paper, a novel market microstructure model with leverage effects is proposed. The model specification assumed a negative correlation in the errors between the price/return innovation and the volatility innovation. With the new representations, a theoretical explanation of leverage effect is provided. Simulated data and daily stock market indices (Shanghai composite index, Shenzhen component index, and Standard and Poor’s 500 Composite index via Bayesian Markov Chain Monte Carlo (MCMC method are used to estimate the leverage market microstructure model. The results verify the effectiveness of the model and its estimation approach proposed in the paper and also indicate that the stock markets have strong leverage effects. Compared with the classical leverage stochastic volatility (SV model in terms of DIC (Deviance Information Criterion, the leverage market microstructure model fits the data better.

  10. Leveraging energy efficiency to finance public-private social housing projects

    International Nuclear Information System (INIS)

    Copiello, Sergio

    2016-01-01

    The Italian housing model relies on a high rate of privately owned houses. In comparison, few dwellings are built and managed by the public sector. The social housing stock has been built mainly during some post-second world war decades; instead, since the early nineties, it underwent a privatization process. Such a model is inefficient and iniquitous in the long run. Therefore, after being disregarded for several years, social housing has gone back to be among the main agenda items. Nonetheless, due to the lack of public grants, new funding sources are required. The government now fosters an increasing involvement of private finance through Public-Private Partnership schemes. A first outcome can be found in some pioneering experiences. Their comparative analysis allows bringing out worthwhile findings, which are useful to steer housing policies. Moderate to low yields entail the need to involve new kinds of private entities, particularly those adopting a venture philanthropy approach. Meanwhile, building energy performance measures are a crucial driver of feasibility. They allow the tenants to be willing to pay agreed rents somehow higher than both social rents of protected tenancies and fair rents of regulated tenancies. - Highlights: •In Italy, the provision of affordable dwellings was disregarded for years. •Recently, instead, social housing has come back to be among the main agenda items. •Latest regulations try to tie together social housing and Public-Private Partnership. •Social tenants may be asked to pay more than in protected and regulated tenancies. •Energy-efficient measures allow keeping the tenants neutral about the rent increase.

  11. SuperWiseNet - a unique network platform to leverage student entrepreneurship projects

    DEFF Research Database (Denmark)

    Gertsen, Frank; Høgsaa, Asger; Tollestrup, Christian H. T.

    2016-01-01

    The area of interests is the development of a potentially new complementary industry-university component, which has been labelled ‘SuperWiseNet’ for the context of academic entrepreneurial programs. The SuperWiseNet is a network-based platform for interaction between students of entrepreneurship...

  12. Leveraging Research Partnerships to Co-Produce Actionable Science and Build Institutional Capacity

    Science.gov (United States)

    Fleming, P.; Chinn, A.; Rufo Hill, J.; Edgerly, J.; Garcia, E.

    2017-12-01

    Seattle Public Utilities (SPU) provides high quality drinking water to 1.4 million people in the greater Seattle area and storm, wastewater and solid waste services to the City of Seattle. SPU's engagement on climate change has evolved significantly over the past 20 years. What began in 1997 as an inquiry into how El Nino may affect water supply has evolved into a broad based ongoing exploration that includes extensive in-house knowledge, capacity and expertise. This presentation will describe SPU's evolution from a funder and consumer of climate research to an active contributor to the development of applied research products, highlighted SPU's changing role in three climate impacts assessment studies. It will describe how SPU has leveraged these studies and partnerships to enhance its knowledge base, build its internal institutional capacity and produce actionable science that it is helping to foster incorporation of climate change into various aspects of utility planning and decision making. It will describe the PUMA Project and how the results from that research effort are being factored into SPU's state mandated Water System Plan.

  13. CAUGHT IN THE ACT: THE ASSEMBLY OF MASSIVE CLUSTER GALAXIES AT z = 1.62

    International Nuclear Information System (INIS)

    Lotz, Jennifer M.; Ferguson, Henry C.; Grogin, Norman; Koekemoer, Anton M.; Papovich, Casey; Tran, Kim-Vy; Faber, S. M.; Guo Yicheng; Kocevski, Dale; Lee, Kyoung-Soo; McIntosh, Daniel; Momcheva, Ivelina; Rudnick, Gregory; Saintonge, Amelie; Van der Wel, Arjen; Willmer, Christopher

    2013-01-01

    We present the recent merger history of massive galaxies in a spectroscopically confirmed proto-cluster at z = 1.62. Using Hubble Space Telescope WFC3 near-infrared imaging from the Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey, we select cluster and z ∼ 1.6 field galaxies with M star ≥ 3 × 10 10 M ☉ , to determine the frequency of double nuclei or close companions within projected separations less than 20 kpc co-moving. We find that four out of five spectroscopically confirmed massive proto-cluster galaxies have double nuclei, and 57 +13 -14 % of all M star ≥ 3 × 10 10 M ☉ cluster candidates are observed in either close pair systems or have double nuclei. In contrast, only 11% ± 3% of the field galaxies are observed in close pair/double nuclei systems. After correcting for the contribution from random projections, the implied merger rate per massive galaxy in the proto-cluster is ∼3-10 times higher than the merger rate of massive field galaxies at z ∼ 1.6. Close pairs in the cluster have minor merger stellar mass ratios (M primary : M satellite ≥ 4), while the field pairs consist of both major and minor mergers. At least half of the cluster mergers are gas-poor, as indicated by their red colors and low 24 μm fluxes. Two of the double-nucleated cluster members have X-ray detected active galactic nuclei with L x > 10 43 erg s –1 , and are strong candidates for dual or offset super-massive black holes. We conclude that the massive z = 1.62 proto-cluster galaxies are undergoing accelerated assembly via minor mergers, and discuss the implications for galaxy evolution in proto-cluster environments

  14. HiView: an integrative genome browser to leverage Hi-C results for the interpretation of GWAS variants.

    Science.gov (United States)

    Xu, Zheng; Zhang, Guosheng; Duan, Qing; Chai, Shengjie; Zhang, Baqun; Wu, Cong; Jin, Fulai; Yue, Feng; Li, Yun; Hu, Ming

    2016-03-11

    Genome-wide association studies (GWAS) have identified thousands of genetic variants associated with complex traits and diseases. However, most of them are located in the non-protein coding regions, and therefore it is challenging to hypothesize the functions of these non-coding GWAS variants. Recent large efforts such as the ENCODE and Roadmap Epigenomics projects have predicted a large number of regulatory elements. However, the target genes of these regulatory elements remain largely unknown. Chromatin conformation capture based technologies such as Hi-C can directly measure the chromatin interactions and have generated an increasingly comprehensive catalog of the interactome between the distal regulatory elements and their potential target genes. Leveraging such information revealed by Hi-C holds the promise of elucidating the functions of genetic variants in human diseases. In this work, we present HiView, the first integrative genome browser to leverage Hi-C results for the interpretation of GWAS variants. HiView is able to display Hi-C data and statistical evidence for chromatin interactions in genomic regions surrounding any given GWAS variant, enabling straightforward visualization and interpretation. We believe that as the first GWAS variants-centered Hi-C genome browser, HiView is a useful tool guiding post-GWAS functional genomics studies. HiView is freely accessible at: http://www.unc.edu/~yunmli/HiView .

  15. Pushing the limits : from better bits to faster coil, companies leverage technology to ramp up onshore drilling performance

    Energy Technology Data Exchange (ETDEWEB)

    Smith, M.

    2009-06-15

    Horizontal drilling and drilling with coiled tubing are two well drilling techniques that have steadily gained ground in the drilling industry. Most of the techniques evolved in western Canada and Alaska, but are now being successfully used south of the border. This article discussed the leveraging of technology by drilling companies in order to ramp up onshore drilling performance. Calgary-based Xtreme Coil Drilling Corp. leveraged its unique coil over top drive rigs in order to score more speed records and set new marks in both the United States Rockies and Mexico. This article also referred to other companies and their wells that have set records, including CNX Gas Corporation and the Marcellus Shale prospect; Smith International and its horizontal turbodrilling of a Pennsylvanian reservoir; and Baker Oil Tools' new rotating, self-aligning multilateral (RAM) system. For each of these examples, the article described the technology and the challenges encountered by the companies as well as the objectives of the project, and results of the drilling efforts. 2 figs.

  16. 7 CFR 4290.1610 - Effect of prepayment or early redemption of Leverage on a Trust Certificate.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Effect of prepayment or early redemption of Leverage... AGRICULTURE RURAL BUSINESS INVESTMENT COMPANY (âRBICâ) PROGRAM Financial Assistance for RBICs (Leverage) Funding Leverage by Use of Guaranteed Trust Certificates (âtcsâ) § 4290.1610 Effect of prepayment or early...

  17. 13 CFR 107.1610 - Effect of prepayment or early redemption of Leverage on a Trust Certificate.

    Science.gov (United States)

    2010-01-01

    ... (Leverage) Funding Leverage by Use of Sba-Guaranteed Trust Certificates (âtcsâ) § 107.1610 Effect of... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Effect of prepayment or early redemption of Leverage on a Trust Certificate. 107.1610 Section 107.1610 Business Credit and Assistance SMALL...

  18. 13 CFR 108.1610 - Effect of prepayment or early redemption of Leverage on a Trust Certificate.

    Science.gov (United States)

    2010-01-01

    ... Companies (Leverage) Funding Leverage by Use of Sba Guaranteed Trust Certificates (âtcsâ) § 108.1610 Effect... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Effect of prepayment or early redemption of Leverage on a Trust Certificate. 108.1610 Section 108.1610 Business Credit and Assistance SMALL...

  19. Data Flow for the TERRA-REF project

    Science.gov (United States)

    Kooper, R.; Burnette, M.; Maloney, J.; LeBauer, D.

    2017-12-01

    The Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform (TERRA-REF) program aims to identify crop traits that are best suited to producing high-energy sustainable biofuels and match those plant characteristics to their genes to speed the plant breeding process. One tool used to achieve this goal is a high-throughput phenotyping robot outfitted with sensors and cameras to monitor the growth of 1.25 acres of sorghum. Data types range from hyperspectral imaging to 3D reconstructions and thermal profiles, all at 1mm resolution. This system produces thousands of daily measurements with high spatiotemporal resolution. The team at NCSA processes, annotates, organizes and stores the massive amounts of data produced by this system - up to 5 TB per day. Data from the sensors is streamed to a local gantry-cache server. The standardized sensor raw data stream is automatically and securely delivered to NCSA using Globus Connect service. Once files have been successfully received by the Globus endpoint, the files are removed from the gantry-cache server. As each dataset arrives or is created the Clowder system automatically triggers different software tools to analyze each file, extract information, and convert files to a common format. Other tools can be triggered to run after all required data is uploaded. For example, a stitched image of the entire field is created after all images of the field become available. Some of these tools were developed by external collaborators based on predictive models and algorithms, others were developed as part of other projects and could be leveraged by the TERRA project. Data will be stored for the lifetime of the project and is estimated to reach 10 PB over 3 years. The Clowder system, BETY and other systems will allow users to easily find data by browsing or searching the extracted information.

  20. A rare case of massive hepatosplenomegaly due to acute ...

    African Journals Online (AJOL)

    massive hepatosplenomegaly include chronic lymphoproliferative malignancies, infections (malaria, leishmaniasis) and glycogen storage diseases (Gaucher's disease).[4] In our case the probable causes of the massive hepatosplenomegaly were a combination of late presentation after symptom onset, leukaemic infiltration.

  1. Reappraising the concept of massive transfusion in trauma

    DEFF Research Database (Denmark)

    Stanworth, Simon J; Morris, Timothy P; Gaarder, Christine

    2010-01-01

    ABSTRACT : INTRODUCTION : The massive-transfusion concept was introduced to recognize the dilutional complications resulting from large volumes of packed red blood cells (PRBCs). Definitions of massive transfusion vary and lack supporting clinical evidence. Damage-control resuscitation regimens o...

  2. Massive vulval oedema in multiple pregnancies at Bugando Medical ...

    African Journals Online (AJOL)

    In this report we describe two cases of massive vulval oedema seen in two ... passage of yellow-whitish discharge per vagina (Figure 1). Examination revealed massive oedema, and digital vaginal examination was difficult due to tenderness.

  3. Massively Parallel Algorithms for Solution of Schrodinger Equation

    Science.gov (United States)

    Fijany, Amir; Barhen, Jacob; Toomerian, Nikzad

    1994-01-01

    In this paper massively parallel algorithms for solution of Schrodinger equation are developed. Our results clearly indicate that the Crank-Nicolson method, in addition to its excellent numerical properties, is also highly suitable for massively parallel computation.

  4. Leveraging CubeSat Technology to Address Nighttime Imagery Requirements over the Arctic

    Science.gov (United States)

    Pereira, J. J.; Mamula, D.; Caulfield, M.; Gallagher, F. W., III; Spencer, D.; Petrescu, E. M.; Ostroy, J.; Pack, D. W.; LaRosa, A.

    2017-12-01

    The National Oceanic and Atmospheric Administration (NOAA) has begun planning for the future operational environmental satellite system by conducting the NOAA Satellite Observing System Architecture (NSOSA) study. In support of the NSOSA study, NOAA is exploring how CubeSat technology funded by NASA can be used to demonstrate the ability to measure three-dimensional profiles of global temperature and water vapor. These measurements are critical for the National Weather Service's (NWS) weather prediction mission. NOAA is conducting design studies on Earth Observing Nanosatellites (EON) for microwave (EON-MW) and infrared (EON-IR) soundings, with MIT Lincoln Laboratory and NASA JPL, respectively. The next step is to explore the technology required for a CubeSat mission to address NWS nighttime imagery requirements over the Arctic. The concept is called EON-Day/Night Band (DNB). The DNB is a 0.5-0.9 micron channel currently on the operational Visible Infrared Imaging Radiometer Suite (VIIRS) instrument, which is part of the Suomi-National Polar-orbiting Partnership and Joint Polar Satellite System satellites. NWS has found DNB very useful during the long periods of darkness that occur during the Alaskan cold season. The DNB enables nighttime imagery products of fog, clouds, and sea ice. EON-DNB will leverage experiments carried out by The Aerospace Corporation's CUbesat MULtispectral Observation System (CUMULOS) sensor and other related work. CUMULOS is a DoD-funded demonstration of COTS camera technology integrated as a secondary mission on the JPL Integrated Solar Array and Reflectarray Antenna mission. CUMULOS is demonstrating a staring visible Si CMOS camera. The EON-DNB project will leverage proven, advanced compact visible lens and focal plane camera technologies to meet NWS user needs for nighttime visible imagery. Expanding this technology to an operational demonstration carries several areas of risk that need to be addressed prior to an operational mission

  5. The value of tax shields with a fixed book-value leverage ratio

    OpenAIRE

    Fernandez, Pablo

    2005-01-01

    The value of tax shields depends only on the nature of the stochastic process of the net increases of debt. The value of tax shields in a world with no leverage cost is the tax rate times the current debt plus the present value of the net increases of debt. We develop valuation formulae for a company that maintains a fixed book-value leverage ratio and show that it is more realistic than to assume, as Miles-Ezzell (1980) do, a fixed market-value leverage ratio. We also show that Miles-Ezzell ...

  6. Struktur Kepemilikan, Kebijakan Dividen, dan Leverage sebagai Determinan atas Nilai Perusahaan

    Directory of Open Access Journals (Sweden)

    Indah Eva Ambarwati

    2014-08-01

    Full Text Available This study aims to make research and prove the exis­tence of empirical evidence about the effect of Ownership Structure, Dividend Policy and Leverage partially and simultaneously by using multiple linear regression method. The results of the analysis in this study shows that the Ownership Structure, Dividend Policy and Leverage simultaneous effect on firm value. Partially, the variables that affect the value of the company is Leverage (Debt to Equity Ratio, Debt to Capital Asset Ratio, and Long Term Debt Ratio.

  7. Is Leverage Effective in Increasing Performance Under Managerial Moral Hazard?

    NARCIS (Netherlands)

    Calcagno, R.

    2000-01-01

    We consider a model in which the principal-agent relation between inside shareholders and the management affects the firm value.We study the effect of financing the project with risky debt in changing the incentive for a risk-neutral shareholder (the principal) to implement the project-value

  8. Beyond traditional advertisements: leveraging Facebook's social structures for research recruitment.

    Science.gov (United States)

    Valdez, Rupa S; Guterbock, Thomas M; Thompson, Morgan J; Reilly, Jeremiah D; Menefee, Hannah K; Bennici, Maria S; Williams, Ishan C; Rexrode, Deborah L

    2014-10-27

    Obtaining access to a demographically and geographically diverse sample for health-related research can be costly and time consuming. Previous studies have reported mixed results regarding the potential of using social media-based advertisements to overcome these challenges. Our aim was to develop and assess the feasibility, benefits, and challenges of recruiting for research studies related to consumer health information technology (IT) by leveraging the social structures embedded in the social networking platform, Facebook. Two recruitment strategies that involved direct communication with existing Facebook groups and pages were developed and implemented in two distinct populations. The first recruitment strategy involved posting a survey link directly to consenting groups and pages and was used to recruit Filipino-Americans to a study assessing the perceptions, use of, and preferences for consumer health IT. This study took place between August and December 2013. The second recruitment strategy targeted individuals with type 2 diabetes and involved creating a study-related Facebook group and asking administrators of other groups and pages to publicize our group to their members. Group members were then directly invited to participate in an online pre-study survey. This portion of a larger study to understand existing health management practices as a foundation for consumer health IT design took place between May and June 2014. In executing both recruitment strategies, efforts were made to establish trust and transparency. Recruitment rate, cost, content of interaction, and characteristics of the sample obtained were used to assess the recruitment methods. The two recruitment methods yielded 87 and 79 complete responses, respectively. The first recruitment method yielded a rate of study completion proportionate to that of the rate of posts made, whereas recruitment successes of the second recruitment method seemed to follow directly from the actions of a subset

  9. Massively Parallel Computing: A Sandia Perspective

    Energy Technology Data Exchange (ETDEWEB)

    Dosanjh, Sudip S.; Greenberg, David S.; Hendrickson, Bruce; Heroux, Michael A.; Plimpton, Steve J.; Tomkins, James L.; Womble, David E.

    1999-05-06

    The computing power available to scientists and engineers has increased dramatically in the past decade, due in part to progress in making massively parallel computing practical and available. The expectation for these machines has been great. The reality is that progress has been slower than expected. Nevertheless, massively parallel computing is beginning to realize its potential for enabling significant break-throughs in science and engineering. This paper provides a perspective on the state of the field, colored by the authors' experiences using large scale parallel machines at Sandia National Laboratories. We address trends in hardware, system software and algorithms, and we also offer our view of the forces shaping the parallel computing industry.

  10. Massive ovarian edema, due to adjacent appendicitis.

    Science.gov (United States)

    Callen, Andrew L; Illangasekare, Tushani; Poder, Liina

    2017-04-01

    Massive ovarian edema is a benign clinical entity, the imaging findings of which can mimic an adnexal mass or ovarian torsion. In the setting of acute abdominal pain, identifying massive ovarian edema is a key in avoiding potential fertility-threatening surgery in young women. In addition, it is important to consider other contributing pathology when ovarian edema is secondary to another process. We present a case of a young woman presenting with subacute abdominal pain, whose initial workup revealed marked enlarged right ovary. Further imaging, diagnostic tests, and eventually diagnostic laparoscopy revealed that the ovarian enlargement was secondary to subacute appendicitis, rather than a primary adnexal process. We review the classic ultrasound and MRI imaging findings and pitfalls that relate to this diagnosis.

  11. Stochastic spin-one massive field

    International Nuclear Information System (INIS)

    Lim, S.C.

    1984-01-01

    Stochastic quantization schemes of Nelson and Parisi and Wu are applied to a spin-one massive field. Unlike the scalar case Nelson's stochastic spin-one massive field cannot be identified with the corresponding euclidean field even if the fourth component of the euclidean coordinate is taken as equal to the real physical time. In the Parisi-Wu quantization scheme the stochastic Proca vector field has a similar property as the scalar field; which has an asymptotically stationary part and a transient part. The large equal-time limit of the expectation values of the stochastic Proca field are equal to the expectation values of the corresponding euclidean field. In the Stueckelberg formalism the Parisi-Wu scheme gives rise to a stochastic vector field which differs from the massless gauge field in that the gauge cannot be fixed by the choice of boundary condition. (orig.)

  12. Frontiers of massively parallel scientific computation

    International Nuclear Information System (INIS)

    Fischer, J.R.

    1987-07-01

    Practical applications using massively parallel computer hardware first appeared during the 1980s. Their development was motivated by the need for computing power orders of magnitude beyond that available today for tasks such as numerical simulation of complex physical and biological processes, generation of interactive visual displays, satellite image analysis, and knowledge based systems. Representative of the first generation of this new class of computers is the Massively Parallel Processor (MPP). A team of scientists was provided the opportunity to test and implement their algorithms on the MPP. The first results are presented. The research spans a broad variety of applications including Earth sciences, physics, signal and image processing, computer science, and graphics. The performance of the MPP was very good. Results obtained using the Connection Machine and the Distributed Array Processor (DAP) are presented

  13. M2M massive wireless access

    DEFF Research Database (Denmark)

    Zanella, Andrea; Zorzi, Michele; Santos, André F.

    2013-01-01

    In order to make the Internet of Things a reality, ubiquitous coverage and low-complexity connectivity are required. Cellular networks are hence the most straightforward and realistic solution to enable a massive deployment of always connected Machines around the globe. Nevertheless, a paradigm...... shift in the conception and design of future cellular networks is called for. Massive access attempts, low-complexity and cheap machines, sporadic transmission and correlated signals are among the main properties of this new reality, whose main consequence is the disruption of the development...... Access Reservation, Coded Random Access and the exploitation of multiuser detection in random access. Additionally, we will show how the properties of machine originated signals, such as sparsity and spatial/time correlation can be exploited. The end goal of this paper is to provide motivation...

  14. Massive Predictive Modeling using Oracle R Enterprise

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    R is fast becoming the lingua franca for analyzing data via statistics, visualization, and predictive analytics. For enterprise-scale data, R users have three main concerns: scalability, performance, and production deployment. Oracle's R-based technologies - Oracle R Distribution, Oracle R Enterprise, Oracle R Connector for Hadoop, and the R package ROracle - address these concerns. In this talk, we introduce Oracle's R technologies, highlighting how each enables R users to achieve scalability and performance while making production deployment of R results a natural outcome of the data analyst/scientist efforts. The focus then turns to Oracle R Enterprise with code examples using the transparency layer and embedded R execution, targeting massive predictive modeling. One goal behind massive predictive modeling is to build models per entity, such as customers, zip codes, simulations, in an effort to understand behavior and tailor predictions at the entity level. Predictions...

  15. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Zacharia, T.; Aramayo, G.A.

    1994-01-01

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  16. Massive scalar field evolution in de Sitter

    Energy Technology Data Exchange (ETDEWEB)

    Markkanen, Tommi [Department of Physics, King’s College London,Strand, London WC2R 2LS (United Kingdom); Rajantie, Arttu [Department of Physics, Imperial College London,London SW7 2AZ (United Kingdom)

    2017-01-30

    The behaviour of a massive, non-interacting and non-minimally coupled quantised scalar field in an expanding de Sitter background is investigated by solving the field evolution for an arbitrary initial state. In this approach there is no need to choose a vacuum in order to provide a definition for particle states, nor to introduce an explicit ultraviolet regularization. We conclude that the expanding de Sitter space is a stable equilibrium configuration under small perturbations of the initial conditions. Depending on the initial state, the energy density can approach its asymptotic value from above or below, the latter of which implies a violation of the weak energy condition. The backreaction of the quantum corrections can therefore lead to a phase of super-acceleration also in the non-interacting massive case.

  17. How Massive Single Stars End Their Life

    Science.gov (United States)

    Heger, A.; Fryer, C. L.; Woosley, S. E.; Langer, N.; Hartmann, D. H.

    2003-01-01

    How massive stars die-what sort of explosion and remnant each produces-depends chiefly on the masses of their helium cores and hydrogen envelopes at death. For single stars, stellar winds are the only means of mass loss, and these are a function of the metallicity of the star. We discuss how metallicity, and a simplified prescription for its effect on mass loss, affects the evolution and final fate of massive stars. We map, as a function of mass and metallicity, where black holes and neutron stars are likely to form and where different types of supernovae are produced. Integrating over an initial mass function, we derive the relative populations as a function of metallicity. Provided that single stars rotate rapidly enough at death, we speculate on stellar populations that might produce gamma-ray bursts and jet-driven supernovae.

  18. Electromagnetic form factors of a massive neutrino

    International Nuclear Information System (INIS)

    Dvornikov, M.S.; Studenikin, A.I.

    2004-01-01

    Electromagnetic form factors of a massive neutrino are studied in a minimally extended standard model in an arbitrary R ξ gauge and taking into account the dependence on the masses of all interacting particles. The contribution from all Feynman diagrams to the electric, magnetic, and anapole form factors, in which the dependence of the masses of all particles as well as on gauge parameters is accounted for exactly, are obtained for the first time in explicit form. The asymptotic behavior of the magnetic form factor for large negative squares of the momentum of an external photon is analyzed and the expression for the anapole moment of a massive neutrino is derived. The results are generalized to the case of mixing between various flavors of the neutrino. Explicit expressions are obtained for the electric, magnetic, and electric dipole and anapole transitional form factors as well as for the transitional electric dipole moment

  19. HII regions in collapsing massive molecular clouds

    International Nuclear Information System (INIS)

    Yorke, H.W.; Bodenheimer, P.; Tenorio-Tagle, G.

    1982-01-01

    Results of two-dimensional numerical calculations of the evolution of HII regions associated with self-gravitating, massive molecular clouds are presented. Depending on the location of the exciting star, a champagne flow can occur concurrently with the central collapse of a nonrotating cloud. Partial evaporation of the cloud at a rate of about 0.005 solar masses/yr results. When 100 O-stars are placed at the center of a freely falling cloud of 3x10 5 solar masses no evaporation takes place. Rotating clouds collapse to disks and the champagne flow can evaporate the cloud at a higher rate (0.01 solar masses/yr). It is concluded that massive clouds containing OB-stars have lifetimes of no more than 10 7 yr. (Auth.)

  20. Environmental partnerships: Leveraging resources to meet environmental challenges

    International Nuclear Information System (INIS)

    Sink, C.; Berg, T.; Booth, F.; Easley, K.

    1992-01-01

    Over 40 years of defense production activities have left behind a serious environmental legacy. Federal and State mandates require the remediation of defense production sites. To ensure an appropriate and timely response to these enormous environmental restoration and waste management challenges, the Secretary of Energy, Admiral James D. Watkins, authorized the establishment of the Office of Environmental Restoration and Waste Management (EM). EM is actively seeking collaborative opportunities with other government agencies and the private sector to identify, adapt, and develop new and consistent site restoration and consistent waste management practices, throughout the DOE Complex. The Technology Integration Division (TID) of the EM Office of Technology Development (TD) is charged with promoting the movement of innovative technology and 'lessons learned' into, out of, and across the Complex to enhance public, private, domestic, and international cleanup capabilities and bolster U.S. competitiveness. Secretary Watkins recently set a new course for DOE in technology transfer, and TID is responding to this new mission requirement by expanding and enhancing cooperative work with public and private sector partners. Consistent with this new philosophy of operations, TID acts as a facilitator to ensure other government agencies, industry, and universities work in partnership with EM to find more efficient and cost-effective technological solutions to mutual environmental management problems. In addition, TID leverages the technical and financial resources of public and private participants to share the costs associated with technology research, development, demonstration, testing, and evaluation (RDDT and E). This paper provides an overview of the OTD technology integration effort, the importance of public participation, and a discussion of technology integration models currently being developed in conjunction with TID support and oversight. (author)

  1. Leveraging multi-generational workforce values in interactive information societies

    Directory of Open Access Journals (Sweden)

    Sophie van der Walt

    2010-08-01

    Full Text Available Background: The success of organisations relies on various factors including the ability of its multi-generational workforce to collaborate within the interactive information society. By developing an awareness of the different values of a diverse workforce, organisations may benefit from diversity. Various diversity factors, such as ethnicity, age and gender, impact on the way people interact, especially in the interactive information society.Objectives: This article advocates the need for generational awareness and addresses how this awareness presents benefits to companies, such as, increased productivity, improved succession planning policies and strategies to recruit and retain a diverse workforce. The research problem is directed at how diversity management influences Traditionalists, Baby Boomers, Generation X and Generation Y in terms of their work performance and co-worker relationships.Method: The research design combines Critical Theory and Generational Theory within the mixed-method paradigm. The sequential exploratory design was decided upon as it studies the unknown relationships between different generations of employees. The literature review was followed by a quantitative empirical research component and data was collected by means of a questionnaire. Results: The findings highlight specific differences between generations regarding their perspectives on work values and co-worker relationships, rewards, work-life balance and retirement.Conclusion: The article concludes with recommendations on the role diversity management plays in terms of work performance and co-worker relationships. By leveraging generational awareness in the interactive information society organizations with a multi-generational workforce will succeed in the competitive business environment.

  2. Massively parallel evolutionary computation on GPGPUs

    CERN Document Server

    Tsutsui, Shigeyoshi

    2013-01-01

    Evolutionary algorithms (EAs) are metaheuristics that learn from natural collective behavior and are applied to solve optimization problems in domains such as scheduling, engineering, bioinformatics, and finance. Such applications demand acceptable solutions with high-speed execution using finite computational resources. Therefore, there have been many attempts to develop platforms for running parallel EAs using multicore machines, massively parallel cluster machines, or grid computing environments. Recent advances in general-purpose computing on graphics processing units (GPGPU) have opened u

  3. FMFT. Fully massive four-loop tadpoles

    Energy Technology Data Exchange (ETDEWEB)

    Pikelner, Andrey [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik

    2017-07-15

    We present FMFT - a package written in FORM that evaluates four-loop fully massive tadpole Feynman diagrams. It is a successor of the MATAD package that has been successfully used to calculate many renormalization group functions at three-loop order in a wide range of quantum field theories especially in the Standard Model. We describe an internal structure of the package and provide some examples of its usage.

  4. Towards Massive Machine Type Cellular Communications

    OpenAIRE

    Dawy, Zaher; Saad, Walid; Ghosh, Arunabha; Andrews, Jeffrey G.; Yaacoub, Elias

    2015-01-01

    Cellular networks have been engineered and optimized to carrying ever-increasing amounts of mobile data, but over the last few years, a new class of applications based on machine-centric communications has begun to emerge. Automated devices such as sensors, tracking devices, and meters - often referred to as machine-to-machine (M2M) or machine-type communications (MTC) - introduce an attractive revenue stream for mobile network operators, if a massive number of them can be efficiently support...

  5. Massive Schwinger model at finite θ

    Science.gov (United States)

    Azcoiti, Vicente; Follana, Eduardo; Royo-Amondarain, Eduardo; Di Carlo, Giuseppe; Vaquero Avilés-Casco, Alejandro

    2018-01-01

    Using the approach developed by V. Azcoiti et al. [Phys. Lett. B 563, 117 (2003), 10.1016/S0370-2693(03)00601-4], we are able to reconstruct the behavior of the massive one-flavor Schwinger model with a θ term and a quantized topological charge. We calculate the full dependence of the order parameter with θ . Our results at θ =π are compatible with Coleman's conjecture on the phase diagram of this model.

  6. Harmonic polylogarithms for massive Bhabha scattering

    International Nuclear Information System (INIS)

    Czakon, M.; Riemann, T.

    2005-08-01

    One- and two-dimensional harmonic polylogarithms, HPLs and GPLs, appear in calculations of multi-loop integrals. We discuss them in the context of analytical solutions for two-loop master integrals in the case of massive Bhabha scattering in QED. For the GPLs we discuss analytical representations, conformal transformations, and also their transformations corresponding to relations between master integrals in the s- and t-channel. (orig.)

  7. Massive Open Online Courses and economic sustainability

    OpenAIRE

    Liyanagunawardena, Tharindu R.; Lundqvist, Karsten O.; Williams, Shirley A.

    2015-01-01

    Millions of users around the world have registered on Massive Open Online Courses (MOOCs) offered by hundreds of universities (and other organizations) worldwide. Creating and offering these courses costs thousands of pounds. However, at present, revenue generated by MOOCs is not sufficient to offset these costs. The sustainability of MOOCs is a pressing concern as they incur not only upfront creation costs but also maintenance costs to keep content relevant, as well as on-going facilitation ...

  8. Weakly interacting massive particles and stellar structure

    International Nuclear Information System (INIS)

    Bouquet, A.

    1988-01-01

    The existence of weakly interacting massive particles (WIMPs) may solve both the dark matter problem and the solar neutrino problem. Such particles affect the energy transport in the stellar cores and change the stellar structure. We present the results of an analytic approximation to compute these effects in a self-consistent way. These results can be applied to many different stars, but we focus on the decrease of the 8 B neutrino flux in the case of the Sun

  9. Non Pauli-Fierz Massive Gravitons

    CERN Document Server

    Dvali, Gia; Redi, Michele

    2008-01-01

    We study general Lorentz invariant theories of massive gravitons. We show that, contrary to the standard lore, there exist consistent theories where the graviton mass term violates Pauli-Fierz structure. For theories where the graviton is a resonance this does not imply the existence of a scalar ghost if the deviation from Pauli-Fierz becomes sufficiently small at high energies. These types of mass terms are required by any consistent realization of the DGP model in higher dimension.

  10. FMFT: fully massive four-loop tadpoles

    Science.gov (United States)

    Pikelner, Andrey

    2018-03-01

    We present FMFT - a package written in FORM that evaluates four-loop fully massive tadpole Feynman diagrams. It is a successor of the MATAD package that has been successfully used to calculate many renormalization group functions at three-loop order in a wide range of quantum field theories especially in the Standard Model. We describe an internal structure of the package and provide some examples of its usage.

  11. On 3D Minimal Massive Gravity

    CERN Document Server

    Alishahiha, Mohsen; Naseh, Ali; Shirzad, Ahmad

    2014-12-03

    We study linearized equations of motion of the newly proposed three dimensional gravity, known as minimal massive gravity, using its metric formulation. We observe that the resultant linearized equations are exactly the same as that of TMG by making use of a redefinition of the parameters of the model. In particular the model admits logarithmic modes at the critical points. We also study several vacuum solutions of the model, specially at a certain limit where the contribution of Chern-Simons term vanishes.

  12. Magnetic fields and massive star formation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Qizhou; Keto, Eric; Ho, Paul T. P.; Ching, Tao-Chung; Chen, How-Huan [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Qiu, Keping [School of Astronomy and Space Science, Nanjing University, 22 Hankou Road, Nanjing 210093 (China); Girart, Josep M.; Juárez, Carmen [Institut de Ciències de l' Espai, (CSIC-IEEC), Campus UAB, Facultat de Ciències, C5p 2, E-08193 Bellaterra, Catalonia (Spain); Liu, Hauyu; Tang, Ya-Wen; Koch, Patrick M.; Rao, Ramprasad; Lai, Shih-Ping [Academia Sinica Institute of Astronomy and Astrophysics, P.O. Box 23-141, Taipei 106, Taiwan (China); Li, Zhi-Yun [Department of Astronomy, University of Virginia, P.O. Box 400325, Charlottesville, VA 22904 (United States); Frau, Pau [Observatorio Astronómico Nacional, Alfonso XII, 3 E-28014 Madrid (Spain); Li, Hua-Bai [Department of Physics, The Chinese University of Hong Kong, Hong Kong (China); Padovani, Marco [Laboratoire de Radioastronomie Millimétrique, UMR 8112 du CNRS, École Normale Supérieure et Observatoire de Paris, 24 rue Lhomond, F-75231 Paris Cedex 05 (France); Bontemps, Sylvain [OASU/LAB-UMR5804, CNRS, Université Bordeaux 1, F-33270 Floirac (France); Csengeri, Timea, E-mail: qzhang@cfa.harvard.edu [Max Planck Institute for Radioastronomy, Auf dem Hügel 69, D-53121 Bonn (Germany)

    2014-09-10

    Massive stars (M > 8 M {sub ☉}) typically form in parsec-scale molecular clumps that collapse and fragment, leading to the birth of a cluster of stellar objects. We investigate the role of magnetic fields in this process through dust polarization at 870 μm obtained with the Submillimeter Array (SMA). The SMA observations reveal polarization at scales of ≲0.1 pc. The polarization pattern in these objects ranges from ordered hour-glass configurations to more chaotic distributions. By comparing the SMA data with the single dish data at parsec scales, we found that magnetic fields at dense core scales are either aligned within 40° of or perpendicular to the parsec-scale magnetic fields. This finding indicates that magnetic fields play an important role during the collapse and fragmentation of massive molecular clumps and the formation of dense cores. We further compare magnetic fields in dense cores with the major axis of molecular outflows. Despite a limited number of outflows, we found that the outflow axis appears to be randomly oriented with respect to the magnetic field in the core. This result suggests that at the scale of accretion disks (≲ 10{sup 3} AU), angular momentum and dynamic interactions possibly due to close binary or multiple systems dominate over magnetic fields. With this unprecedentedly large sample of massive clumps, we argue on a statistical basis that magnetic fields play an important role during the formation of dense cores at spatial scales of 0.01-0.1 pc in the context of massive star and cluster star formation.

  13. Comment on ''Topologically Massive Gauge Theories''

    International Nuclear Information System (INIS)

    Bezerra de Mello, E.R.

    1988-01-01

    In a recent paper by R. Pisarski and S. Rao concerning topologically massive quantum Yang--Mills theory, the expression of the P-even part of the non-Abelian gauge field self-energy at one-loop order is shown to obey a consistency condition, which is not fulfilled by the formula originally presented by S. Deser, R. Jackiw, and S. Templeton. In this comment, I present a recalculation which agress with Pisarski and Rao. copyright 1988 Academic Press, Inc

  14. SUPERDENSE MASSIVE GALAXIES IN WINGS LOCAL CLUSTERS

    International Nuclear Information System (INIS)

    Valentinuzzi, T.; D'Onofrio, M.; Fritz, J.; Poggianti, B. M.; Bettoni, D.; Fasano, G.; Moretti, A.; Omizzolo, A.; Varela, J.; Cava, A.; Couch, W. J.; Dressler, A.; Moles, M.; Kjaergaard, P.; Vanzella, E.

    2010-01-01

    Massive quiescent galaxies at z > 1 have been found to have small physical sizes, and hence to be superdense. Several mechanisms, including minor mergers, have been proposed for increasing galaxy sizes from high- to low-z. We search for superdense massive galaxies in the WIde-field Nearby Galaxy-cluster Survey (WINGS) of X-ray selected galaxy clusters at 0.04 10 M sun , are mostly S0 galaxies, have a median effective radius (R e ) = 1.61 ± 0.29 kpc, a median Sersic index (n) = 3.0 ± 0.6, and very old stellar populations with a median mass-weighted age of 12.1 ± 1.3 Gyr. We calculate a number density of 2.9 x 10 -2 Mpc -3 for superdense galaxies in local clusters, and a hard lower limit of 1.3 x 10 -5 Mpc -3 in the whole comoving volume between z = 0.04 and z = 0.07. We find a relation between mass, effective radius, and luminosity-weighted age in our cluster galaxies, which can mimic the claimed evolution of the radius with redshift, if not properly taken into account. We compare our data with spectroscopic high-z surveys and find that-when stellar masses are considered-there is consistency with the local WINGS galaxy sizes out to z ∼ 2, while a discrepancy of a factor of 3 exists with the only spectroscopic z > 2 study. In contrast, there is strong evidence for a large evolution in radius for the most massive galaxies with M * > 4 x 10 11 M sun compared to similarly massive galaxies in WINGS, i.e., the brightest cluster galaxies.

  15. EVOLUTION OF MASSIVE PROTOSTARS VIA DISK ACCRETION

    International Nuclear Information System (INIS)

    Hosokawa, Takashi; Omukai, Kazuyuki; Yorke, Harold W.

    2010-01-01

    Mass accretion onto (proto-)stars at high accretion rates M-dot * > 10 -4 M sun yr -1 is expected in massive star formation. We study the evolution of massive protostars at such high rates by numerically solving the stellar structure equations. In this paper, we examine the evolution via disk accretion. We consider a limiting case of 'cold' disk accretion, whereby most of the stellar photosphere can radiate freely with negligible backwarming from the accretion flow, and the accreting material settles onto the star with the same specific entropy as the photosphere. We compare our results to the calculated evolution via spherically symmetric accretion, the opposite limit, whereby the material accreting onto the star contains the entropy produced in the accretion shock front. We examine how different accretion geometries affect the evolution of massive protostars. For cold disk accretion at 10 -3 M sun yr -1 , the radius of a protostar is initially small, R * ≅ a few R sun . After several solar masses have accreted, the protostar begins to bloat up and for M * ≅ 10 M sun the stellar radius attains its maximum of 30-400 R sun . The large radius ∼100 R sun is also a feature of spherically symmetric accretion at the same accreted mass and accretion rate. Hence, expansion to a large radius is a robust feature of accreting massive protostars. At later times, the protostar eventually begins to contract and reaches the zero-age main sequence (ZAMS) for M * ≅ 30 M sun , independent of the accretion geometry. For accretion rates exceeding several 10 -3 M sun yr -1 , the protostar never contracts to the ZAMS. The very large radius of several hundreds R sun results in the low effective temperature and low UV luminosity of the protostar. Such bloated protostars could well explain the existence of bright high-mass protostellar objects, which lack detectable H II regions.

  16. Extensive tumor reconstruction with massive allograft

    International Nuclear Information System (INIS)

    Zulmi Wan

    1999-01-01

    Massive deep-frozen bone allografts were implanted in four patients after wide tumor resection. Two cases were solitary proximal femur metastases, secondary to Thyroid cancer and breast cancer respectively; while the other two cases were primary in nature i.e. Chondrosarcoma proximal humerus and Osteosarcoma proximal femur. All were treated with a cemented alloprosthesis except in the upper limb where shoulder fusion was performed. Augmentation of these techniques were done with a segment 1 free vascularised fibular composite graft to the proximal femur of breast secondaries and proximal humerus Chondrosarcoma. Coverage of the wound of the latter was also contributed by lattisimus dorsi flap. The present investigations demonstrated the massive bone allografts were intimately anchored by host bone and there had been no evidence of aseptic loosening at the graft-cement interface. This study showed that with good effective tumor control, reconstructive surgery with massive allografts represented a good alternative to prosthetic implants in tumors of the limbs. No infection was seen in all four cases

  17. Cosmology in general massive gravity theories

    International Nuclear Information System (INIS)

    Comelli, D.; Nesti, F.; Pilo, L.

    2014-01-01

    We study the cosmological FRW flat solutions generated in general massive gravity theories. Such a model are obtained adding to the Einstein General Relativity action a peculiar non derivative potentials, function of the metric components, that induce the propagation of five gravitational degrees of freedom. This large class of theories includes both the case with a residual Lorentz invariance as well as the case with rotational invariance only. It turns out that the Lorentz-breaking case is selected as the only possibility. Moreover it turns out that that perturbations around strict Minkowski or dS space are strongly coupled. The upshot is that even though dark energy can be simply accounted by massive gravity modifications, its equation of state w eff has to deviate from -1. Indeed, there is an explicit relation between the strong coupling scale of perturbations and the deviation of w eff from -1. Taking into account current limits on w eff and submillimiter tests of the Newton's law as a limit on the possible strong coupling scale, we find that it is still possible to have a weakly coupled theory in a quasi dS background. Future experimental improvements on short distance tests of the Newton's law may be used to tighten the deviation of w eff form -1 in a weakly coupled massive gravity theory

  18. Massive transfusion protocols: current best practice

    Directory of Open Access Journals (Sweden)

    Hsu YM

    2016-03-01

    Full Text Available Yen-Michael S Hsu,1 Thorsten Haas,2 Melissa M Cushing1 1Department of Pathology and Laboratory Medicine, Weill Cornell Medical College, New York, NY, USA; 2Department of Anesthesia, University Children's Hospital Zurich, Zurich, Switzerland Abstract: Massive transfusion protocols (MTPs are established to provide rapid blood replacement in a setting of severe hemorrhage. Early optimal blood transfusion is essential to sustain organ perfusion and oxygenation. There are many variables to consider when establishing an MTP, and studies have prospectively evaluated different scenarios and patient populations to establish the best practices to attain improved patient outcomes. The establishment and utilization of an optimal MTP is challenging given the ever-changing patient status during resuscitation efforts. Much of the MTP literature comes from the trauma population, due to the fact that massive hemorrhage is the leading cause of preventable trauma-related death. As we come to further understand the positive and negative clinical impacts of transfusion-related factors, massive transfusion practice can be further refined. This article will first discuss specific MTPs targeting different patient populations and current relevant international guidelines. Then, we will examine a wide selection of therapeutic products to support MTPs, including newly available products and the most suitable of the traditional products. Lastly, we will discuss the best design for an MTP, including ratio-based MTPs and MTPs based on the use of point-of-care coagulation diagnostic tools. Keywords: hemorrhage, MTP, antifibrinolytics, coagulopathy, trauma, ratio, logistics, guidelines, hemostatic

  19. Galaxy bispectrum from massive spinning particles

    Science.gov (United States)

    Moradinezhad Dizgah, Azadeh; Lee, Hayden; Muñoz, Julian B.; Dvorkin, Cora

    2018-05-01

    Massive spinning particles, if present during inflation, lead to a distinctive bispectrum of primordial perturbations, the shape and amplitude of which depend on the masses and spins of the extra particles. This signal, in turn, leaves an imprint in the statistical distribution of galaxies; in particular, as a non-vanishing galaxy bispectrum, which can be used to probe the masses and spins of these particles. In this paper, we present for the first time a new theoretical template for the bispectrum generated by massive spinning particles, valid for a general triangle configuration. We then proceed to perform a Fisher-matrix forecast to assess the potential of two next-generation spectroscopic galaxy surveys, EUCLID and DESI, to constrain the primordial non-Gaussianity sourced by these extra particles. We model the galaxy bispectrum using tree-level perturbation theory, accounting for redshift-space distortions and the Alcock-Paczynski effect, and forecast constraints on the primordial non-Gaussianity parameters marginalizing over all relevant biases and cosmological parameters. Our results suggest that these surveys would potentially be sensitive to any primordial non-Gaussianity with an amplitude larger than fNL≈ 1, for massive particles with spins 2, 3, and 4. Interestingly, if non-Gaussianities are present at that level, these surveys will be able to infer the masses of these spinning particles to within tens of percent. If detected, this would provide a very clear window into the particle content of our Universe during inflation.

  20. Effects of massive transfusion on oxygen availability

    Directory of Open Access Journals (Sweden)

    José Otávio Costa Auler Jr

    Full Text Available OBJECTIVE: To determine oxygen derived parameters, hemodynamic and biochemical laboratory data (2,3 Diphosphoglycerate, lactate and blood gases analysis in patients after cardiac surgery who received massive blood replacement. DESIGN: Prospective study. SETTING: Heart Institute (Instituto do Coração, Hospital das Clínicas, Faculdade de Medicina, Universidade de São Paulo, Brazil. PARTICIPANTS: Twelve patients after cardiac surgery who received massive transfusion replacement; six of them evolved to a fatal outcome within the three-day postoperative follow-up. MEASUREMENTS AND MAIN RESULTS: The non-survivors group (n=6 presented high lactate levels and low P50 levels, when compared to the survivors group (p<0.05. Both groups presented an increase in oxygen consumption and O2 extraction, and there were no significant differences between them regarding these parameters. The 2,3 DPG levels were slightly reduced in both groups. CONCLUSIONS: This study shows that patients who are massively transfused following cardiovascular surgery present cell oxygenation disturbances probably as a result of O2 transport inadequacy.

  1. Emergent universe with wormholes in massive gravity

    Science.gov (United States)

    Paul, B. C.; Majumdar, A. S.

    2018-03-01

    An emergent universe (EU) scenario is proposed to obtain a universe free from big-bang singularity. In this framework the present universe emerged from a static Einstein universe phase in the infinite past. A flat EU scenario is found to exist in Einstein’s gravity with a non-linear equation of state (EoS). It has been shown subsequently that a physically realistic EU model can be obtained considering cosmic fluid composed of interacting fluids with a non-linear equation of state. It results a viable cosmological model accommodating both early inflation and present accelerating phases. In the present paper, the origin of an initial static Einstein universe needed in the EU model is explored in a massive gravity theory which subsequently emerged to be a dynamically evolving universe. A new gravitational instanton solution in a flat universe is obtained in the massive gravity theory which is a dynamical wormhole that might play an important role in realizing the origin of the initial state of the emergent universe. The emergence of a Lorentzian universe from a Euclidean gravity is understood by a Wick rotation τ = i t . A universe with radiation at the beginning finally transits into the present observed universe with a non-linear EoS as the interactions among the fluids set in. Thus a viable flat EU scenario where the universe stretches back into time infinitely, with no big bang is permitted in a massive gravity.

  2. Transcatheter emboilization therapy of massive colonic bleeding

    International Nuclear Information System (INIS)

    Shin, G. H.; Oh, J. H.; Yoon, Y.

    1996-01-01

    To evaulate the efficacy and safety of emergent superselective transcatheter embolization for controlling massive colonic bleeding. Six of the seven patients who had symptom of massive gastrointestinal bleeding underwent emergent transcatheter embolization for control of the bleeding. Gastrointestinal bleeding in these patients was originated from various colonic diseases: rectal cancer(n=1), proctitis(n=1), benign ulcer(n=1), mucosal injury by ventriculoperitoneal shunt(n=1), and unknown(n=2). All patients except one with rectal cancer were critically ill. Superselective embolization were done by using Gelfoam particles and/or coils. The vessels embolized were ileocolic artery(n=1). superior rectal artery(n=2), inferior rectal artery (n=1), and middle and inferior rectal arteries(n=1). Hemostasis was successful immediately in all patients. Two underwnet surgery due to recurrent bleeding developed 3 days after the procedure(n=1) or in associalion with underlying rectal cancer(n=1). On surgical specimen of two cases, there was no mucosal ischemic change. Transcatheter embolization is a safe and effective treatment of method for the control of massive colonic bleeding

  3. Entity Linking Leveraging the GeoDeepDive Cyberinfrastructure and Managing Uncertainty with Provenance.

    Science.gov (United States)

    Maio, R.; Arko, R. A.; Lehnert, K.; Ji, P.

    2017-12-01

    Unlocking the full, rich, network of links between the scientific literature and the real world entities to which data correspond - such as field expeditions (cruises) on oceanographic research vessels and physical samples collected during those expeditions - remains a challenge for the geoscience community. Doing so would enable data reuse and integration on a broad scale; making it possible to inspect the network and discover, for example, all rock samples reported in the scientific literature found within 10 kilometers of an undersea volcano, and associated geochemical analyses. Such a capability could facilitate new scientific discoveries. The GeoDeepDive project provides negotiated access to 4.2+ million documents from scientific publishers, enabling text and document mining via a public API and cyberinfrastructure. We mined this corpus using entity linking techniques, which are inherently uncertain, and recorded provenance information about each link. This opens the entity linking methodology to scrutiny, and enables downstream applications to make informed assessments about the suitability of an entity link for consumption. A major challenge is how to model and disseminate the provenance information. We present results from entity linking between journal articles, research vessels and cruises, and physical samples from the Petrological Database (PetDB), and incorporate Linked Data resources such as cruises in the Rolling Deck to Repository (R2R) catalog where possible. Our work demonstrates the value and potential of the GeoDeepDive cyberinfrastructure in combination with Linked Data infrastructure provided by the EarthCube GeoLink project. We present a research workflow to capture provenance information that leverages the World Wide Web Consortium (W3C) recommendation PROV Ontology.

  4. A Case Study: Leadership Style and Practice Leveraging Knowledge Management in Multigenerational Professional Learning Communities

    Science.gov (United States)

    Giles-Weeks, Veda

    2014-01-01

    Age related demographic changes, within public school organizations are resulting in leadership challenges in leveraging organizational knowledge across four unique generational cohorts. Competitive success within schools has linkages to organizational cohesiveness and knowledge management (KM). Generational cohorts maintain values affecting…

  5. EUCLID: Leveraging IPM for sustainable production of fruit and vegetable crops in partnership with China

    OpenAIRE

    Nicot , Philippe C.; Bardin , Marc; Leyronas , Christel; Desneux , Nicolas

    2016-01-01

    EUCLID: Leveraging IPM for sustainable production of fruit and vegetable crops in partnership with China. 13. IOBC-WPRS Meeting of the working group "Biological control of fungal and bacterial plant pathogens. .

  6. Leveraging DMO's Hi-Tech Simulation Against the F-16 Flying Training Gap

    National Research Council Canada - National Science Library

    McGrath, Shaun R

    2005-01-01

    .... The purpose of this research is to examine leveraging hi-tech simulation assets against the every growing gap in training caused by a systematic reduction in the average fighter pilot's flying hours...

  7. Leveraging Indigenous Knowledge to Create Jobs for Women in ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Researchers will identify indigenous knowledge, technology, and traditional enterprise practices in which ... The project will build research skills and knowledge to support Rwanda and Tanzania's ... Giving girls and women the power to decide.

  8. Leveraging technology to reduce health inequities in Kenya | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-11-22

    Nov 22, 2016 ... The three-year study, titled Addressing health inequities in Kenya: Potential and ... through applied research capacity building in e-health (SEARCH) program. ... For example, MoH was involved from project formulation and ...

  9. The effects of government bond purchases on leverage constraints of banks and non-financial firms

    OpenAIRE

    Kühl, Michael

    2016-01-01

    This paper investigates how government bond purchases affect leverage-constrained banks and non-financial firms by utilising a stochastic general equilibrium model. My results indicate that government bond purchases not only reduce non-financial firms' borrowing costs, amplified through a reduction in expected defaults, but also lower banks' profit margins. In an economy in which loans priced at par dominate in banks' balance sheets - as a reflection of the euro area's structure - the leverag...

  10. PENGARUH PERPUTARAN KAS, PERPUTARAN PIUTANG USAHA, PERPUTARAN PERSEDIAAN DAN LEVERAGE TERHADAP KINERJA KEUANGAN PERUSAHAAN

    OpenAIRE

    Suprihatin, Neneng Sri; Nasser, Etty

    2017-01-01

    The purpose of this study is to determine the effect of cash turnover, account receivable turnover, inventory turnover and leverage to financial performance (Liquidity and Rentability). The independent variable is cash turnover, account receivable turnover, inventory turnover and leverage. The dependent variable are financial performance (Liquidity and Rentability). The sample used in the study consists of manufacturing company in food and beverage sector the period of 2009-2012.The result sh...

  11. Leverage, Growth Opportunities and Firm Investment: The Case of Manufacturing Firms in China

    OpenAIRE

    Di Sheng; Shuyang Hou

    2014-01-01

    This paper examined the impact of financial leverage on investment decisions offirms using the panel data of publicly traded Chinese firms. We collected data for511 manufacturing companies during the period from 2005 to 2013 to do theresearch. The data shows that financial leverage is negatively correlated with afirm’s investment. Moreover, after we categorized the data into two types: 1)high-growth firms and 2) low-growth firms, it demonstrated that such negativecorrelation is significant fo...

  12. On predictability of rare events leveraging social media: a machine learning perspective

    OpenAIRE

    Le, Lei; Ferrara, Emilio; Flammini, Alessandro

    2015-01-01

    Information extracted from social media streams has been leveraged to forecast the outcome of a large number of real-world events, from political elections to stock market fluctuations. An increasing amount of studies demonstrates how the analysis of social media conversations provides cheap access to the wisdom of the crowd. However, extents and contexts in which such forecasting power can be effectively leveraged are still unverified at least in a systematic way. It is also unclear how soci...

  13. Modeling Financial Time Series Based on a Market Microstructure Model with Leverage Effect

    OpenAIRE

    Yanhui Xi; Hui Peng; Yemei Qin

    2016-01-01

    The basic market microstructure model specifies that the price/return innovation and the volatility innovation are independent Gaussian white noise processes. However, the financial leverage effect has been found to be statistically significant in many financial time series. In this paper, a novel market microstructure model with leverage effects is proposed. The model specification assumed a negative correlation in the errors between the price/return innovation and the volatility innovation....

  14. Leverage effect, economic policy uncertainty and realized volatility with regime switching

    Science.gov (United States)

    Duan, Yinying; Chen, Wang; Zeng, Qing; Liu, Zhicao

    2018-03-01

    In this study, we first investigate the impacts of leverage effect and economic policy uncertainty (EPU) on future volatility in the framework of regime switching. Out-of-sample results show that the HAR-RV including the leverage effect and economic policy uncertainty with regimes can achieve higher forecast accuracy than RV-type and GARCH-class models. Our robustness results further imply that these factors in the framework of regime switching can substantially improve the HAR-RV's forecast performance.

  15. LEVERAGING SUSTAINABILITY AS BUDGETARY RESOURCES THROUGH FINANCIAL LAW INSTRUMENTS

    Directory of Open Access Journals (Sweden)

    IONEL BOSTAN

    2016-02-01

    Full Text Available Through this approach we intend on actually achieving glances on basic legal norms in the field of taxation - Tax Code and the Fiscal Procedure Code - in terms of their potential to confer sustainability of public financial resources. Therefore, after playing some considerations regarding the sustainability of fiscal resources, highlighting the relationship taxation - development, we stop on the first reunification tax laws in the context of the market economy in Romania (2003, marked by the adoption of the Tax Code - by law - and legislating procedure by governmental tax legislation, then the desire circumscribed debate issues of sustainability of public finances. By placing our focus on massive renewal of the provisions of the tax code, which took place in 2015, and treat problems related to sustainability, prudence, predictability and efficiency - as imperative contained in the Fiscal Responsibility Law. Creating the premises to ensure predictability of the tax system and the continuation of conduct necessary fiscal consolidation sustainable, by rewriting the Tax Code and re-systematization of rules of Fiscal Procedure are prominently presented in this paper (Part Two, to finally reveal the economic impact of rewriting Codes the tax area.

  16. Reappraising the concept of massive transfusion in trauma

    DEFF Research Database (Denmark)

    Stanworth, Simon J; Morris, Timothy P; Gaarder, Christine

    2010-01-01

    ABSTRACT : INTRODUCTION : The massive-transfusion concept was introduced to recognize the dilutional complications resulting from large volumes of packed red blood cells (PRBCs). Definitions of massive transfusion vary and lack supporting clinical evidence. Damage-control resuscitation regimens...... of modern trauma care are targeted to the early correction of acute traumatic coagulopathy. The aim of this study was to identify a clinically relevant definition of trauma massive transfusion based on clinical outcomes. We also examined whether the concept was useful in that early prediction of massive...... transfusion as a concept in trauma has limited utility, and emphasis should be placed on identifying patients with massive hemorrhage and acute traumatic coagulopathy....

  17. Thermodynamics inducing massive particles' tunneling and cosmic censorship

    International Nuclear Information System (INIS)

    Zhang, Baocheng; Cai, Qing-yu; Zhan, Ming-sheng

    2010-01-01

    By calculating the change of entropy, we prove that the first law of black hole thermodynamics leads to the tunneling probability of massive particles through the horizon, including the tunneling probability of massive charged particles from the Reissner-Nordstroem black hole and the Kerr-Newman black hole. Novelly, we find the trajectories of massive particles are close to that of massless particles near the horizon, although the trajectories of massive charged particles may be affected by electromagnetic forces. We show that Hawking radiation as massive particles tunneling does not lead to violation of the weak cosmic-censorship conjecture. (orig.)

  18. Thermodynamics inducing massive particles' tunneling and cosmic censorship

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Baocheng [Chinese Academy of Sciences, State Key Laboratory of Magnetic Resonances and Atomic and Molecular Physics, Wuhan Institute of Physics and Mathematics, Wuhan (China); Graduate University of Chinese Academy of Sciences, Beijing (China); Cai, Qing-yu [Chinese Academy of Sciences, State Key Laboratory of Magnetic Resonances and Atomic and Molecular Physics, Wuhan Institute of Physics and Mathematics, Wuhan (China); Zhan, Ming-sheng [Chinese Academy of Sciences, State Key Laboratory of Magnetic Resonances and Atomic and Molecular Physics, Wuhan Institute of Physics and Mathematics, Wuhan (China); Chinese Academy of Sciences, Center for Cold Atom Physics, Wuhan (China)

    2010-08-15

    By calculating the change of entropy, we prove that the first law of black hole thermodynamics leads to the tunneling probability of massive particles through the horizon, including the tunneling probability of massive charged particles from the Reissner-Nordstroem black hole and the Kerr-Newman black hole. Novelly, we find the trajectories of massive particles are close to that of massless particles near the horizon, although the trajectories of massive charged particles may be affected by electromagnetic forces. We show that Hawking radiation as massive particles tunneling does not lead to violation of the weak cosmic-censorship conjecture. (orig.)

  19. INTERDEPENDENT ANALYSIS OF LEVERAGE, DIVIDEND, AND MANAGERIAL OWNERSHIP POLICIES: Agencies Perspectives

    Directory of Open Access Journals (Sweden)

    Wibisono Hardjopranoto

    2006-06-01

    Full Text Available This paper attempts to investigate interdependent mechanism among leverage, dividend, and managerial ownership policies. This paper considers firm size and economic conditions to control their effect on the relationship among the three policies. The interrelationship between leverage, dividend, and managerial ownership policies will be tested using two-stage least squares. Five exogenous variables are employed in simultaneous equation: current assets and structure of assets as leverage determinants, book to market and return on investment as dividend determinants, and relative return to risk as managerial ownership determinant. The research employs year 1994-2004 data, with 1717 firm years. The research findings can be summarised as follows. First, there is a negative relationship between managerial ownership and leverage policies as suggested by agency theory. Second, there is a relationship between managerial ownership and dividend policies, but the relationship between leverage and dividend is insignificant. Third, the relationship between leverage and dividend is insensitive to economic condition and firm size. Fourth, all exogenous variables have significant effect on endogenous variables, except relative return. Fifth, the effects of exogenous variables are not sensitive to control variables. Sixth, we find that managers show self-interest behaviours by reducing managerial ownership when the economic condition worsens.

  20. Leveraging Health Information Technology to Improve Quality in Federal Healthcare.

    Science.gov (United States)

    Weigel, Fred K; Switaj, Timothy L; Hamilton, Jessica

    2015-01-01

    Healthcare delivery in America is extremely complex because it is comprised of a fragmented and nonsystematic mix of stakeholders, components, and processes. Within the US healthcare structure, the federal healthcare system is poised to lead American medicine in leveraging health information technology to improve the quality of healthcare. We posit that through developing, adopting, and refining health information technology, the federal healthcare system has the potential to transform federal healthcare quality by managing the complexities associated with healthcare delivery. Although federal mandates have spurred the widespread use of electronic health records, other beneficial technologies have yet to be adopted in federal healthcare settings. The use of health information technology is fundamental in providing the highest quality, safest healthcare possible. In addition, health information technology is valuable in achieving the Agency for Healthcare Research and Quality's implementation goals. We conducted a comprehensive literature search using the Google Scholar, PubMed, and Cochrane databases to identify an initial list of articles. Through a thorough review of the titles and abstracts, we identified 42 articles as having relevance to health information technology and quality. Through our exclusion criteria of currency of the article, citation frequency, applicability to the federal health system, and quality of research supporting conclusions, we refined the list to 11 references from which we performed our analysis. The literature shows that the use of computerized physician order entry has significantly increased accurate medication dosage and decreased medication errors. The use of clinical decision support systems have significantly increased physician adherence to guidelines, although there is little evidence that indicates any significant correlation to patient outcomes. Research shows that interoperability and usability are continuing challenges for

  1. Perspective: Memcomputing: Leveraging memory and physics to compute efficiently

    Science.gov (United States)

    Di Ventra, Massimiliano; Traversa, Fabio L.

    2018-05-01

    It is well known that physical phenomena may be of great help in computing some difficult problems efficiently. A typical example is prime factorization that may be solved in polynomial time by exploiting quantum entanglement on a quantum computer. There are, however, other types of (non-quantum) physical properties that one may leverage to compute efficiently a wide range of hard problems. In this perspective, we discuss how to employ one such property, memory (time non-locality), in a novel physics-based approach to computation: Memcomputing. In particular, we focus on digital memcomputing machines (DMMs) that are scalable. DMMs can be realized with non-linear dynamical systems with memory. The latter property allows the realization of a new type of Boolean logic, one that is self-organizing. Self-organizing logic gates are "terminal-agnostic," namely, they do not distinguish between the input and output terminals. When appropriately assembled to represent a given combinatorial/optimization problem, the corresponding self-organizing circuit converges to the equilibrium points that express the solutions of the problem at hand. In doing so, DMMs take advantage of the long-range order that develops during the transient dynamics. This collective dynamical behavior, reminiscent of a phase transition, or even the "edge of chaos," is mediated by families of classical trajectories (instantons) that connect critical points of increasing stability in the system's phase space. The topological character of the solution search renders DMMs robust against noise and structural disorder. Since DMMs are non-quantum systems described by ordinary differential equations, not only can they be built in hardware with the available technology, they can also be simulated efficiently on modern classical computers. As an example, we will show the polynomial-time solution of the subset-sum problem for the worst cases, and point to other types of hard problems where simulations of DMMs

  2. The MASSIVE survey. I. A volume-limited integral-field spectroscopic study of the most massive early-type galaxies within 108 Mpc

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Chung-Pei [Department of Astronomy, University of California, Berkeley, CA 94720 (United States); Greene, Jenny E.; Murphy, Jeremy D. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); McConnell, Nicholas [Institute for Astronomy, University of Hawaii at Manoa, Honolulu, HI 96822 (United States); Janish, Ryan [Department of Physics, University of California, Berkeley, CA 94720 (United States); Blakeslee, John P. [Dominion Astrophysical Observatory, NRC Herzberg Institute of Astrophysics, Victoria, BC V9E 2E7 (Canada); Thomas, Jens, E-mail: cpma@berkeley.edu [Max Planck-Institute for Extraterrestrial Physics, Giessenbachstr. 1, D-85741 Garching (Germany)

    2014-11-10

    Massive early-type galaxies represent the modern day remnants of the earliest major star formation episodes in the history of the universe. These galaxies are central to our understanding of the evolution of cosmic structure, stellar populations, and supermassive black holes, but the details of their complex formation histories remain uncertain. To address this situation, we have initiated the MASSIVE Survey, a volume-limited, multi-wavelength, integral-field spectroscopic (IFS) and photometric survey of the structure and dynamics of the ∼100 most massive early-type galaxies within a distance of 108 Mpc. This survey probes a stellar mass range M* ≳ 10{sup 11.5} M {sub ☉} and diverse galaxy environments that have not been systematically studied to date. Our wide-field IFS data cover about two effective radii of individual galaxies, and for a subset of them, we are acquiring additional IFS observations on sub-arcsecond scales with adaptive optics. We are also acquiring deep K-band imaging to trace the extended halos of the galaxies and measure accurate total magnitudes. Dynamical orbit modeling of the combined data will allow us to simultaneously determine the stellar, black hole, and dark matter halo masses. The primary goals of the project are to constrain the black hole scaling relations at high masses, investigate systematically the stellar initial mass function and dark matter distribution in massive galaxies, and probe the late-time assembly of ellipticals through stellar population and kinematical gradients. In this paper, we describe the MASSIVE sample selection, discuss the distinct demographics and structural and environmental properties of the selected galaxies, and provide an overview of our basic observational program, science goals and early survey results.

  3. Revealing evolved massive stars with Spitzer

    Science.gov (United States)

    Gvaramadze, V. V.; Kniazev, A. Y.; Fabrika, S.

    2010-06-01

    Massive evolved stars lose a large fraction of their mass via copious stellar wind or instant outbursts. During certain evolutionary phases, they can be identified by the presence of their circumstellar nebulae. In this paper, we present the results of a search for compact nebulae (reminiscent of circumstellar nebulae around evolved massive stars) using archival 24-μm data obtained with the Multiband Imaging Photometer for Spitzer. We have discovered 115 nebulae, most of which bear a striking resemblance to the circumstellar nebulae associated with luminous blue variables (LBVs) and late WN-type (WNL) Wolf-Rayet (WR) stars in the Milky Way and the Large Magellanic Cloud (LMC). We interpret this similarity as an indication that the central stars of detected nebulae are either LBVs or related evolved massive stars. Our interpretation is supported by follow-up spectroscopy of two dozen of these central stars, most of which turn out to be either candidate LBVs (cLBVs), blue supergiants or WNL stars. We expect that the forthcoming spectroscopy of the remaining objects from our list, accompanied by the spectrophotometric monitoring of the already discovered cLBVs, will further increase the known population of Galactic LBVs. This, in turn, will have profound consequences for better understanding the LBV phenomenon and its role in the transition between hydrogen-burning O stars and helium-burning WR stars. We also report on the detection of an arc-like structure attached to the cLBV HD 326823 and an arc associated with the LBV R99 (HD 269445) in the LMC. Partially based on observations collected at the German-Spanish Astronomical Centre, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC). E-mail: vgvaram@mx.iki.rssi.ru (VVG); akniazev@saao.ac.za (AYK); fabrika@sao.ru (SF)

  4. A Massively Parallel Code for Polarization Calculations

    Science.gov (United States)

    Akiyama, Shizuka; Höflich, Peter

    2001-03-01

    We present an implementation of our Monte-Carlo radiation transport method for rapidly expanding, NLTE atmospheres for massively parallel computers which utilizes both the distributed and shared memory models. This allows us to take full advantage of the fast communication and low latency inherent to nodes with multiple CPUs, and to stretch the limits of scalability with the number of nodes compared to a version which is based on the shared memory model. Test calculations on a local 20-node Beowulf cluster with dual CPUs showed an improved scalability by about 40%.

  5. Deflection of massive neutrinos by gravitational fields

    International Nuclear Information System (INIS)

    Fargion, D.

    1981-01-01

    The curvature undergone by massive neutrino trajectories, passing by a mass M at a distance b from the center of a body, is examined. Calculations led to the following angle of deflection: δ rho = 2GM/b#betta# 2 sub(infinity)C 2 (1 + #betta# 2 sub(infinity)), where #betta#sub(infinity) is the dimensionless velocity of the particle at infinity. The ultrarelativistic limit (#betta#sub(infinity) = 1) coincides with the usual massless deflection. Physical consequences are considered. (author)

  6. Body contouring following massive weight loss

    Directory of Open Access Journals (Sweden)

    Vijay Langer

    2011-01-01

    Full Text Available Obesity is a global disease with epidemic proportions. Bariatric surgery or modified lifestyles go a long way in mitigating the vast weight gain. Patients following these interventions usually undergo massive weight loss. This results in redundant tissues in various parts of the body. Loose skin causes increased morbidity and psychological trauma. This demands various body contouring procedures that are usually excisional. These procedures are complex and part of a painstaking process that needs a committed patient and an industrious plastic surgeon. As complications in these patients can be quite frequent, both the patient and the surgeon need to be aware and willing to deal with them.

  7. Non-Pauli-Fierz Massive Gravitons

    International Nuclear Information System (INIS)

    Dvali, Gia; Pujolas, Oriol; Redi, Michele

    2008-01-01

    We study general Lorentz invariant theories of massive gravitons. We show that, contrary to the standard lore, there exist consistent theories where the graviton mass term violates Pauli-Fierz structure. For theories where the graviton is a resonance, this does not imply the existence of a scalar ghost if the deviation from a Pauli-Fierz structure becomes sufficiently small at high energies. These types of mass terms are required by any consistent realization of the Dvali-Gabadadze-Porrati model in higher dimension

  8. Massive Preperitoneal Hematoma after a Subcutaneous Injection

    Directory of Open Access Journals (Sweden)

    Hideki Katagiri

    2016-01-01

    Full Text Available Preperitoneal hematomas are rare and can develop after surgery or trauma. A 74-year-old woman, receiving systemic anticoagulation, developed a massive preperitoneal hematoma after a subcutaneous injection of teriparatide using a 32-gauge, 4 mm needle. In this patient, there were two factors, the subcutaneous injection of teriparatide and systemic anticoagulation, associated with development of the hematoma. These two factors are especially significant, because they are widely used clinically. Although extremely rare, physicians must consider this potentially life-threatening complication after subcutaneous injections, especially in patients receiving anticoagulation.

  9. Hadroproduction of massive lepton pairs and QCD

    International Nuclear Information System (INIS)

    Berger, E.L.

    1979-04-01

    A survey is presented of some current issues of interest in attempts to describe the production of massive lepton pairs in hadronic collisions at high energies. I concentrate on the interpretation of data in terms of the parton model and on predictions derived from quantum-chromodynamics (QCD), their reliability and their confrontation with experiment. Among topics treated are the connection with deep-inelastic lepton scattering, universality of structure functions, and the behavior of cross-sections as a function of transverse momentum

  10. Discovery of massive neutral vector mesons

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    Personal accounts of the discovery of massive neutral vector mesons (psi particles) are given by researchers S. Ting, G. Goldhaber, and B. Richter. The double-arm spectrometer and the Cherenkov effect are explained in a technical note, and the solenoidal magnetic detector is discussed in an explanatory note for nonspecialists. Reprints of three papers in Physical Review Letters which announced the discovery of the particles are given: Experimental observation of a heavy particle J, Discovery of a narrow resonance in e + e - annihilation, and Discovery of a second narrow resonance in e + e - annihilation. A discussion of subsequent developments and scientific biographies of the three authors are also presented. 25 figures

  11. Monopole Solutions in Topologically Massive Gauge Theory

    International Nuclear Information System (INIS)

    Teh, Rosy; Wong, Khai-Ming; Koh, Pin-Wai

    2010-01-01

    Monopoles in topologically massive SU(2) Yang-Mils-Higgs gauge theory in 2+1 dimensions with a Chern-Simon mass term have been studied by Pisarski some years ago. He argued that there is a monopole solution that is regular everywhere, but found that it does not possess finite action. There were no exact or numerical solutions being presented by him. Hence it is our purpose to further investigate this solution in more detail. We obtained numerical regular solutions that smoothly interpolates between the behavior at small and large distances for different values of Chern-Simon term strength and for several fixed values of Higgs field strength.

  12. Massively parallel Fokker-Planck code ALLAp

    International Nuclear Information System (INIS)

    Batishcheva, A.A.; Krasheninnikov, S.I.; Craddock, G.G.; Djordjevic, V.

    1996-01-01

    The recently developed for workstations Fokker-Planck code ALLA simulates the temporal evolution of 1V, 2V and 1D2V collisional edge plasmas. In this work we present the results of code parallelization on the CRI T3D massively parallel platform (ALLAp version). Simultaneously we benchmark the 1D2V parallel vesion against an analytic self-similar solution of the collisional kinetic equation. This test is not trivial as it demands a very strong spatial temperature and density variation within the simulation domain. (orig.)

  13. Massive Asynchronous Parallelization of Sparse Matrix Factorizations

    Energy Technology Data Exchange (ETDEWEB)

    Chow, Edmond [Georgia Inst. of Technology, Atlanta, GA (United States)

    2018-01-08

    Solving sparse problems is at the core of many DOE computational science applications. We focus on the challenge of developing sparse algorithms that can fully exploit the parallelism in extreme-scale computing systems, in particular systems with massive numbers of cores per node. Our approach is to express a sparse matrix factorization as a large number of bilinear constraint equations, and then solving these equations via an asynchronous iterative method. The unknowns in these equations are the matrix entries of the factorization that is desired.

  14. The Black Hole Radiation in Massive Gravity

    Directory of Open Access Journals (Sweden)

    Ivan Arraut

    2018-02-01

    Full Text Available We apply the Bogoliubov transformations in order to connect two different vacuums, one located at past infinity and another located at future infinity around a black hole inside the scenario of the nonlinear theory of massive gravity. The presence of the extra degrees of freedom changes the behavior of the logarithmic singularity and, as a consequence, the relation between the two Bogoliubov coefficients. This has an effect on the number of particles, or equivalently, on the black hole temperature perceived by observers defining the time arbitrarily.

  15. Massive runaway stars in the Large Magellanic Cloud

    Science.gov (United States)

    Gvaramadze, V. V.; Kroupa, P.; Pflamm-Altenburg, J.

    2010-09-01

    The origin of massive field stars in the Large Magellanic Cloud (LMC) has long been an enigma. The recent measurements of large offsets (˜ 100 km s-1) between the heliocentric radial velocities of some very massive (O2-type) field stars and the systemic LMC velocity provides a possible explanation of this enigma and suggests that the field stars are runaway stars ejected from their birthplaces at the very beginning of their parent cluster's dynamical evolution. A straightforward way to prove this explanation is to measure the proper motions of the field stars and to show that they are moving away from one of the nearby star clusters or OB associations. This approach is, however, complicated by the long distance to the LMC, which makes accurate proper motion measurements difficult. We used an alternative approach for solving the problem (first applied for Galactic field stars), based on the search for bow shocks produced by runaway stars. The geometry of detected bow shocks would allow us to infer the direction of stellar motion, thereby determining their possible parent clusters. In this paper we present the results of a search for bow shocks around six massive field stars that have been proposed as candidate runaway stars. Using archival Spitzer Space Telescope data, we found a bow shock associated with one of our programme stars, the O2 V((f*)) star BI 237, which is the first-ever detection of bow shocks in the LMC. Orientation of the bow shock suggests that BI 237 was ejected from the OB association LH 82 (located at ≃ 120 pc in projection from the star). A by-product of our search is the detection of bow shocks generated by four OB stars in the field of the LMC and an arc-like structure attached to the candidate luminous blue variable R81 (HD 269128). The geometry of two of these bow shocks is consistent with the possibility that their associated stars were ejected from the 30 Doradus star-forming complex. We discuss implications of our findings for the

  16. Point of no return. The massive climate threats we must avoid

    Energy Technology Data Exchange (ETDEWEB)

    Voorhar, R.; Myllyvirta, L.

    2013-01-15

    The world is quickly reaching a point of no return for preventing the worst impacts of climate change. With total disregard for this unfolding global disaster, the fossil fuel industry is planning 14 massive coal, oil and gas projects that would produce as much new carbon dioxide (CO2) emissions in 2020 as the entire US, and delay action on climate change for more than a decade.

  17. Focused on the prize: Characteristics of experts in massive multiplayer online games

    OpenAIRE

    Wang, Jing; Huffaker, David A.; Treem, Jeffrey W.; Fullerton, Lindsay; Ahmad, Muhammad A.; Williams, Dmitri; Poole, Marshall Scott; Contractor, Noshir

    2011-01-01

    This study is the first large–scale multi–method attempt to empirically examine the characteristics leading to development of expertise in EverQuest II, a popular massively multi–player online role–playing game (MMOs). Benefiting from the unprecedented opportunity of obtaining game log data matched with survey data, the project investigated the relationship between player motivations and in–game behavior, personality characteristics, and demographic attributes wi...

  18. Massive stars and miniature robots: today's research and tomorrow's technologies

    Science.gov (United States)

    Taylor, William David

    2013-03-01

    This thesis documents the reduction of the VLT-FLAMES Tarantula Survey (VFTS) data set, whilst also describing the analysis for one of the serendipitous discoveries: the massive binary R139. This high-mass binary will provide an excellent future calibration point for stellar models, in part as it seems to defy certain expectations about its evolution. Out with the VFTS, a search for binary companions around a trio of B-type supergiants is presented. These stars are surrounded by nebulae that closely resemble the triple-ring structure associated with the poorly-understood SN1987A. Do these stars share a similar evolutionary fate? While strong evidence is found for periodic pulsations in one of the stars, there appears to be no indication of a short-period binary companion suggested in the literature. Gathering observations from a wide range of environments builds a fuller picture of massive stars, but the samples remain somewhat limited. The coming generation of extremely large telescopes will open new regions for studies like the VFTS. Fully utilising these remarkable telescopes will require many new technologies, and this thesis presents one such development project. For adaptive-optics corrected, multi-object instruments it will be necessary to position small pick-off mirrors in the telescope¿s focal plane to select the sub-fields on the sky. This could be most efficiently achieved if the mirrors were self-propelled, which has led to a miniature robot project called MAPS - the Micro Autonomous Positioning System. A number of robots have been built with a footprint of only 30 x 30mm. These wirelessly-controlled robots draw their power from the floor on which they operate and have shown the potential to be positioned to an accuracy of tens of microns. This thesis details much of the early design work and testing of the robots, and also the development of the camera imaging system used to determine the position of the robots. The MAPS project is ongoing and a

  19. The MPPC project

    International Nuclear Information System (INIS)

    Rohrbach, F.

    1993-01-01

    We report on the work done in massively parallel processing with a view to studying possible solutions for extracting interesting high-energy physics particle events at future high-luminosity hadronic colliders operating in the TeV energy domain. We concentrate on a special Single Instruction Multiple Data (SIMD) architecture: the Associative String Processor (ASP). The Massively Parallel Processing Collaboration (MPPC) Project, grouping nine European institutes, was launched by CERN to carry out this RandD programme. This report, written by partners of the MPPC collaboration, describes the main results achieved at the end of the project: construction of ASP machines, parallel software development and application studies in high-energy physics and in other fields of science. A final, positive assessment of the ASP concept has been made by the Collaboration. (orig.)

  20. Leveraging International Cooperation Acquisition Opportunities for the Department of Defense

    Science.gov (United States)

    2014-09-01

    Comparative and cost/ benefit and risk analysis and SWOT  Research Needed: Examples of U.S. DOD domestic projects and programs that have included...for International Cooperation and how that will benefit or deter the domestic program offices system acquisition efforts. Our research and analysis ...38 2. SWOT Analysis ..................................................................................39

  1. A NASA Strategy for Leveraging Emerging Launch Vehicles for Routine, Small Payload Missions

    Science.gov (United States)

    Underwood, Bruce E.

    2005-01-01

    Orbital flight opportunities for small payloads have always been few and far between, and then on February 1, 2002, the situation got worse. In the wake of the loss of the Columbia during STS- 107, changing NASA missions and priorities led to the termination of the Shuttle Small Payloads Projects, including Get-Away Special, Hitcbker, and Space Experiment Module. In spite of the limited opportunities, long queue, and restrictions associated with flying experiments on a man-rated transportation system; the carriers provided a sustained, high quality experiment services for education, science, and technology payloads, and was one of the few games in town. Attempts to establish routine opportunities aboard existing ELVs have been unsuccessful, as the cost-per-pound on small ELVs and conflicts with primary spacecraft on larger vehicles have proven prohibitive. Ths has led to a backlog of existing NASA-sponsored payloads and no prospects or plans for fbture opportunities within the NASA community. The prospects for breaking out of this paradigm appear promising as a result of NASA s partnership with DARPA in pursuit of low-cost, responsive small ELVs under the Falcon Program. Through this partnership several new small ELVs, providing 1000 lbs. to LEO will be demonstrated in less than two years that promise costs that are reasonable enough that NASA, DoD, and other sponsors can once again invest in small payload opportunities. Within NASA, planning has already begun. NASA will be populating one or more of the Falcon demonstration flights with small payloads that are already under development. To accommodate these experiments, Goddard s Wallops Flight Facility has been tasked to develop a multi-payload ejector (MPE) to accommodate the needs of these payloads. The MPE capabilities and design is described in detail in a separately submitted abstract. Beyond use of the demonstration flights however, Goddard has already begun developing strategies to leverage these new ELVs

  2. Leveraging Open Standards and Technologies to Enhance Community Access to Earth Science Lidar Data

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Cowart, C.; Baru, C.; Arrowsmith, R.

    2011-12-01

    Lidar (Light Detection and Ranging) data, collected from space, airborne and terrestrial platforms, have emerged as an invaluable tool for a variety of Earth science applications ranging from ice sheet monitoring to modeling of earth surface processes. However, lidar present a unique suite of challenges from the perspective of building cyberinfrastructure systems that enable the scientific community to access these valuable research datasets. Lidar data are typically characterized by millions to billions of individual measurements of x,y,z position plus attributes; these "raw" data are also often accompanied by derived raster products and are frequently terabytes in size. As a relatively new and rapidly evolving data collection technology, relevant open data standards and software projects are immature compared to those for other remote sensing platforms. The NSF-funded OpenTopography Facility project has developed an online lidar data access and processing system that co-locates data with on-demand processing tools to enable users to access both raw point cloud data as well as custom derived products and visualizations. OpenTopography is built on a Service Oriented Architecture (SOA) in which applications and data resources are deployed as standards compliant (XML and SOAP) Web services with the open source Opal Toolkit. To develop the underlying applications for data access, filtering and conversion, and various processing tasks, OpenTopography has heavily leveraged existing open source software efforts for both lidar and raster data. Operating on the de facto LAS binary point cloud format (maintained by ASPRS), open source libLAS and LASlib libraries provide OpenTopography data ingestion, query and translation capabilities. Similarly, raster data manipulation is performed through a suite of services built on the Geospatial Data Abstraction Library (GDAL). OpenTopography has also developed our own algorithm for high-performance gridding of lidar point cloud data

  3. Discovering Strategies to Improve Business Value in Outsourcing Projects

    NARCIS (Netherlands)

    Ponisio, Laura; van Eck, Pascal; Vruggink, P.

    2008-01-01

    This paper deals with the problem of leveraging client business value in a software development outsourcing relationship. We have observed software development projects from two different Dutch IT outsourcing companies and studied the approach they apply in their (successful) projects. The results

  4. Condensing Massive Satellite Datasets For Rapid Interactive Analysis

    Science.gov (United States)

    Grant, G.; Gallaher, D. W.; Lv, Q.; Campbell, G. G.; Fowler, C.; LIU, Q.; Chen, C.; Klucik, R.; McAllister, R. A.

    2015-12-01

    Our goal is to enable users to interactively analyze massive satellite datasets, identifying anomalous data or values that fall outside of thresholds. To achieve this, the project seeks to create a derived database containing only the most relevant information, accelerating the analysis process. The database is designed to be an ancillary tool for the researcher, not an archival database to replace the original data. This approach is aimed at improving performance by reducing the overall size by way of condensing the data. The primary challenges of the project include: - The nature of the research question(s) may not be known ahead of time. - The thresholds for determining anomalies may be uncertain. - Problems associated with processing cloudy, missing, or noisy satellite imagery. - The contents and method of creation of the condensed dataset must be easily explainable to users. The architecture of the database will reorganize spatially-oriented satellite imagery into temporally-oriented columns of data (a.k.a., "data rods") to facilitate time-series analysis. The database itself is an open-source parallel database, designed to make full use of clustered server technologies. A demonstration of the system capabilities will be shown. Applications for this technology include quick-look views of the data, as well as the potential for on-board satellite processing of essential information, with the goal of reducing data latency.

  5. Enterprise SRS: Leveraging Ongoing Operations to Advance National Programs - 13108

    International Nuclear Information System (INIS)

    Marra, J.E.; Murray, A.M.; McGuire, P.W.; Wheeler, V.B.

    2013-01-01

    The SRS is re-purposing its vast array of assets to solve future national issues regarding environmental stewardship, national security, and clean energy. The vehicle for this transformation is Enterprise SRS which presents a new, strategic view of SRS as a united endeavor for 'all things nuclear' as opposed to a group of distinct and separate entities with individual missions and organizations. Key among the Enterprise SRS strategic initiatives is the integration of research into facilities in conjunction with ongoing missions to provide researchers from other national laboratories, academic institutions, and commercial entities the opportunity to demonstrate their technologies in a relevant environment and scale prior to deployment. To manage that integration of research demonstrations into site facilities, The DOE Savannah River Operations Office, Savannah River Nuclear Solutions, and the Savannah River National Laboratory (SRNL) have established the Center for Applied Nuclear Materials Processing and Engineering Research (CANMPER). The key objective of this initiative is to bridge the gap between promising transformational nuclear materials management advancements and large-scale deployment of the technology by leveraging SRS assets (e.g. facilities, staff, and property) for those critical engineering-scale demonstrations necessary to assure the successful deployment of new technologies. CANMPER will coordinate the demonstration of R and D technologies and serve as the interface between the engineering-scale demonstration and the R and D programs, essentially providing cradle-to-grave support to the R and D team during the demonstration. While the initial focus of CANMPER will be on the effective use of SRS assets for these demonstrations, CANMPER also will work with research teams to identify opportunities to perform R and D demonstrations at other facilities. Unique to this approach is the fact that these SRS assets will continue to accomplish DOE's critical

  6. Leveraging High Performance Computing for Managing Large and Evolving Data Collections

    Directory of Open Access Journals (Sweden)

    Ritu Arora

    2014-10-01

    Full Text Available The process of developing a digital collection in the context of a research project often involves a pipeline pattern during which data growth, data types, and data authenticity need to be assessed iteratively in relation to the different research steps and in the interest of archiving. Throughout a project’s lifecycle curators organize newly generated data while cleaning and integrating legacy data when it exists, and deciding what data will be preserved for the long term. Although these actions should be part of a well-oiled data management workflow, there are practical challenges in doing so if the collection is very large and heterogeneous, or is accessed by several researchers contemporaneously. There is a need for data management solutions that can help curators with efficient and on-demand analyses of their collection so that they remain well-informed about its evolving characteristics. In this paper, we describe our efforts towards developing a workflow to leverage open science High Performance Computing (HPC resources for routinely and efficiently conducting data management tasks on large collections. We demonstrate that HPC resources and techniques can significantly reduce the time for accomplishing critical data management tasks, and enable a dynamic archiving throughout the research process. We use a large archaeological data collection with a long and complex formation history as our test case. We share our experiences in adopting open science HPC resources for large-scale data management, which entails understanding usage of the open source HPC environment and training users. These experiences can be generalized to meet the needs of other data curators working with large collections.

  7. Leveraging biospecimen resources for discovery or validation of markers for early cancer detection.

    Science.gov (United States)

    Schully, Sheri D; Carrick, Danielle M; Mechanic, Leah E; Srivastava, Sudhir; Anderson, Garnet L; Baron, John A; Berg, Christine D; Cullen, Jennifer; Diamandis, Eleftherios P; Doria-Rose, V Paul; Goddard, Katrina A B; Hankinson, Susan E; Kushi, Lawrence H; Larson, Eric B; McShane, Lisa M; Schilsky, Richard L; Shak, Steven; Skates, Steven J; Urban, Nicole; Kramer, Barnett S; Khoury, Muin J; Ransohoff, David F

    2015-04-01

    Validation of early detection cancer biomarkers has proven to be disappointing when initial promising claims have often not been reproducible in diagnostic samples or did not extend to prediagnostic samples. The previously reported lack of rigorous internal validity (systematic differences between compared groups) and external validity (lack of generalizability beyond compared groups) may be effectively addressed by utilizing blood specimens and data collected within well-conducted cohort studies. Cohort studies with prediagnostic specimens (eg, blood specimens collected prior to development of clinical symptoms) and clinical data have recently been used to assess the validity of some early detection biomarkers. With this background, the Division of Cancer Control and Population Sciences (DCCPS) and the Division of Cancer Prevention (DCP) of the National Cancer Institute (NCI) held a joint workshop in August 2013. The goal was to advance early detection cancer research by considering how the infrastructure of cohort studies that already exist or are being developed might be leveraged to include appropriate blood specimens, including prediagnostic specimens, ideally collected at periodic intervals, along with clinical data about symptom status and cancer diagnosis. Three overarching recommendations emerged from the discussions: 1) facilitate sharing of existing specimens and data, 2) encourage collaboration among scientists developing biomarkers and those conducting observational cohort studies or managing healthcare systems with cohorts followed over time, and 3) conduct pilot projects that identify and address key logistic and feasibility issues regarding how appropriate specimens and clinical data might be collected at reasonable effort and cost within existing or future cohorts. © Published by Oxford University Press 2015.

  8. Testing the Larson relations in massive clumps

    Science.gov (United States)

    Traficante, A.; Duarte-Cabral, A.; Elia, D.; Fuller, G. A.; Merello, M.; Molinari, S.; Peretto, N.; Schisano, E.; Di Giorgio, A.

    2018-06-01

    We tested the validity of the three Larson relations in a sample of 213 massive clumps selected from the Herschel infrared Galactic Plane (Hi-GAL) survey, also using data from the Millimetre Astronomy Legacy Team 90 GHz (MALT90) survey of 3-mm emission lines. The clumps are divided into five evolutionary stages so that we can also discuss the Larson relations as a function of evolution. We show that this ensemble does not follow the three Larson relations, regardless of the clump's evolutionary phase. A consequence of this breakdown is that the dependence of the virial parameter αvir on mass (and radius) is only a function of the gravitational energy, independent of the kinetic energy of the system; thus, αvir is not a good descriptor of clump dynamics. Our results suggest that clumps with clear signatures of infall motions are statistically indistinguishable from clumps with no such signatures. The observed non-thermal motions are not necessarily ascribed to turbulence acting to sustain the gravity, but they might be a result of the gravitational collapse at the clump scales. This seems to be particularly true for the most massive (M ≥ 1000 M⊙) clumps in the sample, where exceptionally high magnetic fields might not be enough to stabilize the collapse.

  9. Planckian Interacting Massive Particles as Dark Matter.

    Science.gov (United States)

    Garny, Mathias; Sandora, McCullen; Sloth, Martin S

    2016-03-11

    The standard model could be self-consistent up to the Planck scale according to the present measurements of the Higgs boson mass and top quark Yukawa coupling. It is therefore possible that new physics is only coupled to the standard model through Planck suppressed higher dimensional operators. In this case the weakly interacting massive particle miracle is a mirage, and instead minimality as dictated by Occam's razor would indicate that dark matter is related to the Planck scale, where quantum gravity is anyway expected to manifest itself. Assuming within this framework that dark matter is a Planckian interacting massive particle, we show that the most natural mass larger than 0.01M_{p} is already ruled out by the absence of tensor modes in the cosmic microwave background (CMB). This also indicates that we expect tensor modes in the CMB to be observed soon for this type of minimal dark matter model. Finally, we touch upon the Kaluza-Klein graviton mode as a possible realization of this scenario within UV complete models, as well as further potential signatures and peculiar properties of this type of dark matter candidate. This paradigm therefore leads to a subtle connection between quantum gravity, the physics of primordial inflation, and the nature of dark matter.

  10. Massive neutrinos in almost-commutative geometry

    International Nuclear Information System (INIS)

    Stephan, Christoph A.

    2007-01-01

    In the noncommutative formulation of the standard model of particle physics by Chamseddine and Connes [Commun. Math. Phys. 182, 155 (1996), e-print hep-th/9606001], one of the three generations of fermions has to possess a massless neutrino. [C. P. Martin et al., Phys. Rep. 29, 363 (1998), e-print hep-th-9605001]. This formulation is consistent with neutrino oscillation experiments and the known bounds of the Pontecorvo-Maki-Nakagawa-Sakata matrix (PMNS matrix). But future experiments which may be able to detect neutrino masses directly and high-precision measurements of the PMNS matrix might need massive neutrinos in all three generations. In this paper we present an almost-commutative geometry which allows for a standard model with massive neutrinos in all three generations. This model does not follow in a straightforward way from the version of Chamseddine and Connes since it requires an internal algebra with four summands of matrix algebras, instead of three summands for the model with one massless neutrino

  11. MASSIVE PLEURAL EFFUSION: A CASE REPORT

    Directory of Open Access Journals (Sweden)

    Putu Bayu Dian Tresna Dewi

    2013-03-01

    Full Text Available Pleural effusion is abnormal fluid accumulation within pleural cavity between the parietal pleura and visceralis pleura, either transudation or exudates. A 47 year-old female presented with dyspneu, cough, and decreased of appetite. She had history of right lung tumor. Physical examination revealed asymmetric chest movement where right part of lung was lagged during breathing, vocal fremitus on the right chest was decreased, dullness at the right chest, decreased vesicular sound in the right chest, enlargement of supraclavicular and colli dextra lymph nodes, and hepatomegali. Complete blood count showed leukocytosis. Clinical chemistry analysis showed hipoalbumin and decreased liver function. Blood gas analysis showed hypoxemia. Pleural fluid analysis showed an exudates, murky red liquid color filled with erythrocytes, number of cells. Cytological examination showed existence of a non-small cell carcinoma tends adeno type. From chest X-ray showed massive right pleural effusion. Based on history, physical examination and investigations, she was diagnosed with massive pleural effusion et causa suspected malignancy. She had underwent pleural fluid evacuation and treated with analgesics and antibiotics.

  12. Massive clot formation after tooth extraction

    Directory of Open Access Journals (Sweden)

    Santosh Hunasgi

    2015-01-01

    Full Text Available Oral surgical procedures mainly tooth extraction can be related with an extended hemorrhage owed to the nature of the process resulting in an "open wound." The attempt of this paper is to present a case of massive postoperative clot formation after tooth extraction and highlight on the oral complications of surgical procedures. A 32-year-old male patient reported to the Dental Clinic for evaluation and extraction of grossly decayed 46. Clinical evaluation of 46 revealed root stumps. Extraction of the root stumps was performed, and it was uneventful. Hemostasis was achieved and postsurgical instructions were specified to the patient. The patient reported to the clinic, the very subsequent morning with a criticism of bleeding at the extraction site. On clinical examination, bleeding was noted from the socket in relation to 46. To control bleeding, oral hemostatic drugs Revici - E (Ethamsylate 500 mg was prescribed and bleeding was stopped in 2 h. However, a massive clot was formed at the extraction site. Further, this clot resolved on its own in 1-week time. Despite the fact that dental extraction is considered to be a minor surgical procedure, some cases may present with life-threatening complications including hemorrhage. Vigilant and significant history taking, physical and dental examinations prior to dental procedures are a must to avoid intraoperative and postoperative complications.

  13. One-loop calculations with massive particles

    International Nuclear Information System (INIS)

    Oldenborgh, G.J. van.

    1990-01-01

    In this thesis some techniques for performing one-loop calculations with massive particles are presented. Numerical techniques are presented necessary for evaluating one-loop integrals which occur in one-loop calculations of photon-photon scattering. The algorithms have been coded in FORTRAN (to evaluate the scalar integrals) and the algebraic language FORM (to reduce the tensor integrals to scalar integrals). Applications are made in the theory of the strong interaction, QCD, i.e. in handling one-loop integrals with massive particles, in order to regulate the infinities by mass parameters encountered in this theory. However this simplifies the computation considerably, the description of the proton structure functions have to be renormalized in order to obtain physical results. This renormalization is different from the published results for the gluon and thus has to be redone. The first physics results that have been obtained with these new methods are presented. These concern heavy quark production in semi-leptonic interactions, for instance neutrino charm production and top production at the electron-proton (ep) collider HERA and the proposed LEP/LHC combination. Total and differential cross-sections for one-loop corrections to top production at the HERA and proposed LEP/HLC ep colliders are given and structure functions for charmed quark production are compared with previously published results. (author). 58 refs.; 18 figs.; 5 tabs

  14. Dipolar dark matter with massive bigravity

    International Nuclear Information System (INIS)

    Blanchet, Luc; Heisenberg, Lavinia

    2015-01-01

    Massive gravity theories have been developed as viable IR modifications of gravity motivated by dark energy and the problem of the cosmological constant. On the other hand, modified gravity and modified dark matter theories were developed with the aim of solving the problems of standard cold dark matter at galactic scales. Here we propose to adapt the framework of ghost-free massive bigravity theories to reformulate the problem of dark matter at galactic scales. We investigate a promising alternative to dark matter called dipolar dark matter (DDM) in which two different species of dark matter are separately coupled to the two metrics of bigravity and are linked together by an internal vector field. We show that this model successfully reproduces the phenomenology of dark matter at galactic scales (i.e. MOND) as a result of a mechanism of gravitational polarisation. The model is safe in the gravitational sector, but because of the particular couplings of the matter fields and vector field to the metrics, a ghost in the decoupling limit is present in the dark matter sector. However, it might be possible to push the mass of the ghost beyond the strong coupling scale by an appropriate choice of the parameters of the model. Crucial questions to address in future work are the exact mass of the ghost, and the cosmological implications of the model

  15. Evolution of massive close binary stars

    International Nuclear Information System (INIS)

    Masevich, A.G.; Tutukov, A.V.

    1982-01-01

    Some problems of the evolution of massive close binary stars are discussed. Most of them are nonevolutionized stars with close masses of components. After filling the Roche cavity and exchange of matter between the components the Wolf-Rayet star is formed. As a result of the supernovae explosion a neutron star or a black hole is formed in the system. The system does not disintegrate but obtains high space velocity owing to the loss of the supernovae envelope. The satellite of the neutron star or black hole - the star of the O or B spectral class loses about 10 -6 of the solar mass for a year. Around the neighbouring component a disc of this matter is formed the incidence of which on a compact star leads to X radiation appearance. The neutron star cannot absorb the whole matter of the widening component and the binary system submerges into the common envelope. As a result of the evolution of massive close binary systems single neutron stars can appear which after the lapse of some time become radiopulsars. Radiopulsars with such high space velocities have been found in our Galaxy [ru

  16. The formation of massive molecular filaments and massive stars triggered by a magnetohydrodynamic shock wave

    Science.gov (United States)

    Inoue, Tsuyoshi; Hennebelle, Patrick; Fukui, Yasuo; Matsumoto, Tomoaki; Iwasaki, Kazunari; Inutsuka, Shu-ichiro

    2018-05-01

    Recent observations suggest an that intensive molecular cloud collision can trigger massive star/cluster formation. The most important physical process caused by the collision is a shock compression. In this paper, the influence of a shock wave on the evolution of a molecular cloud is studied numerically by using isothermal magnetohydrodynamics simulations with the effect of self-gravity. Adaptive mesh refinement and sink particle techniques are used to follow the long-time evolution of the shocked cloud. We find that the shock compression of a turbulent inhomogeneous molecular cloud creates massive filaments, which lie perpendicularly to the background magnetic field, as we have pointed out in a previous paper. The massive filament shows global collapse along the filament, which feeds a sink particle located at the collapse center. We observe a high accretion rate \\dot{M}_acc> 10^{-4} M_{⊙}yr-1 that is high enough to allow the formation of even O-type stars. The most massive sink particle achieves M > 50 M_{⊙} in a few times 105 yr after the onset of the filament collapse.

  17. MASSIVE+: The Growth Histories of MASSIVE Survey Galaxies from their Globular Cluster Colors

    Science.gov (United States)

    Blakeslee, John

    2017-08-01

    The MASSIVE survey is targeting the 100 most massive galaxies within 108 Mpc that are visible in the northern sky. These most massive galaxies in the present-day universe reside in a surprisingly wide variety of environments, from rich clusters to fossil groups to near isolation. We propose to use WFC3/UVIS and ACS to carry out a deep imaging study of the globular cluster populations around a selected subset of the MASSIVE targets. Though much is known about GC systems of bright galaxies in rich clusters, we know surprisingly little about the effects of environment on these systems. The MASSIVE sample provides a golden opportunity to learn about the systematics of GC systems and what they can tell us about environmental drivers on the evolution of the highest mass galaxies. The most pressing questions to be addressed include: (1) Do isolated giants have the same constant mass fraction of GCs to total halo mass as BCGs of similar luminosity? (2) Do their GC systems show the same color (metallicity) distribution, which is an outcome of the mass spectrum of gas-rich halos during hierarchical growth? (3) Do the GCs in isolated high-mass galaxies follow the same radial distribution versus metallicity as in rich environments (a test of the relative importance of growth by accretion)? (4) Do the GCs of galaxies in sparse environments follow the same mass function? Our proposed second-band imaging will enable us to secure answers to these questions and add enormously to the legacy value of existing HST imaging of the highest mass galaxies in the universe.

  18. Leveraging design thinking to build sustainable mobile health systems.

    Science.gov (United States)

    Eckman, Molly; Gorski, Irena; Mehta, Khanjan

    Mobile health, or mHealth, technology has the potential to improve health care access in the developing world. However, the majority of mHealth projects do not expand beyond the pilot stage. A core reason why is because they do not account for the individual needs and wants of those involved. A collaborative approach is needed to integrate the perspectives of all stakeholders into the design and operation of mHealth endeavours. Design thinking is a methodology used to develop and evaluate novel concepts for systems. With roots in participatory processes and self-determined pathways, design thinking provides a compelling framework to understand and apply the needs of diverse stakeholders to mHealth project development through a highly iterative process. The methodology presented in this article provides a structured approach to apply design thinking principles to assess the feasibility of novel mHealth endeavours during early conceptualisation.

  19. Identifying Enterprise Leverage Points in Defense Acquisition Program Performance

    Science.gov (United States)

    2009-09-01

    differentiated . [108] Table 1: Table of Validation and Approval Authority5 Beyond the major categories used for programs as noted above, there is also a...impossible to identify which “ uber -portfolio” a system should belong to as many “portfolios” claim a system as an integral part of the larger portfolio...to differentiate between programs. DOD 5002, Enclosure E states “A technology project or acquisition program shall be categorized based on its

  20. Developing a Data Discovery Tool for Interdisciplinary Science: Leveraging a Web-based Mapping Application and Geosemantic Searching

    Science.gov (United States)

    Albeke, S. E.; Perkins, D. G.; Ewers, S. L.; Ewers, B. E.; Holbrook, W. S.; Miller, S. N.

    2015-12-01

    The sharing of data and results is paramount for advancing scientific research. The Wyoming Center for Environmental Hydrology and Geophysics (WyCEHG) is a multidisciplinary group that is driving scientific breakthroughs to help manage water resources in the Western United States. WyCEHG is mandated by the National Science Foundation (NSF) to share their data. However, the infrastructure from which to share such diverse, complex and massive amounts of data did not exist within the University of Wyoming. We developed an innovative framework to meet the data organization, sharing, and discovery requirements of WyCEHG by integrating both open and closed source software, embedded metadata tags, semantic web technologies, and a web-mapping application. The infrastructure uses a Relational Database Management System as the foundation, providing a versatile platform to store, organize, and query myriad datasets, taking advantage of both structured and unstructured formats. Detailed metadata are fundamental to the utility of datasets. We tag data with Uniform Resource Identifiers (URI's) to specify concepts with formal descriptions (i.e. semantic ontologies), thus allowing users the ability to search metadata based on the intended context rather than conventional keyword searches. Additionally, WyCEHG data are geographically referenced. Using the ArcGIS API for Javascript, we developed a web mapping application leveraging database-linked spatial data services, providing a means to visualize and spatially query available data in an intuitive map environment. Using server-side scripting (PHP), the mapping application, in conjunction with semantic search modules, dynamically communicates with the database and file system, providing access to available datasets. Our approach provides a flexible, comprehensive infrastructure from which to store and serve WyCEHG's highly diverse research-based data. This framework has not only allowed WyCEHG to meet its data stewardship

  1. Financial Leverage and Corporate Performance: Does Financial Crisis Owe an Explanation?

    Directory of Open Access Journals (Sweden)

    Syed Jawad Hussain Shahzad

    2015-04-01

    Full Text Available The objective of this study is to investigate the impact of financial leverage on corporate financial performance of Pakistan’s textile sector from 1999-2012 using panel data. The leverage-performance relationship is examined with a special focus on the Global Financial Crisis of 2007-2008. Both accounting-based (Return on Assets - ROA and market-based (Tobin’s Q measures of corporate financial performance are used. Regression analysis is performed with and without inclusion of financial crisis dummy. Total Debt to Total Assets (TDTA, Long Term Debt to Total Assets (LDTA, Short Term Debt to Total Assets (SDTA and Debt to Equity (DE ratios are used as proxies for financial leverage whereas firm’s size and firm’s efficiency are used as control variables. The results indicate that financial leverage has a negative impact on corporate performance when measured with ROA. Whereas in case of Tobin’s Q, SDTA coefficient is positive. It can be concluded that since cost of borrowing is high in Pakistan and debt capital markets are less developed, firms are forced to resort to banks as their source of debt finance and thus have to repay huge amount of principal and interest which has a heavy toll on their financial health. In addition to this, financial crisis was found to have a negative impact on corporate performance and also affect the leverage-performance relationship.

  2. Making the case: leveraging resources toward public health system improvement in Turning Point states.

    Science.gov (United States)

    Bekemeier, Betty; Riley, Catharine M; Padgett, Stephen M; Berkowitz, Bobbie

    2007-01-01

    Leveraging funds to sustain the efforts of a grant-funded initiative is often an explicit, or implicit, expectation in philanthropy. However, the magnitude of funds leveraged and the factors that facilitate success in leveraging are rarely researched. An example of one of these grant-funded initiatives is the National Turning Point Initiative. Twenty-one states received funding from The Robert Wood Johnson Foundation as part of this initiative to establish and implement strategic goals for achieving significant statewide public health system improvement through diverse, cross-sector partnerships. Leaders from 17 of these 21 states participated in a two-phased study regarding the leveraging of additional funds for their public health infrastructure improvement activities. This article reports on the second phase of the study. In this phase, key informant interviews were conducted to examine how leveraging of resources occurred as part of this large national initiative. Findings indicate that the combination of a comprehensive planning process and a broad-based partnership was crucial in securing resources to expand their efforts. The ability to strategically respond to unexpected events and opportunities also helped states use their plans and partnerships to "make the case" for additional resources to improve their public health infrastructure.

  3. Leverage principle of retardation signal in titration of double protein via chip moving reaction boundary electrophoresis.

    Science.gov (United States)

    Zhang, Liu-Xia; Cao, Yi-Ren; Xiao, Hua; Liu, Xiao-Ping; Liu, Shao-Rong; Meng, Qing-Hua; Fan, Liu-Yin; Cao, Cheng-Xi

    2016-03-15

    In the present work we address a simple, rapid and quantitative analytical method for detection of different proteins present in biological samples. For this, we proposed the model of titration of double protein (TDP) and its relevant leverage theory relied on the retardation signal of chip moving reaction boundary electrophoresis (MRBE). The leverage principle showed that the product of the first protein content and its absolute retardation signal is equal to that of the second protein content and its absolute one. To manifest the model, we achieved theoretical self-evidence for the demonstration of the leverage principle at first. Then relevant experiments were conducted on the TDP-MRBE chip. The results revealed that (i) there was a leverage principle of retardation signal within the TDP of two pure proteins, and (ii) a lever also existed within these two complex protein samples, evidently demonstrating the validity of TDP model and leverage theory in MRBE chip. It was also showed that the proposed technique could provide a rapid and simple quantitative analysis of two protein samples in a mixture. Finally, we successfully applied the developed technique for the quantification of soymilk in adulterated infant formula. The TDP-MRBE opens up a new window for the detection of adulteration ratio of the poor food (milk) in blended high quality one. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Neutron stars structure in the context of massive gravity

    Energy Technology Data Exchange (ETDEWEB)

    Hendi, S.H.; Bordbar, G.H.; Panah, B. Eslam; Panahiyan, S., E-mail: hendi@shirazu.ac.ir, E-mail: ghbordbar@shirazu.ac.ir, E-mail: behzad.eslampanah@gmail.com, E-mail: sh.panahiyan@gmail.com [Physics Department and Biruni Observatory, College of Sciences, Shiraz University, Shiraz 71454 (Iran, Islamic Republic of)

    2017-07-01

    Motivated by the recent interests in spin−2 massive gravitons, we study the structure of neutron star in the context of massive gravity. The modifications of TOV equation in the presence of massive gravity are explored in 4 and higher dimensions. Next, by considering the modern equation of state for the neutron star matter (which is extracted by the lowest order constrained variational (LOCV) method with the AV18 potential), different physical properties of the neutron star (such as Le Chatelier's principle, stability and energy conditions) are investigated. It is shown that consideration of the massive gravity has specific contributions into the structure of neutron star and introduces new prescriptions for the massive astrophysical objects. The mass-radius relation is examined and the effects of massive gravity on the Schwarzschild radius, average density, compactness, gravitational redshift and dynamical stability are studied. Finally, a relation between mass and radius of neutron star versus the Planck mass is extracted.

  5. Neutron stars structure in the context of massive gravity

    Science.gov (United States)

    Hendi, S. H.; Bordbar, G. H.; Eslam Panah, B.; Panahiyan, S.

    2017-07-01

    Motivated by the recent interests in spin-2 massive gravitons, we study the structure of neutron star in the context of massive gravity. The modifications of TOV equation in the presence of massive gravity are explored in 4 and higher dimensions. Next, by considering the modern equation of state for the neutron star matter (which is extracted by the lowest order constrained variational (LOCV) method with the AV18 potential), different physical properties of the neutron star (such as Le Chatelier's principle, stability and energy conditions) are investigated. It is shown that consideration of the massive gravity has specific contributions into the structure of neutron star and introduces new prescriptions for the massive astrophysical objects. The mass-radius relation is examined and the effects of massive gravity on the Schwarzschild radius, average density, compactness, gravitational redshift and dynamical stability are studied. Finally, a relation between mass and radius of neutron star versus the Planck mass is extracted.

  6. Neutron stars structure in the context of massive gravity

    International Nuclear Information System (INIS)

    Hendi, S.H.; Bordbar, G.H.; Panah, B. Eslam; Panahiyan, S.

    2017-01-01

    Motivated by the recent interests in spin−2 massive gravitons, we study the structure of neutron star in the context of massive gravity. The modifications of TOV equation in the presence of massive gravity are explored in 4 and higher dimensions. Next, by considering the modern equation of state for the neutron star matter (which is extracted by the lowest order constrained variational (LOCV) method with the AV18 potential), different physical properties of the neutron star (such as Le Chatelier's principle, stability and energy conditions) are investigated. It is shown that consideration of the massive gravity has specific contributions into the structure of neutron star and introduces new prescriptions for the massive astrophysical objects. The mass-radius relation is examined and the effects of massive gravity on the Schwarzschild radius, average density, compactness, gravitational redshift and dynamical stability are studied. Finally, a relation between mass and radius of neutron star versus the Planck mass is extracted.

  7. Massive supermultiplets in four-dimensional superstring theory

    International Nuclear Information System (INIS)

    Feng Wanzhe; Lüst, Dieter; Schlotterer, Oliver

    2012-01-01

    We extend the discussion of Feng et al. (2011) on massive Regge excitations on the first mass level of four-dimensional superstring theory. For the lightest massive modes of the open string sector, universal supermultiplets common to all four-dimensional compactifications with N=1,2 and N=4 spacetime supersymmetry are constructed respectively - both their vertex operators and their supersymmetry variations. Massive spinor helicity methods shed light on the interplay between individual polarization states.

  8. Leveraging geodetic data to reduce losses from earthquakes

    Science.gov (United States)

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    Seismic hazard assessments that are based on a variety of data and the best available science, coupled with rapid synthesis of real-time information from continuous monitoring networks to guide post-earthquake response, form a solid foundation for effective earthquake loss reduction. With this in mind, the Earthquake Hazards Program (EHP) of the U.S. Geological Survey (USGS) Natural Hazards Mission Area (NHMA) engages in a variety of undertakings, both established and emergent, in order to provide high quality products that enable stakeholders to take action in advance of and in response to earthquakes. Examples include the National Seismic Hazard Model (NSHM), development of tools for improved situational awareness such as earthquake early warning (EEW) and operational earthquake forecasting (OEF), research about induced seismicity, and new efforts to advance comprehensive subduction zone science and monitoring. Geodetic observations provide unique and complementary information directly relevant to advancing many aspects of these efforts (fig. 1). EHP scientists have long leveraged geodetic data for a range of influential studies, and they continue to develop innovative observation and analysis methods that push the boundaries of the field of geodesy as applied to natural hazards research. Given the ongoing, rapid improvement in availability, variety, and precision of geodetic measurements, considering ways to fully utilize this observational resource for earthquake loss reduction is timely and essential. This report presents strategies, and the underlying scientific rationale, by which the EHP could achieve the following outcomes: The EHP is an authoritative source for the interpretation of geodetic data and its use for earthquake loss reduction throughout the United States and its territories.The USGS consistently provides timely, high quality geodetic data to stakeholders.Significant earthquakes are better characterized by incorporating geodetic data into USGS

  9. Exact Solutions in 3D New Massive Gravity

    Science.gov (United States)

    Ahmedov, Haji; Aliev, Alikram N.

    2011-01-01

    We show that the field equations of new massive gravity (NMG) consist of a massive (tensorial) Klein-Gordon-type equation with a curvature-squared source term and a constraint equation. We also show that, for algebraic type D and N spacetimes, the field equations of topologically massive gravity (TMG) can be thought of as the “square root” of the massive Klein-Gordon-type equation. Using this fact, we establish a simple framework for mapping all types D and N solutions of TMG into NMG. Finally, we present new examples of types D and N solutions to NMG.

  10. Holographic heat engine within the framework of massive gravity

    Science.gov (United States)

    Mo, Jie-Xiong; Li, Gu-Qiang

    2018-05-01

    Heat engine models are constructed within the framework of massive gravity in this paper. For the four-dimensional charged black holes in massive gravity, it is shown that the existence of graviton mass improves the heat engine efficiency significantly. The situation is more complicated for the five-dimensional neutral black holes since the constant which corresponds to the third massive potential also contributes to the efficiency. It is also shown that the existence of graviton mass can improve the heat engine efficiency. Moreover, we probe how the massive gravity influences the behavior of the heat engine efficiency approaching the Carnot efficiency.

  11. Very massive runaway stars from three-body encounters

    Science.gov (United States)

    Gvaramadze, Vasilii V.; Gualandris, Alessia

    2011-01-01

    Very massive stars preferentially reside in the cores of their parent clusters and form binary or multiple systems. We study the role of tight very massive binaries in the origin of the field population of very massive stars. We performed numerical simulations of dynamical encounters between single (massive) stars and a very massive binary with parameters similar to those of the most massive known Galactic binaries, WR 20a and NGC 3603-A1. We found that these three-body encounters could be responsible for the origin of high peculiar velocities (≥70 km s-1) observed for some very massive (≥60-70 M⊙) runaway stars in the Milky Way and the Large Magellanic Cloud (e.g. λ Cep, BD+43°3654, Sk -67°22, BI 237, 30 Dor 016), which can hardly be explained within the framework of the binary-supernova scenario. The production of high-velocity massive stars via three-body encounters is accompanied by the recoil of the binary in the opposite direction to the ejected star. We show that the relative position of the very massive binary R145 and the runaway early B-type star Sk-69°206 on the sky is consistent with the possibility that both objects were ejected from the central cluster, R136, of the star-forming region 30 Doradus via the same dynamical event - a three-body encounter.

  12. Massively parallel Fokker-Planck calculations

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1990-01-01

    This paper reports that the Fokker-Planck package FPPAC, which solves the complete nonlinear multispecies Fokker-Planck collision operator for a plasma in two-dimensional velocity space, has been rewritten for the Connection Machine 2. This has involved allocation of variables either to the front end or the CM2, minimization of data flow, and replacement of Cray-optimized algorithms with ones suitable for a massively parallel architecture. Calculations have been carried out on various Connection Machines throughout the country. Results and timings on these machines have been compared to each other and to those on the static memory Cray-2. For large problem size, the Connection Machine 2 is found to be cost-efficient

  13. Large-group psychodynamics and massive violence

    Directory of Open Access Journals (Sweden)

    Vamik D. Volkan

    2006-06-01

    Full Text Available Beginning with Freud, psychoanalytic theories concerning large groups have mainly focused on individuals' perceptions of what their large groups psychologically mean to them. This chapter examines some aspects of large-group psychology in its own right and studies psychodynamics of ethnic, national, religious or ideological groups, the membership of which originates in childhood. I will compare the mourning process in individuals with the mourning process in large groups to illustrate why we need to study large-group psychology as a subject in itself. As part of this discussion I will also describe signs and symptoms of large-group regression. When there is a threat against a large-group's identity, massive violence may be initiated and this violence in turn, has an obvious impact on public health.

  14. Massive cortical reorganization in sighted Braille readers.

    Science.gov (United States)

    Siuda-Krzywicka, Katarzyna; Bola, Łukasz; Paplińska, Małgorzata; Sumera, Ewa; Jednoróg, Katarzyna; Marchewka, Artur; Śliwińska, Magdalena W; Amedi, Amir; Szwed, Marcin

    2016-03-15

    The brain is capable of large-scale reorganization in blindness or after massive injury. Such reorganization crosses the division into separate sensory cortices (visual, somatosensory...). As its result, the visual cortex of the blind becomes active during tactile Braille reading. Although the possibility of such reorganization in the normal, adult brain has been raised, definitive evidence has been lacking. Here, we demonstrate such extensive reorganization in normal, sighted adults who learned Braille while their brain activity was investigated with fMRI and transcranial magnetic stimulation (TMS). Subjects showed enhanced activity for tactile reading in the visual cortex, including the visual word form area (VWFA) that was modulated by their Braille reading speed and strengthened resting-state connectivity between visual and somatosensory cortices. Moreover, TMS disruption of VWFA activity decreased their tactile reading accuracy. Our results indicate that large-scale reorganization is a viable mechanism recruited when learning complex skills.

  15. Signatures of massive sgoldstinos at hadron colliders

    International Nuclear Information System (INIS)

    Perazzi, Elena; Ridolfi, Giovanni; Zwirner, Fabio

    2000-01-01

    In supersymmetric extensions of the Standard Model with a very light gravitino, the effective theory at the weak scale should contain not only the goldstino G-tilde, but also its supersymmetric partners, the sgoldstinos. In the simplest case, the goldstino is a gauge-singlet and its superpartners are two neutral spin-0 particles, S and P. We study possible signals of massive sgoldstinos at hadron colliders, focusing on those that are most relevant for the Tevatron. We show that inclusive production of sgoldstinos, followed by their decay into two photons, can lead to observable signals or to stringent combined bounds on the gravitino and sgoldstino masses. Sgoldstino decays into two gluon jets may provide a useful complementary signature

  16. Scalable Strategies for Computing with Massive Data

    Directory of Open Access Journals (Sweden)

    Michael Kane

    2013-11-01

    Full Text Available This paper presents two complementary statistical computing frameworks that address challenges in parallel processing and the analysis of massive data. First, the foreach package allows users of the R programming environment to define parallel loops that may be run sequentially on a single machine, in parallel on a symmetric multiprocessing (SMP machine, or in cluster environments without platform-specific code. Second, the bigmemory package implements memory- and file-mapped data structures that provide (a access to arbitrarily large data while retaining a look and feel that is familiar to R users and (b data structures that are shared across processor cores in order to support efficient parallel computing techniques. Although these packages may be used independently, this paper shows how they can be used in combination to address challenges that have effectively been beyond the reach of researchers who lack specialized software development skills or expensive hardware.

  17. Computational chaos in massively parallel neural networks

    Science.gov (United States)

    Barhen, Jacob; Gulati, Sandeep

    1989-01-01

    A fundamental issue which directly impacts the scalability of current theoretical neural network models to massively parallel embodiments, in both software as well as hardware, is the inherent and unavoidable concurrent asynchronicity of emerging fine-grained computational ensembles and the possible emergence of chaotic manifestations. Previous analyses attributed dynamical instability to the topology of the interconnection matrix, to parasitic components or to propagation delays. However, researchers have observed the existence of emergent computational chaos in a concurrently asynchronous framework, independent of the network topology. Researcher present a methodology enabling the effective asynchronous operation of large-scale neural networks. Necessary and sufficient conditions guaranteeing concurrent asynchronous convergence are established in terms of contracting operators. Lyapunov exponents are computed formally to characterize the underlying nonlinear dynamics. Simulation results are presented to illustrate network convergence to the correct results, even in the presence of large delays.

  18. Substructure of Highly Boosted Massive Jets

    Energy Technology Data Exchange (ETDEWEB)

    Alon, Raz [Weizmann Inst. of Science, Rehovot (Israel)

    2012-10-01

    Modern particle accelerators enable researchers to study new high energy frontiers which have never been explored before. This realm opens possibilities to further examine known fields such as Quantum Chromodynamics. In addition, it allows searching for new physics and setting new limits on the existence of such. This study examined the substructure of highly boosted massive jets measured by the CDF II detector. Events from 1.96 TeV proton-antiproton collisions at the Fermilab Tevatron Collider were collected out of a total integrated luminosity of 5.95 fb$^{-1}$. They were selected to have at least one jet with transverse momentum above 400 GeV/c. The jet mass, angularity, and planar flow were measured and compared with predictions of perturbative Quantum Chromodynamics, and were found to be consistent with the theory. A search for boosted top quarks was conducted and resulted in an upper limit on the production cross section of such top quarks.

  19. The Search for Stable, Massive, Elementary Particles

    International Nuclear Information System (INIS)

    Kim, Peter C.

    2001-01-01

    In this paper we review the experimental and observational searches for stable, massive, elementary particles other than the electron and proton. The particles may be neutral, may have unit charge or may have fractional charge. They may interact through the strong, electromagnetic, weak or gravitational forces or through some unknown force. The purpose of this review is to provide a guide for future searches--what is known, what is not known, and what appear to be the most fruitful areas for new searches. A variety of experimental and observational methods such as accelerator experiments, cosmic ray studies, searches for exotic particles in bulk matter and searches using astrophysical observations is included in this review

  20. Hadronic production of massive lepton pairs

    International Nuclear Information System (INIS)

    Berger, E.L.

    1982-12-01

    A review is presented of recent experimental and theoretical progress in studies of the production of massive lepton pairs in hadronic collisions. I begin with the classical Drell-Yan annihilation model and its predictions. Subsequently, I discuss deviations from scaling, the status of the proofs of factorization in the parton model, higher-order terms in the perturbative QCD expansion, the discrepancy between measured and predicted yields (K factor), high-twist terms, soft gluon effects, transverse-momentum distributions, implications for weak vector boson (W +- and Z 0 ) yields and production properties, nuclear A dependence effects, correlations of the lepton pair with hadrons in the final state, and angular distributions in the lepton-pair rest frame

  1. Planckian Interacting Massive Particles as Dark Matter

    DEFF Research Database (Denmark)

    Garny, Mathias; Sandora, McCullen; Sloth, Martin S.

    2016-01-01

    . In this case the WIMP miracle is a mirage, and instead minimality as dictated by Occam's razor would indicate that dark matter is related to the Planck scale, where quantum gravity is anyway expected to manifest itself. Assuming within this framework that dark matter is a Planckian Interacting Massive Particle......, we show that the most natural mass larger than $0.01\\,\\textrm{M}_p$ is already ruled out by the absence of tensor modes in the CMB. This also indicates that we expect tensor modes in the CMB to be observed soon for this type of minimal dark matter model. Finally, we touch upon the KK graviton mode...... as a possible realization of this scenario within UV complete models, as well as further potential signatures and peculiar properties of this type of dark matter candidate. This paradigm therefore leads to a subtle connection between quantum gravity, the physics of primordial inflation, and the nature of dark...

  2. Effect of massive disks on bulge isophotes

    International Nuclear Information System (INIS)

    Monet, D.G.; Richstone, D.O.; Schechter, P.L.

    1981-01-01

    Massive disks produce flattened equipotentials. Unless the stars in a galaxy bulge are preferentially hotter in the z direction than in the plane, the isophotes will be at least as flat as the equipotentials. The comparison of two galaxy models having flat rotation curves with the available surface photometry for five external galaxies does not restrict the mass fraction which might reside in the disk. However, star counts in our own Galaxy indicate that unless the disk terminates close to the solar circle, no more than half the mass within that circle lies in the disk. The remaining half must lie either in the bulge or, more probably, in a third dark, round, dynamically distinct component

  3. Neural nets for massively parallel optimization

    Science.gov (United States)

    Dixon, Laurence C. W.; Mills, David

    1992-07-01

    To apply massively parallel processing systems to the solution of large scale optimization problems it is desirable to be able to evaluate any function f(z), z (epsilon) Rn in a parallel manner. The theorem of Cybenko, Hecht Nielsen, Hornik, Stinchcombe and White, and Funahasi shows that this can be achieved by a neural network with one hidden layer. In this paper we address the problem of the number of nodes required in the layer to achieve a given accuracy in the function and gradient values at all points within a given n dimensional interval. The type of activation function needed to obtain nonsingular Hessian matrices is described and a strategy for obtaining accurate minimal networks presented.

  4. Climate models on massively parallel computers

    International Nuclear Information System (INIS)

    Vitart, F.; Rouvillois, P.

    1993-01-01

    First results got on massively parallel computers (Multiple Instruction Multiple Data and Simple Instruction Multiple Data) allow to consider building of coupled models with high resolutions. This would make possible simulation of thermoaline circulation and other interaction phenomena between atmosphere and ocean. The increasing of computers powers, and then the improvement of resolution will go us to revise our approximations. Then hydrostatic approximation (in ocean circulation) will not be valid when the grid mesh will be of a dimension lower than a few kilometers: We shall have to find other models. The expert appraisement got in numerical analysis at the Center of Limeil-Valenton (CEL-V) will be used again to imagine global models taking in account atmosphere, ocean, ice floe and biosphere, allowing climate simulation until a regional scale

  5. Innovative insurance plan promises to leverage green power

    International Nuclear Information System (INIS)

    Edge, Gordon

    1999-01-01

    This article explains the gap between customers of green power signing short term (1-2 year) contracts and the banks wanting power purchase agreements for ten or more years before lending on new projects. Details are given of a new initiative from the US green power industry for a green premium for green power marketeers with the idea of an insurance product to take some of the risk and bridge the gap. Examples of coverage under the green power insurance proposal are discussed, and the funding and implementation of the scheme, and the effect of the insurance are considered

  6. Massive Outflows Associated with ATLASGAL Clumps

    Science.gov (United States)

    Yang, A. Y.; Thompson, M. A.; Urquhart, J. S.; Tian, W. W.

    2018-03-01

    We have undertaken the largest survey for outflows within the Galactic plane using simultaneously observed {}13{CO} and {{{C}}}18{{O}} data. Out of a total of 919 ATLASGAL clumps, 325 have data suitable to identify outflows, and 225 (69% ± 3%) show high-velocity outflows. The clumps with detected outflows show significantly higher clump masses ({M}clump}), bolometric luminosities ({L}bol}), luminosity-to-mass ratios ({L}bol}/{M}clump}), and peak H2 column densities ({N}{{{H}}2}) compared to those without outflows. Outflow activity has been detected within the youngest quiescent clump (i.e., 70 μ {{m}} weak) in this sample, and we find that the outflow detection rate increases with {M}clump}, {L}bol}, {L}bol}/{M}clump}, and {N}{{{H}}2}, approaching 90% in some cases (UC H II regions = 93% ± 3%; masers = 86% ± 4%; HC H II regions = 100%). This high detection rate suggests that outflows are ubiquitous phenomena of massive star formation (MSF). The mean outflow mass entrainment rate implies a mean accretion rate of ∼ {10}-4 {M}ȯ {yr}}-1, in full agreement with the accretion rate predicted by theoretical models of MSF. Outflow properties are tightly correlated with {M}clump}, {L}bol}, and {L}bol}/{M}clump} and show the strongest relation with the bolometric clump luminosity. This suggests that outflows might be driven by the most massive and luminous source within the clump. The correlations are similar for both low-mass and high-mass outflows over 7 orders of magnitude, indicating that they may share a similar outflow mechanism. Outflow energy is comparable to the turbulent energy within the clump; however, we find no evidence that outflows increase the level of clump turbulence as the clumps evolve. This implies that the origin of turbulence within clumps is fixed before the onset of star formation.

  7. Modular action on the massive algebra

    International Nuclear Information System (INIS)

    Saffary, T.

    2005-12-01

    The subject of this thesis is the modular group of automorphisms (σ m t ) t element of R , m>0, acting on the massive algebra of local observables M m (O) having their support in O is contained in R 4 . After a compact introduction to micro-local analysis and the theory of one-parameter groups of automorphisms, which are used extensively throughout the investigation, we are concerned with modular theory and its consequences in mathematics, e.g., Connes' cocycle theorem and classification of type III factors and Jones' index theory, as well as in physics, e.g., the determination of local von Neumann algebras to be hyperfinite factors of type III 1 , the formulation of thermodynamic equilibrium states for infinite-dimensional quantum systems (KMS states) and the discovery of modular action as geometric transformations. However, our main focus are its applications in physics, in particular the modular action as Lorentz boosts on the Rindler wedge, as dilations on the forward light cone and as conformal mappings on the double cone. Subsequently, their most important implications in local quantum physics are discussed. The purpose of this thesis is to shed more light on the transition from the known massless modular action to the wanted massive one in the case of double cones. First of all the infinitesimal generatore δ m of the group (σ m t ) t element of R is investigated, especially some assumptions on its structure are verified explicitly for the first time for two concrete examples. Then, two strategies for the calculation of σ m t itself are discussed. Some formalisms and results from operator theory and the method of second quantisation used in this thesis are made available in the appendix. (orig.)

  8. METHYL CYANIDE OBSERVATIONS TOWARD MASSIVE PROTOSTARS

    Energy Technology Data Exchange (ETDEWEB)

    Rosero, V.; Hofner, P. [Physics Department, New Mexico Tech, 801 Leroy Place, Socorro, NM 87801 (United States); Kurtz, S. [Centro de Radioastronomia y Astrofisica, Universidad Nacional Autonoma de Mexico, Morelia 58090 (Mexico); Bieging, J. [Department of Astronomy and Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721 (United States); Araya, E. D. [Physics Department, Western Illinois University, 1 University Circle, Macomb, IL 61455 (United States)

    2013-07-01

    We report the results of a survey in the CH{sub 3}CN J = 12 {yields} 11 transition toward a sample of massive proto-stellar candidates. The observations were carried out with the 10 m Submillimeter Telescope on Mount Graham, AZ. We detected this molecular line in 9 out of 21 observed sources. In six cases this is the first detection of this transition. We also obtained full beam sampled cross-scans for five sources which show that the lower K-components can be extended on the arcminute angular scale. The higher K-components, however, are always found to be compact with respect to our 36'' beam. A Boltzmann population diagram analysis of the central spectra indicates CH{sub 3}CN column densities of about 10{sup 14} cm{sup -2}, and rotational temperatures above 50 K, which confirms these sources as hot molecular cores. Independent fits to line velocity and width for the individual K-components resulted in the detection of an increasing blueshift with increasing line excitation for four sources. Comparison with mid-infrared (mid-IR) images from the SPITZER GLIMPSE/IRAC archive for six sources show that the CH{sub 3}CN emission is generally coincident with a bright mid-IR source. Our data clearly show that the CH{sub 3}CN J = 12 {yields} 11 transition is a good probe of the hot molecular gas near massive protostars, and provide the basis for future interferometric studies.

  9. The effect of high leverage points on the logistic ridge regression estimator having multicollinearity

    Science.gov (United States)

    Ariffin, Syaiba Balqish; Midi, Habshah

    2014-06-01

    This article is concerned with the performance of logistic ridge regression estimation technique in the presence of multicollinearity and high leverage points. In logistic regression, multicollinearity exists among predictors and in the information matrix. The maximum likelihood estimator suffers a huge setback in the presence of multicollinearity which cause regression estimates to have unduly large standard errors. To remedy this problem, a logistic ridge regression estimator is put forward. It is evident that the logistic ridge regression estimator outperforms the maximum likelihood approach for handling multicollinearity. The effect of high leverage points are then investigated on the performance of the logistic ridge regression estimator through real data set and simulation study. The findings signify that logistic ridge regression estimator fails to provide better parameter estimates in the presence of both high leverage points and multicollinearity.

  10. The impact of financial crises on the risk-return tradeoff and the leverage effect

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; Nielsen, Morten Ørregaard; Zhu, Jie

    50% in magnitude during …financial crises. No such changes are observed during NBER recessions, so in this sense …financial crises are special. Applications to a number of major developed and emerging international stock markets confirm the increase in the leverage effect, whereas the international......We investigate the impact of financial crises on two fundamental features of stock returns, namely, the risk-return tradeoff and the leverage effect. We apply the fractionally integrated exponential GARCH-in-mean (FIEGARCH-M) model for daily stock return data, which includes both features...... and allows the co-existence of long memory in volatility and short memory in returns. We extend this model to allow the financial parameters governing the volatility-in-mean effect and the leverage effect to change during financial crises. An application to the daily U.S. stock index return series from 1926...

  11. The impact of financial crises on the risk-return tradeoff and the leverage effect

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; Nielsen, Morten Ørregaard; Zhu, Jie

    2015-01-01

    % in magnitude during financial crises. No such changes are observed during NBER recessions, so in this sense financial crises are special. Applications to a number of major developed and emerging international stock markets confirm the increase in the leverage effect, whereas the international evidence......We investigate the impact of financial crises on two fundamental features of stock returns, namely, the risk-return tradeoff and the leverage effect. We apply the fractionally integrated exponential GARCH-in-mean (FIEGARCH-M) model for daily stock return data, which includes both features...... and allows the co-existence of long memory in volatility and short memory in returns. We extend this model to allow the financial parameters governing the volatility-in-mean effect and the leverage effect to change during financial crises. An application to the daily U.S. stock index return series from 1926...

  12. MASSIVE INFANT STARS ROCK THEIR CRADLE

    Science.gov (United States)

    2002-01-01

    Extremely intense radiation from newly born, ultra-bright stars has blown a glowing spherical bubble in the nebula N83B, also known as NGC 1748. A new NASA Hubble Space Telescope image has helped to decipher the complex interplay of gas and radiation of a star-forming region in a nearby galaxy. The image graphically illustrates just how these massive stars sculpt their environment by generating powerful winds that alter the shape of the parent gaseous nebula. These processes are also seen in our Milky Way in regions like the Orion Nebula. The Hubble telescope is famous for its contribution to our knowledge about star formation in very distant galaxies. Although most of the stars in the Universe were born several billions of years ago, when the Universe was young, star formation still continues today. This new Hubble image shows a very compact star-forming region in a small part of one of our neighboring galaxies - the Large Magellanic Cloud. This galaxy lies only 165,000 light-years from our Milky Way and can easily be seen with the naked eye from the Southern Hemisphere. Young, massive, ultra-bright stars are seen here just as they are born and emerge from the shelter of their pre-natal molecular cloud. Catching these hefty stars at their birthplace is not as easy as it may seem. Their high mass means that the young stars evolve very rapidly and are hard to find at this critical stage. Furthermore, they spend a good fraction of their youth hidden from view, shrouded by large quantities of dust in a molecular cloud. The only chance is to observe them just as they start to emerge from their cocoon - and then only with very high-resolution telescopes. Astronomers from France, the U.S., and Germany have used Hubble to study the fascinating interplay between gas, dust, and radiation from the newly born stars in this nebula. Its peculiar and turbulent structure has been revealed for the first time. This high-resolution study has also uncovered several individual stars

  13. Forecasting stock return volatility: A comparison between the roles of short-term and long-term leverage effects

    Science.gov (United States)

    Pan, Zhiyuan; Liu, Li

    2018-02-01

    In this paper, we extend the GARCH-MIDAS model proposed by Engle et al. (2013) to account for the leverage effect in short-term and long-term volatility components. Our in-sample evidence suggests that both short-term and long-term negative returns can cause higher future volatility than positive returns. Out-of-sample results show that the predictive ability of GARCH-MIDAS is significantly improved after taking the leverage effect into account. The leverage effect for short-term volatility component plays more important role than the leverage effect for long-term volatility component in affecting out-of-sample forecasting performance.

  14. Environmental geochemical study of Red Mountain--an undisturbed volcanogenic massive sulfide deposit in the Bonnifield District, Alaska range, east-central Alaska: Chapter I in Recent U.S. Geological Survey studies in the Tintina Gold Province, Alaska, United States, and Yukon, Canada--results of a 5-year project

    Science.gov (United States)

    Eppinger, Robert G.; Briggs, Paul H.; Dusel-Bacon, Cynthia; Giles, Stuart A.; Gough, Larry P.; Hammarstrom, Jane M.; Hubbard, Bernard E.

    2007-01-01

    The Red Mountain volcanogenic massive sulfide (VMS) deposit exhibits well-constrained examples of acid-generating, metal-leaching, metal-precipitation, and self-mitigation (via co-precipitation, dilution, and neutralization) processes that occur in an undisturbed natural setting, a rare occurrence in North America. The unmined pyrite-rich deposit displays a remarkable environmental footprint of natural acid generation, high metal concentrations, and exceedingly high rare-earth-element (REE) concentrations in surface waters. Dissolution of pyrite and associated secondary reactions under near-surface, oxidizing conditions are the primary causes for the acid generation and metal leaching. The deposit is hosted in Devonian to Mississippian felsic metavolcanic rocks of the Mystic Creek Member of the Totatlanika Schist.

  15. Leveraging management information in improving call centre productivity

    Directory of Open Access Journals (Sweden)

    Manthisana Mosese

    2016-04-01

    Objectives: This research explored the use of management information and its impact on two fundamental functions namely, improving productivity without compromising the quality of service, in the call centre of a well-known South African fashion retailer, Edcon. Following the implementation of the call centre technology project the research set out to determine how Edcon can transform their call centre to improve productivity and customer service through effective utilisation of their management information. Method: Internal documents and reports were analysed to provide the basis of evaluation between the measures of productivity prior to and post the implementation of a technology project at Edcon’s call centre. Semi-structured in-depth and group interviews were conducted to establish the importance and use of management information in improving productivity and customer service. Results: The results indicated that the availability of management information has indeed contributed to improved efficiency at the Edcon call centre. Although literature claims that there is a correlation between a call centre technology upgrade and improvement in performance, evident in the return on investment being realised within a year or two of implementation, it fell beyond the scope of this study to investigate the return on investment for Edcon’s call centre. Conclusion: Although Edcon has begun realising benefits in improved productivity in their call centre from their available management information, information will continue to play a crucial role in supporting management with informed decisions that will improve the call centre operations. [pdf to follow

  16. Cosmological leverage from the matter power spectrum in the presence of baryon and nonlinear effects

    International Nuclear Information System (INIS)

    Bielefeld, Jannis; Huterer, Dragan; Linder, Eric V.

    2015-01-01

    We investigate how the use of higher wavenumbers (smaller scales) in the galaxy clustering power spectrum influences cosmological constraints. We take into account uncertainties from nonlinear density fluctuations, (scale dependent) galaxy bias, and baryonic effects. Allowing for substantially model independent uncertainties through separate fit parameters in each wavenumber bin that also allow for the redshift evolution, we quantify strong gains in dark energy and neutrino mass leverage with increasing maximum wavenumber, despite marginalizing over numerous (up to 125) extra fit parameters. The leverage is due to not only an increased number of modes but, more significantly, breaking of degeneracies beyond the linear regime

  17. AN EXAMINATION OF THE LEVERAGE EFFECT IN THE ISE WITH STOCHASTIC VOLATILITY MODEL

    Directory of Open Access Journals (Sweden)

    YELİZ YALÇIN

    2013-06-01

    Full Text Available The purpose of this paper is the asses the leverage effect of the Istanbul Stock Exchange within the Stochastic Volatility framework in the period 01.01.1990 – 11.08.2006. The relationship between risk and return is a well established phenomenon in Financial Econometerics. Both positive and negative relationship has been reported in the empirical literature. That use the conditional variance the empirical evidence provided in this paper from the Stochastic Volatility is to be negative feed back effect and statistically insignificant leverage effect.

  18. Leveraging biology interest to broaden participation in the geosciences

    Science.gov (United States)

    Perin, S.; Conner, L.; Oxtoby, L.

    2017-12-01

    It has been well documented that female participation in the geoscience workforce is low. By contrast, the biology workforce has largely reached gender parity. These trends are rooted in patterns of interest among youth. Specifically, girls tend to like biology and value social and societal connections to science (Brotman & Moore 2008). Our NSF-funded project, "BRIGHT Girls," offers two-week summer academies to high school-aged girls, in which the connections between the geosciences and biology are made explicit. We are conducting qualitative research to trace the girls' identity work during this intervention. Using team-based video interaction analysis, we are finding that the fabric of the academy allows girls to "try on" new possible selves in science. Our results imply that real-world, interdisciplinary programs that include opportunities for agency and authentic science practice may be a fruitful approach for broadening participation in the geosciences.

  19. Leveraging the fullest potential of scientific collections through digitisation.

    Directory of Open Access Journals (Sweden)

    Roger Charles Baird

    2010-10-01

    Full Text Available Access to digitised specimen data is a vital means to distribute information and in turn create knowledge. Pooling the accessibility of specimen and observation data under common standards and harnessing the power of distributed datasets places more and more information and the disposal of a globally dispersed work force, which would otherwise carry on its work in relative isolation, and with limited profile and impact. Citing a number of higher profile national and international projects, it is argued that a globally coordinated approach to the digitisation of a critical mass of scientific specimens and specimen-related data is highly desirable and required, to maximize the value of these collections to civil society and to support the advancement of our scientific knowledge globally.

  20. Leveraging business intelligence to make better decisions: Part II.

    Science.gov (United States)

    Reimers, Mona

    2014-01-01

    This article is the second in a series about business intelligence (BI) in a medical practice. The first article reviewed the evolution of data reporting within the industry and provided some examples of how BI concepts differ from the reports available in the menus of our software systems, or the dashboards and scorecards practices have implemented. This article will discuss how to begin a BI initiative for front-end medical practice staffers that will create tools they can use to reduce errors and increase efficiency throughout their workday. This type of BI rollout can allow practices to get started with very little financial investment, gain enthusiasm from end users, and achieve a quick return on investment. More examples of successful BI projects in medical practices are discussed to help illustrate BI concepts.