WorldWideScience

Sample records for strong empirical basis

  1. Empirical projection-based basis-component decomposition method

    Science.gov (United States)

    Brendel, Bernhard; Roessl, Ewald; Schlomka, Jens-Peter; Proksa, Roland

    2009-02-01

    Advances in the development of semiconductor based, photon-counting x-ray detectors stimulate research in the domain of energy-resolving pre-clinical and clinical computed tomography (CT). For counting detectors acquiring x-ray attenuation in at least three different energy windows, an extended basis component decomposition can be performed in which in addition to the conventional approach of Alvarez and Macovski a third basis component is introduced, e.g., a gadolinium based CT contrast material. After the decomposition of the measured projection data into the basis component projections, conventional filtered-backprojection reconstruction is performed to obtain the basis-component images. In recent work, this basis component decomposition was obtained by maximizing the likelihood-function of the measurements. This procedure is time consuming and often unstable for excessively noisy data or low intrinsic energy resolution of the detector. Therefore, alternative procedures are of interest. Here, we introduce a generalization of the idea of empirical dual-energy processing published by Stenner et al. to multi-energy, photon-counting CT raw data. Instead of working in the image-domain, we use prior spectral knowledge about the acquisition system (tube spectra, bin sensitivities) to parameterize the line-integrals of the basis component decomposition directly in the projection domain. We compare this empirical approach with the maximum-likelihood (ML) approach considering image noise and image bias (artifacts) and see that only moderate noise increase is to be expected for small bias in the empirical approach. Given the drastic reduction of pre-processing time, the empirical approach is considered a viable alternative to the ML approach.

  2. Riesz basis for strongly continuous groups.

    NARCIS (Netherlands)

    Zwart, Heiko J.

    Given a Hilbert space and the generator of a strongly continuous group on this Hilbert space. If the eigenvalues of the generator have a uniform gap, and if the span of the corresponding eigenvectors is dense, then these eigenvectors form a Riesz basis (or unconditional basis) of the Hilbert space.

  3. Educational texts as empirical basis in qualitative research in Physical Education

    DEFF Research Database (Denmark)

    Svendsen, Annemari Munk

    This presentation will focus attention on educational texts as empirical basis in qualitative research in Physical Education (PE). Educational texts may be defined as all kinds of texts used in a pedagogical setting, including textbooks, popular articles, webpages and political reports (Selander......). This makes them fundamental sites for illuminating what counts as knowledge in an educational setting (Selander & Skjeldbred, 2004). This presentation will introduce a qualitative research study obtained with discourse analysis of educational texts in Physical Education Teacher Education (PETE) in Denmark...... (Svendsen & Svendsen, 2014). It will present the theoretical and methodological considerations that are tied to the analysis of educational texts and discuss the qualities and challenges related to educational texts as empirical basis in qualitative research in PE. References: Apple, M. W. & Christian...

  4. Poppers, Kaposi's sarcoma, and HIV infection: empirical example of a strong confounding effect?

    Science.gov (United States)

    Morabia, A

    1995-01-01

    Are there empirical examples of strong confounding effects? Textbooks usually show examples of weak confounding or use hypothetical examples of strong confounding to illustrate the paradoxical consequences of not separating out the effect of the studied exposure from that of second factor acting as a confounder. HIV infection is a candidate strong confounder of the spuriously high association reported between consumption of poppers, a sexual stimulant, and risk of Kaposi's sarcoma in the early phase of the AIDS epidemic. To examine this hypothesis, assumptions must be made on the prevalence of HIV infection among cases of Kaposi's sarcoma and on the prevalence of heavy popper consumption according to HIV infection in cases and controls. Results show that HIV infection may have confounded the poppers-Kaposi's sarcoma association. However, it cannot be ruled out that HIV did not qualify as a confounder because it was either an intermediate variable or an effect modifier of the association between popper inhalation and Kaposi's sarcoma. This example provides a basis to discuss the mechanism by which confounding occurs as well as the practical importance of confounding in epidemiologic research.

  5. Outcome (competency) based education: an exploration of its origins, theoretical basis, and empirical evidence

    DEFF Research Database (Denmark)

    Mørcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-01-01

    and professional attributes as ‘‘competencies’’. OBE has been adopted by consensus in the face of weak empirical evidence. OBE, which has been advocated for over 50 years, can contribute usefully to defining requisite knowledge and skills, and blueprinting assessments. Its applicability to more complex aspects...... greatest benefits. Our aim was to explore the underpinnings of OBE: its historical origins, theoretical basis, and empirical evidence of its effects in order to answer the question: How can predetermined learning outcomes influence undergraduate medical education? This literature review had three...... components: A review of historical landmarks in the evolution of OBE; a review of conceptual frameworks and theories; and a systematic review of empirical publications from 1999 to 2010 that reported data concerning the effects of learning outcomes on undergraduate medical education. OBE had its origins...

  6. A semi-empirical analysis of strong-motion peaks in terms of seismic source, propagation path, and local site conditions

    Science.gov (United States)

    Kamiyama, M.; Orourke, M. J.; Flores-Berrones, R.

    1992-09-01

    A new type of semi-empirical expression for scaling strong-motion peaks in terms of seismic source, propagation path, and local site conditions is derived. Peak acceleration, peak velocity, and peak displacement are analyzed in a similar fashion because they are interrelated. However, emphasis is placed on the peak velocity which is a key ground motion parameter for lifeline earthquake engineering studies. With the help of seismic source theories, the semi-empirical model is derived using strong motions obtained in Japan. In the derivation, statistical considerations are used in the selection of the model itself and the model parameters. Earthquake magnitude M and hypocentral distance r are selected as independent variables and the dummy variables are introduced to identify the amplification factor due to individual local site conditions. The resulting semi-empirical expressions for the peak acceleration, velocity, and displacement are then compared with strong-motion data observed during three earthquakes in the U.S. and Mexico.

  7. Two Studies of the Empirical Basis of Two Learning Resource-Oriented Motivational Strategies for Gifted Educators

    Science.gov (United States)

    Vladut, Anamaria; Vialle, Wilma; Ziegler, Albert

    2016-01-01

    Two learning resource-oriented motivational strategies for gifted educators are introduced: a homeostatic orientation that aims for balance and an allostatic orientation that aims at growth. In order to establish the empirical basis of these motivational strategies, two studies were conducted with samples of students from a specialized…

  8. Strong Generative Capacity and the Empirical Base of Linguistic Theory

    Directory of Open Access Journals (Sweden)

    Dennis Ott

    2017-09-01

    Full Text Available This Perspective traces the evolution of certain central notions in the theory of Generative Grammar (GG. The founding documents of the field suggested a relation between the grammar, construed as recursively enumerating an infinite set of sentences, and the idealized native speaker that was essentially equivalent to the relation between a formal language (a set of well-formed formulas and an automaton that recognizes strings as belonging to the language or not. But this early view was later abandoned, when the focus of the field shifted to the grammar's strong generative capacity as recursive generation of hierarchically structured objects as opposed to strings. The grammar is now no longer seen as specifying a set of well-formed expressions and in fact necessarily constructs expressions of any degree of intuitive “acceptability.” The field of GG, however, has not sufficiently acknowledged the significance of this shift in perspective, as evidenced by the fact that (informal and experimentally-controlled observations about string acceptability continue to be treated as bona fide data and generalizations for the theory of GG. The focus on strong generative capacity, it is argued, requires a new discussion of what constitutes valid empirical evidence for GG beyond observations pertaining to weak generation.

  9. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  10. Economic growth and emissions reconsidering the empirical basis of environmental Kuznets curves

    International Nuclear Information System (INIS)

    De Bruyn, S.M.; Van den Bergh, J.C.J.M.; Opschoor, J.B.

    1998-01-01

    Recent empirical research indicates that certain types of emissions follow an inverted-U or environmental Kuznets curve (EKC) as income grows. This regularity has been interpreted as a possible de-linking of economic growth and patterns of certain pollutants for developed economies. In this paper the empirical basis of this result is investigated, by considering some statistical particularities of the various EKC studies performed. It is argued that the inverted-U relationship between income and emissions estimated from panel data need not hold for specific individual countries over time. Based on insights from 'intensity-of-use' analysis in resource economics an alternative growth model is specified and estimated for three types of emissions (CO 2 , NO x and SO 2 ) in four countries (Netherlands, UK, USA and Western Germany). It is found that the time patterns of these emissions correlate positively with economic growth and that emission reductions may have been achieved as a result of structural and technological changes in the economy. 'Sustainable growth' is defined as the rate of economic growth that does not lead to growth in emissions. Its rate is calculated for each type of emission and country, based on estimated parameter values. The resulting indicators reflect a balance between the positive influence of growth and negative influence of structural change and technological progress on emission levels

  11. Actual factors to determine cross-currency basis swaps: An empirical study on US dollar/Japanese yen basis swap rates from the late 1990s

    OpenAIRE

    Shinada, Naoki

    2005-01-01

    Cross-currency basis swap rates that exchange US-dollar (USD) and Japanese-yen (JPY) LIBORs have fluctuated since the late 1990s. It is increasingly important for market participants to figure out such swap rates, but there have not been many empirical studies about actual markets. This study addresses factors of USD/JPY swap rates from the late 1990s to the present, and demonstrates that differences in credit risk premiums, forward exchange rates and assets swaps of foreign investors from JP...

  12. Accurate and balanced anisotropic Gaussian type orbital basis sets for atoms in strong magnetic fields

    Science.gov (United States)

    Zhu, Wuming; Trickey, S. B.

    2017-12-01

    In high magnetic field calculations, anisotropic Gaussian type orbital (AGTO) basis functions are capable of reconciling the competing demands of the spherically symmetric Coulombic interaction and cylindrical magnetic (B field) confinement. However, the best available a priori procedure for composing highly accurate AGTO sets for atoms in a strong B field [W. Zhu et al., Phys. Rev. A 90, 022504 (2014)] yields very large basis sets. Their size is problematical for use in any calculation with unfavorable computational cost scaling. Here we provide an alternative constructive procedure. It is based upon analysis of the underlying physics of atoms in B fields that allow identification of several principles for the construction of AGTO basis sets. Aided by numerical optimization and parameter fitting, followed by fine tuning of fitting parameters, we devise formulae for generating accurate AGTO basis sets in an arbitrary B field. For the hydrogen iso-electronic sequence, a set depends on B field strength, nuclear charge, and orbital quantum numbers. For multi-electron systems, the basis set formulae also include adjustment to account for orbital occupations. Tests of the new basis sets for atoms H through C (1 ≤ Z ≤ 6) and ions Li+, Be+, and B+, in a wide B field range (0 ≤ B ≤ 2000 a.u.), show an accuracy better than a few μhartree for single-electron systems and a few hundredths to a few mHs for multi-electron atoms. The relative errors are similar for different atoms and ions in a large B field range, from a few to a couple of tens of millionths, thereby confirming rather uniform accuracy across the nuclear charge Z and B field strength values. Residual basis set errors are two to three orders of magnitude smaller than the electronic correlation energies in multi-electron atoms, a signal of the usefulness of the new AGTO basis sets in correlated wavefunction or density functional calculations for atomic and molecular systems in an external strong B field.

  13. Accurate and balanced anisotropic Gaussian type orbital basis sets for atoms in strong magnetic fields.

    Science.gov (United States)

    Zhu, Wuming; Trickey, S B

    2017-12-28

    In high magnetic field calculations, anisotropic Gaussian type orbital (AGTO) basis functions are capable of reconciling the competing demands of the spherically symmetric Coulombic interaction and cylindrical magnetic (B field) confinement. However, the best available a priori procedure for composing highly accurate AGTO sets for atoms in a strong B field [W. Zhu et al., Phys. Rev. A 90, 022504 (2014)] yields very large basis sets. Their size is problematical for use in any calculation with unfavorable computational cost scaling. Here we provide an alternative constructive procedure. It is based upon analysis of the underlying physics of atoms in B fields that allow identification of several principles for the construction of AGTO basis sets. Aided by numerical optimization and parameter fitting, followed by fine tuning of fitting parameters, we devise formulae for generating accurate AGTO basis sets in an arbitrary B field. For the hydrogen iso-electronic sequence, a set depends on B field strength, nuclear charge, and orbital quantum numbers. For multi-electron systems, the basis set formulae also include adjustment to account for orbital occupations. Tests of the new basis sets for atoms H through C (1 ≤ Z ≤ 6) and ions Li + , Be + , and B + , in a wide B field range (0 ≤ B ≤ 2000 a.u.), show an accuracy better than a few μhartree for single-electron systems and a few hundredths to a few mHs for multi-electron atoms. The relative errors are similar for different atoms and ions in a large B field range, from a few to a couple of tens of millionths, thereby confirming rather uniform accuracy across the nuclear charge Z and B field strength values. Residual basis set errors are two to three orders of magnitude smaller than the electronic correlation energies in multi-electron atoms, a signal of the usefulness of the new AGTO basis sets in correlated wavefunction or density functional calculations for atomic and molecular systems in an external strong B

  14. Recent Development of the Empirical Basis for Prediction of Vortex Induced Vibrations

    Directory of Open Access Journals (Sweden)

    Carl M. Larsen

    2016-02-01

    Full Text Available This paper describes the research activity related to VIV that has taken place at NTNU and MARINTEK in Trondheim during the last years. The overall aim of the work has been increased understanding of the VIV phenomenon and to improve the empirical basis for prediction of VIV. The work has included experiments with flexible beams in sheared and uniform flow and forced motions of short, rigid cylinders. Key results in terms of hydrodynamic coefficients and analysis procedures have been implemented in the computer program VIVANA, which has resulted in new analysis options and improved hydrodynamic coefficients. Some examples of results are presented, but the main focus of the paper is to give an overview of the work and point out how the new results can be used in order to improve VIV analyses.

  15. Volkov basis for simulation of interaction of strong laser pulses and solids

    Science.gov (United States)

    Kidd, Daniel; Covington, Cody; Li, Yonghui; Varga, Kálmán

    2018-01-01

    An efficient and accurate basis comprised of Volkov states is implemented and tested for time-dependent simulations of interactions between strong laser pulses and crystalline solids. The Volkov states are eigenstates of the free electron Hamiltonian in an electromagnetic field and analytically represent the rapidly oscillating time-dependence of the orbitals, allowing significantly faster time propagation than conventional approaches. The Volkov approach can be readily implemented in plane-wave codes by multiplying the potential energy matrix elements with a simple time-dependent phase factor.

  16. Reduced nicotine product standards for combustible tobacco: building an empirical basis for effective regulation.

    Science.gov (United States)

    Donny, Eric C; Hatsukami, Dorothy K; Benowitz, Neal L; Sved, Alan F; Tidey, Jennifer W; Cassidy, Rachel N

    2014-11-01

    Both the Tobacco Control Act in the U.S. and Article 9 of the Framework Convention on Tobacco Control enable governments to directly address the addictiveness of combustible tobacco by reducing nicotine through product standards. Although nicotine may have some harmful effects, the detrimental health effects of smoked tobacco are primarily due to non-nicotine constituents. Hence, the health effects of nicotine reduction would likely be determined by changes in behavior that result in changes in smoke exposure. Herein, we review the current evidence on nicotine reduction and discuss some of the challenges in establishing the empirical basis for regulatory decisions. To date, research suggests that very low nicotine content cigarettes produce a desirable set of outcomes, including reduced exposure to nicotine, reduced smoking, and reduced dependence, without significant safety concerns. However, much is still unknown, including the effects of gradual versus abrupt changes in nicotine content, effects in vulnerable populations, and impact on youth. A coordinated effort must be made to provide the best possible scientific basis for regulatory decisions. The outcome of this effort may provide the foundation for a novel approach to tobacco control that dramatically reduces the devastating health consequences of smoked tobacco. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Basis material decomposition in spectral CT using a semi-empirical, polychromatic adaption of the Beer-Lambert model

    Science.gov (United States)

    Ehn, S.; Sellerer, T.; Mechlem, K.; Fehringer, A.; Epple, M.; Herzen, J.; Pfeiffer, F.; Noël, P. B.

    2017-01-01

    Following the development of energy-sensitive photon-counting detectors using high-Z sensor materials, application of spectral x-ray imaging methods to clinical practice comes into reach. However, these detectors require extensive calibration efforts in order to perform spectral imaging tasks like basis material decomposition. In this paper, we report a novel approach to basis material decomposition that utilizes a semi-empirical estimator for the number of photons registered in distinct energy bins in the presence of beam-hardening effects which can be termed as a polychromatic Beer-Lambert model. A maximum-likelihood estimator is applied to the model in order to obtain estimates of the underlying sample composition. Using a Monte-Carlo simulation of a typical clinical CT acquisition, the performance of the proposed estimator was evaluated. The estimator is shown to be unbiased and efficient according to the Cramér-Rao lower bound. In particular, the estimator is capable of operating with a minimum number of calibration measurements. Good results were obtained after calibration using less than 10 samples of known composition in a two-material attenuation basis. This opens up the possibility for fast re-calibration in the clinical routine which is considered an advantage of the proposed method over other implementations reported in the literature.

  18. Empirical STORM-E Model. [I. Theoretical and Observational Basis

    Science.gov (United States)

    Mertens, Christopher J.; Xu, Xiaojing; Bilitza, Dieter; Mlynczak, Martin G.; Russell, James M., III

    2013-01-01

    Auroral nighttime infrared emission observed by the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) instrument onboard the Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite is used to develop an empirical model of geomagnetic storm enhancements to E-region peak electron densities. The empirical model is called STORM-E and will be incorporated into the 2012 release of the International Reference Ionosphere (IRI). The proxy for characterizing the E-region response to geomagnetic forcing is NO+(v) volume emission rates (VER) derived from the TIMED/SABER 4.3 lm channel limb radiance measurements. The storm-time response of the NO+(v) 4.3 lm VER is sensitive to auroral particle precipitation. A statistical database of storm-time to climatological quiet-time ratios of SABER-observed NO+(v) 4.3 lm VER are fit to widely available geomagnetic indices using the theoretical framework of linear impulse-response theory. The STORM-E model provides a dynamic storm-time correction factor to adjust a known quiescent E-region electron density peak concentration for geomagnetic enhancements due to auroral particle precipitation. Part II of this series describes the explicit development of the empirical storm-time correction factor for E-region peak electron densities, and shows comparisons of E-region electron densities between STORM-E predictions and incoherent scatter radar measurements. In this paper, Part I of the series, the efficacy of using SABER-derived NO+(v) VER as a proxy for the E-region response to solar-geomagnetic disturbances is presented. Furthermore, a detailed description of the algorithms and methodologies used to derive NO+(v) VER from SABER 4.3 lm limb emission measurements is given. Finally, an assessment of key uncertainties in retrieving NO+(v) VER is presented

  19. Let the Weakest Link Go! Empirical Explorations on the Relative Importance of Weak and Strong Ties on Social Networking Sites

    OpenAIRE

    Nicole C. Krämer; Leonie Rösner; Sabrina C. Eimler; Stephan Winter; German Neubaum

    2014-01-01

    Theoretical approaches as well as empirical results in the area of social capital accumulation on social networking sites suggest that weak ties/bridging versus strong ties/bonding social capital should be distinguished and that while bonding social capital is connected to emotional support, bridging social capital entails the provision of information. Additionally, recent studies imply the notion that weak ties/bridging social capital are gaining increasing importance in today’s social media...

  20. Let the Weakest Link Go! Empirical Explorations on the Relative Importance of Weak and Strong Ties on Social Networking Sites

    Directory of Open Access Journals (Sweden)

    Nicole C. Krämer

    2014-12-01

    Full Text Available Theoretical approaches as well as empirical results in the area of social capital accumulation on social networking sites suggest that weak ties/bridging versus strong ties/bonding social capital should be distinguished and that while bonding social capital is connected to emotional support, bridging social capital entails the provision of information. Additionally, recent studies imply the notion that weak ties/bridging social capital are gaining increasing importance in today’s social media environments. By means of a survey (N = 317 we challenged these presuppositions by assessing the social support functions that are ascribed to three different types of contacts from participants’ network (weak, medium, or strong tie. In contrast to theoretical assumptions, we do not find that weak ties are experienced to supply informational support whereas strong ties first and foremost provide emotional support. Instead we find that within social networking sites, strong ties are perceived to provide both emotional and informational support and weak ties are perceived as less important than recent literature assumes.

  1. Basis for selecting optimum antibiotic regimens for secondary peritonitis.

    Science.gov (United States)

    Maseda, Emilio; Gimenez, Maria-Jose; Gilsanz, Fernando; Aguilar, Lorenzo

    2016-01-01

    Adequate management of severely ill patients with secondary peritonitis requires supportive therapy of organ dysfunction, source control of infection and antimicrobial therapy. Since secondary peritonitis is polymicrobial, appropriate empiric therapy requires combination therapy in order to achieve the needed coverage for both common and more unusual organisms. This article reviews etiological agents, resistance mechanisms and their prevalence, how and when to cover them and guidelines for treatment in the literature. Local surveillances are the basis for the selection of compounds in antibiotic regimens, which should be further adapted to the increasing number of patients with risk factors for resistance (clinical setting, comorbidities, previous antibiotic treatments, previous colonization, severity…). Inadequate antimicrobial regimens are strongly associated with unfavorable outcomes. Awareness of resistance epidemiology and of clinical consequences of inadequate therapy against resistant bacteria is crucial for clinicians treating secondary peritonitis, with delicate balance between optimization of empirical therapy (improving outcomes) and antimicrobial overuse (increasing resistance emergence).

  2. Pharmacological basis for the empirical use of Eugenia uniflora L. (Myrtaceae) as antihypertensive.

    Science.gov (United States)

    Consolini, A E; Baldini, O A; Amat, A G

    1999-07-01

    The rational basis for the use of Eugenia uniflora L. (Myrtaceae) as antihypertensive in Northeastern Argentina was assessed in normotensive rats. Intraperitoneal administration of the aqueous crude extract (ACE) decreased blood pressure (BP) of normotensive rats dose-dependently until 47.1 +/- 8.2% of control. The effective-dose 50 was 3.1 +/- 0.4 mg dried leaves/kg (d.l./kg) (yielding of ACE: 17% w/w). To determine the origin of hypotensive activity. Alpha-adrenergic antagonistic and vasorelaxant ACE activities were tested. The dose-response curve for phenylephrine on BP was inhibited non-competitively until 80% of its maximal effect (at 8 mg d.l. ACE/kg). Perfusion pressure (PP) of rat hindquarters (previously vasoconstricted by high-K+) was decreased by ACE in a concentration-dependent manner until -32.3 +/- 11.5% of tonic contraction at 1.2 g d.l. ACE/100 ml. In addition, A.C.E demonstrated diuretic activity at a dose (120 mg d.l./kg) higher than the hypotensive one. It was almost as potent as amiloride, but while amiloride induced loss of Na+ and saving of K+, ACE induced decrease in Na+ excretion. The results suggest that the empirical use of Eugenia uniflora L. (Myrtaceae) is mostly due to a hypotensive effect mediated by a direct vasodilating activity, and to a weak diuretic effect that could be related to an increase in renal blood flow.

  3. Building Strong Customer Relationships through Brand Orientation in Small Service Firms: An Empirical Investigation

    OpenAIRE

    Chovancová, Miloslava; Osakwe, Christian Nedu; Ogbonna, Benson U.

    2015-01-01

    The purpose of this paper is to empirically examine the relationship between the adoption of a brand orientation strategy and customer relationship performance in a small service firm setting. More specifically, in addition to investigating the direct link between brand orientation and customer relationship performance, we further examine the moderating effects of entrepreneurial orientation and perceived competitive intensity on the empirical link between brand orientation and customer relat...

  4. Bubble, weak and strong hyperinflation: Theory and empirical evidence

    Directory of Open Access Journals (Sweden)

    Fernando de Holanda Barbosa

    2015-05-01

    Full Text Available This paper presents a theoretical framework that allows a taxonomy of hyperinflation, namely: (i bubble, (ii weak and (iii strong hyperinflation. The inflation tax revenue curve is used to characterize each type of hyperinflation and we use this curve to test them. The bubble and strong hyperinflation hypotheses are rejected using Brazilian data. The weak hyperinflation hypothesis is not rejected and the economy could have been on the ‘wrong’ side of the Laffer curve during hyperinflation. This outcome, contrary to conventional wisdom, is predicted by this hypothesis, which presents a solution to an old puzzle of the hyperinflation literature.

  5. The Theoretical and Empirical Basis for Meditation as an Intervention for PTSD

    Science.gov (United States)

    Lang, Ariel J.; Strauss, Jennifer L.; Bomyea, Jessica; Bormann, Jill E.; Hickman, Steven D.; Good, Raquel C.; Essex, Michael

    2012-01-01

    In spite of the existence of good empirically supported treatments for posttraumatic stress disorder (PTSD), consumers and providers continue to ask for more options for managing this common and often chronic condition. Meditation-based approaches are being widely implemented, but there is minimal research rigorously assessing their effectiveness.…

  6. Federalism and regional health care expenditures: an empirical analysis for the Swiss cantons.

    Science.gov (United States)

    Crivelli, Luca; Filippini, Massimo; Mosca, Ilaria

    2006-05-01

    Switzerland (7.2 million inhabitants) is a federal state composed of 26 cantons. The autonomy of cantons and a particular health insurance system create strong heterogeneity in terms of regulation and organisation of health care services. In this study we use a single-equation approach to model the per capita cantonal expenditures on health care services and postulate that per capita health expenditures depend on some economic, demographic and structural factors. The empirical analysis demonstrates that a larger share of old people tends to increase health costs and that physicians paid on a fee-for-service basis swell expenditures, thus highlighting a possible phenomenon of supply-induced demand.

  7. The conceptual and empirical relationship between gambling, investing, and speculation.

    Science.gov (United States)

    Arthur, Jennifer N; Williams, Robert J; Delfabbro, Paul H

    2016-12-01

    Background and aims To review the conceptual and empirical relationship between gambling, investing, and speculation. Methods An analysis of the attributes differentiating these constructs as well as identification of all articles speaking to their empirical relationship. Results Gambling differs from investment on many different attributes and should be seen as conceptually distinct. On the other hand, speculation is conceptually intermediate between gambling and investment, with a few of its attributes being investment-like, some of its attributes being gambling-like, and several of its attributes being neither clearly gambling or investment-like. Empirically, gamblers, investors, and speculators have similar cognitive, motivational, and personality attributes, with this relationship being particularly strong for gambling and speculation. Population levels of gambling activity also tend to be correlated with population level of financial speculation. At an individual level, speculation has a particularly strong empirical relationship to gambling, as speculators appear to be heavily involved in traditional forms of gambling and problematic speculation is strongly correlated with problematic gambling. Discussion and conclusions Investment is distinct from gambling, but speculation and gambling have conceptual overlap and a strong empirical relationship. It is recommended that financial speculation be routinely included when assessing gambling involvement, and there needs to be greater recognition and study of financial speculation as both a contributor to problem gambling as well as an additional form of behavioral addiction in its own right.

  8. Multiscale empirical interpolation for solving nonlinear PDEs

    KAUST Repository

    Calo, Victor M.

    2014-12-01

    In this paper, we propose a multiscale empirical interpolation method for solving nonlinear multiscale partial differential equations. The proposed method combines empirical interpolation techniques and local multiscale methods, such as the Generalized Multiscale Finite Element Method (GMsFEM). To solve nonlinear equations, the GMsFEM is used to represent the solution on a coarse grid with multiscale basis functions computed offline. Computing the GMsFEM solution involves calculating the system residuals and Jacobians on the fine grid. We use empirical interpolation concepts to evaluate these residuals and Jacobians of the multiscale system with a computational cost which is proportional to the size of the coarse-scale problem rather than the fully-resolved fine scale one. The empirical interpolation method uses basis functions which are built by sampling the nonlinear function we want to approximate a limited number of times. The coefficients needed for this approximation are computed in the offline stage by inverting an inexpensive linear system. The proposed multiscale empirical interpolation techniques: (1) divide computing the nonlinear function into coarse regions; (2) evaluate contributions of nonlinear functions in each coarse region taking advantage of a reduced-order representation of the solution; and (3) introduce multiscale proper-orthogonal-decomposition techniques to find appropriate interpolation vectors. We demonstrate the effectiveness of the proposed methods on several nonlinear multiscale PDEs that are solved with Newton\\'s methods and fully-implicit time marching schemes. Our numerical results show that the proposed methods provide a robust framework for solving nonlinear multiscale PDEs on a coarse grid with bounded error and significant computational cost reduction.

  9. The empirical equilibrium structure of diacetylene

    OpenAIRE

    Thorwirth, S.; Harding, M. E.; Muders, D.; Gauss, J.

    2008-01-01

    High-level quantum-chemical calculations are reported at the MP2 and CCSD(T) levels of theory for the equilibrium structure and the harmonic and anharmonic force fields of diacetylene, HCCCCH. The calculations were performed employing Dunning's hierarchy of correlation-consistent basis sets cc-pVXZ, cc-pCVXZ, and cc-pwCVXZ, as well as the ANO2 basis set of Almloef and Taylor. An empirical equilibrium structure based on experimental rotational constants for thirteen isotopic species of diacety...

  10. Prediction of strong ground motion based on scaling law of earthquake

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro; Irikura, Kojiro; Fukuchi, Yasunaga.

    1991-01-01

    In order to predict more practically strong ground motion, it is important to study how to use a semi-empirical method in case of having no appropriate observation records for actual small-events as empirical Green's functions. We propose a prediction procedure using artificially simulated small ground motions as substitute for the actual motions. First, we simulate small-event motion by means of stochastic simulation method proposed by Boore (1983) in considering pass effects such as attenuation, and broadening of waveform envelope empirically in the objective region. Finally, we attempt to predict the strong ground motion due to a future large earthquake (M 7, Δ = 13 km) using the same summation procedure as the empirical Green's function method. We obtained the results that the characteristics of the synthetic motion using M 5 motion were in good agreement with those by the empirical Green's function method. (author)

  11. Empirical Analysis of Closed-Loop Duopoly Advertising Strategies

    OpenAIRE

    Gary M. Erickson

    1992-01-01

    Closed-loop (perfect) equilibria in a Lanchester duopoly differential game of advertising competition are used as the basis for empirical investigation. Two systems of simultaneous nonlinear equations are formed, one from a general Lanchester model and one from a constrained model. Two empirical applications are conducted. In one involving Coca-Cola and Pepsi-Cola, a formal statistical testing procedure is used to detect whether closed-loop equilibrium advertising strategies are used by the c...

  12. New strong motion network in Georgia: basis for specifying seismic hazard

    Science.gov (United States)

    Kvavadze, N.; Tsereteli, N. S.

    2017-12-01

    Risk created by hazardous natural events is closely related to sustainable development of the society. Global observations have confirmed tendency of growing losses resulting from natural disasters, one of the most dangerous and destructive if which are earthquakes. Georgia is located in seismically active region. So, it is imperative to evaluate probabilistic seismic hazard and seismic risk with proper accuracy. National network of Georgia includes 35 station all of which are seismometers. There are significant gaps in strong motion recordings, which essential for seismic hazard assessment. To gather more accelerometer recordings, we have built a strong motion network distributed on the territory of Georgia. The network includes 6 stations for now, with Basalt 4x datalogger and strong motion sensor Episensor ES-T. For each site, Vs30 and soil resonance frequencies have been measured. Since all but one station (Tabakhmelam near Tbilisi), are located far from power and internet lines special system was created for instrument operation. Solar power is used to supply the system with electricity and GSM/LTE modems for internet access. VPN tunnel was set up using Raspberry pi, for two-way communication with stations. Tabakhmela station is located on grounds of Ionosphere Observatory, TSU and is used as a hub for the network. This location also includes a broadband seismometer and VLF electromagnetic waves observation antenna, for possible earthquake precursor studies. On server, located in Tabakhmela, the continues data is collected from all the stations, for later use. The recordings later will be used in different seismological and engineering problems, namely selecting and creating GMPE model for Caucasus, for probabilistic seismic hazard and seismic risk evaluation. These stations are a start and in the future expansion of strong motion network is planned. Along with this, electromagnetic wave observations will continue and additional antennas will be implemented

  13. Integrated empirical ethics: loss of normativity?

    Science.gov (United States)

    van der Scheer, Lieke; Widdershoven, Guy

    2004-01-01

    An important discussion in contemporary ethics concerns the relevance of empirical research for ethics. Specifically, two crucial questions pertain, respectively, to the possibility of inferring normative statements from descriptive statements, and to the danger of a loss of normativity if normative statements should be based on empirical research. Here we take part in the debate and defend integrated empirical ethical research: research in which normative guidelines are established on the basis of empirical research and in which the guidelines are empirically evaluated by focusing on observable consequences. We argue that in our concrete example normative statements are not derived from descriptive statements, but are developed within a process of reflection and dialogue that goes on within a specific praxis. Moreover, we show that the distinction in experience between the desirable and the undesirable precludes relativism. The normative guidelines so developed are both critical and normative: they help in choosing the right action and in evaluating that action. Finally, following Aristotle, we plead for a return to the view that morality and ethics are inherently related to one another, and for an acknowledgment of the fact that moral judgments have their origin in experience which is always related to historical and cultural circumstances.

  14. Empirically Based Myths: Astrology, Biorhythms, and ATIs.

    Science.gov (United States)

    Ragsdale, Ronald G.

    1980-01-01

    A myth may have an empirical basis through chance occurrence; perhaps Aptitude Treatment Interactions (ATIs) are in this category. While ATIs have great utility in describing, planning, and implementing instruction, few disordinal interactions have been found. Article suggests narrowing of ATI research with replications and estimates of effect…

  15. Empirical equations for the prediction of PGA and pseudo spectral accelerations using Iranian strong-motion data

    Science.gov (United States)

    Zafarani, H.; Luzi, Lucia; Lanzano, Giovanni; Soghrat, M. R.

    2018-01-01

    A recently compiled, comprehensive, and good-quality strong-motion database of the Iranian earthquakes has been used to develop local empirical equations for the prediction of peak ground acceleration (PGA) and 5%-damped pseudo-spectral accelerations (PSA) up to 4.0 s. The equations account for style of faulting and four site classes and use the horizontal distance from the surface projection of the rupture plane as a distance measure. The model predicts the geometric mean of horizontal components and the vertical-to-horizontal ratio. A total of 1551 free-field acceleration time histories recorded at distances of up to 200 km from 200 shallow earthquakes (depth < 30 km) with moment magnitudes ranging from Mw 4.0 to 7.3 are used to perform regression analysis using the random effects algorithm of Abrahamson and Youngs (Bull Seism Soc Am 82:505-510, 1992), which considers between-events as well as within-events errors. Due to the limited data used in the development of previous Iranian ground motion prediction equations (GMPEs) and strong trade-offs between different terms of GMPEs, it is likely that the previously determined models might have less precision on their coefficients in comparison to the current study. The richer database of the current study allows improving on prior works by considering additional variables that could not previously be adequately constrained. Here, a functional form used by Boore and Atkinson (Earthquake Spect 24:99-138, 2008) and Bindi et al. (Bull Seism Soc Am 9:1899-1920, 2011) has been adopted that allows accounting for the saturation of ground motions at close distances. A regression has been also performed for the V/H in order to retrieve vertical components by scaling horizontal spectra. In order to take into account epistemic uncertainty, the new model can be used along with other appropriate GMPEs through a logic tree framework for seismic hazard assessment in Iran and Middle East region.

  16. Empirical prediction of ash deposition propensities in coal-fired utilities

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, F.

    1997-01-01

    This report contain an outline of some of the ash chemistry indices utilized in the EPREDEPO (Empirical PREdiction of DEPOsition) PC-program, version 1.0 (DEPO10), developed by Flemming Frandsen, The CHEC Research Programme, at the Department of Chemical Engineering, Technical University of Denmark. DEPO10 is a 1st generation FTN77 Fortran PC-programme designed to empirically predict ash deposition propensities in coal-fired utility boilers. Expectational data (empirical basis) from an EPRI-sponsored survey of ash deposition experiences at coal-fired utility boilers, performed by Battelle, have been tested for use on Danish coal chemistry - boiler operational conditions, in this study. (au) 31 refs.

  17. Site classification for National Strong Motion Observation Network System (NSMONS) stations in China using an empirical H/V spectral ratio method

    Science.gov (United States)

    Ji, Kun; Ren, Yefei; Wen, Ruizhi

    2017-10-01

    Reliable site classification of the stations of the China National Strong Motion Observation Network System (NSMONS) has not yet been assigned because of lacking borehole data. This study used an empirical horizontal-to-vertical (H/V) spectral ratio (hereafter, HVSR) site classification method to overcome this problem. First, according to their borehole data, stations selected from KiK-net in Japan were individually assigned a site class (CL-I, CL-II, or CL-III), which is defined in the Chinese seismic code. Then, the mean HVSR curve for each site class was computed using strong motion recordings captured during the period 1996-2012. These curves were compared with those proposed by Zhao et al. (2006a) for four types of site classes (SC-I, SC-II, SC-III, and SC-IV) defined in the Japanese seismic code (JRA, 1980). It was found that an approximate range of the predominant period Tg could be identified by the predominant peak of the HVSR curve for the CL-I and SC-I sites, CL-II and SC-II sites, and CL-III and SC-III + SC-IV sites. Second, an empirical site classification method was proposed based on comprehensive consideration of peak period, amplitude, and shape of the HVSR curve. The selected stations from KiK-net were classified using the proposed method. The results showed that the success rates of the proposed method in identifying CL-I, CL-II, and CL-III sites were 63%, 64%, and 58% respectively. Finally, the HVSRs of 178 NSMONS stations were computed based on recordings from 2007 to 2015 and the sites classified using the proposed method. The mean HVSR curves were re-calculated for three site classes and compared with those from KiK-net data. It was found that both the peak period and the amplitude were similar for the mean HVSR curves derived from NSMONS classification results and KiK-net borehole data, implying the effectiveness of the proposed method in identifying different site classes. The classification results have good agreement with site classes

  18. The impact of SOA for achieving healthcare interoperability. An empirical investigation based on a hypothetical adoption.

    Science.gov (United States)

    Daskalakis, S; Mantas, J

    2009-01-01

    The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.

  19. Synthetic strong ground motions for engineering design utilizing empirical Green`s functions

    Energy Technology Data Exchange (ETDEWEB)

    Hutchings, L.J.; Jarpe, S.P.; Kasameyer, P.W.; Foxall, W.

    1996-04-11

    We present a methodology for developing realistic synthetic strong ground motions for specific sites from specific earthquakes. We analyzed the possible ground motion resulting from a M = 7.25 earthquake that ruptures 82 km of the Hayward fault for a site 1.4 km from the fault in the eastern San Francisco Bay area. We developed a suite of 100 rupture scenarios for the Hayward fault earthquake and computed the corresponding strong ground motion time histories. We synthesized strong ground motion with physics-based solutions of earthquake rupture and applied physical bounds on rupture parameters. By having a suite of rupture scenarios of hazardous earthquakes for a fixed magnitude and identifying the hazard to the site from the statistical distribution of engineering parameters, we introduce a probabilistic component into the deterministic hazard calculation. Engineering parameters of synthesized ground motions agree with those recorded from the 1995 Kobe, Japan and the 1992 Landers, California earthquakes at similar distances and site geologies.

  20. Beyond Clinical Case Studies in Psychoanalysis: A Review of Psychoanalytic Empirical Single Case Studies Published in ISI-Ranked Journals

    Science.gov (United States)

    Meganck, Reitske; Inslegers, Ruth; Krivzov, Juri; Notaerts, Liza

    2017-01-01

    Single case studies are at the origin of both theory development and research in the field of psychoanalysis and psychotherapy. While clinical case studies are the hallmark of psychoanalytic theory and practice, their scientific value has been strongly criticized. To address problems with the subjective bias of retrospective therapist reports and uncontrollability of clinical case studies, systematic approaches to investigate psychotherapy process and outcome at the level of the single case have been developed. Such empirical case studies are also able to bridge the famous gap between academic research and clinical practice as they provide clinically relevant insights into how psychotherapy works. This study presents a review of psychoanalytic empirical case studies published in ISI-ranked journals and maps the characteristics of the study, therapist, patient en therapies that are investigated. Empirical case studies increased in quantity and quality (amount of information and systematization) over time. While future studies could pay more attention to providing contextual information on therapist characteristics and informed consent considerations, the available literature provides a basis to conduct meta-studies of single cases and as such contribute to knowledge aggregation. PMID:29046660

  1. Empirical evaluation and justification of methodologies in psychological science.

    Science.gov (United States)

    Proctor, R W; Capaldi, E J

    2001-11-01

    The purpose of this article is to describe a relatively new movement in the history and philosophy of science, naturalism, a form of pragmatism emphasizing that methodological principles are empirical statements. Thus, methodological principles must be evaluated and justified on the same basis as other empirical statements. On this view, methodological statements may be less secure than the specific scientific theories to which they give rise. The authors examined the feasibility of a naturalistic approach to methodology using logical and historical analysis and by contrasting theories that predict new facts versus theories that explain already known facts. They provide examples of how differences over methodological issues in psychology and in science generally may be resolved using a naturalistic, or empirical, approach.

  2. Empirical Descriptions of Criminal Sentencing Decision-Making

    Directory of Open Access Journals (Sweden)

    Rasmus H. Wandall

    2014-05-01

    Full Text Available The article addresses the widespread use of statistical causal modelling to describe criminal sentencing decision-making empirically in Scandinavia. The article describes the characteristics of this model, and on this basis discusses three aspects of sentencing decision-making that the model does not capture: 1 the role of law and legal structures in sentencing, 2 the processes of constructing law and facts as they occur in the processes of handling criminal cases, and 3 reflecting newer organisational changes to sentencing decision-making. The article argues for a stronger empirically based design of sentencing models and for a more balanced use of different social scientific methodologies and models of sentencing decision-making.

  3. Building Strong Customer Relationships through Brand Orientation in Small Service Firms: An Empirical Investigation

    Directory of Open Access Journals (Sweden)

    Miloslava Chovancová

    2015-06-01

    Full Text Available The purpose of this paper is to empirically examine the relationship between the adoption of a brand orientation strategy and customer relationship performance in a small service firm setting. More specifically, in addition to investigating the direct link between brand orientation and customer relationship performance, we further examine the moderating effects of entrepreneurial orientation and perceived competitive intensity on the empirical link between brand orientation and customer relationship performance. To test the hypothesized relationships in the conceptual framework, 105 usable structured questionnaires were collected from small service firms and the data were further analyzed using a hierarchical, moderated regression analysis. The results affirm the positive link between brand orientation and customer relationship performance. Moreover, entrepreneurial orientation is found to strengthen the brand orientation-customer relationship performance link. However, our results show that competitive intensity does not significantly moderate the brand orientation-customer relationship performance link. Nonetheless, it is highly suggestive that perceived competitive intensity is a direct predictor of customer relationship performance. In terms of the practical significance of the overall research model, the effect size is fairly large (Cohen’s f 2 = 0.33. The research implications and directions for future research are further highlighted in the penultimate section of the paper.

  4. An empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  5. Physician assistant job satisfaction : a narrative review of empirical research

    NARCIS (Netherlands)

    Hooker, Roderick S; Kuilman, Luppo; Everett, Christine M

    2015-01-01

    PURPOSE: To examine physician assistant (PA) job satisfaction and identify factors predicting job satisfaction and identify areas of needed research. With a global PA movement underway and a half-century in development, the empirical basis for informing employers of approaches to improve job

  6. Internal Variations in Empirical Oxygen Abundances for Giant H II Regions in the Galaxy NGC 2403

    Science.gov (United States)

    Mao, Ye-Wei; Lin, Lin; Kong, Xu

    2018-02-01

    This paper presents a spectroscopic investigation of 11 {{H}} {{II}} regions in the nearby galaxy NGC 2403. The {{H}} {{II}} regions are observed with a long-slit spectrograph mounted on the 2.16 m telescope at XingLong station of National Astronomical Observatories of China. For each of the {{H}} {{II}} regions, spectra are extracted at different nebular radii along the slit-coverage. Oxygen abundances are empirically estimated from the strong-line indices R23, N2O2, O3N2, and N2 for each spectrophotometric unit, with both observation- and model-based calibrations adopted into the derivation. Radial profiles of these diversely estimated abundances are drawn for each nebula. In the results, the oxygen abundances separately estimated with the prescriptions on the basis of observations and models, albeit from the same spectral index, systematically deviate from each other; at the same time, the spectral indices R23 and N2O2 are distributed with flat profiles, whereas N2 and O3N2 exhibit apparent gradients with the nebular radius. Because our study naturally samples various ionization levels, which inherently decline at larger radii within individual {{H}} {{II}} regions, the radial distributions indicate not only the robustness of R23 and N2O2 against ionization variations but also the sensitivity of N2 and O3N2 to the ionization parameter. The results in this paper provide observational corroboration of the theoretical prediction about the deviation in the empirical abundance diagnostics. Our future work is planned to investigate metal-poor {{H}} {{II}} regions with measurable T e, in an attempt to recalibrate the strong-line indices and consequently disclose the cause of the discrepancies between the empirical oxygen abundances.

  7. Sparsity guided empirical wavelet transform for fault diagnosis of rolling element bearings

    Science.gov (United States)

    Wang, Dong; Zhao, Yang; Yi, Cai; Tsui, Kwok-Leung; Lin, Jianhui

    2018-02-01

    Rolling element bearings are widely used in various industrial machines, such as electric motors, generators, pumps, gearboxes, railway axles, turbines, and helicopter transmissions. Fault diagnosis of rolling element bearings is beneficial to preventing any unexpected accident and reducing economic loss. In the past years, many bearing fault detection methods have been developed. Recently, a new adaptive signal processing method called empirical wavelet transform attracts much attention from readers and engineers and its applications to bearing fault diagnosis have been reported. The main problem of empirical wavelet transform is that Fourier segments required in empirical wavelet transform are strongly dependent on the local maxima of the amplitudes of the Fourier spectrum of a signal, which connotes that Fourier segments are not always reliable and effective if the Fourier spectrum of the signal is complicated and overwhelmed by heavy noises and other strong vibration components. In this paper, sparsity guided empirical wavelet transform is proposed to automatically establish Fourier segments required in empirical wavelet transform for fault diagnosis of rolling element bearings. Industrial bearing fault signals caused by single and multiple railway axle bearing defects are used to verify the effectiveness of the proposed sparsity guided empirical wavelet transform. Results show that the proposed method can automatically discover Fourier segments required in empirical wavelet transform and reveal single and multiple railway axle bearing defects. Besides, some comparisons with three popular signal processing methods including ensemble empirical mode decomposition, the fast kurtogram and the fast spectral correlation are conducted to highlight the superiority of the proposed method.

  8. An empirical investigation of Australian Stock Exchange data

    Science.gov (United States)

    Bertram, William K.

    2004-10-01

    We present an empirical study of high frequency Australian equity data examining the behaviour of distribution tails and the existence of long memory. A method is presented allowing us to deal with Australian Stock Exchange data by splitting it into two separate data series representing an intraday and overnight component. Power-law exponents for the empirical density functions are estimated and compared with results from other studies. Using the autocorrelation and variance plots we find there to be a strong indication of long-memory type behaviour in the absolute return, volume and transaction frequency.

  9. Rating environmental noise on the basis of noise maps

    NARCIS (Netherlands)

    Miedema, H.M.E.; Borst, H.C.

    2006-01-01

    A system that rates noise on the basis of noise maps has been developed which is based on empirical exposure-response relationships, so that effects in the community will be lower if the system gives a better rating. It is consistent with noise metrics and effect endpoint chosen in the EU, i.e., it

  10. A global empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  11. Internet governance and global self regulation: theoretical and empirical building blocks for a general theory of self regulation

    NARCIS (Netherlands)

    Vey Mestdagh, C.; Rijgersberg, R.

    2010-01-01

    The following exposition sets out to identify the basic theoretical and empirical building blocks for a general theory of self-regulation. It uses the Internet as an empirical basis since its global reach and technical characteristics create interdependencies between actors that transcend national

  12. First-principle modelling of forsterite surface properties: Accuracy of methods and basis sets.

    Science.gov (United States)

    Demichelis, Raffaella; Bruno, Marco; Massaro, Francesco R; Prencipe, Mauro; De La Pierre, Marco; Nestola, Fabrizio

    2015-07-15

    The seven main crystal surfaces of forsterite (Mg2 SiO4 ) were modeled using various Gaussian-type basis sets, and several formulations for the exchange-correlation functional within the density functional theory (DFT). The recently developed pob-TZVP basis set provides the best results for all properties that are strongly dependent on the accuracy of the wavefunction. Convergence on the structure and on the basis set superposition error-corrected surface energy can be reached also with poorer basis sets. The effect of adopting different DFT functionals was assessed. All functionals give the same stability order for the various surfaces. Surfaces do not exhibit any major structural differences when optimized with different functionals, except for higher energy orientations where major rearrangements occur around the Mg sites at the surface or subsurface. When dispersions are not accounted for, all functionals provide similar surface energies. The inclusion of empirical dispersions raises the energy of all surfaces by a nearly systematic value proportional to the scaling factor s of the dispersion formulation. An estimation for the surface energy is provided through adopting C6 coefficients that are more suitable than the standard ones to describe O-O interactions in minerals. A 2 × 2 supercell of the most stable surface (010) was optimized. No surface reconstruction was observed. The resulting structure and surface energy show no difference with respect to those obtained when using the primitive cell. This result validates the (010) surface model here adopted, that will serve as a reference for future studies on adsorption and reactivity of water and carbon dioxide at this interface. © 2015 Wiley Periodicals, Inc.

  13. Estimation of strong ground motion

    International Nuclear Information System (INIS)

    Watabe, Makoto

    1993-01-01

    Fault model has been developed to estimate a strong ground motion in consideration of characteristics of seismic source and propagation path of seismic waves. There are two different approaches in the model. The first one is a theoretical approach, while the second approach is a semi-empirical approach. Though the latter is more practical than the former to be applied to the estimation of input motions, it needs at least the small-event records, the value of the seismic moment of the small event and the fault model of the large event

  14. Behavioural processes in marketing channel relationships: Review and integration of empirical evidence

    DEFF Research Database (Denmark)

    Jensen, Nils Bøgelund; Skytte, Hans

    1997-01-01

    This paper reviews the empirical research on behavioural processes in marketing channel relationships. Systematically examining nine international journals, we find 49 papers on behavioural processes. On the basis of the hypothesis tests in the papers, we discuss the results and integrate...

  15. Strong anticipation and long-range cross-correlation: Application of detrended cross-correlation analysis to human behavioral data

    Science.gov (United States)

    Delignières, Didier; Marmelat, Vivien

    2014-01-01

    In this paper, we analyze empirical data, accounting for coordination processes between complex systems (bimanual coordination, interpersonal coordination, and synchronization with a fractal metronome), by using a recently proposed method: detrended cross-correlation analysis (DCCA). This work is motivated by the strong anticipation hypothesis, which supposes that coordination between complex systems is not achieved on the basis of local adaptations (i.e., correction, predictions), but results from a more global matching of complexity properties. Indeed, recent experiments have evidenced a very close correlation between the scaling properties of the series produced by two coordinated systems, despite a quite weak local synchronization. We hypothesized that strong anticipation should result in the presence of long-range cross-correlations between the series produced by the two systems. Results allow a detailed analysis of the effects of coordination on the fluctuations of the series produced by the two systems. In the long term, series tend to present similar scaling properties, with clear evidence of long-range cross-correlation. Short-term results strongly depend on the nature of the task. Simulation studies allow disentangling the respective effects of noise and short-term coupling processes on DCCA results, and suggest that the matching of long-term fluctuations could be the result of short-term coupling processes.

  16. Horizontal and Vertical Intra-Industry Trade: Is the Empirical Classification Usable?

    DEFF Research Database (Denmark)

    Nielsen, Jørgen Ulff-Møller; Lüthje, Teit

    2002-01-01

    On the basis of OECD trade statistics at SITC 5 digit level for the period 1961-1999 this paper shows the classification of international trade in (1) inter-industry trade; (2) horizontal intra-industry; and (3) vertical intra-industry trade used in the empirical trade literature to be unstable...

  17. Threshold model of cascades in empirical temporal networks

    Science.gov (United States)

    Karimi, Fariba; Holme, Petter

    2013-08-01

    Threshold models try to explain the consequences of social influence like the spread of fads and opinions. Along with models of epidemics, they constitute a major theoretical framework of social spreading processes. In threshold models on static networks, an individual changes her state if a certain fraction of her neighbors has done the same. When there are strong correlations in the temporal aspects of contact patterns, it is useful to represent the system as a temporal network. In such a system, not only contacts but also the time of the contacts are represented explicitly. In many cases, bursty temporal patterns slow down disease spreading. However, as we will see, this is not a universal truth for threshold models. In this work we propose an extension of Watts’s classic threshold model to temporal networks. We do this by assuming that an agent is influenced by contacts which lie a certain time into the past. I.e., the individuals are affected by contacts within a time window. In addition to thresholds in the fraction of contacts, we also investigate the number of contacts within the time window as a basis for influence. To elucidate the model’s behavior, we run the model on real and randomized empirical contact datasets.

  18. On calculation of difference in specific heats at constant pressure and constant volume using an empiric Nernst-Lindeman equation

    International Nuclear Information System (INIS)

    Leont'ev, K.L.

    1981-01-01

    Known theoretical and empirical formulae are considered for the difference in specific heats at constant pressure and volume. On the basis of the Grunaiser law on the ratio of specific heat to thermal expansion and on the basis of the correlation proposed by the author, between this ratio and average velocity of elastic waves obtained in a new expression for the difference in specific heats and determined are conditions at which empiric Nernst-Lindeman equation can be considered to be strict. Results of calculations for metals with fcc lattice are presented

  19. Economic Growth and Transboundary Pollution in Europe. An Empirical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ansuategi, A. [Ekonomi Analisiaren Oinarriak I Saila, Ekonomi Zientzien Fakultatea, Lehendakari Agirre Etorbidea, 83, 48015 Bilbao (Spain)

    2003-10-01

    The existing empirical evidence suggests that environmental Kuznets curves only exist for pollutants with semi-local and medium term impacts. Ansuategi and Perrings (2000) have considered the behavioral basis for the correlation observed between different spatial incidence of environmental degradation and the relation between economic growth and environmental quality. They show that self-interested planners following a Nash-type strategy tend to address environmental effects sequentially: addressing those with the most immediate costs first, and those whose costs are displaced in space later. This paper tests such behavioral basis in the context of sulphur dioxide emissions in Europe.

  20. Economic Growth and Transboundary Pollution in Europe. An Empirical Analysis

    International Nuclear Information System (INIS)

    Ansuategi, A.

    2003-01-01

    The existing empirical evidence suggests that environmental Kuznets curves only exist for pollutants with semi-local and medium term impacts. Ansuategi and Perrings (2000) have considered the behavioral basis for the correlation observed between different spatial incidence of environmental degradation and the relation between economic growth and environmental quality. They show that self-interested planners following a Nash-type strategy tend to address environmental effects sequentially: addressing those with the most immediate costs first, and those whose costs are displaced in space later. This paper tests such behavioral basis in the context of sulphur dioxide emissions in Europe

  1. Uniform convergence of the empirical spectral distribution function

    NARCIS (Netherlands)

    Mikosch, T; Norvaisa, R

    1997-01-01

    Let X be a linear process having a finite fourth moment. Assume F is a class of square-integrable functions. We consider the empirical spectral distribution function J(n,X) based on X and indexed by F. If F is totally bounded then J(n,X) satisfies a uniform strong law of large numbers. If, in

  2. Factors influencing the potential for strong brand relationships with consumer product brands: An overview and research agenda

    DEFF Research Database (Denmark)

    Bech-Larsen, Tino; Bergkvist, Lars; Francis, Julie

    Based on the premise that consumer product brands are different with respect to their potential to form strong long-term relationships with consumers, this paper aims to identify factors that influence brands' potential for strong long-term relationships and to suggest how these can be empirically...... investigated. The paper reviews brand-centric and consumer-centric research and identifies twelve brand variables that may influence the relationship potential of consumer product brands. A research agenda is suggested and a number of issues that needs to be resolved before empirical research can be carried...... out are discussed. The paper concludes by speculating on possible outcomes in future empirical studies and it is suggested that multiple brand variables will have to be employed to evaluate the relationship potential of brands....

  3. Coulomb interaction in the supermultiplet basis

    International Nuclear Information System (INIS)

    Ruzha, Ya.Kh.; Guseva, T.V.; Tamberg, Yu.Ya.; Vanagas, V.V.

    1989-01-01

    An approximate expression for the matrix elements of the Coulomb interaction operator in the supermultiplet basis has been derived with the account for the orbitally-nonsymmetric terms. From the general expression a simplified formula for the Coulomb interaction energy has been proposed. On the basis of the expression obtained the contribution of the Coulomb interaction to the framework of a strongly restricted dynamic model in the light (4≤A≤40) and heavy (158≤A≤196) nuclei region has been studied. 19 refs.; 4 tabs

  4. Understanding the Organizational Nature of Student Persistence: Empirically-based Recommendations for Practice.

    Science.gov (United States)

    Berger, Joseph B.

    2002-01-01

    Builds on the assumption that colleges and universities are organizations and subsequently that the organizational perspective provides important insights for improving retention on campuses. A review of existing organizational studies of undergraduate persistence serves as the basis for ten empirically-based recommendations for practice that are…

  5. The empirical equilibrium structure of diacetylene

    Science.gov (United States)

    Thorwirth, Sven; Harding, Michael E.; Muders, Dirk; Gauss, Jürgen

    2008-09-01

    High-level quantum-chemical calculations are reported at the MP2 and CCSD(T) levels of theory for the equilibrium structure and the harmonic and anharmonic force fields of diacetylene, H sbnd C tbnd C sbnd C tbnd C sbnd H. The calculations were performed employing Dunning's hierarchy of correlation-consistent basis sets cc-pV XZ, cc-pCV XZ, and cc-pwCV XZ, as well as the ANO2 basis set of Almlöf and Taylor. An empirical equilibrium structure based on experimental rotational constants for 13 isotopic species of diacetylene and computed zero-point vibrational corrections is determined (reemp:r=1.0615 Å,r=1.2085 Å,r=1.3727 Å) and in good agreement with the best theoretical structure (CCSD(T)/cc-pCV5Z: r=1.0617 Å, r=1.2083 Å, r=1.3737 Å). In addition, the computed fundamental vibrational frequencies are compared with the available experimental data and found in satisfactory agreement.

  6. Assessment of empirical formulae for local response of concrete structures to hard projectile impact

    International Nuclear Information System (INIS)

    Buzaud, E.; Cazaubon, Ch.; Chauvel, D.

    2007-01-01

    The outcome of the impact of a hard projectile on a reinforced concrete structure is affected by different parameters such as the configuration of the interaction, the projectile geometry, mass and velocity and the target geometry, reinforcement, and concrete mechanical properties. Those parameters have been investigated experimentally during the last 30 years, hence providing a basis of simplified mathematical models like empirical formulae. The aim of the authors is to assess the relative performances of classical and more recent empirical formulae. (authors)

  7. Intuition in Decision Making –Theoretical and Empirical Aspects

    Directory of Open Access Journals (Sweden)

    Kamila Malewska

    2015-11-01

    Full Text Available In an economy dominated by information and knowledge, analysis ceases to be the sole and sufficient source of knowledge. Managers seek alternative ways of obtaining and interpreting information and knowledge. Here, managerial intuitive potential begins to play an important role. The aim of this paper is to present the issue of intuition in decision making in both theoretical and empirical terms. The first part presents the essence of intuition and its role in management, especially in decision making. Then, the empirical part attempts to identify the intuitive potential of managers and the extent of its use in practical decision making. The case study method was used in order to achieve this goal. The analysis involved a Polish food company “Fawor” that employs more than 300 workers. These literature and empirical studies in the area of intuition were conducted within the research project „The impact of managerial intuitive potential on the effectiveness of decision making processes”, financed by the National Science Centre, Poland (funds allocated on the basis of decision No. DEC-2014/13/D/HS4/01750

  8. Empirical continuation of the differential cross section

    International Nuclear Information System (INIS)

    Borbely, I.

    1978-12-01

    The theoretical basis as well as the practical methods of empirical continuation of the differential cross section into the nonphysical region of the cos theta variable are discussed. The equivalence of the different methods is proved. A physical applicability condition is given and the published applications are reviewed. In many cases the correctly applied procedure turns out to provide nonsignificant or even incorrect structure information which points to the necessity for careful and statistically complete analysis of the experimental data with a physical understanding of the analysed process. (author)

  9. Theoretical and Empirical Descriptions of Thermospheric Density

    Science.gov (United States)

    Solomon, S. C.; Qian, L.

    2004-12-01

    The longest-term and most accurate overall description the density of the upper thermosphere is provided by analysis of change in the ephemeris of Earth-orbiting satellites. Empirical models of the thermosphere developed in part from these measurements can do a reasonable job of describing thermospheric properties on a climatological basis, but the promise of first-principles global general circulation models of the coupled thermosphere/ionosphere system is that a true high-resolution, predictive capability may ultimately be developed for thermospheric density. However, several issues are encountered when attempting to tune such models so that they accurately represent absolute densities as a function of altitude, and their changes on solar-rotational and solar-cycle time scales. Among these are the crucial ones of getting the heating rates (from both solar and auroral sources) right, getting the cooling rates right, and establishing the appropriate boundary conditions. However, there are several ancillary issues as well, such as the problem of registering a pressure-coordinate model onto an altitude scale, and dealing with possible departures from hydrostatic equilibrium in empirical models. Thus, tuning a theoretical model to match empirical climatology may be difficult, even in the absence of high temporal or spatial variation of the energy sources. We will discuss some of the challenges involved, and show comparisons of simulations using the NCAR Thermosphere-Ionosphere-Electrodynamics General Circulation Model (TIE-GCM) to empirical model estimates of neutral thermosphere density and temperature. We will also show some recent simulations using measured solar irradiance from the TIMED/SEE instrument as input to the TIE-GCM.

  10. Strong plasma shock structures based on the Navier--Stokes equations

    International Nuclear Information System (INIS)

    Abe, K.

    1975-01-01

    The structure of a plasma collisional shock wave is examined on the basis of the Navier--Stokes equations and simultaneously on the basis of the Fokker--Planck equation. The resultant structures are compared to check the validity of the Navier--Stokes equations applied to the structures of strong shock waves. The Navier--Stokes equations give quite correct structures for weak shock waves. For the strong shock waves, the detailed structures obtained from the Navier--Stokes equations differ from the results of the Fokker--Planck equation, but the shock thicknesses of the two shock waves are in relatively close agreement

  11. Empirical reality, empirical causality, and the measurement problem

    International Nuclear Information System (INIS)

    d'Espagnat, B.

    1987-01-01

    Does physics describe anything that can meaningfully be called independent reality, or is it merely operational? Most physicists implicitly favor an intermediate standpoint, which takes quantum physics into account, but which nevertheless strongly holds fast to quite strictly realistic ideas about apparently obvious facts concerning the macro-objects. Part 1 of this article, which is a survey of recent measurement theories, shows that, when made explicit, the standpoint in question cannot be upheld. Part 2 brings forward a proposal for making minimal changes to this standpoint in such a way as to remove such objections. The empirical reality thus constructed is a notion that, to some extent, does ultimately refer to the human means of apprehension and of data processing. It nevertheless cannot be said that it reduces to a mere name just labelling a set of recipes that never fail. It is shown that their usual notion of macroscopic causality must be endowed with similar features

  12. Strong motions observed by K-NET and KiK-net during the 2016 Kumamoto earthquake sequence

    Science.gov (United States)

    Suzuki, Wataru; Aoi, Shin; Kunugi, Takashi; Kubo, Hisahiko; Morikawa, Nobuyuki; Nakamura, Hiromitsu; Kimura, Takeshi; Fujiwara, Hiroyuki

    2017-01-01

    The nationwide strong-motion seismograph network of K-NET and KiK-net in Japan successfully recorded the strong ground motions of the 2016 Kumamoto earthquake sequence, which show the several notable characteristics. For the first large earthquake with a JMA magnitude of 6.5 (21:26, April 14, 2016, JST), the large strong motions are concentrated near the epicenter and the strong-motion attenuations are well predicted by the empirical relation for crustal earthquakes with a moment magnitude of 6.1. For the largest earthquake of the sequence with a JMA magnitude of 7.3 (01:25, April 16, 2016, JST), the large peak ground accelerations and velocities extend from the epicentral area to the northeast direction. The attenuation feature of peak ground accelerations generally follows the empirical relation, whereas that for velocities deviates from the empirical relation for stations with the epicentral distance of greater than 200 km, which can be attributed to the large Love wave having a dominant period around 10 s. The large accelerations were observed at stations even in Oita region, more than 70 km northeast from the epicenter. They are attributed to the local induced earthquake in Oita region, whose moment magnitude is estimated to be 5.5 by matching the amplitudes of the corresponding phases with the empirical attenuation relation. The real-time strong-motion observation has a potential for contributing to the mitigation of the ongoing earthquake disasters. We test a methodology to forecast the regions to be exposed to the large shaking in real time, which has been developed based on the fact that the neighboring stations are already shaken, for the largest event of the Kumamoto earthquakes, and demonstrate that it is simple but effective to quickly make warning. We also shows that the interpolation of the strong motions in real time is feasible, which will be utilized for the real-time forecast of ground motions based on the observed shakings.[Figure not available

  13. Stewardship and risk: An empirically grounded theory of organic fish farming in Scotland

    NARCIS (Netherlands)

    Georgakopoulos, G.; Ciancanelli, P.; Coulson, A.; Kaldis, P.E.

    2008-01-01

    It has long been assumed ownership gives farmers incentives to act as stewards of the land. On this basis, quasi-property rights are granted to fish farmers to encourage them to manage risks to the aquatic environment. This paper offers an empirically grounded theorisation of fish farmers’

  14. Written institutional ethics policies on euthanasia: an empirical-based organizational-ethical framework.

    Science.gov (United States)

    Lemiengre, Joke; Dierckx de Casterlé, Bernadette; Schotsmans, Paul; Gastmans, Chris

    2014-05-01

    As euthanasia has become a widely debated issue in many Western countries, hospitals and nursing homes especially are increasingly being confronted with this ethically sensitive societal issue. The focus of this paper is how healthcare institutions can deal with euthanasia requests on an organizational level by means of a written institutional ethics policy. The general aim is to make a critical analysis whether these policies can be considered as organizational-ethical instruments that support healthcare institutions to take their institutional responsibility for dealing with euthanasia requests. By means of an interpretative analysis, we conducted a process of reinterpretation of results of former Belgian empirical studies on written institutional ethics policies on euthanasia in dialogue with the existing international literature. The study findings revealed that legal regulations, ethical and care-oriented aspects strongly affected the development, the content, and the impact of written institutional ethics policies on euthanasia. Hence, these three cornerstones-law, care and ethics-constituted the basis for the empirical-based organizational-ethical framework for written institutional ethics policies on euthanasia that is presented in this paper. However, having a euthanasia policy does not automatically lead to more legal transparency, or to a more professional and ethical care practice. The study findings suggest that the development and implementation of an ethics policy on euthanasia as an organizational-ethical instrument should be considered as a dynamic process. Administrators and ethics committees must take responsibility to actively create an ethical climate supporting care providers who have to deal with ethical dilemmas in their practice.

  15. CULTURAL IDENTITY AS BASIS OF FORMATION OF THE STATE

    OpenAIRE

    Sukhanov Vyacheslav Vladimirovich

    2012-01-01

    The ethnic question in Russia put on the agenda both in the Soviet Union and in the Russian Empire. In modern Russia enduring excitements in the state and increasing world influence, very sharply there is a question of cultural identity. Creation of the civil nation, on the basis of uniform culture, system of valuable reference points still remains no more than idea. Reconstruction or designing of essentially new institutes for regulation of the relations with ethnic minority -temporary measu...

  16. Why Does Emissions Trading under the EU ETS Not Affect Firms' Competitiveness? Empirical Findings from the Literature

    OpenAIRE

    Joltreau, Eugénie; Sommerfeld, Katrin

    2017-01-01

    Environmental policies may have important consequences for firms’ competitiveness or profitability. However, the empirical literature shows that hardly any statistically significant effects on firms can be detected for the European Union Emissions Trading Scheme (EU ETS). On the basis of existing literature, we focus on potential explanations for why the empirical literature finds hardly any significant competitiveness effects on firms, least not during the first two phases of the scheme (...

  17. The empirical potential of live streaming beyond cognitive psychology

    Directory of Open Access Journals (Sweden)

    Alexander Nicolai Wendt

    2017-03-01

    Full Text Available Empirical methods of self-description, think aloud protocols and introspection have been extensively criticized or neglected in behaviorist and cognitivist psychology. Their methodological value has been fundamentally questioned since there apparently is no suficient proof for their validity. However, the major arguments against self-description can be critically reviewed by theoretical psychology. This way, these methods’ empirical value can be redeemed. Furthermore, self-descriptive methods can be updated by the use of contemporary media technology. In order to support the promising perspectives for future empirical research in the field of cognitive psychology, Live Streaming is proposed as a viable data source. Introducing this new paradigm, this paper presents some of the formal constituents and accessible contents of Live Streaming, and relates them to established forms of empirical research. By its structure and established usage, Live Streaming bears remarkable resemblances to the traditional methods of self-description, yet it also adds fruitful new features of use. On the basis of its qualities, the possible benefits that appear to be feasible in comparison with the traditional methods of self-description are elaborated, such as Live Streaming’s ecological validity. Ultimately, controversial theoretical concepts, such as those in phenomenology and cultural-historical psychology, are adopted to sketch further potential benefits of the utility of Live Streaming in current psychology debates.

  18. Effective interactions in strongly-coupled quantum systems

    International Nuclear Information System (INIS)

    Chen, J.M.C.

    1986-01-01

    In this thesis, they study the role of effective interactions in strongly-coupled Fermi systems where the short-range correlations introduce difficulties requiring special treatment. The correlated basis function method provides the means to incorporate the short-range correlations and generate the matrix elements of the Hamiltonian and identity operators in a nonorthogonal basis of states which are so important to their studies. In the first half of the thesis, the particle-hole channel is examined to elucidate the effects of collective excitations. Proceeding from a least-action principle, a generalization of the random-phase approximation is developed capable of describing such strongly-interacting Fermi systems as nuclei, nuclear matter, neutron-star matter, and liquid 3 He. A linear response of dynamically correlated system to a weak external perturbation is also derived based on the same framework. In the second half of the thesis, the particle-particle channel is examined to elucidate the effects of pairing in nuclear and neutron-star matter

  19. Empirical data and moral theory. A plea for integrated empirical ethics.

    Science.gov (United States)

    Molewijk, Bert; Stiggelbout, Anne M; Otten, Wilma; Dupuis, Heleen M; Kievit, Job

    2004-01-01

    Ethicists differ considerably in their reasons for using empirical data. This paper presents a brief overview of four traditional approaches to the use of empirical data: "the prescriptive applied ethicists," "the theorists," "the critical applied ethicists," and "the particularists." The main aim of this paper is to introduce a fifth approach of more recent date (i.e. "integrated empirical ethics") and to offer some methodological directives for research in integrated empirical ethics. All five approaches are presented in a table for heuristic purposes. The table consists of eight columns: "view on distinction descriptive-prescriptive sciences," "location of moral authority," "central goal(s)," "types of normativity," "use of empirical data," "method," "interaction empirical data and moral theory," and "cooperation with descriptive sciences." Ethicists can use the table in order to identify their own approach. Reflection on these issues prior to starting research in empirical ethics should lead to harmonization of the different scientific disciplines and effective planning of the final research design. Integrated empirical ethics (IEE) refers to studies in which ethicists and descriptive scientists cooperate together continuously and intensively. Both disciplines try to integrate moral theory and empirical data in order to reach a normative conclusion with respect to a specific social practice. IEE is not wholly prescriptive or wholly descriptive since IEE assumes an interdepence between facts and values and between the empirical and the normative. The paper ends with three suggestions for consideration on some of the future challenges of integrated empirical ethics.

  20. Weak and strong coupling equilibration in nonabelian gauge theories

    Energy Technology Data Exchange (ETDEWEB)

    Keegan, Liam [Physics Department, Theory Unit, CERN,CH-1211 Genève 23 (Switzerland); Kurkela, Aleksi [Physics Department, Theory Unit, CERN,CH-1211 Genève 23 (Switzerland); Faculty of Science and Technology, University of Stavanger,4036 Stavanger (Norway); Romatschke, Paul [Department of Physics, 390 UCB, University of Colorado at Boulder,Boulder, CO (United States); Center for Theory of Quantum Matter, University of Colorado,Boulder, Colorado 80309 (United States); Schee, Wilke van der [Center for Theoretical Physics, MIT,Cambridge, MA 02139 (United States); Zhu, Yan [Department of Physics, University of Jyväskyla, P.O. Box 35, FI-40014 University of Jyväskylä (Finland); Helsinki Institute of Physics,P.O. Box 64, 00014 University of Helsinki (Finland)

    2016-04-06

    We present a direct comparison studying equilibration through kinetic theory at weak coupling and through holography at strong coupling in the same set-up. The set-up starts with a homogeneous thermal state, which then smoothly transitions through an out-of-equilibrium phase to an expanding system undergoing boost-invariant flow. This first apples-to-apples comparison of equilibration provides a benchmark for similar equilibration processes in heavy-ion collisions, where the equilibration mechanism is still under debate. We find that results at weak and strong coupling can be smoothly connected by simple, empirical power-laws for the viscosity, equilibration time and entropy production of the system.

  1. Weak and strong coupling equilibration in nonabelian gauge theories

    International Nuclear Information System (INIS)

    Keegan, Liam; Kurkela, Aleksi; Romatschke, Paul; Schee, Wilke van der; Zhu, Yan

    2016-01-01

    We present a direct comparison studying equilibration through kinetic theory at weak coupling and through holography at strong coupling in the same set-up. The set-up starts with a homogeneous thermal state, which then smoothly transitions through an out-of-equilibrium phase to an expanding system undergoing boost-invariant flow. This first apples-to-apples comparison of equilibration provides a benchmark for similar equilibration processes in heavy-ion collisions, where the equilibration mechanism is still under debate. We find that results at weak and strong coupling can be smoothly connected by simple, empirical power-laws for the viscosity, equilibration time and entropy production of the system.

  2. Empirical links between natural mortality and recovery in marine fishes.

    Science.gov (United States)

    Hutchings, Jeffrey A; Kuparinen, Anna

    2017-06-14

    Probability of species recovery is thought to be correlated with specific aspects of organismal life history, such as age at maturity and longevity, and how these affect rates of natural mortality ( M ) and maximum per capita population growth ( r max ). Despite strong theoretical underpinnings, these correlates have been based on predicted rather than realized population trajectories following threat mitigation. Here, we examine the level of empirical support for postulated links between a suite of life-history traits (related to maturity, age, size and growth) and recovery in marine fishes. Following threat mitigation (medium time since cessation of overfishing = 20 years), 71% of 55 temperate populations had fully recovered, the remainder exhibiting, on average, negligible change (impaired recovery). Singly, life-history traits did not influence recovery status. In combination, however, those that jointly reflect length-based mortality at maturity, M α , revealed that recovered populations have higher M α , which we hypothesize to reflect local adaptations associated with greater r max But, within populations, the smaller sizes at maturity generated by overfishing are predicted to increase M α , slowing recovery and increasing its uncertainty. We conclude that recovery potential is greater for populations adapted to high M but that temporal increases in M concomitant with smaller size at maturity will have the opposite effect. The recovery metric documented here ( M α ) has a sound theoretical basis, is significantly correlated with direct estimates of M that directly reflect r max , is not reliant on data-intensive time series, can be readily estimated, and offers an empirically defensible correlate of recovery, given its clear links to the positive and impaired responses to threat mitigation that have been observed in fish populations over the past three decades. © 2017 The Author(s).

  3. EMPIRICAL TESTING OF MODIFIED BLACK-SCHOLES OPTION PRICING MODEL FORMULA ON NSE DERIVATIVE MARKET IN INDIA

    Directory of Open Access Journals (Sweden)

    Ambrish Gupta

    2013-01-01

    Full Text Available The main objectives of this paper are to incorporate modification in Black-Scholes option pricing model formula by adding some new variables on the basis of given assumption related to risk-free interest rate, and also shows the calculation process of new risk-free interest rate on the basis of modified variable. This paper also identifies the various situations in empirical testing of modified and original Black-Scholes formula with respect to the market value on the basis of assumed and calculated risk-free interest rate.

  4. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  5. Application of a net-based baseline correction scheme to strong-motion records of the 2011 Mw 9.0 Tohoku earthquake

    Science.gov (United States)

    Tu, Rui; Wang, Rongjiang; Zhang, Yong; Walter, Thomas R.

    2014-06-01

    The description of static displacements associated with earthquakes is traditionally achieved using GPS, EDM or InSAR data. In addition, displacement histories can be derived from strong-motion records, allowing an improvement of geodetic networks at a high sampling rate and a better physical understanding of earthquake processes. Strong-motion records require a correction procedure appropriate for baseline shifts that may be caused by rotational motion, tilting and other instrumental effects. Common methods use an empirical bilinear correction on the velocity seismograms integrated from the strong-motion records. In this study, we overcome the weaknesses of an empirically based bilinear baseline correction scheme by using a net-based criterion to select the timing parameters. This idea is based on the physical principle that low-frequency seismic waveforms at neighbouring stations are coherent if the interstation distance is much smaller than the distance to the seismic source. For a dense strong-motion network, it is plausible to select the timing parameters so that the correlation coefficient between the velocity seismograms of two neighbouring stations is maximized after the baseline correction. We applied this new concept to the KiK-Net and K-Net strong-motion data available for the 2011 Mw 9.0 Tohoku earthquake. We compared the derived coseismic static displacement with high-quality GPS data, and with the results obtained using empirical methods. The results show that the proposed net-based approach is feasible and more robust than the individual empirical approaches. The outliers caused by unknown problems in the measurement system can be easily detected and quantified.

  6. The Cost of Failure: An Empirical Look at the Financial Effect of Business Failure on the Self-Employed

    Science.gov (United States)

    2005-08-01

    wealth measurements. Chapter Two also addresses the relevant work in labor economics that provides the theoretical basis and empirical support for how...lead to asset growth are typically not addressed in their own right. It is also worth noting that much of the labor economics literature devoted to...earnings profile, it is hard to ignore. Additionally, empirical labor economics literature shows the existence a quadratic component to the shape of

  7. Empirical psychology, common sense, and Kant's empirical markers for moral responsibility.

    Science.gov (United States)

    Frierson, Patrick

    2008-12-01

    This paper explains the empirical markers by which Kant thinks that one can identify moral responsibility. After explaining the problem of discerning such markers within a Kantian framework I briefly explain Kant's empirical psychology. I then argue that Kant's empirical markers for moral responsibility--linked to higher faculties of cognition--are not sufficient conditions for moral responsibility, primarily because they are empirical characteristics subject to natural laws. Next. I argue that these markers are not necessary conditions of moral responsibility. Given Kant's transcendental idealism, even an entity that lacks these markers could be free and morally responsible, although as a matter of fact Kant thinks that none are. Given that they are neither necessary nor sufficient conditions, I discuss the status of Kant's claim that higher faculties are empirical markers of moral responsibility. Drawing on connections between Kant's ethical theory and 'common rational cognition' (4:393), I suggest that Kant's theory of empirical markers can be traced to ordinary common sense beliefs about responsibility. This suggestion helps explain both why empirical markers are important and what the limits of empirical psychology are within Kant's account of moral responsibility.

  8. The effects of performance measurement and compensation on motivation: An empirical study

    NARCIS (Netherlands)

    van Herpen, M.; van Praag, C.M.; Cools, K.

    2003-01-01

    The design and implementation of a performance measurement and compensation system can strongly affect the motivation of employees. Building on economic and psychological theory this study develops a conceptual model that is used to empirically test this effect. Our survey results demonstrate a

  9. Segmentation-free empirical beam hardening correction for CT

    Energy Technology Data Exchange (ETDEWEB)

    Schüller, Sören; Sawall, Stefan [German Cancer Research Center (DKFZ), Im Neuenheimer Feld 280, Heidelberg 69120 (Germany); Stannigel, Kai; Hülsbusch, Markus; Ulrici, Johannes; Hell, Erich [Sirona Dental Systems GmbH, Fabrikstraße 31, 64625 Bensheim (Germany); Kachelrieß, Marc, E-mail: marc.kachelriess@dkfz.de [German Cancer Research Center (DKFZ), Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)

    2015-02-15

    proposed algorithm to be segmentation-free (sf). This deformation leads to a nonlinear accentuation of higher CT-values. The original volume and the gray value deformed volume are monochromatically forward projected. The two projection sets are then monomially combined and reconstructed to generate sets of basis volumes which are used for correction. This is done by maximization of the image flatness due to adding additionally a weighted sum of these basis images. sfEBHC is evaluated on polychromatic simulations, phantom measurements, and patient data. The raw data sets were acquired by a dual source spiral CT scanner, a digital volume tomograph, and a dual source micro CT. Different phantom and patient data were used to illustrate the performance and wide range of usability of sfEBHC across different scanning scenarios. The artifact correction capabilities are compared to EBHC. Results: All investigated cases show equal or improved image quality compared to the standard EBHC approach. The artifact correction is capable of correcting beam hardening artifacts for different scan parameters and scan scenarios. Conclusions: sfEBHC generates beam hardening-reduced images and is furthermore capable of dealing with images which are affected by high noise and strong artifacts. The algorithm can be used to recover structures which are hardly visible inside the beam hardening-affected regions.

  10. Segmentation-free empirical beam hardening correction for CT.

    Science.gov (United States)

    Schüller, Sören; Sawall, Stefan; Stannigel, Kai; Hülsbusch, Markus; Ulrici, Johannes; Hell, Erich; Kachelrieß, Marc

    2015-02-01

    algorithm to be segmentation-free (sf). This deformation leads to a nonlinear accentuation of higher CT-values. The original volume and the gray value deformed volume are monochromatically forward projected. The two projection sets are then monomially combined and reconstructed to generate sets of basis volumes which are used for correction. This is done by maximization of the image flatness due to adding additionally a weighted sum of these basis images. sfEBHC is evaluated on polychromatic simulations, phantom measurements, and patient data. The raw data sets were acquired by a dual source spiral CT scanner, a digital volume tomograph, and a dual source micro CT. Different phantom and patient data were used to illustrate the performance and wide range of usability of sfEBHC across different scanning scenarios. The artifact correction capabilities are compared to EBHC. All investigated cases show equal or improved image quality compared to the standard EBHC approach. The artifact correction is capable of correcting beam hardening artifacts for different scan parameters and scan scenarios. sfEBHC generates beam hardening-reduced images and is furthermore capable of dealing with images which are affected by high noise and strong artifacts. The algorithm can be used to recover structures which are hardly visible inside the beam hardening-affected regions.

  11. Empirical data and optimal monitoring policies: the case of four Russian sea harbours

    Energy Technology Data Exchange (ETDEWEB)

    Deissenberg, C. [CEFI-CNRS, Les Milles (France); Gurman, V.; Shevchuk, E. [RAS, Program Systems Inst., Pereslavl-Zalessky (Russian Federation); Ryumina, E. [Russian Academy of Sciences, Moscow (Russian Federation). Inst. of Economic Market Problems; Shevlyagin, K. [State Committee of the Environment Protection of the Russian Federation, Moscow (Russian Federation). Marine Environment Dept.

    2001-07-01

    In this paper, we describe the present state of empirical information about oil spills and oil monitoring activities in Russian harbours. We explain how we gathered, organized, and estimated the data needed to run the monitoring efforts optimization model of Deissenberg et al. (2001). We present, analyse, and discuss the results of the optimizations carried out with this model on the basis of the empirical data. These results show, in particular, that the economic efficiency of the monitoring activities decreases rapidly as the corresponding budget increases. This suggests that, rather urgently, measures other than monitoring should be initiated to control sea harbour pollution. (Author)

  12. On a possible empirical meaning of meets and joins for quantum propositions

    Science.gov (United States)

    Herbut, Fedor

    1984-09-01

    Following up an idea of Jauch, an empirical basis is worked out for the meets (and joins) of quantum propositions through approximate measurement to an arbitrary degree of accuracy of special yes-no effects constructed from chains of yes-no measurements. This is done with a view to interpreting the meets (and joins) as Dedekind's cuts in analogy with irrationals.

  13. On a possible empirical meaning of meets and joins for quantum propositions

    International Nuclear Information System (INIS)

    Herbut, F.

    1984-01-01

    Following up an idea of Jauch, an empirical basis is worked out for the meets (and joins) of quantum propositions through approximate measurement to an arbitrary degree of accuracy of special yes-no effects constructed from chains of yes-no measurements. This is done with a view to interpreting the meets (and joins) as Dedekind's cuts in analogy with irrationals. (orig.)

  14. Empirical relations between instrumental and seismic parameters of some strong earthquakes of Colombia

    International Nuclear Information System (INIS)

    Marin Arias, Juan Pablo; Salcedo Hurtado, Elkin de Jesus; Castillo Gonzalez, Hardany

    2008-01-01

    In order to establish the relationships between macroseismic and instrumental parameters, macroseismic field of 28 historical earthquakes that produced great effects in the Colombian territory were studied. The integration of the parameters was made by using the methodology of Kaussel and Ramirez (1992), for great Chilean earthquakes; Kanamori and Anderson (1975) and Coppersmith and Well (1994) for world-wide earthquakes. Once determined the macroseismic and instrumental parameters it was come to establish the model of the source of each earthquake, with which the data base of these parameters was completed. For each earthquake parameters related to the local and normal macroseismic epicenter were complemented, depth of the local and normal center, horizontal extension of both centers, vertical extension of the normal center, model of the source, area of rupture. The obtained empirical relations from linear equations, even show behaviors very similar to the found ones by other authors for other regions of the world and to world-wide level. The results of this work allow establishing that certain mutual non compatibility exists between the area of rupture and the length of rupture determined by the macroseismic methods, with parameters found with instrumental data like seismic moment, Ms magnitude and Mw magnitude.

  15. Evidence-based Nursing Education - a Systematic Review of Empirical Research

    Science.gov (United States)

    Reiber, Karin

    2011-01-01

    The project „Evidence-based Nursing Education – Preparatory Stage“, funded by the Landesstiftung Baden-Württemberg within the programme Impulsfinanzierung Forschung (Funding to Stimulate Research), aims to collect information on current research concerned with nursing education and to process existing data. The results of empirical research which has already been carried out were systematically evaluated with aim of identifying further topics, fields and matters of interest for empirical research in nursing education. In the course of the project, the available empirical studies on nursing education were scientifically analysed and systematised. The over-arching aim of the evidence-based training approach – which extends beyond the aims of this project - is the conception, organisation and evaluation of vocational training and educational processes in the caring professions on the basis of empirical data. The following contribution first provides a systematic, theoretical link to the over-arching reference framework, as the evidence-based approach is adapted from thematically related specialist fields. The research design of the project is oriented towards criteria introduced from a selection of studies and carries out a two-stage systematic review of the selected studies. As a result, the current status of research in nursing education, as well as its organisation and structure, and questions relating to specialist training and comparative education are introduced and discussed. Finally, the empirical research on nursing training is critically appraised as a complementary element in educational theory/psychology of learning and in the ethical tradition of research. This contribution aims, on the one hand, to derive and describe the methods used, and to introduce the steps followed in gathering and evaluating the data. On the other hand, it is intended to give a systematic overview of empirical research work in nursing education. In order to preserve a

  16. A Place for Sexual Dysfunctions in an Empirical Taxonomy of Psychopathology

    Science.gov (United States)

    Forbes, Miriam K.; Baillie, Andrew J.; Eaton, Nicholas R.; Krueger, Robert F.

    2017-01-01

    Sexual dysfunctions commonly co-occur with various depressive and anxiety disorders. An emerging framework for understanding the classification of mental disorders suggests that such comorbidity is a manifestation of underlying dimensions of psychopathology (or “spectra”). In this review, we synthesize the evidence that sexual dysfunctions should be included in the empirical taxonomy of psychopathology as part of the internalizing spectrum, which accounts for comorbidity among the depressive and anxiety disorders. The review has four parts. Part 1 summarizes the empirical basis and utility of the empirical taxonomy of psychopathology. Part 2 reviews the prima facie evidence for the hypothesis that sexual dysfunctions are part of the internalizing spectrum (i.e., high rates of comorbidity; shared cognitive, affective, and temperament characteristics; common neural substrates and biomarkers; shared course and treatment response; and the lack of causal relationships between them). Part 3 critically analyzes and integrates the results of the eight studies that have addressed this hypothesis. Finally, Part 4 examines the implications of reconceptualizing sexual dysfunctions as part of the internalizing spectrum, and explores avenues for future research. PMID:28121167

  17. A Conceptual Basis for Developing Common Curricula in Teacher Education Programs for Occupational Education. Graduate Studies in Education, Number 2, Volume 3.

    Science.gov (United States)

    Courtney, E. Wayne

    The purpose of this document was to generate a rationale and a design for planning a conceptual basis for developing common curriculums in vocational teacher education training programs. A review of the literature discusses heuristic approaches to teacher education, the rational basis for common programs, empirical studies in teacher education,…

  18. Inglorious Empire

    DEFF Research Database (Denmark)

    Khair, Tabish

    2017-01-01

    Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00......Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00...

  19. An empirical model for the melt viscosity of polymer blends

    International Nuclear Information System (INIS)

    Dobrescu, V.

    1981-01-01

    On the basis of experimental data for blends of polyethylene with different polymers an empirical equation is proposed to describe the dependence of melt viscosity of blends on component viscosities and composition. The model ensures the continuity of viscosity vs. composition curves throughout the whole composition range, the possibility of obtaining extremum values higher or lower than the viscosities of components, allows the calculation of flow curves of blends from the flow curves of components and their volume fractions. (orig.)

  20. Budget deficit, money growth and inflation: Empirical evidence from Vietnam

    OpenAIRE

    Khieu Van, Hoang

    2014-01-01

    This study empirically examines the nexus among budget deficit, money supply and inflation by using a monthly data set from January 1995 to December 2012 and a SVAR model with five endogenous variables, inflation, money growth, budget deficit growth, real GDP growth and interest rate. Since real GDP and budget deficit are unavailable on the monthly basis, we interpolate those series using Chow and Lin’s (1971) annualized approach from their annual series. Overall, we found that money growth h...

  1. Benchmarking FeCr empirical potentials against density functional theory data

    International Nuclear Information System (INIS)

    Klaver, T P C; Bonny, G; Terentyev, D; Olsson, P

    2010-01-01

    Three semi-empirical force field FeCr potentials, two within the formalism of the two-band model and one within the formalism of the concentration dependent model, have been benchmarked against a wide variety of density functional theory (DFT) structures. The benchmarking allows an assessment of how reliable empirical potential results are in different areas relevant to radiation damage modelling. The DFT data consist of defect-free structures, structures with single interstitials and structures with small di- and tri-interstitial clusters. All three potentials reproduce the general trend of the heat of formation (h.o.f.) quite well. The most important shortcomings of the original two-band model potential are the low or even negative h.o.f. for Cr-rich structures and the lack of a strong repulsion when moving two solute Cr atoms from being second-nearest neighbours to nearest neighbours. The newer two-band model potential partly solves the first problem. The most important shortcoming in the concentration dependent model potential is the magnitude of the Cr–Cr repulsion, being too strong at short distances and mostly absent at longer distances. Both two-band model potentials do reproduce long-range Cr–Cr repulsion. For interstitials the two-band model potentials reproduce a number of Cr–interstitial binding energies surprisingly well, in contrast to the concentration dependent model potential. For Cr interacting with clusters, the result can sometimes be directly extrapolated from Cr interacting with single interstitials, both according to DFT and the three empirical potentials

  2. Catching the tail: Empirical identification of the distribution of the value of travel time

    DEFF Research Database (Denmark)

    Börjesson, Maria; Fosgerau, Mogens; Algers, Staffan

    2012-01-01

    Recent methodological advances in discrete choice analysis in combination with certain stated choice experiments have allowed researchers to check empirically the identification of the distribution of latent variables such as the value of travel time (VTT). Lack of identification is likely...... to be common and the consequences are severe. E.g., the Danish value of time study found the 15% right tail of the VTT distribution to be unidentified, making it impossible to estimate the mean VTT without resorting to strong assumptions with equally strong impact on the resulting estimate. This paper analyses...... data generated from a similar choice experiment undertaken in Sweden during 2007–2008 in which the range of tradeoff values between time and money was significantly increased relative to the Danish experiment. The results show that this change allowed empirical identification of effectively the entire...

  3. Psychodynamic therapy from the perspective of self-organization. a concept of change and a methodological approach for empirical examination.

    Science.gov (United States)

    Gumz, Antje; Geyer, Michael; Brähler, Elmar

    2014-01-01

    Observations from therapeutic practice and a series of empirical findings, for example, those on discontinuous change in psychotherapeutic processes, suggest modelling the therapeutic process as a self-organizing system with stable and critical instable phases and abrupt transitions. Here, a concept of psychotherapeutic change is presented that applies self-organization theory to psychodynamic principles. The authors explain the observations and considerations that form the basis of the concept and present some connections with existing findings and concepts. On the basis of this model, they generated two hypotheses regarding the co-occurrence of instability and discontinuous change and the degree of synchrony between the therapist and patient. A study design to test these hypotheses was developed and applied to a single case (psychodynamic therapy). After each session, patient and therapist rated their interaction. A measure of instability was calculated across the resulting time series. Sequences of destabilization were observed. On the basis of points of extreme instability, the process was divided into phases. Local instability maxima were accompanied by significant discontinuous change. Destabilization was highly synchronous in therapist and patient ratings. The authors discussed the concept and the methodological procedure. The approach enables the operationalization of crises and to empirically assess the significance of critical phases and developments within the therapeutic relationship. We present a concept of change that applies self-organization theory to psychodynamic therapy. We empirically tested the hypotheses formulated in the concept based on an extract of 125 long-term psychodynamic therapy sessions. We continuously monitored the therapeutic interaction and calculated a measure of the instability of the assessments. We identified several sequences of stable and unstable episodes. Episodes of high instability were accompanied by discontinuous

  4. Strong motion duration and earthquake magnitude relationships

    International Nuclear Information System (INIS)

    Salmon, M.W.; Short, S.A.; Kennedy, R.P.

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ''strong motion duration'' has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions

  5. Diverse Delivery Methods and Strong Psychological Benefits: A Review of Online Formative Assessment

    Science.gov (United States)

    McLaughlin, T.; Yan, Z.

    2017-01-01

    This article is a review of literature on online formative assessment (OFA). It includes a narrative summary that synthesizes the research on the diverse delivery methods of OFA, as well as the empirical literature regarding the strong psychological benefits and limitations. Online formative assessment can be delivered using many traditional…

  6. Prediction of mandibular rotation: an empirical test of clinician performance.

    Science.gov (United States)

    Baumrind, S; Korn, E L; West, E E

    1984-11-01

    An experiment was conducted in an attempt to determine empirically how effective a number of expert clinicians were at differentiating "backward rotators" from "forward rotators" on the basis of head-film information which might reasonably have been available to them prior to instituting treatment for the correction of Class II malocclusion. As a result of a previously reported ongoing study, pre- and posttreatment head films were available for 188 patients treated in the mixed dentition for the correction of Class II malocclusion and for 50 untreated Class II subjects. These subjects were divided into 14 groups (average size of group, 17; range, 6 to 23) solely on the basis of type of treatment and the clinician from whose clinic the records had originated. From within each group, we selected the two or three subjects who had exhibited the most extreme backward rotation and the two or three subjects who had exhibited the most extreme forward rotation of the mandible during the interval between films. The sole criterion for classification was magnitude of change in the mandibular plane angle of Downs between the pre- and posttreatment films of each patient. The resulting sample contained 32 backward-rotator subjects and 32 forward-rotator subjects. Five expert judges (mean clinical experience, 28 years) were asked to identify the backward-rotator subjects by examination of the pretreatment films. The findings may be summarized as follows: (1) No judge performed significantly better than chance. (2) There was strong evidence that the judges used a shared, though relatively ineffective, set of rules in making their discriminations between forward and backward rotators. (3) Statistical analysis of the predictive power of a set of standard cephalometric measurements which had previously been made for this set of subjects indicated that the numerical data also failed to identify potential backward rotators at a rate significantly better than chance. We infer from these

  7. Reflective equilibrium and empirical data: third person moral experiences in empirical medical ethics.

    Science.gov (United States)

    De Vries, Martine; Van Leeuwen, Evert

    2010-11-01

    In ethics, the use of empirical data has become more and more popular, leading to a distinct form of applied ethics, namely empirical ethics. This 'empirical turn' is especially visible in bioethics. There are various ways of combining empirical research and ethical reflection. In this paper we discuss the use of empirical data in a special form of Reflective Equilibrium (RE), namely the Network Model with Third Person Moral Experiences. In this model, the empirical data consist of the moral experiences of people in a practice. Although inclusion of these moral experiences in this specific model of RE can be well defended, their use in the application of the model still raises important questions. What precisely are moral experiences? How to determine relevance of experiences, in other words: should there be a selection of the moral experiences that are eventually used in the RE? How much weight should the empirical data have in the RE? And the key question: can the use of RE by empirical ethicists really produce answers to practical moral questions? In this paper we start to answer the above questions by giving examples taken from our research project on understanding the norm of informed consent in the field of pediatric oncology. We especially emphasize that incorporation of empirical data in a network model can reduce the risk of self-justification and bias and can increase the credibility of the RE reached. © 2009 Blackwell Publishing Ltd.

  8. Countertransference when working with narcissistic personality disorder: An empirical investigation.

    Science.gov (United States)

    Tanzilli, Annalisa; Muzi, Laura; Ronningstam, Elsa; Lingiardi, Vittorio

    2017-06-01

    Narcissistic personality disorder (NPD) is one of the most challenging clinical syndromes to treat in psychotherapy, especially due to the difficulties of establishing a good enough therapist-patient relationship. Countertransference responses to NPD can be particularly intense, frustrating, and difficult to manage, as is often reported in the clinical literature though not clearly supported empirically. The aims of this study were to (a) investigate the relationship between patients' NPD and therapists' responses; (b) examine the associations between patient, clinician, therapy variables and clinicians' reactions during treatment of NPD patients; and (c) provide an empirically derived portrait of countertransference with NPD. A sample of psychiatrists and clinical psychologists (N = 67) completed the Therapist Response Questionnaire to identify patterns of countertransference, the Shedler-Westen Assessment Procedure-200, and the Global Assessment of Functioning Scale to assess the personality pathology and psychosocial functioning of a patient in their care. The results showed that NPD was positively associated with hostile/angry, criticized/devalued, helpless/inadequate, and disengaged countertransference and negatively associated with therapists' positive response, regardless of patients' personality and psychosocial functioning. NPD patients with stronger traits of cluster B personality pathology tended to elicit more negative and heterogeneous countertransference reactions than NPD patients without these features. The countertransference patterns with NPD patients were not strongly influenced by the variables of clinicians and therapy, with the exception of clinical experience. Overall, the portrait of therapists' reactions to NPD provided a clinically nuanced and empirically founded description strongly resembling theoretical-clinical accounts. The therapeutic implications of these findings were discussed. (PsycINFO Database Record (c) 2017 APA, all rights

  9. Empirical research in medical ethics: How conceptual accounts on normative-empirical collaboration may improve research practice

    Science.gov (United States)

    2012-01-01

    Background The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. Discussion A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. Summary High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis. PMID:22500496

  10. Long-term predictability of regions and dates of strong earthquakes

    Science.gov (United States)

    Kubyshen, Alexander; Doda, Leonid; Shopin, Sergey

    2016-04-01

    Results on the long-term predictability of strong earthquakes are discussed. It is shown that dates of earthquakes with M>5.5 could be determined in advance of several months before the event. The magnitude and the region of approaching earthquake could be specified in the time-frame of a month before the event. Determination of number of M6+ earthquakes, which are expected to occur during the analyzed year, is performed using the special sequence diagram of seismic activity for the century time frame. Date analysis could be performed with advance of 15-20 years. Data is verified by a monthly sequence diagram of seismic activity. The number of strong earthquakes expected to occur in the analyzed month is determined by several methods having a different prediction horizon. Determination of days of potential earthquakes with M5.5+ is performed using astronomical data. Earthquakes occur on days of oppositions of Solar System planets (arranged in a single line). At that, the strongest earthquakes occur under the location of vector "Sun-Solar System barycenter" in the ecliptic plane. Details of this astronomical multivariate indicator still require further research, but it's practical significant is confirmed by practice. Another one empirical indicator of approaching earthquake M6+ is a synchronous variation of meteorological parameters: abrupt decreasing of minimal daily temperature, increasing of relative humidity, abrupt change of atmospheric pressure (RAMES method). Time difference of predicted and actual date is no more than one day. This indicator is registered 104 days before the earthquake, so it was called as Harmonic 104 or H-104. This fact looks paradoxical, but the works of A. Sytinskiy and V. Bokov on the correlation of global atmospheric circulation and seismic events give a physical basis for this empirical fact. Also, 104 days is a quarter of a Chandler period so this fact gives insight on the correlation between the anomalies of Earth orientation

  11. A systematic study of the strong interaction with PANDA

    NARCIS (Netherlands)

    Messchendorp, J. G.; Hosaka, A; Khemchandani, K; Nagahiro, H; Nawa, K

    2011-01-01

    The theory of Quantum Chromo Dynamics (QCD) reproduces the strong interaction at distances much shorter than the size of the nucleon. At larger distance scales, the generation of hadron masses and confinement cannot yet be derived from first principles on basis of QCD. The PANDA experiment at FAIR

  12. Localization in random bipartite graphs: Numerical and empirical study

    Science.gov (United States)

    Slanina, František

    2017-05-01

    We investigate adjacency matrices of bipartite graphs with a power-law degree distribution. Motivation for this study is twofold: first, vibrational states in granular matter and jammed sphere packings; second, graphs encoding social interaction, especially electronic commerce. We establish the position of the mobility edge and show that it strongly depends on the power in the degree distribution and on the ratio of the sizes of the two parts of the bipartite graph. At the jamming threshold, where the two parts have the same size, localization vanishes. We found that the multifractal spectrum is nontrivial in the delocalized phase, but still near the mobility edge. We also study an empirical bipartite graph, namely, the Amazon reviewer-item network. We found that in this specific graph the mobility edge disappears, and we draw a conclusion from this fact regarding earlier empirical studies of the Amazon network.

  13. Improved Wind Speed Prediction Using Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    ZHANG, Y.

    2018-05-01

    Full Text Available Wind power industry plays an important role in promoting the development of low-carbon economic and energy transformation in the world. However, the randomness and volatility of wind speed series restrict the healthy development of the wind power industry. Accurate wind speed prediction is the key to realize the stability of wind power integration and to guarantee the safe operation of the power system. In this paper, combined with the Empirical Mode Decomposition (EMD, the Radial Basis Function Neural Network (RBF and the Least Square Support Vector Machine (SVM, an improved wind speed prediction model based on Empirical Mode Decomposition (EMD-RBF-LS-SVM is proposed. The prediction result indicates that compared with the traditional prediction model (RBF, LS-SVM, the EMD-RBF-LS-SVM model can weaken the random fluctuation to a certain extent and improve the short-term accuracy of wind speed prediction significantly. In a word, this research will significantly reduce the impact of wind power instability on the power grid, ensure the power grid supply and demand balance, reduce the operating costs in the grid-connected systems, and enhance the market competitiveness of the wind power.

  14. Atomic excitation and acceleration in strong laser fields

    International Nuclear Information System (INIS)

    Zimmermann, H; Eichmann, U

    2016-01-01

    Atomic excitation in the tunneling regime of a strong-field laser–matter interaction has been recently observed. It is conveniently explained by the concept of frustrated tunneling ionization (FTI), which naturally evolves from the well-established tunneling picture followed by classical dynamics of the electron in the combined laser field and Coulomb field of the ionic core. Important predictions of the FTI model such as the n distribution of Rydberg states after strong-field excitation and the dependence on the laser polarization have been confirmed in experiments. The model also establishes a sound basis to understand strong-field acceleration of neutral atoms in strong laser fields. The experimental observation has become possible recently and initiated a variety of experiments such as atomic acceleration in an intense standing wave and the survival of Rydberg states in strong laser fields. Furthermore, the experimental investigations on strong-field dissociation of molecules, where neutral excited fragments after the Coulomb explosion of simple molecules have been observed, can be explained. In this review, we introduce the subject and give an overview over relevant experiments supplemented by new results. (paper)

  15. Strongly correlated systems experimental techniques

    CERN Document Server

    Mancini, Ferdinando

    2015-01-01

    The continuous evolution and development of experimental techniques is at the basis of any fundamental achievement in modern physics. Strongly correlated systems (SCS), more than any other, need to be investigated through the greatest variety of experimental techniques in order to unveil and crosscheck the numerous and puzzling anomalous behaviors characterizing them. The study of SCS fostered the improvement of many old experimental techniques, but also the advent of many new ones just invented in order to analyze the complex behaviors of these systems. Many novel materials, with functional properties emerging from macroscopic quantum behaviors at the frontier of modern research in physics, chemistry and materials science, belong to this class of systems. The volume presents a representative collection of the modern experimental techniques specifically tailored for the analysis of strongly correlated systems. Any technique is presented in great detail by its own inventor or by one of the world-wide recognize...

  16. Compactly Supported Basis Functions as Support Vector Kernels for Classification.

    Science.gov (United States)

    Wittek, Peter; Tan, Chew Lim

    2011-10-01

    Wavelet kernels have been introduced for both support vector regression and classification. Most of these wavelet kernels do not use the inner product of the embedding space, but use wavelets in a similar fashion to radial basis function kernels. Wavelet analysis is typically carried out on data with a temporal or spatial relation between consecutive data points. We argue that it is possible to order the features of a general data set so that consecutive features are statistically related to each other, thus enabling us to interpret the vector representation of an object as a series of equally or randomly spaced observations of a hypothetical continuous signal. By approximating the signal with compactly supported basis functions and employing the inner product of the embedding L2 space, we gain a new family of wavelet kernels. Empirical results show a clear advantage in favor of these kernels.

  17. Scalar strong interaction hadron theory

    CERN Document Server

    Hoh, Fang Chao

    2015-01-01

    The scalar strong interaction hadron theory, SSI, is a first principles' and nonlocal theory at quantum mechanical level that provides an alternative to low energy QCD and Higgs related part of the standard model. The quark-quark interaction is scalar rather than color-vectorial. A set of equations of motion for mesons and another set for baryons have been constructed. This book provides an account of the present state of a theory supposedly still at its early stage of development. This work will facilitate researchers interested in entering into this field and serve as a basis for possible future development of this theory.

  18. Environmental regulation of households. An empirical review of economic and psychological factors

    International Nuclear Information System (INIS)

    Van den Bergh, Jeroen C.J.M.

    2008-01-01

    The literature on sustainable consumption and environmental regulation of household behavior is dominated by conceptual and normative approaches. As a result, many suggestions lack a firm empirical basis. To overcome this deficiency, econometric studies in three areas of environmentally relevant activities of households are reviewed: residential use of energy, generation of solid waste and recycling, and residential use of water. Next to price and income elasticities, attention is devoted to individual socio-economic features and psychological factors, such as attitudes, knowledge, perceptions and values. Potential psychological determinants and related insights are further examined by discussing a range of representative and illustrative statistical-psychological studies of environmental behavior. One important general finding is that there are very few empirical studies that systematically combine socio-economic and psychological determinants. A range of insights for environmental policy is derived, and research recommendations are offered. (author)

  19. Negative binomial multiplicity distributions, a new empirical law for high energy collisions

    International Nuclear Information System (INIS)

    Van Hove, L.; Giovannini, A.

    1987-01-01

    For a variety of high energy hadron production reactions, recent experiments have confirmed the findings of the UA5 Collaboration that charged particle multiplicities in central (pseudo) rapidity intervals and in full phase space obey negative binomial (NB) distributions. The authors discuss the meaning of this new empirical law on the basis of new data and they show that they support the interpretation of the NB distributions in terms of a cascading mechanism of hardron production

  20. Empirical Philosophy of Science

    DEFF Research Database (Denmark)

    Mansnerus, Erika; Wagenknecht, Susann

    2015-01-01

    knowledge takes place through the integration of the empirical or historical research into the philosophical studies, as Chang, Nersessian, Thagard and Schickore argue in their work. Building upon their contributions we will develop a blueprint for an Empirical Philosophy of Science that draws upon...... qualitative methods from the social sciences in order to advance our philosophical understanding of science in practice. We will regard the relationship between philosophical conceptualization and empirical data as an iterative dialogue between theory and data, which is guided by a particular ‘feeling with......Empirical insights are proven fruitful for the advancement of Philosophy of Science, but the integration of philosophical concepts and empirical data poses considerable methodological challenges. Debates in Integrated History and Philosophy of Science suggest that the advancement of philosophical...

  1. Developmant of a Reparametrized Semi-Empirical Force Field to Compute the Rovibrational Structure of Large PAHs

    Science.gov (United States)

    Fortenberry, Ryan

    The Spitzer Space Telescope observation of spectra most likely attributable to diverse and abundant populations of polycyclic aromatic hydrocarbons (PAHs) in space has led to tremendous interest in these molecules as tracers of the physical conditions in different astrophysical regions. A major challenge in using PAHs as molecular tracers is the complexity of the spectral features in the 3-20 μm region. The large number and vibrational similarity of the putative PAHs responsible for these spectra necessitate determination for the most accurate basis spectra possible for comparison. It is essential that these spectra be established in order for the regions explored with the newest generation of observatories such as SOFIA and JWST to be understood. Current strategies to develop these spectra for individual PAHs involve either matrixisolation IR measurements or quantum chemical calculations of harmonic vibrational frequencies. These strategies have been employed to develop the successful PAH IR spectral database as a repository of basis functions used to fit astronomically observed spectra, but they are limited in important ways. Both techniques provide an adequate description of the molecules in their electronic, vibrational, and rotational ground state, but these conditions do not represent energetically hot regions for PAHs near strong radiation fields of stars and are not direct representations of the gas phase. Some non-negligible matrix effects are known in condensed-phase studies, and the inclusion of anharmonicity in quantum chemical calculations is essential to generate physically-relevant results especially for hot bands. While scaling factors in either case can be useful, they are agnostic to the system studied and are not robustly predictive. One strategy that has emerged to calculate the molecular vibrational structure uses vibrational perturbation theory along with a quartic force field (QFF) to account for higher-order derivatives of the potential

  2. Identification of zones of strong wind events in South Africa

    CSIR Research Space (South Africa)

    Goliger, Adam M

    2002-11-01

    Full Text Available This paper summarises the initial stage of development of a wind damage/disaster risk model for South Africa. The aim is to identify the generic zones of various types of strong wind events. The extent of these zones will form the basis...

  3. What 'empirical turn in bioethics'?

    Science.gov (United States)

    Hurst, Samia

    2010-10-01

    Uncertainty as to how we should articulate empirical data and normative reasoning seems to underlie most difficulties regarding the 'empirical turn' in bioethics. This article examines three different ways in which we could understand 'empirical turn'. Using real facts in normative reasoning is trivial and would not represent a 'turn'. Becoming an empirical discipline through a shift to the social and neurosciences would be a turn away from normative thinking, which we should not take. Conducting empirical research to inform normative reasoning is the usual meaning given to the term 'empirical turn'. In this sense, however, the turn is incomplete. Bioethics has imported methodological tools from empirical disciplines, but too often it has not imported the standards to which researchers in these disciplines are held. Integrating empirical and normative approaches also represents true added difficulties. Addressing these issues from the standpoint of debates on the fact-value distinction can cloud very real methodological concerns by displacing the debate to a level of abstraction where they need not be apparent. Ideally, empirical research in bioethics should meet standards for empirical and normative validity similar to those used in the source disciplines for these methods, and articulate these aspects clearly and appropriately. More modestly, criteria to ensure that none of these standards are completely left aside would improve the quality of empirical bioethics research and partly clear the air of critiques addressing its theoretical justification, when its rigour in the particularly difficult context of interdisciplinarity is what should be at stake.

  4. ORGANIZATIONAL VALUES AND MORAL VIRTUES OF ENTREPRENEUR: AN EMPIRICAL STUDY OF SLOVENIAN ENTREPRENEURS

    OpenAIRE

    Vasilij Mate; Dejan Jelovac; Anita Kralj

    2013-01-01

    This article examines the self-reflexion of Slovenian entrepreneurs to their own business activity, with a focus on their core values and virtues, which would consequently affect the performance, growth and development of entrepreneurship in Slovenia. The article starts with a theoretical understanding of organizational values and moral virtues of entrepreneurs and review of the recent empirical studies as the basis on which it is possible to achieve the explanation of the attitude of Sloveni...

  5. Child's Attachment to Mother as the Basis of Mental Development Typology

    Directory of Open Access Journals (Sweden)

    Galina V. Burmenskaya

    2009-01-01

    Full Text Available The article shows the role of the attachment system (child-mother interactions in development of a wide spectrum of individual personality characteristics. Emotional attachment of the child to mother is considered as a complicated system of internal regulation and a basis of typology of mental development. Results of a series of empirical studies show the connection between the type of attachment, formed at the early stages of child development, and characteristics of his/her autonomy, consciousness (self-concept and self-esteem and empathy in preschool and middle childhood.

  6. <strong>Size and local democracystrong>

    DEFF Research Database (Denmark)

    Mouritzen, Poul Erik; Rose, Lawrence

    2009-01-01

    The issue of the appropriate scale for local government has regularly appeared on the agenda of public sector reformers. In the empirical work devoted to this issue, the principal focus has been on the implications of size for efficiency in local service provision. Relatively less emphasis has be...

  7. Linear response calculation using the canonical-basis TDHFB with a schematic pairing functional

    International Nuclear Information System (INIS)

    Ebata, Shuichiro; Nakatsukasa, Takashi; Yabana, Kazuhiro

    2011-01-01

    A canonical-basis formulation of the time-dependent Hartree-Fock-Bogoliubov (TDHFB) theory is obtained with an approximation that the pair potential is assumed to be diagonal in the time-dependent canonical basis. The canonical-basis formulation significantly reduces the computational cost. We apply the method to linear-response calculations for even-even nuclei. E1 strength distributions for proton-rich Mg isotopes are systematically calculated. The calculation suggests strong Landau damping of giant dipole resonance for drip-line nuclei.

  8. An update on the "empirical turn" in bioethics: analysis of empirical research in nine bioethics journals.

    Science.gov (United States)

    Wangmo, Tenzin; Hauri, Sirin; Gennet, Eloise; Anane-Sarpong, Evelyn; Provoost, Veerle; Elger, Bernice S

    2018-02-07

    A review of literature published a decade ago noted a significant increase in empirical papers across nine bioethics journals. This study provides an update on the presence of empirical papers in the same nine journals. It first evaluates whether the empirical trend is continuing as noted in the previous study, and second, how it is changing, that is, what are the characteristics of the empirical works published in these nine bioethics journals. A review of the same nine journals (Bioethics; Journal of Medical Ethics; Journal of Clinical Ethics; Nursing Ethics; Cambridge Quarterly of Healthcare Ethics; Hastings Center Report; Theoretical Medicine and Bioethics; Christian Bioethics; and Kennedy Institute of Ethics Journal) was conducted for a 12-year period from 2004 to 2015. Data obtained was analysed descriptively and using a non-parametric Chi-square test. Of the total number of original papers (N = 5567) published in the nine bioethics journals, 18.1% (n = 1007) collected and analysed empirical data. Journal of Medical Ethics and Nursing Ethics led the empirical publications, accounting for 89.4% of all empirical papers. The former published significantly more quantitative papers than qualitative, whereas the latter published more qualitative papers. Our analysis reveals no significant difference (χ2 = 2.857; p = 0.091) between the proportion of empirical papers published in 2004-2009 and 2010-2015. However, the increasing empirical trend has continued in these journals with the proportion of empirical papers increasing from 14.9% in 2004 to 17.8% in 2015. This study presents the current state of affairs regarding empirical research published nine bioethics journals. In the quarter century of data that is available about the nine bioethics journals studied in two reviews, the proportion of empirical publications continues to increase, signifying a trend towards empirical research in bioethics. The growing volume is mainly attributable to two

  9. Improvement of electrocardiogram by empirical wavelet transform

    Science.gov (United States)

    Chanchang, Vikanda; Kumchaiseemak, Nakorn; Sutthiopad, Malee; Luengviriya, Chaiya

    2017-09-01

    Electrocardiogram (ECG) is a crucial tool in the detection of cardiac arrhythmia. It is also often used in a routine physical exam, especially, for elderly people. This graphical representation of electrical activity of heart is obtained by a measurement of voltage at the skin; therefore, the signal is always contaminated by noise from various sources. For a proper interpretation, the quality of the ECG should be improved by a noise reduction. In this article, we present a study of a noise filtration in the ECG by using an empirical wavelet transform (EWT). Unlike the traditional wavelet method, EWT is adaptive since the frequency spectrum of the ECG is taken into account in the construction of the wavelet basis. We show that the signal-to-noise ratio increases after the noise filtration for different noise artefacts.

  10. Online gaming addiction in children and adolescents: A review of empirical research.

    Science.gov (United States)

    Kuss, Daria J; Griffiths, Mark D

    2012-03-01

    Research suggests that excessive online gaming may lead to symptoms commonly experienced by substance addicts. Since games are particularly appealing to children and adolescents, these individuals may be more at risk than other groups of developing gaming addiction. Given these potential concerns, a literature review was undertaken in order (i) to present the classification basis of online gaming addiction using official mental disorder frameworks, (ii) to identify empirical studies that assess online gaming addiction in children and adolescents, and (iii) to present and evaluate the findings against the background of related and established mental disorder criteria. Empirical evidence comprising 30 studies indicates that for some adolescents, gaming addiction exists and that as the addiction develops, online gaming addicts spend increasing amounts of time preparing for, organizing, and actually gaming. Evidence suggests that problematic online gaming can be conceptualized as a behavioral addiction rather than a disorder of impulse control.

  11. Coherence of evidence from systematic reviews as a basis for evidence strength - a case study in support of an epistemological proposition

    Directory of Open Access Journals (Sweden)

    Mickenautsch Steffen

    2012-01-01

    Full Text Available Abstract Background This article aims to offer, on the basis of Coherence theory, the epistemological proposition that mutually supportive evidence from multiple systematic reviews may successfully refute radical, philosophical scepticism. Methods A case study including seven systematic reviews is presented with the objective of refuting radical philosophical scepticism towards the belief that glass-ionomer cements (GIC are beneficial in tooth caries therapy. The case study illustrates how principles of logical and empirical coherence may be applied as evidence in support of specific beliefs in healthcare. Results The results show that radical scepticism may epistemologically be refuted on the basis of logical and empirical coherence. For success, several systematic reviews covering interconnected beliefs are needed. In praxis, these systematic reviews would also need to be of high quality and its conclusions based on reviewed high quality trials. Conclusions A refutation of radical philosophical scepticism to clinical evidence may be achieved, if and only if such evidence is based on the logical and empirical coherence of multiple systematic review results. Practical application also requires focus on the quality of the systematic reviews and reviewed trials.

  12. Learning Mixtures of Truncated Basis Functions from Data

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre; Pérez-Bernabé, Inmaculada

    2014-01-01

    In this paper we investigate methods for learning hybrid Bayesian networks from data. First we utilize a kernel density estimate of the data in order to translate the data into a mixture of truncated basis functions (MoTBF) representation using a convex optimization technique. When utilizing a ke...... propose an alternative learning method that relies on the cumulative distribution function of the data. Empirical results demonstrate the usefulness of the approaches: Even though the methods produce estimators that are slightly poorer than the state of the art (in terms of log......In this paper we investigate methods for learning hybrid Bayesian networks from data. First we utilize a kernel density estimate of the data in order to translate the data into a mixture of truncated basis functions (MoTBF) representation using a convex optimization technique. When utilizing......-likelihood), they are significantly faster, and therefore indicate that the MoTBF framework can be used for inference and learning in reasonably sized domains. Furthermore, we show how a particular sub- class of MoTBF potentials (learnable by the proposed methods) can be exploited to significantly reduce complexity during inference....

  13. Empirical analyses of price formation in the German electricity market - the devil is in the details; Empirische Analysen der Preisbildung am deutschen Elektrizitaetsmarkt - der Teufel steckt im Detail.

    Energy Technology Data Exchange (ETDEWEB)

    Ellersdorfer, I.; Hundt, M.; Sun Ninghong; Voss, A. [Stuttgart Univ. (DE). Inst. fuer Energiewirtschaft und Rationelle Energieanwendung (IER)

    2008-05-15

    In view of the dramatic rise in wholesale prices over the past years, model-based empirical analyses of price formation in the electricity markets have become an important basis for the discussion on competition policy in Germany and Europe. Empirical analyses are usually performed on the basis of optimising fundamental models which describe the power supply system of a country in greater or lesser detail, thus making it possible to determine how power plants must be deployed so as to cover the electricity demand at the lowest possible cost. The task of determining the difference between market price and incremental cost, a parameter frequently used in competition analyses, is beset with many difficulties of a methodological or empirical nature. The present study undertakes the first ever systematic quantification of the influence of existing uncertainties on the results of the model calculations.

  14. Atomic and Free Electrons in a Strong Light Field

    International Nuclear Information System (INIS)

    Fedorov, Mikhail V.

    1998-02-01

    This book presents and describes a series of unusual and striking strong-field phenomena concerning atoms and free electrons. Some of these phenomena are: multiphoton stimulated Bremsstrahlung, free-electron lasers, ave-packet physics, above-threshold ionization, and strong-field stabilization in Rydberg atoms. The theoretical foundations and causes of the phenomena are described in detail, with all the approximations and derivations discussed. All the known and relevant experiments are described oo, and their results are compared with those of the existing theoretical models. An extensive general theoretical introduction gives a good basis for subsequent parts of the book and is an independent and self-sufficient description of the most efficient theoretical methods of the strong-field and multiphoton physics. This book can serve as a textbook for graduate students

  15. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  16. Probabilistic empirical prediction of seasonal climate: evaluation and potential applications

    Science.gov (United States)

    Dieppois, B.; Eden, J.; van Oldenborgh, G. J.

    2017-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a new evaluation of an established empirical system used to predict seasonal climate across the globe. Forecasts for surface air temperature, precipitation and sea level pressure are produced by the KNMI Probabilistic Empirical Prediction (K-PREP) system every month and disseminated via the KNMI Climate Explorer (climexp.knmi.nl). K-PREP is based on multiple linear regression and built on physical principles to the fullest extent with predictive information taken from the global CO2-equivalent concentration, large-scale modes of variability in the climate system and regional-scale information. K-PREP seasonal forecasts for the period 1981-2016 will be compared with corresponding dynamically generated forecasts produced by operational forecast systems. While there are many regions of the world where empirical forecast skill is extremely limited, several areas are identified where K-PREP offers comparable skill to dynamical systems. We discuss two key points in the future development and application of the K-PREP system: (a) the potential for K-PREP to provide a more useful basis for reference forecasts than those based on persistence or climatology, and (b) the added value of including K-PREP forecast information in multi-model forecast products, at least for known regions of good skill. We also discuss the potential development of

  17. Empirical Test Case Specification

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    This document includes the empirical specification on the IEA task of evaluation building energy simulation computer programs for the Double Skin Facades (DSF) constructions. There are two approaches involved into this procedure, one is the comparative approach and another is the empirical one. I....... In the comparative approach the outcomes of different software tools are compared, while in the empirical approach the modelling results are compared with the results of experimental test cases....

  18. A critical perspective on the measurement of performance in the empirical multinationality and performance literature

    DEFF Research Database (Denmark)

    Richter, Nicole Franziska; Schmidt, Robert; Ladwig, Tina J.

    2017-01-01

    used in empirical studies and the underlying theoretical streams that explain the effects on benefits and costs of multinationality. Our findings indicate that authors still largely rely on overall financial performance measures. Theoretical arguments, in contrast, refer to specific benefit and cost......This paper contributes to the core research in international business (IB), namely the relation between multinationality and performance, and is concerned with the quality of past empirical research designs. On the basis of 49 studies, we critically evaluate the match between performance measures...... positions that are better reflected in operational performance indicators. In our view, the idiosyncratic choice of the performance measures contributes to the varying results in past studies. We offer suggestions for improving future research designs....

  19. Empirical Support for Perceptual Conceptualism

    Directory of Open Access Journals (Sweden)

    Nicolás Alejandro Serrano

    2018-03-01

    Full Text Available The main objective of this paper is to show that perceptual conceptualism can be understood as an empirically meaningful position and, furthermore, that there is some degree of empirical support for its main theses. In order to do this, I will start by offering an empirical reading of the conceptualist position, and making three predictions from it. Then, I will consider recent experimental results from cognitive sciences that seem to point towards those predictions. I will conclude that, while the evidence offered by those experiments is far from decisive, it is enough not only to show that conceptualism is an empirically meaningful position but also that there is empirical support for it.

  20. Resonances of the helium atom in a strong magnetic field

    DEFF Research Database (Denmark)

    Lühr, Armin Christian; Al-Hujaj, Omar-Alexander; Schmelcher, Peter

    2007-01-01

    We present an investigation of the resonances of a doubly excited helium atom in a strong magnetic field covering the regime B=0–100  a.u. A full-interaction approach which is based on an anisotropic Gaussian basis set of one-particle functions being nonlinearly optimized for each field strength...

  1. Empirical research on the experience of the New Homiletic in South Korea

    Directory of Open Access Journals (Sweden)

    Hyun W. Park

    2016-03-01

    Full Text Available The purpose of this article is to present empirical research to reveal the reality of the New Homiletic in South Korea. This research was conducted by means of semistructured interviews with seven pastors and eight laypeople of the evangelical faith, residing in Seoul and its metropolitan areas, within the age limits of 20�59 years. The aim was to uncover the experience of the sermons by both the preachers and the hearers of the sermons. The researcher chose Pieterse�s methodology of analysing the data, which is an inductive analysis called open coding. Six main categories from the pastor�s group and five categories from the laypeople emerged from the data. The categories were rearranged into four themes, which is a valuable finding for current-day Korean preaching in order to enhance the homiletical praxis.Intradisciplinary and/or interdisciplinary implications: This article presents empirical research on the reality of the New Homiletic in South Korea. The results indicate similarity between South Korea and the USA. The conclusion is that traditional discourse should give way to the New Homiletic. This research can become the basis for finding new strategies for evangelical preaching.Keywods: Preaching; South Korean church; Empirical research; New Homiletic

  2. Analyzing the locomotory gaitprint of Caenorhabditis elegans on the basis of empirical mode decomposition.

    Directory of Open Access Journals (Sweden)

    Li-Chun Lin

    Full Text Available The locomotory gait analysis of the microswimmer, Caenorhabditis elegans, is a commonly adopted approach for strain recognition and examination of phenotypic defects. Gait is also a visible behavioral expression of worms under external stimuli. This study developed an adaptive data analysis method based on empirical mode decomposition (EMD to reveal the biological cues behind intricate motion. The method was used to classify the strains of worms according to their gaitprints (i.e., phenotypic traits of locomotion. First, a norm of the locomotory pattern was created from the worm of interest. The body curvature of the worm was decomposed into four intrinsic mode functions (IMFs. A radar chart showing correlations between the predefined database and measured worm was then obtained by dividing each IMF into three parts, namely, head, mid-body, and tail. A comprehensive resemblance score was estimated after k-means clustering. Simulated data that use sinusoidal waves were generated to assess the feasibility of the algorithm. Results suggested that temporal frequency is the major factor in the process. In practice, five worm strains, including wild-type N2, TJ356 (zIs356, CL2070 (dvIs70, CB0061 (dpy-5, and CL2120 (dvIs14, were investigated. The overall classification accuracy of the gaitprint analyses of all the strains reached nearly 89%. The method can also be extended to classify some motor neuron-related locomotory defects of C. elegans in the same fashion.

  3. The hydrodynamic basis of the vacuum cleaner effect in continuous-flow PCNL instruments: an empiric approach and mathematical model.

    Science.gov (United States)

    Mager, R; Balzereit, C; Gust, K; Hüsch, T; Herrmann, T; Nagele, U; Haferkamp, A; Schilling, D

    2016-05-01

    Passive removal of stone fragments in the irrigation stream is one of the characteristics in continuous-flow PCNL instruments. So far the physical principle of this so-called vacuum cleaner effect has not been fully understood yet. The aim of the study was to empirically prove the existence of the vacuum cleaner effect and to develop a physical hypothesis and generate a mathematical model for this phenomenon. In an empiric approach, common low-pressure PCNL instruments and conventional PCNL sheaths were tested using an in vitro model. Flow characteristics were visualized by coloring of irrigation fluid. Influence of irrigation pressure, sheath diameter, sheath design, nephroscope design and position of the nephroscope was assessed. Experiments were digitally recorded for further slow-motion analysis to deduce a physical model. In each tested nephroscope design, we could observe the vacuum cleaner effect. Increase in irrigation pressure and reduction in cross section of sheath sustained the effect. Slow-motion analysis of colored flow revealed a synergism of two effects causing suction and transportation of the stone. For the first time, our model showed a flow reversal in the sheath as an integral part of the origin of the stone transportation during vacuum cleaner effect. The application of Bernoulli's equation provided the explanation of these effects and confirmed our experimental results. We widen the understanding of PCNL with a conclusive physical model, which explains fluid mechanics of the vacuum cleaner effect.

  4. Noise reduction in digital speckle pattern interferometry using bidimensional empirical mode decomposition

    International Nuclear Information System (INIS)

    Bernini, Maria Belen; Federico, Alejandro; Kaufmann, Guillermo H.

    2008-01-01

    We propose a bidimensional empirical mode decomposition (BEMD) method to reduce speckle noise in digital speckle pattern interferometry (DSPI) fringes. The BEMD method is based on a sifting process that decomposes the DSPI fringes in a finite set of subimages represented by high and low frequency oscillations, which are named modes. The sifting process assigns the high frequency information to the first modes, so that it is possible to discriminate speckle noise from fringe information, which is contained in the remaining modes. The proposed method is a fully data-driven technique, therefore neither fixed basis functions nor operator intervention are required. The performance of the BEMD method to denoise DSPI fringes is analyzed using computer-simulated data, and the results are also compared with those obtained by means of a previously developed one-dimensional empirical mode decomposition approach. An application of the proposed BEMD method to denoise experimental fringes is also presented

  5. Knowledge-oriented strategies in the metal industry (empirical studies

    Directory of Open Access Journals (Sweden)

    A. Krawczyk-Sołtys

    2016-07-01

    Full Text Available The aim of this article is an attempt to determine which knowledge-oriented strategies can give metal industry enterprises the best results in achieving and maintaining a competitive advantage. To determine which of these discussed in the literature and implemented in various organizations knowledge-oriented strategies may prove to be the most effective in the metal industry, empirical research has begun. A chosen strategy of knowledge management and supporting strategies are the basis of a choice of methods and means of intended implementation. The choice of a specific knowledge management strategy may also result in the need for changes in an organization, particularly in an information system, internal communication, work organization and human resource management.

  6. THE DYNAMIC INTER-RELATIONSHIP BETWEEN OBESITY AND SCHOOL PERFORMANCE: NEW EMPIRICAL EVIDENCE FROM AUSTRALIA.

    Science.gov (United States)

    Nghiem, Son; Hoang, Viet-Ngu; Vu, Xuan-Binh; Wilson, Clevo

    2017-12-04

    This paper proposes a new empirical model for examining the relationship between obesity and school performance using the simultaneous equation modelling approach. The lagged effects of both learning and health outcomes were included to capture both the dynamic and inter-relational aspects of the relationship between obesity and school performance. The empirical application of this study used comprehensive data from the first five waves of the Longitudinal Study of Australian Children (LSAC), which commenced in 2004 (wave 1) and was repeated every two years until 2018. The study sample included 10,000 children, equally divided between two cohorts (infants and children) across Australia. The empirical results show that past learning and obesity status are strongly associated with most indicators of school outcomes, including reading, writing, spelling, grammar and numeracy national tests, and scores from the internationally standardized Peabody Picture Vocabulary Test and the Matrix Reasoning Test. The main findings of this study are robust due to the choice of obesity indicator and estimation methods.

  7. Empirical Formulas for the Calculations of the Hardness of Steels Cooled From the Austenitizing Temperature

    Directory of Open Access Journals (Sweden)

    Trzaska J.

    2016-09-01

    Full Text Available In this paper, the equations applied for the purpose of the calculations of the hardness of continuously cooled structural steels upon the basis of the temperature of austenitizing. The independent variables of the hardness model were: the mass concentrations of elements, the austenitizing temperature and the cooling rate. The equations were developed with the application of the following methods: multiple regression and logistic regression. In this paper, attention was paid to preparing data for the purpose of calculations, to the methodology of the calculations, and also to the assessment of the quality of developed formulas. The collection of empirical data was prepared upon the basis of more than 500 CCT diagrams.

  8. Review of the Empirical and Clinical Support for Group Therapy Specific to Sexual Abusers.

    Science.gov (United States)

    Jennings, Jerry L; Deming, Adam

    2017-12-01

    This review compiles 48 empirical studies and 55 clinical/practice articles specific to group therapy with sex offenders. Historically, group therapy has always been the predominant modality in sex offender-specific treatment. In the first decades of the field, treatment applied a psychoanalytic methodology that, although not empirically supported, fully appreciated the primary therapeutic importance of the group modality. Conversely, since the early 1980s, treatment has applied a cognitive behavioral method, but the field has largely neglected the therapeutic value of interpersonal group dynamics. The past decade has seen a growing re-appreciation of general therapeutic processes and more holistic approaches in sex offender treatment, and there is an emerging body of empirical research which, although often indirectly concerned with group, has yielded three definitive conclusions. First, the therapeutic qualities of the group therapist-specifically warmth, empathy, encouragement, and guidance-can strongly affect outcomes. Second, the quality of group cohesion can profoundly affect the effectiveness of treatment. Third, confrontational approaches in group therapy are ineffective, if not counter-therapeutic, and overwhelmingly rated as not helpful by sex offenders themselves. Additional conclusions are less strongly supported, but include compelling evidence that sex offenders generally prefer group therapy over individual therapy, that group therapy appears equally effective to individual therapy, and that mixing or separating groups by offense type is not important to therapeutic climate. Other group techniques and approaches specific to sexual abuse treatment are also summarized.

  9. Minimizing the trend effect on detrended cross-correlation analysis with empirical mode decomposition

    International Nuclear Information System (INIS)

    Zhao Xiaojun; Shang Pengjian; Zhao Chuang; Wang Jing; Tao Rui

    2012-01-01

    Highlights: ► Investigate the effects of linear, exponential and periodic trends on DCCA. ► Apply empirical mode decomposition to extract trend term. ► Strong and monotonic trends are successfully eliminated. ► Get the cross-correlation exponent in a persistent behavior without crossover. - Abstract: Detrended cross-correlation analysis (DCCA) is a scaling method commonly used to estimate long-range power law cross-correlation in non-stationary signals. However, the susceptibility of DCCA to trends makes the scaling results difficult to analyze due to spurious crossovers. We artificially generate long-range cross-correlated signals and systematically investigate the effect of linear, exponential and periodic trends. Specifically to the crossovers raised by trends, we apply empirical mode decomposition method which decomposes underlying signals into several intrinsic mode functions (IMF) and a residual trend. After the removal of residual term, strong and monotonic trends such as linear and exponential trends are successfully eliminated. But periodic trend cannot be separated out according to the criterion of IMF, which can be eliminated by Fourier transform. As a special case of DCCA, detrended fluctuation analysis presents similar results.

  10. Climate Prediction for Brazil's Nordeste: Performance of Empirical and Numerical Modeling Methods.

    Science.gov (United States)

    Moura, Antonio Divino; Hastenrath, Stefan

    2004-07-01

    Comparisons of performance of climate forecast methods require consistency in the predictand and a long common reference period. For Brazil's Nordeste, empirical methods developed at the University of Wisconsin use preseason (October January) rainfall and January indices of the fields of meridional wind component and sea surface temperature (SST) in the tropical Atlantic and the equatorial Pacific as input to stepwise multiple regression and neural networking. These are used to predict the March June rainfall at a network of 27 stations. An experiment at the International Research Institute for Climate Prediction, Columbia University, with a numerical model (ECHAM4.5) used global SST information through February to predict the March June rainfall at three grid points in the Nordeste. The predictands for the empirical and numerical model forecasts are correlated at +0.96, and the period common to the independent portion of record of the empirical prediction and the numerical modeling is 1968 99. Over this period, predicted versus observed rainfall are evaluated in terms of correlation, root-mean-square error, absolute error, and bias. Performance is high for both approaches. Numerical modeling produces a correlation of +0.68, moderate errors, and strong negative bias. For the empirical methods, errors and bias are small, and correlations of +0.73 and +0.82 are reached between predicted and observed rainfall.

  11. Modeling multivariate time series on manifolds with skew radial basis functions.

    Science.gov (United States)

    Jamshidi, Arta A; Kirby, Michael J

    2011-01-01

    We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.

  12. Uniform risk spectra of strong earthquake ground motion: NEQRISK

    International Nuclear Information System (INIS)

    Lee, V.W.; Trifunac, M.D.

    1987-01-01

    The concept of uniform risk spectra of Anderson and Trifunac (1977) has been generalized to include (1) more refined description of earthquake source zones, (2) the uncertainties in estimating seismicity parameters a and b in log 10 N = a - bM, (3) to consider uncertainties in estimation of maximum earthquake size in each source zone, and to (4) include the most recent results on empirical scaling of strong motion amplitudes at a site. Examples of using to new NEQRISK program are presented and compared with the corresponding case studies of Anderson and Trifunac (1977). The organization of the computer program NEQRISK is also briefly described

  13. Life Writing After Empire

    DEFF Research Database (Denmark)

    A watershed moment of the twentieth century, the end of empire saw upheavals to global power structures and national identities. However, decolonisation profoundly affected individual subjectivities too. Life Writing After Empire examines how people around the globe have made sense of the post...... in order to understand how individual life writing reflects broader societal changes. From far-flung corners of the former British Empire, people have turned to life writing to manage painful or nostalgic memories, as well as to think about the past and future of the nation anew through the personal...

  14. A review of the nurtured heart approach to parenting: evaluation of its theoretical and empirical foundations.

    Science.gov (United States)

    Hektner, Joel M; Brennan, Alison L; Brotherson, Sean E

    2013-09-01

    The Nurtured Heart Approach to parenting (NHA; Glasser & Easley, 2008) is summarized and evaluated in terms of its alignment with current theoretical perspectives and empirical evidence in family studies and developmental science. Originally conceived and promoted as a behavior management approach for parents of difficult children (i.e., with behavior disorders), NHA is increasingly offered as a valuable strategy for parents of any children, despite a lack of published empirical support. Parents using NHA are trained to minimize attention to undesired behaviors, provide positive attention and praise for compliance with rules, help children be successful by scaffolding and shaping desired behavior, and establish a set of clear rules and consequences. Many elements of the approach have strong support in the theoretical and empirical literature; however, some of the assumptions are more questionable, such as that negative child behavior can always be attributed to unintentional positive reinforcement by parents responding with negative attention. On balance, NHA appears to promote effective and validated parenting practices, but its effectiveness now needs to be tested empirically. © FPI, Inc.

  15. Application of GIS to Empirical Windthrow Risk Model in Mountain Forested Landscapes

    Directory of Open Access Journals (Sweden)

    Lukas Krejci

    2018-02-01

    Full Text Available Norway spruce dominates mountain forests in Europe. Natural variations in the mountainous coniferous forests are strongly influenced by all the main components of forest and landscape dynamics: species diversity, the structure of forest stands, nutrient cycling, carbon storage, and other ecosystem services. This paper deals with an empirical windthrow risk model based on the integration of logistic regression into GIS to assess forest vulnerability to wind-disturbance in the mountain spruce forests of Šumava National Park (Czech Republic. It is an area where forest management has been the focus of international discussions by conservationists, forest managers, and stakeholders. The authors developed the empirical windthrow risk model, which involves designing an optimized data structure containing dependent and independent variables entering logistic regression. The results from the model, visualized in the form of map outputs, outline the probability of risk to forest stands from wind in the examined territory of the national park. Such an application of the empirical windthrow risk model could be used as a decision support tool for the mountain spruce forests in a study area. Future development of these models could be useful for other protected European mountain forests dominated by Norway spruce.

  16. The extended reciprocity: Strong belief outperforms persistence.

    Science.gov (United States)

    Kurokawa, Shun

    2017-05-21

    The existence of cooperation is a mysterious phenomenon and demands explanation, and direct reciprocity is one key potential explanation for the evolution of cooperation. Direct reciprocity allows cooperation to evolve for cooperators who switch their behavior on the basis of information about the opponent's behavior. Here, relevant to direct reciprocity is information deficiency. When the opponent's last move is unknown, how should players behave? One possibility is to choose cooperation with some default probability without using any further information. In fact, our previous paper (Kurokawa, 2016a) examined this strategy. However, there might be beneficial information other than the opponent's last move. A subsequent study of ours (Kurokawa, 2017) examined the strategy which uses the own last move when the opponent's last move is unknown, and revealed that referring to the own move and trying to imitate it when information is absent is beneficial. Is there any other beneficial information else? How about strong belief (i.e., have infinite memory and believe that the opponent's behavior is unchanged)? Here, we examine the evolution of strategies with strong belief. Analyzing the repeated prisoner's dilemma game and using evolutionarily stable strategy (ESS) analysis against an invasion by unconditional defectors, we find the strategy with strong belief is more likely to evolve than the strategy which does not use information other than the opponent player's last move and more likely to evolve than the strategy which uses not only the opponent player's last move but also the own last move. Strong belief produces the extended reciprocity and facilitates the evolution of cooperation. Additionally, we consider the two strategies game between strategies with strong belief and any strategy, and we consider the four strategies game in which unconditional cooperators, unconditional defectors, pessimistic reciprocators with strong belief, and optimistic reciprocators with

  17. Empirical ethics, context-sensitivity, and contextualism.

    Science.gov (United States)

    Musschenga, Albert W

    2005-10-01

    In medical ethics, business ethics, and some branches of political philosophy (multi-culturalism, issues of just allocation, and equitable distribution) the literature increasingly combines insights from ethics and the social sciences. Some authors in medical ethics even speak of a new phase in the history of ethics, hailing "empirical ethics" as a logical next step in the development of practical ethics after the turn to "applied ethics." The name empirical ethics is ill-chosen because of its associations with "descriptive ethics." Unlike descriptive ethics, however, empirical ethics aims to be both descriptive and normative. The first question on which I focus is what kind of empirical research is used by empirical ethics and for which purposes. I argue that the ultimate aim of all empirical ethics is to improve the context-sensitivity of ethics. The second question is whether empirical ethics is essentially connected with specific positions in meta-ethics. I show that in some kinds of meta-ethical theories, which I categorize as broad contextualist theories, there is an intrinsic need for connecting normative ethics with empirical social research. But context-sensitivity is a goal that can be aimed for from any meta-ethical position.

  18. Empirical Hamiltonians

    International Nuclear Information System (INIS)

    Peggs, S.; Talman, R.

    1987-01-01

    As proton accelerators get larger, and include more magnets, the conventional tracking programs which simulate them run slower. The purpose of this paper is to describe a method, still under development, in which element-by-element tracking around one turn is replaced by a single man, which can be processed far faster. It is assumed for this method that a conventional program exists which can perform faithful tracking in the lattice under study for some hundreds of turns, with all lattice parameters held constant. An empirical map is then generated by comparison with the tracking program. A procedure has been outlined for determining an empirical Hamiltonian, which can represent motion through many nonlinear kicks, by taking data from a conventional tracking program. Though derived by an approximate method this Hamiltonian is analytic in form and can be subjected to further analysis of varying degrees of mathematical rigor. Even though the empirical procedure has only been described in one transverse dimension, there is good reason to hope that it can be extended to include two transverse dimensions, so that it can become a more practical tool in realistic cases

  19. Empirical Models of Social Learning in a Large, Evolving Network.

    Directory of Open Access Journals (Sweden)

    Ayşe Başar Bener

    Full Text Available This paper advances theories of social learning through an empirical examination of how social networks change over time. Social networks are important for learning because they constrain individuals' access to information about the behaviors and cognitions of other people. Using data on a large social network of mobile device users over a one-month time period, we test three hypotheses: 1 attraction homophily causes individuals to form ties on the basis of attribute similarity, 2 aversion homophily causes individuals to delete existing ties on the basis of attribute dissimilarity, and 3 social influence causes individuals to adopt the attributes of others they share direct ties with. Statistical models offer varied degrees of support for all three hypotheses and show that these mechanisms are more complex than assumed in prior work. Although homophily is normally thought of as a process of attraction, people also avoid relationships with others who are different. These mechanisms have distinct effects on network structure. While social influence does help explain behavior, people tend to follow global trends more than they follow their friends.

  20. Empire as a Geopolitical Figure

    DEFF Research Database (Denmark)

    Parker, Noel

    2010-01-01

    This article analyses the ingredients of empire as a pattern of order with geopolitical effects. Noting the imperial form's proclivity for expansion from a critical reading of historical sociology, the article argues that the principal manifestation of earlier geopolitics lay not in the nation...... but in empire. That in turn has been driven by a view of the world as disorderly and open to the ordering will of empires (emanating, at the time of geopolitics' inception, from Europe). One implication is that empires are likely to figure in the geopolitical ordering of the globe at all times, in particular...... after all that has happened in the late twentieth century to undermine nationalism and the national state. Empire is indeed a probable, even for some an attractive form of regime for extending order over the disorder produced by globalisation. Geopolitics articulated in imperial expansion is likely...

  1. A comparison of social accounting between local public healthcare services:An empirical research

    Directory of Open Access Journals (Sweden)

    Paolo Ursillo

    2010-03-01

    Full Text Available

    <strong>Introduction>: Social accounting in healthcare is a quantitative–qualitative accounting tool which marks the bond between the business and its social background. It displays healthcare business results and information to the stakeholder. Actually, its use is not widespread in Italy, but often published in United States and other Countries. <strong>

    Methods>: This work is based upon an empirical research, studying social accounting from Local Health Units (LHU, Italian ASL of Adria, Brindisi, Firenze and Umbria region published between 2006 and 2008. These documents have been analyzed, studying the business’ structure, healthcare services, social and economical conditions, financial status, performance indexes and much more data about most company activities.

    <strong>Results>: Accountability in Italy has been studied carefully through longitudinal and cross sectional analysis, observing models and contents, elaborating a concrete proposal for social accounting.

    <strong>Discussion>: Social accounting in healthcare can guarantee important information for non-expert users and expert technicians, allowing the former to take more conscious decisions, and the latter to study its business aspects more deeply. This is made possible by the consideration of extended economical data available in other accountability forms (like annual financial statement, and other performance indexes which give valuable data about social impact, efficiency and effectiveness to the end user.

  2. Comparison of two interpolation methods for empirical mode decomposition based evaluation of radiographic femur bone images.

    Science.gov (United States)

    Udhayakumar, Ganesan; Sujatha, Chinnaswamy Manoharan; Ramakrishnan, Swaminathan

    2013-01-01

    Analysis of bone strength in radiographic images is an important component of estimation of bone quality in diseases such as osteoporosis. Conventional radiographic femur bone images are used to analyze its architecture using bi-dimensional empirical mode decomposition method. Surface interpolation of local maxima and minima points of an image is a crucial part of bi-dimensional empirical mode decomposition method and the choice of appropriate interpolation depends on specific structure of the problem. In this work, two interpolation methods of bi-dimensional empirical mode decomposition are analyzed to characterize the trabecular femur bone architecture of radiographic images. The trabecular bone regions of normal and osteoporotic femur bone images (N = 40) recorded under standard condition are used for this study. The compressive and tensile strength regions of the images are delineated using pre-processing procedures. The delineated images are decomposed into their corresponding intrinsic mode functions using interpolation methods such as Radial basis function multiquadratic and hierarchical b-spline techniques. Results show that bi-dimensional empirical mode decomposition analyses using both interpolations are able to represent architectural variations of femur bone radiographic images. As the strength of the bone depends on architectural variation in addition to bone mass, this study seems to be clinically useful.

  3. Gender and Subject Choice: An Empirical Study on Undergraduate Students' Majors in Phnom Penh

    Science.gov (United States)

    Dom, Vannak; Yi, Gihong

    2018-01-01

    The empirical study on 610 undergraduate students between the age of 16 to 25 in Phnom Penh, Cambodia, was set to examine the relationship of gender and subject choice. The findings have revealed that women were overrepresented in non-science subjects and their gender identity has strong connection with subject choice (*** p < 0.001). The study…

  4. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    Energy Technology Data Exchange (ETDEWEB)

    Spackman, Peter R.; Karton, Amir, E-mail: amir.karton@uwa.edu.au [School of Chemistry and Biochemistry, The University of Western Australia, Perth, WA 6009 (Australia)

    2015-05-15

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/L{sup α} two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/or second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol{sup –1}. The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol{sup –1}.

  5. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    International Nuclear Information System (INIS)

    Spackman, Peter R.; Karton, Amir

    2015-01-01

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/L α two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/or second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol –1 . The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol –1

  6. Strong eukaryotic IRESs have weak secondary structure.

    Directory of Open Access Journals (Sweden)

    Xuhua Xia

    Full Text Available BACKGROUND: The objective of this work was to investigate the hypothesis that eukaryotic Internal Ribosome Entry Sites (IRES lack secondary structure and to examine the generality of the hypothesis. METHODOLOGY/PRINCIPAL FINDINGS: IRESs of the yeast and the fruit fly are located in the 5'UTR immediately upstream of the initiation codon. The minimum folding energy (MFE of 60 nt RNA segments immediately upstream of the initiation codons was calculated as a proxy of secondary structure stability. MFE of the reverse complements of these 60 nt segments was also calculated. The relationship between MFE and empirically determined IRES activity was investigated to test the hypothesis that strong IRES activity is associated with weak secondary structure. We show that IRES activity in the yeast and the fruit fly correlates strongly with the structural stability, with highest IRES activity found in RNA segments that exhibit the weakest secondary structure. CONCLUSIONS: We found that a subset of eukaryotic IRESs exhibits very low secondary structure in the 5'-UTR sequences immediately upstream of the initiation codon. The consistency in results between the yeast and the fruit fly suggests a possible shared mechanism of cap-independent translation initiation that relies on an unstructured RNA segment.

  7. [Normative-empirical determination of personnel requirements in psychosomatic medicine and psychotherapy].

    Science.gov (United States)

    Heuft, Gereon; Hochlehnert, Achim; Barufka, Steffi; Nikendei, Christoph; Kruse, Johannes; Zipfel, Stephan; Hofmann, Tobias; Hildenbrand, Gerhard; Cuntz, Ulrich; Herzog, Wolfgang; Heller, Michael

    2015-01-01

    There is a high degree of misallocated medical care for patients with somatoform disorders and patients with concomitant mental diseases. This complex of problems could be reduced remarkably by integrating psychosomatic departments into hospitals with maximum medical care. Admitting a few big psychosomatic specialist clinics into the calculation basis decreased the Day-Mix Index (DMI). The massive reduction of the calculated costs per day leads to a gap in funding resulting in a loss of the necessary personnel requirements - at least in university psychosomatic departments. The objective of this article is therefore to empirically verify the reference numbers of personnel resources calculated on the basis of the new German lump-sum reimbursement system in psychiatry and psychosomatics (PEPP). The minute values of the reference numbers of Heuft (1999) are contrasted with the minute values of the PEPP reimbursement system in the years 2013 and 2014, as calculated by the Institute for Payment Systems in Hospitals (InEK). The minute values derived from the PEPP data show a remarkable convergence with the minute values of Heuft's reference numbers (1999). A pure pricing system like the PEPP reimbursement system as designed so far threatens empirically verifiable and qualified personnel requirements of psychosomatic departments. In order to ensure the necessary therapy dosage and display it in minute values according to the valid OPS procedure codes, the minimum limit of the reference numbers is mandatory to maintain the substance of psychosomatic care. Based on the present calculation, a base rate of at least 285 e has to be politically demanded. Future developments in personnel costs have to be refinanced at 100 %.

  8. Theological reflections on empire

    Directory of Open Access Journals (Sweden)

    Allan A. Boesak

    2009-11-01

    Full Text Available Since the meeting of the World Alliance of Reformed Churches in Accra, Ghana (2004, and the adoption of the Accra Declaration, a debate has been raging in the churches about globalisation, socio-economic justice, ecological responsibility, political and cultural domination and globalised war. Central to this debate is the concept of empire and the way the United States is increasingly becoming its embodiment. Is the United States a global empire? This article argues that the United States has indeed become the expression of a modern empire and that this reality has considerable consequences, not just for global economics and politics but for theological refl ection as well.

  9. Density matrix of strongly coupled quantum dot - microcavity system

    International Nuclear Information System (INIS)

    Nguyen Van Hop

    2009-01-01

    Any two-level quantum system can be used as a quantum bit (qubit) - the basic element of all devices and systems for quantum information and quantum computation. Recently it was proposed to study the strongly coupled system consisting of a two-level quantum dot and a monoenergetic photon gas in a microcavity-the strongly coupled quantum dot-microcavity (QD-MC) system for short, with the Jaynes-Cumming total Hamiltonian, for the application in the quantum information processing. Different approximations were applied in the theoretical study of this system. In this work, on the basis of the exact solution of the Schrodinger equation for this system without dissipation we derive the exact formulae for its density matrix. The realization of a qubit in this system is discussed. The solution of the system of rate equation for the strongly coupled QD-MC system in the presence of the interaction with the environment was also established in the first order approximation with respect to this interaction.

  10. Kindness in Australia: an empirical critique of moral decline sociology.

    Science.gov (United States)

    Habibis, Daphne; Hookway, Nicholas; Vreugdenhil, Anthea

    2016-09-01

    A new sociological agenda is emerging that interrogates how morality can be established in the absence of the moral certainties of the past but there is a shortage of empirical work on this topic. This article establishes a theoretical framework for the empirical analysis of everyday morality drawing on the work of theorists including Ahmed, Bauman and Taylor. It uses the Australian Survey of Social Attitudes to assess the state and shape of contemporary moralities by asking how kind are Australians, how is its expression socially distributed, and what are the motivations for kindness. The findings demonstrate that Australians exhibit a strong attachment and commitment to kindness as a moral value that is primarily motivated by interiorized sources of moral authority. We argue these findings support the work of theorists such as Ahmed and Taylor who argue authenticity and embodied emotion are legitimate sources of morality in today's secular societies. The research also provides new evidence that generational changes are shaping understandings and practices of kindness in unexpected ways. © London School of Economics and Political Science 2016.

  11. Strong coupling in a gauge invariant field theory

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, K. [Physics Department, Massachusetts Institute of Technology, Cambridge, MA (United States)

    1963-01-15

    I would like to discuss some approximations which may be significant in the domain of strong coupling in a field system analogous to quantum electrodynamics. The motivation of this work is the idea that the strong couplings and elementary particle spectrum may be the consequence of the dynamics of a system whose underlying description is in terms of a set of Fermi fields gauge invariantly coupled to a single (''bare'') massless neutral vector field. The basis of this gauge invariance would of course be the exact conservation law of baryons or ''nucleonic charge''. It seems to me that a coupling scheme based on an invariance principle is most attractive if that invariance is an exact one. It would then be nice to try to account for the approximate invariance principles in the same way one would describe ''accidental degeneracies'' in any quantum system.

  12. REFORMASI SISTEM AKUNTANSI CASH BASIS MENUJU SISTEM AKUNTANSI ACCRUAL BASIS

    Directory of Open Access Journals (Sweden)

    Yuri Rahayu

    2016-03-01

    Full Text Available Abstract –  Accounting reform movement was born with the aim of structuring the direction of improvement . This movement is characterized by the enactment of the Act of 2003 and Act 1 of 2004, which became the basis of the birth of Government Regulation No.24 of 2005 on Government Accounting Standards ( SAP . The general,  accounting is based on two systems,  the cash basis  and the accrual basis. The facts speak far students still at problem with differences to the two methods that result in a lack of understanding on the treatment system for recording. The purpose method of research is particularly relevant to student references who are learning basic accounting so that it can provide information and more meaningful understanding of the accounting method cash basis and Accrual basis. This research was conducted through a normative approach, by tracing the document that references a study/library that combines source of reference that can be believed either from books and the internet are processed with a foundation of knowledge and experience of the author. The conclusion can be drawn that basically to be able to understand the difference of the system and the Cash Basis accrual student base treatment requires an understanding of both methods. To be able to have the ability and understanding of both systems required reading exercises and reference sources.   Keywords : Reform, cash basis, accrual basis   Abstrak - Gerakan reformasi akuntansi dilahirkan dengan tujuan penataan ke arah perbaikan. Gerakan ini  ditandai dengan dikeluarkannya  Undang-Undang tahun 2003 dan Undang-Undang No.1 Tahun 2004  yang menjadi dasar lahirnya Peraturan Pemerintah No.24 Tahun 2005 tentang Standar Akuntansi Pemerintah (SAP . Pada umumnya pencatatan akuntansi di dasarkan pada dua sistem yaitu basis kas (Cash Basis dan basis akrual  (Accrual Basis. Fakta berbicara Selama ini mahasiswa masih dibinggungkan dengan perbedaan ke dua metode itu sehingga

  13. The dynamics of the rise and fall of empires

    Science.gov (United States)

    Gündüz, Güngör

    2016-05-01

    The rise of empires can be elucidated by treating them as living organisms, and the celebrated Verhulst or Lotka-Volterra dynamics can be used to understand the growth mechanisms of empires. The fast growth can be expressed by an exponential function as in the case of Macedonian empire of the Alexander the Great whereas a sigmoidal growth can be expressed by power-law equation as in the case of Roman and Ottoman empires. The superpowers Russia and the USA follow somehow different mechanisms, Russia displays two different exponential growth behaviors whereas the USA follows two different power-law behaviors. They did not disturb and mobilize their social capacity much during the course of their rise. The decline and the collapse of an empire occur through a kind of fragmentation process, and the consequently formed small states become rather free in their behavior. The lands of the new states formed exhibit a hierarchical pattern, and the number of the states having an area smaller than the largest one can be given either by an exponential or power-law function. The exponential distribution pattern occurs when the states are quite free in their pursuits, but the power-law behavior occurs when they are under the pressure of an empire or a strong state in the region. The geological and geographical conditions also affect whether there occurs exponential or power-law behavior. The new unions formed such as the European Union and the Shanghai Cooperation increase the power-law exponent implying that they increase the stress in the international affairs. The viscoelastic behavior of the empires can be found from the scattering diagrams, and the storage (G‧)and loss modulus (G‧‧), and the associated work-like and heat-like terms can be determined in the sense of thermodynamics. The G‧ of Ottomans was larger than that of Romans implying that they confronted severe resistance during their expansion. The G‧ of Russia is also larger than that of the USA; in fact the

  14. The neural basis of testable and non-testable beliefs.

    Directory of Open Access Journals (Sweden)

    Jonathon R Howlett

    Full Text Available Beliefs about the state of the world are an important influence on both normal behavior and psychopathology. However, understanding of the neural basis of belief processing remains incomplete, and several aspects of belief processing have only recently been explored. Specifically, different types of beliefs may involve fundamentally different inferential processes and thus recruit distinct brain regions. Additionally, neural processing of truth and falsity may differ from processing of certainty and uncertainty. The purpose of this study was to investigate the neural underpinnings of assessment of testable and non-testable propositions in terms of truth or falsity and the level of certainty in a belief. Functional magnetic resonance imaging (fMRI was used to study 14 adults while they rated propositions as true or false and also rated the level of certainty in their judgments. Each proposition was classified as testable or non-testable. Testable propositions activated the DLPFC and posterior cingulate cortex, while non-testable statements activated areas including inferior frontal gyrus, superior temporal gyrus, and an anterior region of the superior frontal gyrus. No areas were more active when a proposition was accepted, while the dorsal anterior cingulate was activated when a proposition was rejected. Regardless of whether a proposition was testable or not, certainty that the proposition was true or false activated a common network of regions including the medial prefrontal cortex, caudate, posterior cingulate, and a region of middle temporal gyrus near the temporo-parietal junction. Certainty in the truth or falsity of a non-testable proposition (a strong belief without empirical evidence activated the insula. The results suggest that different brain regions contribute to the assessment of propositions based on the type of content, while a common network may mediate the influence of beliefs on motivation and behavior based on the level of

  15. Does better information about hospital quality affect patients’ choice? Empirical findings from Germany

    OpenAIRE

    Wübker, Ansgar; Sauerland, Dirk; Wübker, Achim

    2008-01-01

    Background: Economic theory strongly suggests that better information about the quality of care affects patients’ choice of health service providers. However, we have little empirical evidence about the impact of information provided on provider’s choice in Germany. Problem: In Germany, we recently find publicly available information about hospital quality. For example, 50 percent of the hospitals in the Rhine-Ruhr area do now publish their quality data voluntarily in a comprehensive, underst...

  16. Why borrowers pay premiums to larger lenders: Empirical evidence from sovereign syndicated loans

    OpenAIRE

    Hallak, Issam

    2002-01-01

    All other terms being equal (e.g. seniority), syndicated loan contracts provide larger lending compensations (in percentage points) to institutions funding larger amounts. This paper explores empirically the motivation for such a price design on a sample of sovereign syndicated loans in the period 1990-1997. I find strong evidence that a larger premium is associated with higher renegotiation probability and information asymmetries. It hardly has any impact on the number of lenders though. Thi...

  17. Daylight Influence on Colour Design : Empirical Study on Perceived Colour and Colour Experience Indoors

    OpenAIRE

    Hårleman, Maud

    2007-01-01

    It is known that one and the same interior colouring will appear different in rooms with windows facing north or facing south, but it is not known how natural daylight from these two compass points affects perceived colour and the ways in which colour is experienced. The objective is to describe the perceived colours to be expected in rooms with sunlight and diffused light, and thus develop a tool for colour design. Two empirical investigations provide the basis for six attached papers. The m...

  18. Advanced Test Reactor Safety Basis Upgrade Lessons Learned Relative to Design Basis Verification and Safety Basis Management

    International Nuclear Information System (INIS)

    G. L. Sharp; R. T. McCracken

    2004-01-01

    The Advanced Test Reactor (ATR) is a pressurized light-water reactor with a design thermal power of 250 MW. The principal function of the ATR is to provide a high neutron flux for testing reactor fuels and other materials. The reactor also provides other irradiation services such as radioisotope production. The ATR and its support facilities are located at the Test Reactor Area of the Idaho National Engineering and Environmental Laboratory (INEEL). An audit conducted by the Department of Energy's Office of Independent Oversight and Performance Assurance (DOE OA) raised concerns that design conditions at the ATR were not adequately analyzed in the safety analysis and that legacy design basis management practices had the potential to further impact safe operation of the facility.1 The concerns identified by the audit team, and issues raised during additional reviews performed by ATR safety analysts, were evaluated through the unreviewed safety question process resulting in shutdown of the ATR for more than three months while these concerns were resolved. Past management of the ATR safety basis, relative to facility design basis management and change control, led to concerns that discrepancies in the safety basis may have developed. Although not required by DOE orders or regulations, not performing design basis verification in conjunction with development of the 10 CFR 830 Subpart B upgraded safety basis allowed these potential weaknesses to be carried forward. Configuration management and a clear definition of the existing facility design basis have a direct relation to developing and maintaining a high quality safety basis which properly identifies and mitigates all hazards and postulated accident conditions. These relations and the impact of past safety basis management practices have been reviewed in order to identify lessons learned from the safety basis upgrade process and appropriate actions to resolve possible concerns with respect to the current ATR safety

  19. The inherited basis of human radiosensitivity

    International Nuclear Information System (INIS)

    Gatti, R.A.

    2001-01-01

    Certain individuals cannot tolerate 'conventional' doses of radiation therapy. This is known to be true of patients with ataxia-telangiectasia and ligase IV deficiency. Although in vitro testing may not correlate completely with clinical radiosensitivity, fibroblasts and lymphoblasts from patients with both of these disorders have been clearly shown to be radiosensitive. Using a colony survival assay (CSA) to test lymphoblastoid cells after irradiation with 1 Gy, a variety of other genetic disorders have been identified as strong candidates for clinical radiosensitivity, such as Nijmegen breakage syndrome, Mre11 deficiency, and Fanconi's anemia. These data are presented and considered as a starting-point for the inherited basis of human radiosensitivity

  20. Birds of the Mongol Empire

    OpenAIRE

    Eugene N. Anderson

    2016-01-01

    The Mongol Empire, the largest contiguous empire the world has ever known, had, among other things, a goodly number of falconers, poultry raisers, birdcatchers, cooks, and other experts on various aspects of birding. We have records of this, largely in the Yinshan Zhengyao, the court nutrition manual of the Mongol empire in China (the Yuan Dynasty). It discusses in some detail 22 bird taxa, from swans to chickens. The Huihui Yaofang, a medical encyclopedia, lists ten taxa used medicinally. Ma...

  1. Wireless and empire geopolitics radio industry and ionosphere in the British Empire 1918-1939

    CERN Document Server

    Anduaga, Aitor

    2009-01-01

    Although the product of consensus politics, the British Empire was based on communications supremacy and the knowledge of the atmosphere. Focusing on science, industry, government, the military, and education, this book studies the relationship between wireless and Empire throughout the interwar period.

  2. An Empirical Spectroscopic Database for Acetylene in the Regions of 5850-9415 CM^{-1}

    Science.gov (United States)

    Campargue, Alain; Lyulin, Oleg

    2017-06-01

    Six studies have been recently devoted to a systematic analysis of the high-resolution near infrared absorption spectrum of acetylene recorded by Cavity Ring Down spectroscopy (CRDS) in Grenoble and by Fourier-transform spectroscopy (FTS) in Brussels and Hefei. On the basis of these works, in the present contribution, we construct an empirical database for acetylene in the 5850 - 9415 \\wn region excluding the 6341-7000 \\wn interval corresponding to the very strong νb{1}+ νb{3} manifold. The database gathers and extends information included in our CRDS and FTS studies. In particular, the intensities of about 1700 lines measured by CRDS in the 7244-7920 \\wn are reported for the first time together with those of several bands of ^{12}C^{13}CH_{2} present in natural isotopic abundance in the acetylene sample. The Herman-Wallis coefficients of most of the bands are derived from a fit of the measured intensity values. A recommended line list is provided with positions calculated using empirical spectroscopic parameters of the lower and upper energy vibrational levels and intensities calculated using the derived Herman-Wallis coefficients. This approach allows completing the experimental list by adding missing lines and improving poorly determined positions and intensities. As a result the constructed line list includes a total of 10973 lines belonging to 146 bands of ^{12}C_{2}H_{2} and 29 bands of ^{12}C^{13}CH_{2}. For comparison the HITRAN2012 database in the same region includes 869 lines of 14 bands, all belonging to ^{12}C_{2}H_{2}. Our weakest lines have an intensity on the order of 10^{-29} cm/molecule,about three orders of magnitude smaller than the HITRAN intensity cut off. Line profile parameters are added to the line list which is provided in HITRAN format. The comparison to the HITRAN2012 line list or to results obtained using the global effective operator approach is discussed in terms of completeness and accuracy.

  3. Empirical Music Aesthetics

    DEFF Research Database (Denmark)

    Grund, Cynthia M.

    The toolbox for empirically exploring the ways that artistic endeavors convey and activate meaning on the part of performers and audiences continues to expand. Current work employing methods at the intersection of performance studies, philosophy, motion capture and neuroscience to better understand...... musical performance and reception is inspired by traditional approaches within aesthetics, but it also challenges some of the presuppositions inherent in them. As an example of such work I present a research project in empirical music aesthetics begun last year and of which I am a team member....

  4. Towards the Genomic Basis of Local Adaptation in Landraces

    Directory of Open Access Journals (Sweden)

    Giandomenico Corrado

    2017-11-01

    Full Text Available Landraces are key elements of agricultural biodiversity that have long been considered a source of useful traits. Their importance goes beyond subsistence agriculture and the essential need to preserve genetic diversity, because landraces are farmer-developed populations that are often adapted to environmental conditions of significance to tackle environmental concerns. It is therefore increasingly important to identify adaptive traits in crop landraces and understand their molecular basis. This knowledge is potentially useful for promoting more sustainable agricultural techniques, reducing the environmental impact of high-input cropping systems, and diminishing the vulnerability of agriculture to global climate change. In this review, we present an overview of the opportunities and limitations offered by landraces’ genomics. We discuss how rapid advances in DNA sequencing techniques, plant phenotyping, and recombinant DNA-based biotechnology encourage both the identification and the validation of the genomic signature of local adaptation in crop landraces. The integration of ‘omics’ sciences, molecular population genetics, and field studies can provide information inaccessible with earlier technological tools. Although empirical knowledge on the genetic and genomic basis of local adaptation is still fragmented, it is predicted that genomic scans for adaptation will unlock an intraspecific molecular diversity that may be different from that of modern varieties.

  5. Empirical philosophy of science

    DEFF Research Database (Denmark)

    Wagenknecht, Susann; Nersessian, Nancy J.; Andersen, Hanne

    2015-01-01

    A growing number of philosophers of science make use of qualitative empirical data, a development that may reconfigure the relations between philosophy and sociology of science and that is reminiscent of efforts to integrate history and philosophy of science. Therefore, the first part...... of this introduction to the volume Empirical Philosophy of Science outlines the history of relations between philosophy and sociology of science on the one hand, and philosophy and history of science on the other. The second part of this introduction offers an overview of the papers in the volume, each of which...... is giving its own answer to questions such as: Why does the use of qualitative empirical methods benefit philosophical accounts of science? And how should these methods be used by the philosopher?...

  6. Dynamic Modeling of a Reformed Methanol Fuel Cell System using Empirical Data and Adaptive Neuro-Fuzzy Inference System Models

    DEFF Research Database (Denmark)

    Justesen, Kristian Kjær; Andreasen, Søren Juhl; Shaker, Hamid Reza

    2013-01-01

    In this work, a dynamic MATLAB Simulink model of a H3-350 Reformed Methanol Fuel Cell (RMFC) stand-alone battery charger produced by Serenergy is developed on the basis of theoretical and empirical methods. The advantage of RMFC systems is that they use liquid methanol as a fuel instead of gaseous...... of the reforming process are implemented. Models of the cooling flow of the blowers for the fuel cell and the burner which supplies process heat for the reformer are made. The two blowers have a common exhaust, which means that the two blowers influence each other’s output. The models take this into account using...... an empirical approach. Fin efficiency models for the cooling effect of the air are also developed using empirical methods. A fuel cell model is also implemented based on a standard model which is adapted to fit the measured performance of the H3-350 module. All the individual parts of the model are verified...

  7. Dynamic Modeling of a Reformed Methanol Fuel Cell System using Empirical Data and Adaptive Neuro-Fuzzy Inference System Models

    DEFF Research Database (Denmark)

    Justesen, Kristian Kjær; Andreasen, Søren Juhl; Shaker, Hamid Reza

    2014-01-01

    In this work, a dynamic MATLAB Simulink model of a H3-350 Reformed Methanol Fuel Cell (RMFC) stand-alone battery charger produced by Serenergy is developed on the basis of theoretical and empirical methods. The advantage of RMFC systems is that they use liquid methanol as a fuel instead of gaseous...... of the reforming process are implemented. Models of the cooling flow of the blowers for the fuel cell and the burner which supplies process heat for the reformer are made. The two blowers have a common exhaust, which means that the two blowers influence each other’s output. The models take this into account using...... an empirical approach. Fin efficiency models for the cooling effect of the air are also developed using empirical methods. A fuel cell model is also implemented based on a standard model which is adapted to fit the measured performance of the H3-350 module. All the individual parts of the model are verified...

  8. The evolutionary basis of human social learning.

    Science.gov (United States)

    Morgan, T J H; Rendell, L E; Ehn, M; Hoppitt, W; Laland, K N

    2012-02-22

    Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules.

  9. Recognizing of stereotypic patterns in epileptic EEG using empirical modes and wavelets

    Science.gov (United States)

    Grubov, V. V.; Sitnikova, E.; Pavlov, A. N.; Koronovskii, A. A.; Hramov, A. E.

    2017-11-01

    Epileptic activity in the form of spike-wave discharges (SWD) appears in the electroencephalogram (EEG) during absence seizures. This paper evaluates two approaches for detecting stereotypic rhythmic activities in EEG, i.e., the continuous wavelet transform (CWT) and the empirical mode decomposition (EMD). The CWT is a well-known method of time-frequency analysis of EEG, whereas EMD is a relatively novel approach for extracting signal's waveforms. A new method for pattern recognition based on combination of CWT and EMD is proposed. It was found that this combined approach resulted to the sensitivity of 86.5% and specificity of 92.9% for sleep spindles and 97.6% and 93.2% for SWD, correspondingly. Considering strong within- and between-subjects variability of sleep spindles, the obtained efficiency in their detection was high in comparison with other methods based on CWT. It is concluded that the combination of a wavelet-based approach and empirical modes increases the quality of automatic detection of stereotypic patterns in rat's EEG.

  10. EMPIRICAL DISTRIBUTION OF STOCK RETURNS OF SOUTHEAST EUROPEAN EMERGING MARKETS

    Directory of Open Access Journals (Sweden)

    Aleksandar Naumoski

    2017-06-01

    Full Text Available The assumption that equity returns follow the normal distribution, most commonly made in financial economics theory and applications, is strongly rejected by empirical evidence presented in this paper. As it was found in many other studies, we confirm that stock returns follow a leptokurtic distribution and skewness, which in most of the Southeast European (SEE markets is negative. This paper investigates further whether there is any distribution that may be considered an optimal fit for stock returns in the SEE region. Using daily, weekly and monthly data samples for a period of five years from ten Southeast European emerging countries, we applied the Anderson-Darling test of Goodness-of-fit. We strongly rejected the aforementioned assumption of normality for all considered data samples and found that the daily stock returns are best fitted by the Johnson SU distribution whereas for the weekly and monthly stock returns there was not one predominant, but many distributions that can be considered a best fit.

  11. Supervised neural network modeling: an empirical investigation into learning from imbalanced data with labeling errors.

    Science.gov (United States)

    Khoshgoftaar, Taghi M; Van Hulse, Jason; Napolitano, Amri

    2010-05-01

    Neural network algorithms such as multilayer perceptrons (MLPs) and radial basis function networks (RBFNets) have been used to construct learners which exhibit strong predictive performance. Two data related issues that can have a detrimental impact on supervised learning initiatives are class imbalance and labeling errors (or class noise). Imbalanced data can make it more difficult for the neural network learning algorithms to distinguish between examples of the various classes, and class noise can lead to the formulation of incorrect hypotheses. Both class imbalance and labeling errors are pervasive problems encountered in a wide variety of application domains. Many studies have been performed to investigate these problems in isolation, but few have focused on their combined effects. This study presents a comprehensive empirical investigation using neural network algorithms to learn from imbalanced data with labeling errors. In particular, the first component of our study investigates the impact of class noise and class imbalance on two common neural network learning algorithms, while the second component considers the ability of data sampling (which is commonly used to address the issue of class imbalance) to improve their performances. Our results, for which over two million models were trained and evaluated, show that conclusions drawn using the more commonly studied C4.5 classifier may not apply when using neural networks.

  12. How to build a strong global brand for an SME : Case: Company A

    OpenAIRE

    Huikko, Nicole

    2014-01-01

    This study falls into the category of product-oriented thesis and examines brands, brand management, and brand building in international B2B context. Through application of theoretical concepts, analysis of empirical data, and practical actions, a global B2B branding plan for case company A is developed. The case company is an SME and has the vision to become a globally recognized brand. The guiding research question is how to build a strong global B2B brand for an SME. The objective of ...

  13. Final Empirical Test Case Specification

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    This document includes the empirical specification on the IEA task of evaluation building energy simulation computer programs for the Double Skin Facades (DSF) constructions. There are two approaches involved into this procedure, one is the comparative approach and another is the empirical one....

  14. Empire vs. Federation

    DEFF Research Database (Denmark)

    Gravier, Magali

    2011-01-01

    The article discusses the concepts of federation and empire in the context of the European Union (EU). Even if these two concepts are not usually contrasted to one another, the article shows that they refer to related type of polities. Furthermore, they can be used at a time because they shed light...... on different and complementary aspects of the European integration process. The article concludes that the EU is at the crossroads between federation and empire and may remain an ‘imperial federation’ for several decades. This could mean that the EU is on the verge of transforming itself to another type...

  15. Empirical analysis of online human dynamics

    Science.gov (United States)

    Zhao, Zhi-Dan; Zhou, Tao

    2012-06-01

    Patterns of human activities have attracted increasing academic interests, since the quantitative understanding of human behavior is helpful to uncover the origins of many socioeconomic phenomena. This paper focuses on behaviors of Internet users. Six large-scale systems are studied in our experiments, including the movie-watching in Netflix and MovieLens, the transaction in Ebay, the bookmark-collecting in Delicious, and the posting in FreindFeed and Twitter. Empirical analysis reveals some common statistical features of online human behavior: (1) The total number of user's actions, the user's activity, and the interevent time all follow heavy-tailed distributions. (2) There exists a strongly positive correlation between user's activity and the total number of user's actions, and a significantly negative correlation between the user's activity and the width of the interevent time distribution. We further study the rescaling method and show that this method could to some extent eliminate the different statistics among users caused by the different activities, yet the effectiveness depends on the data sets.

  16. After-sales service to manufactured goods on technological basis

    Directory of Open Access Journals (Sweden)

    Miriam Borchardt

    2008-07-01

    Full Text Available This theoretical and exploratory paper aims to build a critical analysis on after-sales services, mainly regarded to manufactured goods on technological basis. The purpose of the research is to achieve some better understanding about the essential elements that are to be taken into account in conceiving such a service, after different approaches. After-sales service is a member of the service package and it can influence customer satisfaction. The studied issues can integrate policies to guiding firms in designing after-sales services. They are: definition of the service itself; strategic issues; the facilities and premises; and the operation management. We aim this theoretical research to be a pre-requisite to launch further empirical researches, mainly in the field of inter-organizational relationships. Key-words: service management; after-sales service; service operations; goods associated to services; inter-organizational relationships.

  17. On the strong metric dimension of generalized butterfly graph, starbarbell graph, and {C}_{m}\\odot {P}_{n} graph

    Science.gov (United States)

    Yunia Mayasari, Ratih; Atmojo Kusmayadi, Tri

    2018-04-01

    Let G be a connected graph with vertex set V(G) and edge set E(G). For every pair of vertices u,v\\in V(G), the interval I[u, v] between u and v to be the collection of all vertices that belong to some shortest u ‑ v path. A vertex s\\in V(G) strongly resolves two vertices u and v if u belongs to a shortest v ‑ s path or v belongs to a shortest u ‑ s path. A vertex set S of G is a strong resolving set of G if every two distinct vertices of G are strongly resolved by some vertex of S. The strong metric basis of G is a strong resolving set with minimal cardinality. The strong metric dimension sdim(G) of a graph G is defined as the cardinality of strong metric basis. In this paper we determine the strong metric dimension of a generalized butterfly graph, starbarbell graph, and {C}mȯ {P}n graph. We obtain the strong metric dimension of generalized butterfly graph is sdim(BFn ) = 2n ‑ 2. The strong metric dimension of starbarbell graph is sdim(S{B}{m1,{m}2,\\ldots,{m}n})={\\sum }i=1n({m}i-1)-1. The strong metric dimension of {C}mȯ {P}n graph are sdim({C}mȯ {P}n)=2m-1 for m > 3 and n = 2, and sdim({C}mȯ {P}n)=2m-2 for m > 3 and n > 2.

  18. The inherited basis of human radiosensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Gatti, R.A. [Univ. of California, School of Medicine, Los Angeles, CA (United States). Experimental Pathology

    2001-11-01

    Certain individuals cannot tolerate 'conventional' doses of radiation therapy. This is known to be true of patients with ataxia-telangiectasia and ligase IV deficiency. Although in vitro testing may not correlate completely with clinical radiosensitivity, fibroblasts and lymphoblasts from patients with both of these disorders have been clearly shown to be radiosensitive. Using a colony survival assay (CSA) to test lymphoblastoid cells after irradiation with 1 Gy, a variety of other genetic disorders have been identified as strong candidates for clinical radiosensitivity, such as Nijmegen breakage syndrome, Mre11 deficiency, and Fanconi's anemia. These data are presented and considered as a starting-point for the inherited basis of human radiosensitivity.

  19. The Apocalyptic Empire of America L’Empire apocalyptique américain

    Directory of Open Access Journals (Sweden)

    Akça Ataç

    2009-10-01

    Full Text Available En général, les études traitant de « l’Empire » américain tendent à chercher à comprendre celui-ci à partir de termes concrets tels que la frontière, l’intervention militaire, le commerce international. Néanmoins, les Empires sont d’abord le résultat de profondes traditions intellectuelles intangibles qui encouragent et justifient les actions entreprises dans le cadre de politiques impériales. Dans le cas de l’Amérique, les fondements intellectuels du nouvel idéal impérial sont ancrés dans la vision apocalyptique transportée dans les bagages des premiers colons puritains. Si l’on ne prend pas en compte cet ancrage apocalyptique, on ne peut saisir, dans leur totalité, les principes fondamentaux de « l’Empire » américain. On devrait examiner des termes qui entrent en résonance avec le discours impérial tels que « mission » et « destinée » ainsi que de l’engagement explicite dans la rhétorique présidentielle en faveur de « l’amélioration » du monde à n’importe quel prix du point de vue de cette croyance apocalyptique éternelle. Cet article essaie d’élucider l’origine et l’essence de la vision apocalyptique américaine en portant une attention particulière sur son influence dans la genèse du concept d’Empire américain.

  20. Basis for calculations in the topological expansion

    International Nuclear Information System (INIS)

    Levinson, M.A.

    1982-12-01

    Investigations aimed at putting the topological theory of particles on a more quantitative basis are described. First, the incorporation of spin into the topological structure is discussed and shown to successfully reproduce the observed lowest mass hadron spectrum. The absence of parity-doubled states represents a significant improvement over previous efforts in similar directions. This theory is applied to the lowest order calculation of elementary hadron coupling constant ratios. SU(6)/sub W/ symmetry is maintained and extended via the notions of topological supersymmetry and universality. Finally, efforts to discover a perturbative basis for the topological expansion are described. This has led to the formulation of off-shell Feynman-like rules which provide a calculational scheme for the strong interaction components of the topological expansion once the zero-entropy connected parts are known. These rules are shown to imply a topological asymptotic freedom. Even though the nonlinear zero-entropy problem cannot itself be treated perturbatively, plausible general assumptions about zero-entropy amplitudes allow immediate qualitative inferences concerning physical hadrons. In particular, scenarios for mass splittings beyond the supersymmetric level are described

  1. The pointer basis and the feedback stabilization of quantum systems

    International Nuclear Information System (INIS)

    Li, L; Chia, A; Wiseman, H M

    2014-01-01

    The dynamics for an open quantum system can be ‘unravelled’ in infinitely many ways, depending on how the environment is monitored, yielding different sorts of conditioned states, evolving stochastically. In the case of ideal monitoring these states are pure, and the set of states for a given monitoring forms a basis (which is overcomplete in general) for the system. It has been argued elsewhere (Atkins et al 2005 Europhys. Lett. 69 163) that the ‘pointer basis’ as introduced by Zurek et al (1993 Phys. Rev. Lett. 70 1187), should be identified with the unravelling-induced basis which decoheres most slowly. Here we show the applicability of this concept of pointer basis to the problem of state stabilization for quantum systems. In particular we prove that for linear Gaussian quantum systems, if the feedback control is assumed to be strong compared to the decoherence of the pointer basis, then the system can be stabilized in one of the pointer basis states with a fidelity close to one (the infidelity varies inversely with the control strength). Moreover, if the aim of the feedback is to maximize the fidelity of the unconditioned system state with a pure state that is one of its conditioned states, then the optimal unravelling for stabilizing the system in this way is that which induces the pointer basis for the conditioned states. We illustrate these results with a model system: quantum Brownian motion. We show that even if the feedback control strength is comparable to the decoherence, the optimal unravelling still induces a basis very close to the pointer basis. However if the feedback control is weak compared to the decoherence, this is not the case. (paper)

  2. Review essay: empires, ancient and modern.

    Science.gov (United States)

    Hall, John A

    2011-09-01

    This essay drews attention to two books on empires by historians which deserve the attention of sociologists. Bang's model of the workings of the Roman economy powerfully demonstrates the tributary nature of per-industrial tributary empires. Darwin's analysis concentrates on modern overseas empires, wholly different in character as they involved the transportation of consumption items for the many rather than luxury goods for the few. Darwin is especially good at describing the conditions of existence of late nineteenth century empires, noting that their demise was caused most of all by the failure of balance of power politics in Europe. Concluding thoughts are offered about the USA. © London School of Economics and Political Science 2011.

  3. PROBLEMS WITH WIREDU'S EMPIRICALISM Martin Odei Ajei1 ...

    African Journals Online (AJOL)

    In his “Empiricalism: The Empirical Character of an African Philosophy”,. Kwasi Wiredu sets out ... others, that an empirical metaphysical system contains both empirical ..... realms which multiple categories of existents inhabit and conduct their being in .... to a mode of reasoning that conceives categories polarized by formal.

  4. Strong Nuclear Gravitational Constant and the Origin of Nuclear Planck Scale

    Directory of Open Access Journals (Sweden)

    Seshavatharam U. V. S.

    2010-07-01

    Full Text Available Whether it may be real or an equivalent, existence of strong nuclear gravitational con- stant G S is assumed. Its value is obtained from Fermi’s weak coupling constant as G S = 6 : 9427284 10 31 m 3 / kg sec 2 and thus “nuclear planck scale” is defined. For strong interaction existence of a new integral charged “confined fermion” of mass 105.383 MeV is assumed. Strong coupling constant is the ratio of nuclear planck energy = 11.97 MeV and assumed 105.383 MeV. 1 s = X s is defined as the strong interaction mass gen- erator. With 105.383 MeV fermion various nuclear unit radii are fitted. Fermi’s weak coupling constant, strong interaction upper limit and Bohr radius are fitted at funda- mental level. Considering Fermi’s weak coupling constant and nuclear planck length a new number X e = 294.8183 is defined for fitting the electron, muon and tau rest masses. Using X s , X e and 105 : 32 = 0 : 769 MeV as the Coulombic energy constant = E c , en- ergy coe cients of the semi-empirical mass formula are estimated as E v = 16 : 32 MeV ; E s = 19 : 37 MeV ; E a = 23 : 86 MeV and E p = 11 : 97 MeV where Coulombic energy term contains [ Z ] 2 : Starting from Z = 2 nuclear binding energy is fitted with two terms along with only one energy constant = 0.769 MeV. Finally nucleon mass and its excited levels are fitted.

  5. Selection of the strategy of placing outdoor advertising of trading enterprises on the basis of the matrix model

    OpenAIRE

    Mel'nykovych Olena M.; Krepak Anna S.

    2013-01-01

    The article analyses the essence and significance of strategic planning of advertisement activity and studies theoretical approaches to definition of the "advertising strategy" notion. On the basis of generalisation of theoretical material and empirical experience, the authors form own view on definition of the "strategy of placing outdoor advertising" term, which is interpreted as a subordinate part of the general advertisement strategy of an enterprise and is a concept of placing outdoor ad...

  6. Reducing Production Basis Risk through Rainfall Intensity Frequency (RIF) Indexes: Global Sensitivity Analysis' Implication on Policy Design

    Science.gov (United States)

    Muneepeerakul, Chitsomanus; Huffaker, Ray; Munoz-Carpena, Rafael

    2016-04-01

    The weather index insurance promises financial resilience to farmers struck by harsh weather conditions with swift compensation at affordable premium thanks to its minimal adverse selection and moral hazard. Despite these advantages, the very nature of indexing causes the presence of "production basis risk" that the selected weather indexes and their thresholds do not correspond to actual damages. To reduce basis risk without additional data collection cost, we propose the use of rain intensity and frequency as indexes as it could offer better protection at the lower premium by avoiding basis risk-strike trade-off inherent in the total rainfall index. We present empirical evidences and modeling results that even under the similar cumulative rainfall and temperature environment, yield can significantly differ especially for drought sensitive crops. We further show that deriving the trigger level and payoff function from regression between historical yield and total rainfall data may pose significant basis risk owing to their non-unique relationship in the insured range of rainfall. Lastly, we discuss the design of index insurance in terms of contract specifications based on the results from global sensitivity analysis.

  7. The Intergenerational Transmission of Automobile Brand Preferences: Empirical Evidence and Implications for Firm Strategy

    OpenAIRE

    Soren T. Anderson; Ryan Kellogg; Ashley Langer; James M. Sallee

    2013-01-01

    We document a strong correlation in the brand of automobile chosen by parents and their adult children, using data from the Panel Study of Income Dynamics. This correlation could represent transmission of brand preferences across generations, or it could result from correlation in family characteristics that determine brand choice. We present a variety of empirical specifications that lend support to the former interpretation and to a mechanism that relies at least in part on state dependence...

  8. The Role of Empirical Research in Bioethics

    Science.gov (United States)

    Kon, Alexander A.

    2010-01-01

    There has long been tension between bioethicists whose work focuses on classical philosophical inquiry and those who perform empirical studies on bioethical issues. While many have argued that empirical research merely illuminates current practices and cannot inform normative ethics, others assert that research-based work has significant implications for refining our ethical norms. In this essay, I present a novel construct for classifying empirical research in bioethics into four hierarchical categories: Lay of the Land, Ideal Versus Reality, Improving Care, and Changing Ethical Norms. Through explaining these four categories and providing examples of publications in each stratum, I define how empirical research informs normative ethics. I conclude by demonstrating how philosophical inquiry and empirical research can work cooperatively to further normative ethics. PMID:19998120

  9. Communication: A Jastrow factor coupled cluster theory for weak and strong electron correlation

    International Nuclear Information System (INIS)

    Neuscamman, Eric

    2013-01-01

    We present a Jastrow-factor-inspired variant of coupled cluster theory that accurately describes both weak and strong electron correlation. Compatibility with quantum Monte Carlo allows for variational energy evaluations and an antisymmetric geminal power reference, two features not present in traditional coupled cluster that facilitate a nearly exact description of the strong electron correlations in minimal-basis N 2 bond breaking. In double-ζ treatments of the HF and H 2 O bond dissociations, where both weak and strong correlations are important, this polynomial cost method proves more accurate than either traditional coupled cluster or complete active space perturbation theory. These preliminary successes suggest a deep connection between the ways in which cluster operators and Jastrow factors encode correlation

  10. Diffraction scattering of strongly bound system

    International Nuclear Information System (INIS)

    Kuzmichev, V.E.

    1982-04-01

    The scattering of a hadron on a strongly bound system of two hadrons (dihadron) is considered in the high-energy limit for the relative hadron-dihadron motion. The dihadron scatterer motion and the internal interaction are included in our consideration. It is shown that only small values of the internal transfer momentum of dihadron particles bring the principal contribution to the three-particle propagator in eikonal approximation. On the basis of the exact analytical solution of the integral equation for the total Green function the scattering amplitude is derived. It is shown that the scattering amplitude contains only single, double, and triple scattering terms. The three new terms to the Glauber formula for the total cross section are obtained. These terms decrease both the true total hadron-hadron cross section and the screening correction. (orig.)

  11. Birds of the Mongol Empire

    Directory of Open Access Journals (Sweden)

    Eugene N. Anderson

    2016-09-01

    Full Text Available The Mongol Empire, the largest contiguous empire the world has ever known, had, among other things, a goodly number of falconers, poultry raisers, birdcatchers, cooks, and other experts on various aspects of birding. We have records of this, largely in the Yinshan Zhengyao, the court nutrition manual of the Mongol empire in China (the Yuan Dynasty. It discusses in some detail 22 bird taxa, from swans to chickens. The Huihui Yaofang, a medical encyclopedia, lists ten taxa used medicinally. Marco Polo also made notes on Mongol bird use. There are a few other records. This allows us to draw conclusions about Mongol ornithology, which apparently was sophisticated and detailed.

  12. System requirements and design description for the document basis database interface (DocBasis)

    International Nuclear Information System (INIS)

    Lehman, W.J.

    1997-01-01

    This document describes system requirements and the design description for the Document Basis Database Interface (DocBasis). The DocBasis application is used to manage procedures used within the tank farms. The application maintains information in a small database to track the document basis for a procedure, as well as the current version/modification level and the basis for the procedure. The basis for each procedure is substantiated by Administrative, Technical, Procedural, and Regulatory requirements. The DocBasis user interface was developed by Science Applications International Corporation (SAIC)

  13. Rejection of unfair offers in the ultimatum game is no evidence of strong reciprocity

    Science.gov (United States)

    Yamagishi, Toshio; Horita, Yutaka; Mifune, Nobuhiro; Hashimoto, Hirofumi; Li, Yang; Shinada, Mizuho; Miura, Arisa; Inukai, Keigo; Takagishi, Haruto; Simunovic, Dora

    2012-01-01

    The strong reciprocity model of the evolution of human cooperation has gained some acceptance, partly on the basis of support from experimental findings. The observation that unfair offers in the ultimatum game are frequently rejected constitutes an important piece of the experimental evidence for strong reciprocity. In the present study, we have challenged the idea that the rejection response in the ultimatum game provides evidence of the assumption held by strong reciprocity theorists that negative reciprocity observed in the ultimatum game is inseparably related to positive reciprocity as the two sides of a preference for fairness. The prediction of an inseparable relationship between positive and negative reciprocity was rejected on the basis of the results of a series of experiments that we conducted using the ultimatum game, the dictator game, the trust game, and the prisoner’s dilemma game. We did not find any correlation between the participants’ tendencies to reject unfair offers in the ultimatum game and their tendencies to exhibit various prosocial behaviors in the other games, including their inclinations to positively reciprocate in the trust game. The participants’ responses to postexperimental questions add support to the view that the rejection of unfair offers in the ultimatum game is a tacit strategy for avoiding the imposition of an inferior status. PMID:23188801

  14. Rejection of unfair offers in the ultimatum game is no evidence of strong reciprocity.

    Science.gov (United States)

    Yamagishi, Toshio; Horita, Yutaka; Mifune, Nobuhiro; Hashimoto, Hirofumi; Li, Yang; Shinada, Mizuho; Miura, Arisa; Inukai, Keigo; Takagishi, Haruto; Simunovic, Dora

    2012-12-11

    The strong reciprocity model of the evolution of human cooperation has gained some acceptance, partly on the basis of support from experimental findings. The observation that unfair offers in the ultimatum game are frequently rejected constitutes an important piece of the experimental evidence for strong reciprocity. In the present study, we have challenged the idea that the rejection response in the ultimatum game provides evidence of the assumption held by strong reciprocity theorists that negative reciprocity observed in the ultimatum game is inseparably related to positive reciprocity as the two sides of a preference for fairness. The prediction of an inseparable relationship between positive and negative reciprocity was rejected on the basis of the results of a series of experiments that we conducted using the ultimatum game, the dictator game, the trust game, and the prisoner's dilemma game. We did not find any correlation between the participants' tendencies to reject unfair offers in the ultimatum game and their tendencies to exhibit various prosocial behaviors in the other games, including their inclinations to positively reciprocate in the trust game. The participants' responses to postexperimental questions add support to the view that the rejection of unfair offers in the ultimatum game is a tacit strategy for avoiding the imposition of an inferior status.

  15. Early Site Permit Demonstration Program: Guidelines for determining design basis ground motions

    International Nuclear Information System (INIS)

    1993-01-01

    This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake's ground motion is a function of the earthquake's magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is a stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. Therefore, empirically based approaches that are used for other regions, such as Western North America, are not appropriate for Eastern North America. Moreover, recent advances in science and technology have now made it possible to combine theoretical and empirical methods to develop new procedures and models for estimating ground motion. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. Specifically considered are magnitudes M from 5 to 8, distances from 0 to 500 km, and frequencies from 1 to 35 Hz

  16. Mindfulness Meditation Training for Attention-Deficit/Hyperactivity Disorder in Adulthood: Current Empirical Support, Treatment Overview, and Future Directions

    Science.gov (United States)

    Mitchell, John T.; Zylowska, Lidia; Kollins, Scott H.

    2015-01-01

    Research examining nonpharmacological interventions for adults diagnosed with attention-deficit/hyperactivity disorder (ADHD) has expanded in recent years and provides patients with more treatment options. Mindfulness-based training is an example of an intervention that is gaining promising preliminary empirical support and is increasingly administered in clinical settings. The aim of this review is to provide a rationale for the application of mindfulness to individuals diagnosed with ADHD, describe the current state of the empirical basis for mindfulness training in ADHD, and summarize a treatment approach specific to adults diagnosed with ADHD: the Mindful Awareness Practices (MAPs) for ADHD Program. Two case study examples are provided to demonstrate relevant clinical issues for practitioners interested in this approach. Directions for future research, including mindfulness meditation as a standalone treatment and as a complementary approach to cognitive-behavioral therapy, are provided. PMID:25908900

  17. The Belgian Investments in Mass Transit of the Cities in Russian Empire at the End of the XIXth and at the Beginning of the XXth Centuries

    Directory of Open Access Journals (Sweden)

    Ilya V. Shpakov

    2015-12-01

    Full Text Available The article deals with the process of realization of long-term investment in the city transport projects of the Russian Empire at the end of the XIX and the beginning of the XX centuries. The authors analyze the reasons for the attractiveness of the Russian transport sector for the Belgian companies, the main stages of their integration into the economy of the Russian Empire, examine the legal and operational features of the transport activities of joint stock companies with foreign borrowed capital. There is also analysis of the volumes of the invested funds, description of the authorized capital size and the transport companies’ securities. On the basis of the comparison of the annual profit interest, the article explains the relatively small presence of the empire companies with Russian capital in the urban transport market. Analyzing the annual financial results of the joint stock Belgian transport companies, the authors fix a number of loss-making companies in the provincial towns. The article presents an attempt to explain this phenomenon on the basis of the problems in urban transportation in several cities. Examining the peculiarities of doing transport business in the cities, the article reflects the aspirations of the owners of corporations to maximize their profits at the expense of the quality of transport service. At the beginning of the XX century such companies’ policy led to the negative assessment of the way of solving transport problems of cities at the expense of foreign capital state by the Russian Empire. The authors come to the conclusion that trying to find the most appropriate solution of the question, the empire authorities offered the city transport companies to build their own resources by attracting loans in the state banks and to use of the funds from the bond issues.

  18. 26 CFR 1.1014-4 - Uniformity of basis; adjustment to basis.

    Science.gov (United States)

    2010-04-01

    ...) INCOME TAX (CONTINUED) INCOME TAXES Basis Rules of General Application § 1.1014-4 Uniformity of basis... to property acquired by bequest, devise, or inheritance relate back to the death of the decedent... prescribing a general uniform basis rule for property acquired from a decedent is, on the one hand, to tax the...

  19. Essays on empirical likelihood in economics

    NARCIS (Netherlands)

    Gao, Z.

    2012-01-01

    This thesis intends to exploit the roots of empirical likelihood and its related methods in mathematical programming and computation. The roots will be connected and the connections will induce new solutions for the problems of estimation, computation, and generalization of empirical likelihood.

  20. Many-body calculations of molecular electric polarizabilities in asymptotically complete basis sets

    Science.gov (United States)

    Monten, Ruben; Hajgató, Balázs; Deleuze, Michael S.

    2011-10-01

    The static dipole polarizabilities of Ne, CO, N2, F2, HF, H2O, HCN, and C2H2 (acetylene) have been determined close to the Full-CI limit along with an asymptotically complete basis set (CBS), according to the principles of a Focal Point Analysis. For this purpose the results of Finite Field calculations up to the level of Coupled Cluster theory including Single, Double, Triple, Quadruple and perturbative Pentuple excitations [CCSDTQ(P)] were used, in conjunction with suited extrapolations of energies obtained using augmented and doubly-augmented Dunning's correlation consistent polarized valence basis sets of improving quality. The polarizability characteristics of C2H4 (ethylene) and C2H6 (ethane) have been determined on the same grounds at the CCSDTQ level in the CBS limit. Comparison is made with results obtained using lower levels in electronic correlation, or taking into account the relaxation of the molecular structure due to an adiabatic polarization process. Vibrational corrections to electronic polarizabilities have been empirically estimated according to Born-Oppenheimer Molecular Dynamical simulations employing Density Functional Theory. Confrontation with experiment ultimately indicates relative accuracies of the order of 1 to 2%.

  1. Traditional Arabic & Islamic medicine: validation and empirical assessment of a conceptual model in Qatar.

    Science.gov (United States)

    AlRawi, Sara N; Khidir, Amal; Elnashar, Maha S; Abdelrahim, Huda A; Killawi, Amal K; Hammoud, Maya M; Fetters, Michael D

    2017-03-14

    Evidence indicates traditional medicine is no longer only used for the healthcare of the poor, its prevalence is also increasing in countries where allopathic medicine is predominant in the healthcare system. While these healing practices have been utilized for thousands of years in the Arabian Gulf, only recently has a theoretical model been developed illustrating the linkages and components of such practices articulated as Traditional Arabic & Islamic Medicine (TAIM). Despite previous theoretical work presenting development of the TAIM model, empirical support has been lacking. The objective of this research is to provide empirical support for the TAIM model and illustrate real world applicability. Using an ethnographic approach, we recruited 84 individuals (43 women and 41 men) who were speakers of one of four common languages in Qatar; Arabic, English, Hindi, and Urdu, Through in-depth interviews, we sought confirming and disconfirming evidence of the model components, namely, health practices, beliefs and philosophy to treat, diagnose, and prevent illnesses and/or maintain well-being, as well as patterns of communication about their TAIM practices with their allopathic providers. Based on our analysis, we find empirical support for all elements of the TAIM model. Participants in this research, visitors to major healthcare centers, mentioned using all elements of the TAIM model: herbal medicines, spiritual therapies, dietary practices, mind-body methods, and manual techniques, applied singularly or in combination. Participants had varying levels of comfort sharing information about TAIM practices with allopathic practitioners. These findings confirm an empirical basis for the elements of the TAIM model. Three elements, namely, spiritual healing, herbal medicine, and dietary practices, were most commonly found. Future research should examine the prevalence of TAIM element use, how it differs among various populations, and its impact on health.

  2. Pluvials, Droughts, Energetics, and the Mongol Empire

    Science.gov (United States)

    Hessl, A. E.; Pederson, N.; Baatarbileg, N.

    2012-12-01

    The success of the Mongol Empire, the largest contiguous land empire the world has ever known, is a historical enigma. At its peak in the late 13th century, the empire influenced areas from the Hungary to southern Asia and Persia. Powered by domesticated herbivores, the Mongol Empire grew at the expense of agriculturalists in Eastern Europe, Persia, and China. What environmental factors contributed to the rise of the Mongols? What factors influenced the disintegration of the empire by 1300 CE? Until now, little high resolution environmental data have been available to address these questions. We use tree-ring records of past temperature and water to illuminate the role of energy and water in the evolution of the Mongol Empire. The study of energetics has long been applied to biological and ecological systems but has only recently become a theme in understanding modern coupled natural and human systems (CNH). Because water and energy are tightly linked in human and natural systems, studying their synergies and interactions make it possible to integrate knowledge across disciplines and human history, yielding important lessons for modern societies. We focus on the role of energy and water in the trajectory of an empire, including its rise, development, and demise. Our research is focused on the Orkhon Valley, seat of the Mongol Empire, where recent paleoenvironmental and archeological discoveries allow high resolution reconstructions of past human and environmental conditions for the first time. Our preliminary records indicate that the period 1210-1230 CE, the height of Chinggis Khan's reign is one of the longest and most consistent pluvials in our tree ring reconstruction of interannual drought. Reconstructed temperature derived from five millennium-long records from subalpine forests in Mongolia document warm temperatures beginning in the early 1200's and ending with a plunge into cold temperatures in 1260. Abrupt cooling in central Mongolia at this time is

  3. Two- and four-quasiparticle states in the interacting boson model: Strong-coupling and decoupled band patterns in the SU(3) limit

    International Nuclear Information System (INIS)

    Vretenar, D.; Paar, V.; Bonsignori, G.; Savoia, M.

    1990-01-01

    An extension of the interacting boson approximation model is proposed by allowing for two- and four-quasiparticle excitations out of the boson space. The formation of band patterns based on two- and four-quasiparticle states is investigated in the SU(3) limit of the model. For hole-type (particle-type) fermions coupled to the SU(3) prolate (oblate) core, it is shown that the algebraic K-representation basis, which is the analog of the strong-coupling basis of the geometrical model, provides an appropriate description of the low-lying two-quasiparticle bands. In the case of particle-type (hole-type) fermions coupled to the SU(3) prolate (oblate) core, a new algebraic decoupling basis is derived that is equivalent in the geometrical limit to Stephens' rotation-aligned basis. Comparing the wave functions that are obtained by diagonalization of the model Hamiltonian to the decoupling basis, several low-lying two-quasiparticle bands are identified. The effects of an interaction that conserves only the total nucleon number, mixing states with different number of fermions, are investigated in both the strong-coupling and decoupling limits. All calculations are performed for an SU(3) boson core and the h11/2 fermion orbital

  4. Matrix-product-state method with local basis optimization for nonequilibrium electron-phonon systems

    Science.gov (United States)

    Heidrich-Meisner, Fabian; Brockt, Christoph; Dorfner, Florian; Vidmar, Lev; Jeckelmann, Eric

    We present a method for simulating the time evolution of quasi-one-dimensional correlated systems with strongly fluctuating bosonic degrees of freedom (e.g., phonons) using matrix product states. For this purpose we combine the time-evolving block decimation (TEBD) algorithm with a local basis optimization (LBO) approach. We discuss the performance of our approach in comparison to TEBD with a bare boson basis, exact diagonalization, and diagonalization in a limited functional space. TEBD with LBO can reduce the computational cost by orders of magnitude when boson fluctuations are large and thus it allows one to investigate problems that are out of reach of other approaches. First, we test our method on the non-equilibrium dynamics of a Holstein polaron and show that it allows us to study the regime of strong electron-phonon coupling. Second, the method is applied to the scattering of an electronic wave packet off a region with electron-phonon coupling. Our study reveals a rich physics including transient self-trapping and dissipation. Supported by Deutsche Forschungsgemeinschaft (DFG) via FOR 1807.

  5. Safety Basis Report

    International Nuclear Information System (INIS)

    R.J. Garrett

    2002-01-01

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities

  6. Safety Basis Report

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett

    2002-01-14

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.

  7. Empirical knowledge engine of local governance Senegalese artisanal fisheries Empirical knowledge engine of local governance Senegalese artisanal fisheries

    Science.gov (United States)

    Mbaye, A.

    2016-02-01

    Fishery resources has always been an administrative management faced with the supposed irrationality of artisanal fishermen and the state has always had a monopoly over such management. The state rules well established, synonyms of denial local populations knowledge on management, and expropriation of their fisheries territories, came into conflict with the existing rules thus weakening the traditional management system.However, aware of the threats to their survival because of the limitations of state rules and technicist perception of management, some populations of fishermen tried to organize and implement management measures.These measures are implemented on the basis of their own knowledge of the environmentsThis is the case in Kayar, Nianing, Bétenty, where local management initiatives began to bear fruit despite some difficulties.These examples of successful local management have prompted the Senegalese administration to have more consideration for the knowledge and know-how of fishermen and to be open to co-management of the fisheries resource. his communication shows how this is implemented new co-management approach in the governance of the Senegalese artisanal fisheries through the consideration of empirical knowledge of fishermen.

  8. Empirical agent-based modelling challenges and solutions

    CERN Document Server

    Barreteau, Olivier

    2014-01-01

    This instructional book showcases techniques to parameterise human agents in empirical agent-based models (ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications.  It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.  In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes'  ready to be implemented. Agent-based modeling (AB...

  9. Empirical Legality and Effective Reality

    Directory of Open Access Journals (Sweden)

    Hernán Pringe

    2015-08-01

    Full Text Available The conditions that Kant’s doctrine establishes are examined for the predication of the effective reality of certain empirical objects. It is maintained that a for such a predication, it is necessary to have not only perception but also a certain homogeneity of sensible data, and b the knowledge of the existence of certain empirical objects depends on the application of regulative principles of experience.

  10. Empirical comparison of theories

    International Nuclear Information System (INIS)

    Opp, K.D.; Wippler, R.

    1990-01-01

    The book represents the first, comprehensive attempt to take an empirical approach for comparative assessment of theories in sociology. The aims, problems, and advantages of the empirical approach are discussed in detail, and the three theories selected for the purpose of this work are explained. Their comparative assessment is performed within the framework of several research projects, which among other subjects also investigate the social aspects of the protest against nuclear power plants. The theories analysed in this context are the theory of mental incongruities and that of the benefit, and their efficiency in explaining protest behaviour is compared. (orig./HSCH) [de

  11. Teaching Empirical Software Engineering Using Expert Teams

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    2017-01-01

    Empirical software engineering aims at making software engineering claims measurable, i.e., to analyze and understand phenomena in software engineering and to evaluate software engineering approaches and solutions. Due to the involvement of humans and the multitude of fields for which software...... is crucial, software engineering is considered hard to teach. Yet, empirical software engineering increases this difficulty by adding the scientific method as extra dimension. In this paper, we present a Master-level course on empirical software engineering in which different empirical instruments...... an extra specific expertise that they offer as service to other teams, thus, fostering cross-team collaboration. The paper outlines the general course setup, topics addressed, and it provides initial lessons learned....

  12. Source analysis using regional empirical Green's functions: The 2008 Wells, Nevada, earthquake

    Science.gov (United States)

    Mendoza, C.; Hartzell, S.

    2009-01-01

    We invert three-component, regional broadband waveforms recorded for the 21 February 2008 Wells, Nevada, earthquake using a finite-fault methodology that prescribes subfault responses using eight MW∼4 aftershocks as empirical Green's functions (EGFs) distributed within a 20-km by 21.6-km fault area. The inversion identifies a seismic moment of 6.2 x 1024 dyne-cm (5.8 MW) with slip concentrated in a compact 6.5-km by 4-km region updip from the hypocenter. The peak slip within this localized area is 88 cm and the stress drop is 72 bars, which is higher than expected for Basin and Range normal faults in the western United States. The EGF approach yields excellent fits to the complex regional waveforms, accounting for strong variations in wave propagation and site effects. This suggests that the procedure is useful for studying moderate-size earthquakes with limited teleseismic or strong-motion data and for examining uncertainties in slip models obtained using theoretical Green's functions.

  13. Deposit Insurance and Risk Shifting in a Strong Regulatory Environment

    DEFF Research Database (Denmark)

    Bartholdy, Jan; Justesen, Lene Gilje

    This study provides empirical evidence on the moral hazard implications of introducing deposit insurance into a strong regulatory environment. Denmark offers a unique setting because commercial banks and savings banks have different ownership structures, but are subject to the same set...... of regulations. The ownership structure in savings banks implies that they have no incentive to increase risk after the implementation of a deposit insurance scheme whereas commercial banks have. Also, at the time of introduction, Denmark had high capital requirements and a strict closure policy. Using...... a difference-in-difference framework we show that commercial banks did not increase their risk compared to savings banks when deposit insurance was introduced. The results also hold for large commercial banks, indicating that the systemic risk did not increase either. Thus for a system with high capital...

  14. Early Site Permit Demonstration Program: Guidelines for determining design basis ground motions

    International Nuclear Information System (INIS)

    1993-01-01

    This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake's ground motion is a function of the earthquake's magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is a stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. This document, Volume II, contains Appendices 2, 3, 5, 6, and 7 covering the following topics: Eastern North American Empirical Ground Motion Data; Examination of Variance of Seismographic Network Data; Soil Amplification and Vertical-to-Horizontal Ratios from Analysis of Strong Motion Data From Active Tectonic Regions; Revision and Calibration of Ou and Herrmann Method; Generalized Ray Procedure for Modeling Ground Motion Attenuation; Crustal Models for Velocity Regionalization; Depth Distribution Models; Development of Generic Site Effects Model; Validation and Comparison of One-Dimensional Site Response Methodologies; Plots of Amplification Factors; Assessment of Coupling Between Vertical ampersand Horizontal Motions in Nonlinear Site Response Analysis; and Modeling of Dynamic Soil Properties

  15. Umayyad Relations with Byzantium Empire

    Directory of Open Access Journals (Sweden)

    Mansoor Haidari

    2017-06-01

    Full Text Available This research investigates the political and military relations between Umayyad caliphates with the Byzantine Empire. The aim of this research is to clarify Umayyad caliphate’s relations with the Byzantine Empire. We know that these relations were mostly about war and fight. Because there were always intense conflicts between Muslims and the Byzantine Empire, they had to have an active continuous diplomacy to call truce and settle the disputes. Thus, based on the general policy of the Umayyad caliphs, Christians were severely ignored and segregated within Islamic territories. This segregation of the Christians was highly affected by political relationships. It is worthy of mentioning that Umayyad caliphs brought the governing style of the Sassanid kings and Roman Caesar into the Islamic Caliphate system but they didn’t establish civil institutions and administrative organizations.

  16. Empirical training for conditional random fields

    NARCIS (Netherlands)

    Zhu, Zhemin; Hiemstra, Djoerd; Apers, Peter M.G.; Wombacher, Andreas

    2013-01-01

    In this paper (Zhu et al., 2013), we present a practi- cally scalable training method for CRFs called Empir- ical Training (EP). We show that the standard train- ing with unregularized log likelihood can have many maximum likelihood estimations (MLEs). Empirical training has a unique closed form MLE

  17. Empirical Phenomenology: A Qualitative Research Approach (The ...

    African Journals Online (AJOL)

    Empirical Phenomenology: A Qualitative Research Approach (The Cologne Seminars) ... and practical application of empirical phenomenology in social research. ... and considers its implications for qualitative methods such as interviewing ...

  18. Empirical logic and quantum mechanics

    International Nuclear Information System (INIS)

    Foulis, D.J.; Randall, C.H.

    1976-01-01

    This article discusses some of the basic notions of quantum physics within the more general framework of operational statistics and empirical logic (as developed in Foulis and Randall, 1972, and Randall and Foulis, 1973). Empirical logic is a formal mathematical system in which the notion of an operation is primitive and undefined; all other concepts are rigorously defined in terms of such operations (which are presumed to correspond to actual physical procedures). (Auth.)

  19. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Sissay, Adonay [Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Abanador, Paul; Mauger, François; Gaarde, Mette; Schafer, Kenneth J. [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Lopata, Kenneth, E-mail: klopata@lsu.edu [Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)

    2016-09-07

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagating the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.

  20. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory

    International Nuclear Information System (INIS)

    Sissay, Adonay; Abanador, Paul; Mauger, François; Gaarde, Mette; Schafer, Kenneth J.; Lopata, Kenneth

    2016-01-01

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagating the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.

  1. Empire and the Ambiguities of Love

    Directory of Open Access Journals (Sweden)

    Linnell Secomb

    2013-09-01

    Full Text Available Colonialism is not only enforced through violence but facilitated also by economic, religious and social strategies and inducements. Amongst these, love has been exploited as a tool of empire to construct alliances, procure compliance and disguise the conquest of peoples and territories. Love has also, however, been the basis for an ethics and politics that contests imperialism. Friendship, affinity and amorous relations between coloniser and colonised enables a resistance to the ambitions of colonial occupation and rule. In this article, the work of the London-based artist, Yinka Shonibare, is used to examine the operation of love in the colonial context. Focusing especially on his 2007 installation Jardin d’Amour the bifurcation of love into an imperialist strategy on the one hand and an anti-colonial ethics on the other is challenged. Instead, love is conceived as a process of exposure and hybridisation that transforms lover and beloved by introducing otherness into the heart of the subject. Drawing on French philosopher Jean-Luc Nancy’s analysis of shattered love, it is suggested that each instance of love involves both violence and caress: each performance of love is an intrusion of otherness that inaugurates the subject as an always multiple and heterogeneous being.

  2. Empirical Moral Philosophy and Teacher Education

    Science.gov (United States)

    Schjetne, Espen; Afdal, Hilde Wågsås; Anker, Trine; Johannesen, Nina; Afdal, Geir

    2016-01-01

    In this paper, we explore the possible contributions of empirical moral philosophy to professional ethics in teacher education. We argue that it is both possible and desirable to connect knowledge of how teachers empirically do and understand professional ethics with normative theories of teachers' professional ethics. Our argument is made in…

  3. Early Site Permit Demonstration Program: Guidelines for determining design basis ground motions

    International Nuclear Information System (INIS)

    1993-01-01

    This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake's ground motion is a function of the earthquake's magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is a stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. Therefore, empirically based approaches that are used for other regions, such as Western North America, are not appropriate for Eastern North America. Moreover, recent advances in science and technology have now made it possible to combine theoretical and empirical methods to develop new procedures and models for estimating ground motion. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. Specifically considered are magnitudes M from 5 to 8, distances from 0 to 500 km, and frequencies from 1 to 35 Hz. This document, Volume IV, provides Appendix 8.B, Laboratory Investigations of Dynamic Properties of Reference Sites

  4. Autobiography After Empire

    DEFF Research Database (Denmark)

    Rasch, Astrid

    of the collective, but insufficient attention has been paid to how individuals respond to such narrative changes. This dissertation examines the relationship between individual and collective memory at the end of empire through analysis of 13 end of empire autobiographies by public intellectuals from Australia......Decolonisation was a major event of the twentieth century, redrawing maps and impacting on identity narratives around the globe. As new nations defined their place in the world, the national and imperial past was retold in new cultural memories. These developments have been studied at the level......, the Anglophone Caribbean and Zimbabwe. I conceive of memory as reconstructive and social, with individual memory striving to make sense of the past in the present in dialogue with surrounding narratives. By examining recurring tropes in the autobiographies, like colonial education, journeys to the imperial...

  5. Multiscale finite element methods for high-contrast problems using local spectral basis functions

    KAUST Repository

    Efendiev, Yalchin

    2011-02-01

    In this paper we study multiscale finite element methods (MsFEMs) using spectral multiscale basis functions that are designed for high-contrast problems. Multiscale basis functions are constructed using eigenvectors of a carefully selected local spectral problem. This local spectral problem strongly depends on the choice of initial partition of unity functions. The resulting space enriches the initial multiscale space using eigenvectors of local spectral problem. The eigenvectors corresponding to small, asymptotically vanishing, eigenvalues detect important features of the solutions that are not captured by initial multiscale basis functions. Multiscale basis functions are constructed such that they span these eigenfunctions that correspond to small, asymptotically vanishing, eigenvalues. We present a convergence study that shows that the convergence rate (in energy norm) is proportional to (H/Λ*)1/2, where Λ* is proportional to the minimum of the eigenvalues that the corresponding eigenvectors are not included in the coarse space. Thus, we would like to reach to a larger eigenvalue with a smaller coarse space. This is accomplished with a careful choice of initial multiscale basis functions and the setup of the eigenvalue problems. Numerical results are presented to back-up our theoretical results and to show higher accuracy of MsFEMs with spectral multiscale basis functions. We also present a hierarchical construction of the eigenvectors that provides CPU savings. © 2010.

  6. YOUTH STUDIES – A SPECIFIC GENRE OF THE EMPIRICAL PARADIGM IN SOCIAL SCIENCES

    Directory of Open Access Journals (Sweden)

    Agnė Dorelaitienė

    2017-09-01

    Full Text Available The article presents the situation of youth in contemporary society. Neoliberal economy, ageing society, rapid globalisation, technological changes, increase of social risk have prompted specific, historically unfamiliar, and fairly difficult to forecast social change. Social adaptation and construction of own identity are becoming challenging to youth as a specific social group in this period of great uncertainty, risk, and opportunities. Youth studies are referred to as one of the means to help understand the youth phenomenon and form the respective policy. Aim of the article is to reveal the role of youth studies as a specific interdisciplinary genre of the empirical-analytic paradigm in social sciences. Research objectives: (1 To identify the traditions of youth studies and differences between them; (2. To reveal the specific character of youth studies as an empirical paradigm in the contemporary context. Analysis of scientific sources and document analysis are used for achievement of the goal and objectives. Since the 20th century, youth studies have been developing as an independent research discipline and tradition. Perception of the notion of a young person has been changing along with development of the paradigmatic and methodological research traditions. Modernity has doubtlessly contributed to a young person finding his/her place in other age groups and putting an emphasis on the importance of youth as a specific social group. Recently, youth has been viewed as both the risk and the opportunity group. Although qualitative research, in particular, where youth emancipation is aspired, prevails in the contemporary research tradition, the empirical-analytic paradigm has not lost its relevance. The research has demonstrated that empirical-analytic paradigm is a specific genre of the youth studies characterised by quantitative approach and strong link to politics and practical situation of the phenomenon.

  7. An in silico study of the molecular basis of B-RAF activation and conformational stability

    DEFF Research Database (Denmark)

    Fratev, Filip Filipov; Jonsdottir, Svava Osk

    2009-01-01

    B-RAF kinase plays an important role both in tumour induction and maintenance in several cancers and it is an attractive new drug target. However, the structural basis of the B-RAF activation is still not well understood. RESULTS: In this study we suggest a novel molecular basis of B-RAF activation...... based on molecular dynamics (MD) simulations of B-RAFWT and the B-RAFV600E, B-RAFK601E and B-RAFD594V mutants. A strong hydrogen bond network was identified in B-RAFWT in which the interactions between Lys601 and the well known catalytic residues Lys483, Glu501 and Asp594 play an important role...... the A-loop and the alphaC-helix in the activating mutants, which presumably contribute to the flipping of the activation segment to an active form. Conversely, in the B-RAFD594V mutant that has impaired kinase activity, and in B-RAFWT these interactions were strong and stabilized the kinase inactive...

  8. Technical Note: A comparison of model and empirical measures of catchment-scale effective energy and mass transfer

    Directory of Open Access Journals (Sweden)

    C. Rasmussen

    2013-09-01

    catchment woody plant cover decreases. In summary, the data indicated strong correspondence between model and empirical measures of EEMT with limited bias that agree well with other empirical measures of catchment energy and water partitioning and plant cover.

  9. Problems of reforming of the Russian Empire local government at the beginning of the 1830s

    Directory of Open Access Journals (Sweden)

    Nataliya L. Semyonova

    2016-03-01

    Full Text Available The article deals with the problems of the local government reforming in the high ruling circles of the Russian Empire at the beginning of the 30s of the XIX century on the basis of the published and unpublished archival documents. The author has analyzed the ideas of the Minister of the Internal Affairs A.A. Zakrevsky taken under consideration by the Emperor Nicolay I in 1831. The project presupposed the reform of the local government on the basis of the principles of making the governor’s influence more powerful, restoring the unity of the province’s governmental bodies, increasing a number of officials in the provincial government and their payments. The author concludes that A.A. Zakrevsky’s suggestions were of a purely declarative character. The discussion of the ideas by the Minister of the Internal Affairs at the high governmental circles showed that the representatives of the supreme power had found it necessary to keep “The Institute of the Russian Empire province ruling” in 1775 and had been reluctant to consider any changes in it. The author came to the conclusion that the most important criteria of administrative changes at that time were the tasks to create such an apparatus of government which would be cheap, efficient and could provide the best correlation between the power authority of the local and central bodies. The key moment becomes an idea of the Minister of Finance who provided financial grounds by any changes and reforms.

  10. Empirical studies on usability of mHealth apps: a systematic literature review.

    Science.gov (United States)

    Zapata, Belén Cruz; Fernández-Alemán, José Luis; Idri, Ali; Toval, Ambrosio

    2015-02-01

    The release of smartphones and tablets, which offer more advanced communication and computing capabilities, has led to the strong emergence of mHealth on the market. mHealth systems are being used to improve patients' lives and their health, in addition to facilitating communication between doctors and patients. Researchers are now proposing mHealth applications for many health conditions such as dementia, autism, dysarthria, Parkinson's disease, and so on. Usability becomes a key factor in the adoption of these applications, which are often used by people who have problems when using mobile devices and who have a limited experience of technology. The aim of this paper is to investigate the empirical usability evaluation processes described in a total of 22 selected studies related to mHealth applications by means of a Systematic Literature Review. Our results show that the empirical evaluation methods employed as regards usability could be improved by the adoption of automated mechanisms. The evaluation processes should also be revised to combine more than one method. This paper will help researchers and developers to create more usable applications. Our study demonstrates the importance of adapting health applications to users' need.

  11. <strong>Mini-project>

    DEFF Research Database (Denmark)

    Katajainen, Jyrki

    2008-01-01

    In this project the goal is to develop the safe * family of containers for the CPH STL. The containers to be developed should be safer and more reliable than any of the existing implementations. A special focus should be put on strong exception safety since none of the existing prototypes available...

  12. Empirical Research In Engineering Design

    DEFF Research Database (Denmark)

    Ahmed, Saeema

    2007-01-01

    Increasingly engineering design research involves the use of empirical studies that are conducted within an industrial environment [Ahmed, 2001; Court 1995; Hales 1987]. Research into the use of information by designers or understanding how engineers build up experience are examples of research...... of research issues. This paper describes case studies of empirical research carried out within industry in engineering design focusing upon information, knowledge and experience in engineering design. The paper describes the research methods employed, their suitability for the particular research aims...

  13. Construction of Fine Particles Source Spectrum Bank in Typical Region and Empirical Research of Matching Diagnosis

    Science.gov (United States)

    Wang, Xing; Sun, Wenliang; Guo, Min; Li, Minjiao; Li, Wan

    2018-01-01

    The research object of this paper is fine particles in typical region. The construction of component spectrum bank is based on the technology of online source apportionment, then the result of the apportionment is utilized to verify the effectiveness of fine particles component spectrum bank and which also act as the matching basis of online source apportionment receptor sample. On the next, the particle source of air pollution is carried through the matching diagnosis empirical research by utilizing online source apportionment technology, to provide technical support for the cause analysis and treatment of heavy pollution weather.

  14. Spectral properties of minimal-basis-set orbitals: Implications for molecular electronic continuum states

    Science.gov (United States)

    Langhoff, P. W.; Winstead, C. L.

    Early studies of the electronically excited states of molecules by John A. Pople and coworkers employing ab initio single-excitation configuration interaction (SECI) calculations helped to simulate related applications of these methods to the partial-channel photoionization cross sections of polyatomic molecules. The Gaussian representations of molecular orbitals adopted by Pople and coworkers can describe SECI continuum states when sufficiently large basis sets are employed. Minimal-basis virtual Fock orbitals stabilized in the continuous portions of such SECI spectra are generally associated with strong photoionization resonances. The spectral attributes of these resonance orbitals are illustrated here by revisiting previously reported experimental and theoretical studies of molecular formaldehyde (H2CO) in combination with recently calculated continuum orbital amplitudes.

  15. On the Sophistication of Naïve Empirical Reasoning: Factors Influencing Mathematicians' Persuasion Ratings of Empirical Arguments

    Science.gov (United States)

    Weber, Keith

    2013-01-01

    This paper presents the results of an experiment in which mathematicians were asked to rate how persuasive they found two empirical arguments. There were three key results from this study: (a) Participants judged an empirical argument as more persuasive if it verified that integers possessed an infrequent property than if it verified that integers…

  16. Two concepts of empirical ethics.

    Science.gov (United States)

    Parker, Malcolm

    2009-05-01

    The turn to empirical ethics answers two calls. The first is for a richer account of morality than that afforded by bioethical principlism, which is cast as excessively abstract and thin on the facts. The second is for the facts in question to be those of human experience and not some other, unworldly realm. Empirical ethics therefore promises a richer naturalistic ethics, but in fulfilling the second call it often fails to heed the metaethical requirements related to the first. Empirical ethics risks losing the normative edge which necessarily characterizes the ethical, by failing to account for the nature and the logic of moral norms. I sketch a naturalistic theory, teleological expressivism (TE), which negotiates the naturalistic fallacy by providing a more satisfactory means of taking into account facts and research data with ethical implications. The examples of informed consent and the euthanasia debate are used to illustrate the superiority of this approach, and the problems consequent on including the facts in the wrong kind of way.

  17. ORGANIZATIONAL VALUES AND MORAL VIRTUES OF ENTREPRENEUR: AN EMPIRICAL STUDY OF SLOVENIAN ENTREPRENEURS

    Directory of Open Access Journals (Sweden)

    Vasilij Mate

    2013-05-01

    Full Text Available This article examines the self-reflexion of Slovenian entrepreneurs to their own business activity, with a focus on their core values and virtues, which would consequently affect the performance, growth and development of entrepreneurship in Slovenia. The article starts with a theoretical understanding of organizational values and moral virtues of entrepreneurs and review of the recent empirical studies as the basis on which it is possible to achieve the explanation of the attitude of Slovenian entrepreneurs towards entrepreneurship. We have conducted our own empirical quantitative study on the representative sample of Slovenian entrepreneurs (n =114. Using the obtained results, we tried to verify the six hypotheses. We were particularly interested in those hypotheses that presuppose the entrepreneur who highly appreciates and respects the values and virtues of an ethical businesspearson in practice, will be more economically successful. Based on the results of our research we indicated that the Slovenian entrepreneurs are largely aware of the relevant organizational values and moral virtues, although this is not always obvious in their actions in everyday business practices. The article concludes with an interpretation of the results and discussion of the prospects and challenges for further exploration of the topics covered.

  18. Rainfall Intensity and Frequency Explain Production Basis Risk in Cumulative Rain Index Insurance

    Science.gov (United States)

    Muneepeerakul, Chitsomanus P.; Muneepeerakul, Rachata; Huffaker, Ray G.

    2017-12-01

    With minimal moral hazard and adverse selection, weather index insurance promises financial resilience to farmers struck by harsh weather conditions through swift compensation at affordable premium. Despite these advantages, the very nature of indexing gives rise to production basis risk as the selected weather indexes do not sufficiently correspond to actual damages. To address this problem, we develop a stochastic yield model, built upon a stochastic soil moisture model driven by marked Poisson rainfall. Our analysis shows that even under similar temperature and rainfall amount yields can differ significantly; this was empirically supported by a 2-year field experiment in which rain-fed maize was grown under very similar total rainfall. Here, the year with more intense, less-frequent rainfall produces a better yield—a rare counter evidence to most climate change projections. Through a stochastic yield model, we demonstrate the crucial roles of rainfall intensity and frequency in determining the yield. Importantly, the model allows us to compute rainfall pattern-related basis risk inherent in cumulative rain index insurance. The model results and a case study herein clearly show that total rainfall is a poor indicator of yield, imposing unnecessary production basis risk on farmers and false-positive payouts on insurers. Incorporating rainfall intensity and frequency in the design of rain index insurance can offer farmers better protection, while maintaining the attractive features of the weather index insurance and thus fulfilling its promise of financial resilience.

  19. Design of subjects training on reactor simulator and feasibility study - toward the empirical evaluation of interface design concept

    International Nuclear Information System (INIS)

    Yamaguchi, Y.; Furukawa, H.; Tanabe, F.

    1998-01-01

    On-going JAERI's project for empirical evaluation of the ecological interface design concept was first described. The empirical evaluation is planned to be proceeded through three consecutive steps; designing and actual implementation of the interface on reactor simulator, verification of the interface created, and the validation by the simulator experiment. For conducting the project, three different experimental resources are prerequisite, that are, data analysis method for identifying the operator's strategies, experimental facility including reactor simulator, and experimental subjects or subjects training method. Among the three experimental resources, subjects training method was recently designed and a simulator experiment was earned out in order to examine the feasibility of the designed training method. From the experiment and analysis of the experimental records, we could conclude that it is feasible that the experimental subjects having an appropriate technical basis can gain the sufficient competence for evaluation work of the interface design concept by using the training method designed. (author)

  20. Ravenna from imperial residence to episcopal city: processes of centrality across empires

    Directory of Open Access Journals (Sweden)

    Salvatore Cosentino

    2015-01-01

    Full Text Available From Late Antiquity to the early Middle Ages, two basic factor shaped Ravenna’s ability to influence a much more extensive space than its natural hinterland. The first was its establishment as an imperial residence the second was its location within the northern Adriatic basin, which had since Antiquity been a crossroads for peoples, trade and cultures. Just on the basis of the support it received from the imperial power, its episcopate was elevated to one of the most important sees of Italy. By means of the large international harbour of Classe, from the 5th to the 7th centuries the city imported products from around the entire Mediterranean. With the arrival of the Byzantine government, the ties between the port of Classe and the other Mediterranean export centres shifted by moving from West to East. Moreover, the relationship with Constantinople reaffirmed the political and ecclesiastical importance of Ravenna. As long as these ties remained strong, Ravenna retained a vital contact to the other maritime Mediterranean trade centres. The twilight of Byzantine rule did not cause the decline of the city, but rather a progressive turn of its ruling class toward the political scenario of the medieval West. By virtue of being the management centre of the patrimonium beati Apollinaris, the city remained wealthy and influential well beyond the 9th century. This was due both to the economic power of its archbishops and to their alliance with the Ottonians and then later with the Salian and Swabian emperors. The trajectories of the political centrality of Ravenna from Late Antiquity to the Middle Ages were, therefore, deeply influenced by the dynamic of successive empires, which, in one form or another, were all connected or attempted to reconnect to the memory of its Roman past.

  1. Institutions and growth: theoretical foundations and empirical evidence

    International Nuclear Information System (INIS)

    Wagner, A.F.

    2000-09-01

    Institutions and growth rates are strongly linked both theoretically and empirically. They act through 'efficiency of governance'), as institutions of conflict management, and as devices for inter temporal optimization. Contrary to most of the literature, a non-linear (inversely u-shaped) influence of institutions on growth is also formally derived as a general hypothesis; this reflects the widely neglected notion that institutions also bring opportunity costs with them. Systematic econometric evaluations for a world-wide cross-sectional sample and a European Union panel show mixed results: the rule of law, property rights, and contract and law enforcement are consistently positively related to growth. In Europe, a non-linear relationship is often found for these and other institutions. Corporatism and trust are good for growth in Europe; both bring with them significant rent-seeking costs, though. Comparing the results one notes that no easy transfer of knowledge is possible from one sample to the other. This is an important policy conclusion in its own right. (author)

  2. A subleading power operator basis for the scalar quark current

    Science.gov (United States)

    Chang, Cyuan-Han; Stewart, Iain W.; Vita, Gherardo

    2018-04-01

    Factorization theorems play a crucial role in our understanding of the strong interaction. For collider processes they are typically formulated at leading power and much less is known about power corrections in the λ ≪ 1 expansion. Here we present a complete basis of power suppressed operators for a scalar quark current at O({λ}^2) in the amplitude level power expansion in the Soft Collinear Effective Theory, demonstrating that helicity selection rules significantly simplify the construction. This basis applies for the production of any color singlet scalar in q\\overline{q} annihilation (such as b\\overline{b}\\to H ). We also classify all operators which contribute to the cross section at O({λ}^2) and perform matching calculations to determine their tree level Wilson coefficients. These results can be exploited to study power corrections in both resummed and fixed order perturbation theory, and for analyzing the factorization properties of gauge theory amplitudes and cross sections at subleading power.

  3. The Role of Aboriginal Literacy in Improving English Literacy in Remote Aboriginal Communities: An Empirical Systems Analysis with the Interplay Wellbeing Framework

    Science.gov (United States)

    Wilson, Byron; Quinn, Stephen J.; Abbott, Tammy; Cairney, Sheree

    2018-01-01

    Indigenous language endangerment is critical in Australia, with only 120 of 250 known languages remaining, and only 13 considered strong. A related issue is the gap in formal education outcomes for Aboriginal and Torres Strait Islander people compared with other Australians, with the gap wider in remote regions. Little empirical research exists in…

  4. Evapotranspiration Calculation on the Basis of the Riparian Zone Water Balance

    Directory of Open Access Journals (Sweden)

    SZILÁGYI, József

    2008-01-01

    Full Text Available Riparian forests have a strong influence on groundwater levels and groundwater sustainedstream baseflow. An empirical and a hydraulic version of a new method were developed to calculateevapotranspiration values from riparian zone groundwater levels. The new technique was tested on thehydrometeorological data set of the Hidegvíz Valley (located in Sopron Hills at the eastern foothills ofthe Alps experimental catchment. Evapotranspiration values of this new method were compared tothe Penman-Monteith evapotranspiration values on a half hourly scale and to the White methodevapotranspiration values on a daily scale. Sensitivity analysis showed that the more reliable hydraulicversion of our ET estimation technique is most sensitive (i.e., linearly to the values of the saturatedhydraulic conductivity and specific yield taken from the riparian zone.

  5. An empirical study on entrepreneurs' personal characteristics

    Directory of Open Access Journals (Sweden)

    Ahmad Ahmadkhani

    2012-04-01

    Full Text Available The personality of an entrepreneur is one of the most important characteristics of reaching success by creating jobs and opportunities. In this paper, we demonstrate an empirical study on personal characteristics of students who are supposed to act as entrepreneur to create jobs in seven fields of accounting, computer science, mechanical engineering, civil engineering, metallurgy engineering, electrical engineering and drawing. There are seven aspects of accepting reasonable risk, locus of control, the need for success, mental health conditions, being pragmatic, tolerating ambiguity, dreaming and the sense of challenging in our study to measure the level of entrepreneurship. We uniformly distribute 133 questionnaires among undergraduate students in all seven groups and analyze the results based on t-student test. Our investigation indicates that all students accept reasonable amount of risk, they preserve sufficient locus of control and they are eager for success. In addition, our tests indicate that students believe they maintain sufficient level of mental health care with strong sense of being pragmatic and they could handle ambiguity and challenges.

  6. Trade and Empire

    DEFF Research Database (Denmark)

    Bang, Peter Fibiger

    2007-01-01

    This articles seeks to establish a new set of organizing concepts for the analysis of the Roman imperial economy from Republic to late antiquity: tributary empire, port-folio capitalism and protection costs. Together these concepts explain better economic developments in the Roman world than the...

  7. Measuring stakeholder participation in evaluation: an empirical validation of the Participatory Evaluation Measurement Instrument (PEMI).

    Science.gov (United States)

    Daigneault, Pierre-Marc; Jacob, Steve; Tremblay, Joël

    2012-08-01

    Stakeholder participation is an important trend in the field of program evaluation. Although a few measurement instruments have been proposed, they either have not been empirically validated or do not cover the full content of the concept. This study consists of a first empirical validation of a measurement instrument that fully covers the content of participation, namely the Participatory Evaluation Measurement Instrument (PEMI). It specifically examines (1) the intercoder reliability of scores derived by two research assistants on published evaluation cases; (2) the convergence between the scores of coders and those of key respondents (i.e., authors); and (3) the convergence between the authors' scores on the PEMI and the Evaluation Involvement Scale (EIS). A purposive sample of 40 cases drawn from the evaluation literature was used to assess reliability. One author per case in this sample was then invited to participate in a survey; 25 fully usable questionnaires were received. Stakeholder participation was measured on nominal and ordinal scales. Cohen's κ, the intraclass correlation coefficient, and Spearman's ρ were used to assess reliability and convergence. Reliability results ranged from fair to excellent. Convergence between coders' and authors' scores ranged from poor to good. Scores derived from the PEMI and the EIS were moderately associated. Evidence from this study is strong in the case of intercoder reliability and ranges from weak to strong in the case of convergent validation. Globally, this suggests that the PEMI can produce scores that are both reliable and valid.

  8. Quantitative analyses of empirical fitness landscapes

    International Nuclear Information System (INIS)

    Szendro, Ivan G; Franke, Jasper; Krug, Joachim; Schenk, Martijn F; De Visser, J Arjan G M

    2013-01-01

    The concept of a fitness landscape is a powerful metaphor that offers insight into various aspects of evolutionary processes and guidance for the study of evolution. Until recently, empirical evidence on the ruggedness of these landscapes was lacking, but since it became feasible to construct all possible genotypes containing combinations of a limited set of mutations, the number of studies has grown to a point where a classification of landscapes becomes possible. The aim of this review is to identify measures of epistasis that allow a meaningful comparison of fitness landscapes and then apply them to the empirical landscapes in order to discern factors that affect ruggedness. The various measures of epistasis that have been proposed in the literature appear to be equivalent. Our comparison shows that the ruggedness of the empirical landscape is affected by whether the included mutations are beneficial or deleterious and by whether intragenic or intergenic epistasis is involved. Finally, the empirical landscapes are compared to landscapes generated with the rough Mt Fuji model. Despite the simplicity of this model, it captures the features of the experimental landscapes remarkably well. (paper)

  9. On the effects of basis set truncation and electron correlation in conformers of 2-hydroxy-acetamide

    Science.gov (United States)

    Szarecka, A.; Day, G.; Grout, P. J.; Wilson, S.

    Ab initio quantum chemical calculations have been used to study the differences in energy between two gas phase conformers of the 2-hydroxy-acetamide molecule that possess intramolecular hydrogen bonding. In particular, rotation around the central C-C bond has been considered as a factor determining the structure of the hydrogen bond and stabilization of the conformer. Energy calculations include full geometiy optimization using both the restricted matrix Hartree-Fock model and second-order many-body perturbation theory with a number of commonly used basis sets. The basis sets employed ranged from the minimal STO-3G set to [`]split-valence' sets up to 6-31 G. The effects of polarization functions were also studied. The results display a strong basis set dependence.

  10. Empirical pseudo-potential studies on electronic structure

    Indian Academy of Sciences (India)

    Theoretical investigations of electronic structure of quantum dots is of current interest in nanophase materials. Empirical theories such as effective mass approximation, tight binding methods and empirical pseudo-potential method are capable of explaining the experimentally observed optical properties. We employ the ...

  11. Criterion for traffic phases in single vehicle data and empirical test of a microscopic three-phase traffic theory

    International Nuclear Information System (INIS)

    Kerner, Boris S; Klenov, Sergey L; Hiller, Andreas

    2006-01-01

    Based on empirical and numerical microscopic analyses, the physical nature of a qualitatively different behaviour of the wide moving jam phase in comparison with the synchronized flow phase-microscopic traffic flow interruption within the wide moving jam phase-is found. A microscopic criterion for distinguishing the synchronized flow and wide moving jam phases in single vehicle data measured at a single freeway location is presented. Based on this criterion, empirical microscopic classification of different local congested traffic states is performed. Simulations made show that the microscopic criterion and macroscopic spatiotemporal objective criteria lead to the same identification of the synchronized flow and wide moving jam phases in congested traffic. Microscopic models in the context of three-phase traffic theory have been tested based on the microscopic criterion for the phases in congested traffic. It is found that microscopic three-phase traffic models can explain both microscopic and macroscopic empirical congested pattern features. It is obtained that microscopic frequency distributions for vehicle speed difference as well as fundamental diagrams and speed correlation functions can depend on the spatial co-ordinate considerably. It turns out that microscopic optimal velocity (OV) functions and time headway distributions are not necessarily qualitatively different, even if local congested traffic states are qualitatively different. The reason for this is that important spatiotemporal features of congested traffic patterns are lost in these as well as in many other macroscopic and microscopic traffic characteristics, which are widely used as the empirical basis for a test of traffic flow models, specifically, cellular automata traffic flow models

  12. An Empirical Taxonomy of Crowdfunding Intermediaries

    OpenAIRE

    Haas, Philipp; Blohm, Ivo; Leimeister, Jan Marco

    2014-01-01

    Due to the recent popularity of crowdfunding, a broad magnitude of crowdfunding intermediaries has emerged, while research on crowdfunding intermediaries has been largely neglected. As a consequence, existing classifications of crowdfunding intermediaries are conceptual, lack theoretical grounding, and are not empirically validated. Thus, we develop an empirical taxonomy of crowdfunding intermediaries, which is grounded in the theories of two-sided markets and financial intermediation. Integr...

  13. A sensitivity analysis of centrifugal compressors' empirical models

    International Nuclear Information System (INIS)

    Yoon, Sung Ho; Baek, Je Hyun

    2001-01-01

    The mean-line method using empirical models is the most practical method of predicting off-design performance. To gain insight into the empirical models, the influence of empirical models on the performance prediction results is investigated. We found that, in the two-zone model, the secondary flow mass fraction has a considerable effect at high mass flow-rates on the performance prediction curves. In the TEIS model, the first element changes the slope of the performance curves as well as the stable operating range. The second element makes the performance curves move up and down as it increases or decreases. It is also discovered that the slip factor affects pressure ratio, but it has little effect on efficiency. Finally, this study reveals that the skin friction coefficient has significant effect on both the pressure ratio curve and the efficiency curve. These results show the limitations of the present empirical models, and more reasonable empirical models are reeded

  14. Reduced multiple empirical kernel learning machine.

    Science.gov (United States)

    Wang, Zhe; Lu, MingZhe; Gao, Daqi

    2015-02-01

    Multiple kernel learning (MKL) is demonstrated to be flexible and effective in depicting heterogeneous data sources since MKL can introduce multiple kernels rather than a single fixed kernel into applications. However, MKL would get a high time and space complexity in contrast to single kernel learning, which is not expected in real-world applications. Meanwhile, it is known that the kernel mapping ways of MKL generally have two forms including implicit kernel mapping and empirical kernel mapping (EKM), where the latter is less attracted. In this paper, we focus on the MKL with the EKM, and propose a reduced multiple empirical kernel learning machine named RMEKLM for short. To the best of our knowledge, it is the first to reduce both time and space complexity of the MKL with EKM. Different from the existing MKL, the proposed RMEKLM adopts the Gauss Elimination technique to extract a set of feature vectors, which is validated that doing so does not lose much information of the original feature space. Then RMEKLM adopts the extracted feature vectors to span a reduced orthonormal subspace of the feature space, which is visualized in terms of the geometry structure. It can be demonstrated that the spanned subspace is isomorphic to the original feature space, which means that the dot product of two vectors in the original feature space is equal to that of the two corresponding vectors in the generated orthonormal subspace. More importantly, the proposed RMEKLM brings a simpler computation and meanwhile needs a less storage space, especially in the processing of testing. Finally, the experimental results show that RMEKLM owns a much efficient and effective performance in terms of both complexity and classification. The contributions of this paper can be given as follows: (1) by mapping the input space into an orthonormal subspace, the geometry of the generated subspace is visualized; (2) this paper first reduces both the time and space complexity of the EKM-based MKL; (3

  15. Cultural active approach to the issue of emotion regulation: theoretical explanation and empirical verification of a conceptual model

    Directory of Open Access Journals (Sweden)

    Elena I. Pervichko

    2016-06-01

    Full Text Available The paper gives a theoretical explanation and empirical verification of a conceptual emotion-regulating model, developed in the theoretical methodological context of cultural-active paradigm. A universal hypothesis concerning emotion regulation as a system including psychological and physiological levels has been verified empirically. The psychological level may be subdivided on motivational thinking level and operational technical ones, ruled by such psychological mechanisms as reflection and symbolical mediation. It has been figured out that motivational peculiarities determine the manifestation of other analyzed components of the system of emotion regulation. This is true not only for healthy patients, but also for patients with mitral valve prolapse (MVP. The significance of reflection and symbolical mediation in the system of cultural-active paradigm and emotion regulation has been determined. It has been proved that emotion regulation among patients with MVP differs from that of healthy people, highlighted by a very strong conflict of goal-achieving and fail-avoiding motives, lack of personal reflection and distortion of symbolical mediation, and very limited emotion-regulative resources. It has been shown that patients with MVP differ from the control group, suffering from far more strong emotional stress. It distributes an overall negative impact, reducing the ability to use emotion-regulating resource in emotionally meaningful situations effectively.

  16. Advancing Empirical Approaches to the Concept of Resilience: A Critical Examination of Panarchy, Ecological Information, and Statistical Evidence

    Directory of Open Access Journals (Sweden)

    Ali Kharrazi

    2016-09-01

    Full Text Available Despite its ambiguities, the concept of resilience is of critical importance to researchers, practitioners, and policy-makers in dealing with dynamic socio-ecological systems. In this paper, we critically examine the three empirical approaches of (i panarchy; (ii ecological information-based network analysis; and (iii statistical evidence of resilience to three criteria determined for achieving a comprehensive understanding and application of this concept. These criteria are the ability: (1 to reflect a system’s adaptability to shocks; (2 to integrate social and environmental dimensions; and (3 to evaluate system-level trade-offs. Our findings show that none of the three currently applied approaches are strong in handling all three criteria. Panarchy is strong in the first two criteria but has difficulty with normative trade-offs. The ecological information-based approach is strongest in evaluating trade-offs but relies on common dimensions that lead to over-simplifications in integrating the social and environmental dimensions. Statistical evidence provides suggestions that are simplest and easiest to act upon but are generally weak in all three criteria. This analysis confirms the value of these approaches in specific instances but also the need for further research in advancing empirical approaches to the concept of resilience.

  17. THE INFLUENCE OF PERCEIVED RISK ON CONUMERS’ INTENTION TO BUY ONLINE: A META-ANALYSIS OF EMPIRICAL RESULTS

    OpenAIRE

    Iconaru Claudia; Perju Alexandra; Macovei Octav Ionut

    2012-01-01

    When buying online consumers fear for the security of their financial data and the privacy of their personal information. These two fears summed up gives researchers the perceived risk of an online transaction. The influence of perceived risk on consumers’ intention to buy online has been studied in various models, ranging from having an insignificant influence to having a strong and direct influence. Faced with these confusing results from previous empirical researches, we wonder why there a...

  18. Empirical models of structure of personal qualities of heads: affective type of social action by M. Weber (results of applied researches

    Directory of Open Access Journals (Sweden)

    A. A. Oseev

    2016-01-01

    Full Text Available The article is devoted to methodological foundations of research of leader’s personal qualities. In difference from the previous work, which was devoted to a research of personal qualities of heads, including civil officers, at works of Plato, Aristotle and M. Weber, where were shown empirical models of structure of personal qualities of heads: instrumental-rational and value-rational social actions. This publication presents the empirical models of structure of personal qualities of heads of affective type of M. Weber’s social action. Thanks to it, M. Weber’s concept about social action receives one more approach to verification in practice. The following directions of social researches are allocated. The first direction. When in structure of personal qualities the emotional component is a dominant (“emotional unbalance”, in comparison with intellectual, moral, strong-willed and other personal qualities (diplomacy, social experience, and so forth. Those people, whose indicators of emotional unbalance are in extreme, in maximum borders - carry to psychopaths and they are an object of clinical psychology and medicine. The second direction. When in structure of personal qualities emotional unbalance competes on equal terms (equally has bright difference, a deviation from average values to intellectual, moral and strong-willed qualities. The third direction. When in structure of personal qualities intellectual, moral and strong-willed and others personal qualities dominate over affective lines.

  19. Gazprom: the new empire

    International Nuclear Information System (INIS)

    Guillemoles, A.; Lazareva, A.

    2008-01-01

    Gazprom is conquering the world. The Russian industrial giant owns the hugest gas reserves and enjoys the privilege of a considerable power. Gazprom edits journals, owns hospitals, airplanes and has even built cities where most of the habitants work for him. With 400000 workers, Gazprom represents 8% of Russia's GDP. This inquiry describes the history and operation of this empire and show how its has become a masterpiece of the government's strategy of russian influence reconquest at the world scale. Is it going to be a winning game? Are the corruption affairs and the expected depletion of resources going to weaken the empire? The authors shade light on the political and diplomatic strategies that are played around the crucial dossier of the energy supply. (J.S.)

  20. Intermodal connectivity in Europe, an empirical exploration

    NARCIS (Netherlands)

    de Langen, P.W.; Lases Figueroa, D.M.; van Donselaar, K.H.; Bozuwa, J.

    2017-01-01

    In this paper we analyse the intermodal connectivity in Europe. The empirical analysis is to our knowledge the first empirical analysis of intermodal connections, and is based on a comprehensive database of intermodal connections in Europe. The paper focuses on rail and barge services, as they are

  1. The emerging empirics of evolutionary economic geography

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.

    2011-01-01

    Following last decade’s programmatic papers on Evolutionary Economic Geography, we report on recent empirical advances and how this empirical work can be positioned vis-a`-vis other strands of research in economic geography. First, we review studies on the path dependent nature of clustering, and

  2. The emerging empirics of evolutionary economic geography

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.

    2010-01-01

    Following last decade’s programmatic papers on Evolutionary Economic Geography, we report on recent empirical advances and how this empirical work can be positioned vis-à-vis other strands of research in economic geography. First, we review studies on the path dependent nature of clustering, and how

  3. The emerging empirics of evolutionary economic geography.

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.

    2011-01-01

    Following last decade’s programmatic papers on Evolutionary Economic Geography, we report on recent empirical advances and how this empirical work can be positioned vis-a`-vis other strands of research in economic geography. First, we review studies on the path dependent nature of clustering, and

  4. Innovation in clean coal technologies. Empirical evidence from firm-level patent data

    Energy Technology Data Exchange (ETDEWEB)

    Kruse, Juergen [Koeln Univ. (Germany). Dept. of Economics; Koeln Univ. (Germany). Energiewirtschaftliches Inst.; Wetzel, Heike [Kassel Univ. (Germany). Inst. of Economics

    2016-02-15

    This article empirically analyzes supply-side and demand-side factors expected to a.ect innovation in clean coal technologies. Patent data from 93 national and international patent offices is used to construct new firm-level panel data on 3,648 clean coal innovators over the time period 1978 to 2009. The results indicate that on the supply-side a firm¡¯s history in clean coal patenting and overall propensity to patent positively a.ects clean coal innovation. On the demand-side we find strong evidence that environmental regulation of emissions, that is CO{sub 2}, NO{sub X} and SO{sub 2}, induces innovation in both efficiency improving combustion and after pollution control technologies.

  5. VMF3/GPT3: refined discrete and empirical troposphere mapping functions

    Science.gov (United States)

    Landskron, Daniel; Böhm, Johannes

    2018-04-01

    Incorrect modeling of troposphere delays is one of the major error sources for space geodetic techniques such as Global Navigation Satellite Systems (GNSS) or Very Long Baseline Interferometry (VLBI). Over the years, many approaches have been devised which aim at mapping the delay of radio waves from zenith direction down to the observed elevation angle, so-called mapping functions. This paper contains a new approach intended to refine the currently most important discrete mapping function, the Vienna Mapping Functions 1 (VMF1), which is successively referred to as Vienna Mapping Functions 3 (VMF3). It is designed in such a way as to eliminate shortcomings in the empirical coefficients b and c and in the tuning for the specific elevation angle of 3°. Ray-traced delays of the ray-tracer RADIATE serve as the basis for the calculation of new mapping function coefficients. Comparisons of modeled slant delays demonstrate the ability of VMF3 to approximate the underlying ray-traced delays more accurately than VMF1 does, in particular at low elevation angles. In other words, when requiring highest precision, VMF3 is to be preferable to VMF1. Aside from revising the discrete form of mapping functions, we also present a new empirical model named Global Pressure and Temperature 3 (GPT3) on a 5°× 5° as well as a 1°× 1° global grid, which is generally based on the same data. Its main components are hydrostatic and wet empirical mapping function coefficients derived from special averaging techniques of the respective (discrete) VMF3 data. In addition, GPT3 also contains a set of meteorological quantities which are adopted as they stand from their predecessor, Global Pressure and Temperature 2 wet. Thus, GPT3 represents a very comprehensive troposphere model which can be used for a series of geodetic as well as meteorological and climatological purposes and is fully consistent with VMF3.

  6. Is Investment in Maize Research Balanced and Justified? An Empirical Study

    Directory of Open Access Journals (Sweden)

    Hari Krishna Shrestha

    2016-12-01

    Full Text Available The objective of this study was to investigate whether the investment in maize research was adequate and balanced in Nepalese context. Resource use in maize research was empirically studied with standard congruency analysis by using Full Time Equivalent (FTE of researchers as a proxy measure of investment. The number of researchers involved in maize was 61 but it was only 21.25 on FTE basis, indicating that full time researchers were very few as compared to the cultivated area of maize in the country. Statistical analysis revealed that the investment in maize research was higher in Tarai and lower in the Hills. Congruency index on actual production basis was found low across the eco-zones and even lower across the geographical regions indicating that the investment in maize research was a mismatch and not justified. While adjusted with the equity factor and the research progress factor in the analysis substantial difference was not found in congruency index. This study recommends that substantial increase in investment in maize research is needed with balanced and justified manner across the eco-zones and the geographical regions. Hills need special attention to increase the investment as maize output value is higher in this eco-zone. Eastern and western regions also need increased investment in maize according to their contribution in the output value.

  7. Comparison of an automated classification system with an empirical classification of circulation patterns over the Pannonian basin, Central Europe

    Science.gov (United States)

    Maheras, Panagiotis; Tolika, Konstantia; Tegoulias, Ioannis; Anagnostopoulou, Christina; Szpirosz, Klicász; Károssy, Csaba; Makra, László

    2018-04-01

    The aim of the study is to compare the performance of the two classification methods, based on the atmospheric circulation types over the Pannonian basin in Central Europe. Moreover, relationships including seasonal occurrences and correlation coefficients, as well as comparative diagrams of the seasonal occurrences of the circulation types of the two classification systems are presented. When comparing of the automated (objective) and empirical (subjective) classification methods, it was found that the frequency of the empirical anticyclonic (cyclonic) types is much higher (lower) than that of the automated anticyclonic (cyclonic) types both on an annual and seasonal basis. The highest and statistically significant correlations between the circulation types of the two classification systems, as well as those between the cumulated seasonal anticyclonic and cyclonic types occur in winter for both classifications, since the weather-influencing effect of the atmospheric circulation in this season is the most prevalent. Precipitation amounts in Budapest display a decreasing trend in accordance with the decrease in the occurrence of the automated cyclonic types. In contrast, the occurrence of the empirical cyclonic types displays an increasing trend. There occur types in a given classification that are usually accompanied by high ratios of certain types in the other classification.

  8. An empirical spectroscopic database for acetylene in the regions of 5850-6341 cm-1 and 7000-9415 cm-1

    Science.gov (United States)

    Lyulin, O. M.; Campargue, A.

    2017-12-01

    Six studies have been recently devoted to a systematic analysis of the high-resolution near infrared absorption spectrum of acetylene recorded by Cavity Ring Down spectroscopy (CRDS) in Grenoble and by Fourier-transform spectroscopy (FTS) in Brussels and Hefei. On the basis of these works, in the present contribution, we construct an empirical database for acetylene in the 5850-9415 cm-1 region excluding the 6341-7000 cm-1 interval corresponding to the very strong ν1+ν3 manifold. Our database gathers and extends information included in our CRDS and FTS studies. In particular, the intensities of about 1700 lines measured by CRDS in the 7244-7920 cm-1 region are reported for the first time together with those of several bands of 12C13CH2 present in natural isotopic abundance in the acetylene sample. The Herman-Wallis coefficients of most of the bands are derived from a fit of the measured intensity values. A recommended line list is provided with positions calculated using empirical spectroscopic parameters of the lower and upper energy vibrational levels and intensities calculated using the derived Herman-Wallis coefficients. This approach allows completing the experimental list by adding missing lines and improving poorly determined positions and intensities. As a result the constructed line list includes a total of 11113 transitions belonging to 150 bands of 12C2H2 and 29 bands of 12C13CH2. For comparison the HITRAN database in the same region includes 869 transitions of 14 bands, all belonging to 12C2H2. Our weakest lines have an intensity on the order of 10-29 cm/molecule, about three orders of magnitude smaller than the HITRAN intensity cut off. Line profile parameters are added to the line list which is provided in HITRAN format. The comparison of the acetylene database to the HITRAN2012 line list or to results obtained using the global effective operator approach is discussed in terms of completeness and accuracy.

  9. [The current conception of the unconscious - empirical results of neurobiology, cognitive sciences, social psychology and emotion research].

    Science.gov (United States)

    Schüssler, Gerhard

    2002-01-01

    The influence of the unconscious on psychosomatic medicine and psychotherapy: a comprehensive concept of unconscious processes based on empirical evidence. The theory of the Unconscious constitutes the basis of psychoanalysis and of psychodynamic therapy. The traditional description of the Unconscious as given by Freud is of historical significance and not only gained widespread acceptance but also attracted much criticism. The most important findings of neurobiology, the cognitive sciences, social psychology and emotion research in relation to the Unconscious are compared with this traditional definition. Empirical observations on defence mechanisms are of particular interest in this context. A comprehensive concept of unconscious processes is revealed: the fundamental process of brain function is unconscious. Parts of the symbolic-declarative and emotional-procedural processing by the brain are permanently unconscious. Other parts of these processing procedures are conscious or can be brought to the conscious or alternatively, can also be excluded from the conscious. Unconscious processes exert decisive influence on experience and behaviour; for this reason, every form of psychotherapy should take into account such unconscious processes.

  10. Prediction of Physicochemical Properties of Organic Molecules Using Semi-Empirical Methods

    International Nuclear Information System (INIS)

    Kim, Chan Kyung; Kim, Chang Kon; Kim, Miri; Lee, Hai Whang; Cho, Soo Gyeong

    2013-01-01

    Prediction of physicochemical properties of organic molecules is an important process in chemistry and chemical engineering. The MSEP approach developed in our lab calculates the molecular surface electrostatic potential (ESP) on van der Waals (vdW) surfaces of molecules. This approach includes geometry optimization and frequency calculation using hybrid density functional theory, B3LYP, at the 6-31G(d) basis set to find minima on the potential energy surface, and is known to give satisfactory QSPR results for various properties of organic molecules. However, this MSEP method is not applicable to screen large database because geometry optimization and frequency calculation require considerable computing time. To develop a fast but yet reliable approach, we have re-examined our previous work on organic molecules using two semi-empirical methods, AM1 and PM3. This new approach can be an efficient protocol in designing new molecules with improved properties

  11. Methods for Calculating Empires in Quasicrystals

    Directory of Open Access Journals (Sweden)

    Fang Fang

    2017-10-01

    Full Text Available This paper reviews the empire problem for quasiperiodic tilings and the existing methods for generating the empires of the vertex configurations in quasicrystals, while introducing a new and more efficient method based on the cut-and-project technique. Using Penrose tiling as an example, this method finds the forced tiles with the restrictions in the high dimensional lattice (the mother lattice that can be cut-and-projected into the lower dimensional quasicrystal. We compare our method to the two existing methods, namely one method that uses the algorithm of the Fibonacci chain to force the Ammann bars in order to find the forced tiles of an empire and the method that follows the work of N.G. de Bruijn on constructing a Penrose tiling as the dual to a pentagrid. This new method is not only conceptually simple and clear, but it also allows us to calculate the empires of the vertex configurations in a defected quasicrystal by reversing the configuration of the quasicrystal to its higher dimensional lattice, where we then apply the restrictions. These advantages may provide a key guiding principle for phason dynamics and an important tool for self error-correction in quasicrystal growth.

  12. Structural basis for PPARγ transactivation by endocrine-disrupting organotin compounds

    Science.gov (United States)

    Harada, Shusaku; Hiromori, Youhei; Nakamura, Shota; Kawahara, Kazuki; Fukakusa, Shunsuke; Maruno, Takahiro; Noda, Masanori; Uchiyama, Susumu; Fukui, Kiichi; Nishikawa, Jun-Ichi; Nagase, Hisamitsu; Kobayashi, Yuji; Yoshida, Takuya; Ohkubo, Tadayasu; Nakanishi, Tsuyoshi

    2015-02-01

    Organotin compounds such as triphenyltin (TPT) and tributyltin (TBT) act as endocrine disruptors through the peroxisome proliferator-activated receptor γ (PPARγ) signaling pathway. We recently found that TPT is a particularly strong agonist of PPARγ. To elucidate the mechanism underlying organotin-dependent PPARγ activation, we here analyzed the interactions of PPARγ ligand-binding domain (LBD) with TPT and TBT by using X-ray crystallography and mass spectroscopy in conjunction with cell-based activity assays. Crystal structures of PPARγ-LBD/TBT and PPARγ-LBD/TPT complexes were determined at 1.95 Å and 1.89 Å, respectively. Specific binding of organotins is achieved through non-covalent ionic interactions between the sulfur atom of Cys285 and the tin atom. Comparisons of the determined structures suggest that the strong activity of TPT arises through interactions with helix 12 of LBD primarily via π-π interactions. Our findings elucidate the structural basis of PPARγ activation by TPT.

  13. An empirical analysis of Diaspora bonds

    OpenAIRE

    AKKOYUNLU, Şule; STERN, Max

    2018-01-01

    Abstract. This study is the first to investigate theoretically and empirically the determinants of Diaspora Bonds for eight developing countries (Bangladesh, Ethiopia, Ghana, India, Lebanon, Pakistan, the Philippines, and Sri-Lanka) and one developed country - Israel for the period 1951 and 2008. Empirical results are consistent with the predictions of the theoretical model. The most robust variables are the closeness indicator and the sovereign rating, both on the demand-side. The spread is ...

  14. How rational should bioethics be? The value of empirical approaches.

    Science.gov (United States)

    Alvarez, A A

    2001-10-01

    Rational justification of claims with empirical content calls for empirical and not only normative philosophical investigation. Empirical approaches to bioethics are epistemically valuable, i.e., such methods may be necessary in providing and verifying basic knowledge about cultural values and norms. Our assumptions in moral reasoning can be verified or corrected using these methods. Moral arguments can be initiated or adjudicated by data drawn from empirical investigation. One may argue that individualistic informed consent, for example, is not compatible with the Asian communitarian orientation. But this normative claim uses an empirical assumption that may be contrary to the fact that some Asians do value and argue for informed consent. Is it necessary and factual to neatly characterize some cultures as individualistic and some as communitarian? Empirical investigation can provide a reasonable way to inform such generalizations. In a multi-cultural context, such as in the Philippines, there is a need to investigate the nature of the local ethos before making any appeal to authenticity. Otherwise we may succumb to the same ethical imperialism we are trying hard to resist. Normative claims that involve empirical premises cannot be reasonable verified or evaluated without utilizing empirical methods along with philosophical reflection. The integration of empirical methods to the standard normative approach to moral reasoning should be reasonably guided by the epistemic demands of claims arising from cross-cultural discourse in bioethics.

  15. Merging expert and empirical data for rare event frequency estimation: Pool homogenisation for empirical Bayes models

    International Nuclear Information System (INIS)

    Quigley, John; Hardman, Gavin; Bedford, Tim; Walls, Lesley

    2011-01-01

    Empirical Bayes provides one approach to estimating the frequency of rare events as a weighted average of the frequencies of an event and a pool of events. The pool will draw upon, for example, events with similar precursors. The higher the degree of homogeneity of the pool, then the Empirical Bayes estimator will be more accurate. We propose and evaluate a new method using homogenisation factors under the assumption that events are generated from a Homogeneous Poisson Process. The homogenisation factors are scaling constants, which can be elicited through structured expert judgement and used to align the frequencies of different events, hence homogenising the pool. The estimation error relative to the homogeneity of the pool is examined theoretically indicating that reduced error is associated with larger pool homogeneity. The effects of misspecified expert assessments of the homogenisation factors are examined theoretically and through simulation experiments. Our results show that the proposed Empirical Bayes method using homogenisation factors is robust under different degrees of misspecification.

  16. The genetic basis of hair whorl, handedness, and other phenotypes

    Science.gov (United States)

    Hatfield, J.S.

    2006-01-01

    Evidence is presented that RHO, RHCE, and other RH genes, may be interesting candidates to consider when searching for the genetic basis of hair whorl rotation (i.e., clockwise or counterclockwise), handedness (i.e., right handed, left handed or ambidextrous), speech laterality (i.e., right brained or left brained), speech dyslexia (e.g., stuttering), sexual orientation (i.e., heterosexual, homosexual, bisexual, or transsexual), schizophrenia, bipolar disorder, and autism spectrum disorder. Such evidence involves the need for a genetic model that includes maternal immunization to explain some of the empirical results reported in the literature. The complex polymorphisms present among the maternally immunizing RH genes can then be used to explain other empirical results. Easily tested hypotheses are suggested, based upon genotypic (but not phenotypic) frequencies of the RH genes. In particular, homozygous dominant individuals are expected to be less common or lacking entirely among the alternative phenotypes. If it is proven that RH genes are involved in brain architecture, it will have a profound effect upon our understanding of the development and organization of the asymmetrical vertebrate brain and may eventually lead to a better understanding of the developmental processes which occur to produce the various alternative phenotypes discussed here. In addition, if RH genes are shown to be involved in the production of these phenotypes, then the evolutionary studies can be performed to demonstrate the beneficial effect of the recessive alleles of RHO and RHCE, and why human evolution appears to be selecting for the recessive alleles even though an increase in the frequency of such alleles may imply lower average fecundity among some individuals possessing them.

  17. Basis for criteria for exemption of decommissioning waste

    International Nuclear Information System (INIS)

    Elert, M.; Wiborgh, M.; Bengtsson, A.

    1992-02-01

    The purpose of this study was to provide the Swedish Radiation Protection Institute (SSI) with technical background material to be used as a basis for future decisions concerning exemption levels for decommissioning waste. Simple models have been developed for evaluating the individual doses that may arise from steel waste, concrete waste and non-burnable waste exempted from regulatory control. Two alternatives were studied for the exempted wastes: recycling and disposal in different types of near-surface repositories. The example calculations for the recycling scenarios show that the individual dose obtained is strongly dependent on the exposure time. Thus, the storage of the waste at a scrap yard will give rise to a higher dose than the melting, due to the longer storage time. (28 refs.)

  18. An empirically tractable model of optimal oil spills prevention in Russian sea harbours

    Energy Technology Data Exchange (ETDEWEB)

    Deissenberg, C. [CEFI-CNRS, Les Milles (France); Gurman, V.; Tsirlin, A. [RAS, Program Systems Inst., Pereslavl-Zalessky (Russian Federation); Ryumina, E. [Russian Academy of Sciences, Moscow (Russian Federation). Inst. of Economic Market Problems

    2001-07-01

    Based on previous theoretical work by Gottinger (1997, 1998), we propose a simple model of optimal monitoring of oil-related activities in harbour areas that is suitable for empirical estimation within the Russian-Ukrainian context, in spite of the poor availability of data in these countries. Specifically, the model indicates how to best allocate at the steady state a given monitoring budget between different monitoring activities. An approximate analytical solution to the optimization problem is derived, and a simple procedure for estimating the model on the basis of the actually available data is suggested. An application using data obtained for several harbours of the Black and Baltic Seas is given. It suggests that the current Russian monitoring practice could be much improved by better allocating the available monitoring resources. (Author)

  19. Offshore Wind Turbines Situated in Areas with Strong Currents

    DEFF Research Database (Denmark)

    Jensen, Morten S.; Juul Larsen, Brian; Frigaard, Peter

    Prediction of local scour caused by offshore wind turbine foundations using empirical formulae or numerical models.......Prediction of local scour caused by offshore wind turbine foundations using empirical formulae or numerical models....

  20. Providing the physical basis of SCS curve number method and its proportionality relationship from Richards' equation

    Science.gov (United States)

    Hooshyar, M.; Wang, D.

    2016-12-01

    The empirical proportionality relationship, which indicates that the ratio of cumulative surface runoff and infiltration to their corresponding potentials are equal, is the basis of the extensively used Soil Conservation Service Curve Number (SCS-CN) method. The objective of this paper is to provide the physical basis of the SCS-CN method and its proportionality hypothesis from the infiltration excess runoff generation perspective. To achieve this purpose, an analytical solution of Richards' equation is derived for ponded infiltration in shallow water table environment under the following boundary conditions: 1) the soil is saturated at the land surface; and 2) there is a no-flux boundary which moves downward. The solution is established based on the assumptions of negligible gravitational effect, constant soil water diffusivity, and hydrostatic soil moisture profile between the no-flux boundary and water table. Based on the derived analytical solution, the proportionality hypothesis is a reasonable approximation for rainfall partitioning at the early stage of ponded infiltration in areas with a shallow water table for coarse textured soils.

  1. Analisis Sektor Basis dalam Hubungannya dengan Penyerapan Tenaga Kerja di Kabupaten Batang Hari

    Directory of Open Access Journals (Sweden)

    Syaiful Syaiful

    2014-11-01

    Full Text Available The purpose of the study is to (1 analyze the economy sector which become basis sector with its development rates in Batang Hari Regency, (2 the correlation between the developments of basis sector with employment and to (3 analyze the policy of local government on developing that basis sector through bibliography methods on secondary PDRB data according to constant price 2000 of Batang Hari Regency and Jambi Province from 2003 to 2012 and also employment data in the same periods and regency.  This analysis uses Location Quotient (LQ model and Dynamic Location Quotient (DLQ model.  Whereas the correlation between the developments of basis sector with employment uses Pearson’s Coefficient Of Correlation. The output of LQ analysis shows that there are four economy sectors as basis sector (LQ >1 in Batang Hari, which are agricultures, industry and manufactures, trades, hotels, restaurants, and another distinction service. While from the DLQ analysis, there are four sectors identified can be a basis sector in the future (DLQ >1, which are mining and excavation, electricity, gases and fresh water, transportation and communications, and distinction sectors.  Only distinction sector which is identified as basis sector nowadays as well as in the future.  From the study of the PDRB rate in Batang Hari from 2003 to 2013 shows that average growth rate of agriculture, industry and manufactures, trades, hotels and restaurants are relatively smaller than other sectors in Batang Hari, excluding distinction sectors which are higher. Pearson’s Coefficient Of Correlation analysis evinces there only two basis sectors which its growth has a strong and positive correlation with employment in Batang Hari, which are distinction and trades with hotels and restaurants. The PDRB growth of these sectors moving in the direction of the employment rates.  In the agriculture and manufacture industry sectors, the correlation with employment is low and very weak

  2. Physician Assistant Job Satisfaction: A Narrative Review of Empirical Research.

    Science.gov (United States)

    Hooker, Roderick S; Kuilman, Luppo; Everett, Christine M

    2015-12-01

    To examine physician assistant (PA) job satisfaction and identify factors predicting job satisfaction and identify areas of needed research. With a global PA movement underway and a half-century in development, the empirical basis for informing employers of approaches to improve job satisfaction has not received a careful review. A narrative review of empirical research was undertaken to inform stakeholders about PA employment with a goal of improved management. The a priori criteria included published studies that asked PAs about job satisfaction. Articles addressing PA job satisfaction, written in English, were reviewed and categorized according to the Job Characteristics Model. Of 68 publications reviewed, 29 met criteria and were categorized in a Job Characteristics Model. Most studies report a high degree of job satisfaction when autonomy, income, patient responsibility, physician support, and career advancement opportunities are surveyed. Age, sex, specialty, and occupational background are needed to understand the effect on job satisfaction. Quality of studies varies widely. Employers may want to examine their relationships with PAs periodically. The factors of job satisfaction may assist policymakers and health administrators in creating welcoming professional employment environments. The main limitation: no study comprehensively evaluated all the antecedents of job satisfaction. PAs seem to experience job satisfaction supported by low attrition rates and competitive wages. Contributing factors are autonomy, experienced responsibility, pay, and supportive supervising physician. A number of intrinsic rewards derived from the performance of the job within the social environment, along with extrinsic rewards, may contribute to overall job satisfaction. PA job satisfaction research is underdeveloped; investigations should include longitudinal studies, cohort analyses, and economic determinants.

  3. Empirical Evidence from Kenya

    African Journals Online (AJOL)

    FIRST LADY

    2011-01-18

    Jan 18, 2011 ... Empirical results reveal that consumption of sugar in. Kenya varies ... experiences in trade in different regions of the world. Some studies ... To assess the relationship between domestic sugar retail prices and sugar sales in ...

  4. Empirical analysis of uranium spot prices

    International Nuclear Information System (INIS)

    Morman, M.R.

    1988-01-01

    The objective is to empirically test a market model of the uranium industry that incorporates the notion that, if the resource is viewed as an asset by economic agents, then its own rate of return along with the own rate of return of a competing asset would be a major factor in formulating the price of the resource. The model tested is based on a market model of supply and demand. The supply model incorporates the notion that the decision criteria used by uranium mine owners is to select that extraction rate that maximizes the net present value of their extraction receipts. The demand model uses a concept that allows for explicit recognition of the prospect of arbitrage between a natural-resource market and the market for other capital goods. The empirical approach used for estimation was a recursive or causal model. The empirical results were consistent with the theoretical models. The coefficients of the demand and supply equations had the appropriate signs. Tests for causality were conducted to validate the use of the causal model. The results obtained were favorable. The implication of the findings as related to future studies of exhaustible resources are: (1) in some cases causal models are the appropriate specification for empirical analysis; (2) supply models should incorporate a measure to capture depletion effects

  5. Critical Realism and Empirical Bioethics: A Methodological Exposition.

    Science.gov (United States)

    McKeown, Alex

    2017-09-01

    This paper shows how critical realism can be used to integrate empirical data and philosophical analysis within 'empirical bioethics'. The term empirical bioethics, whilst appearing oxymoronic, simply refers to an interdisciplinary approach to the resolution of practical ethical issues within the biological and life sciences, integrating social scientific, empirical data with philosophical analysis. It seeks to achieve a balanced form of ethical deliberation that is both logically rigorous and sensitive to context, to generate normative conclusions that are practically applicable to the problem, challenge, or dilemma. Since it incorporates both philosophical and social scientific components, empirical bioethics is a field that is consistent with the use of critical realism as a research methodology. The integration of philosophical and social scientific approaches to ethics has been beset with difficulties, not least because of the irreducibly normative, rather than descriptive, nature of ethical analysis and the contested relation between fact and value. However, given that facts about states of affairs inform potential courses of action and their consequences, there is a need to overcome these difficulties and successfully integrate data with theory. Previous approaches have been formulated to overcome obstacles in combining philosophical and social scientific perspectives in bioethical analysis; however each has shortcomings. As a mature interdisciplinary approach critical realism is well suited to empirical bioethics, although it has hitherto not been widely used. Here I show how it can be applied to this kind of research and explain how it represents an improvement on previous approaches.

  6. Evaluation of one-dimensional and two-dimensional volatility basis sets in simulating the aging of secondary organic aerosol with smog-chamber experiments.

    Science.gov (United States)

    Zhao, Bin; Wang, Shuxiao; Donahue, Neil M; Chuang, Wayne; Hildebrandt Ruiz, Lea; Ng, Nga L; Wang, Yangjun; Hao, Jiming

    2015-02-17

    We evaluate the one-dimensional volatility basis set (1D-VBS) and two-dimensional volatility basis set (2D-VBS) in simulating the aging of SOA derived from toluene and α-pinene against smog-chamber experiments. If we simulate the first-generation products with empirical chamber fits and the subsequent aging chemistry with a 1D-VBS or a 2D-VBS, the models mostly overestimate the SOA concentrations in the toluene oxidation experiments. This is because the empirical chamber fits include both first-generation oxidation and aging; simulating aging in addition to this results in double counting of the initial aging effects. If the first-generation oxidation is treated explicitly, the base-case 2D-VBS underestimates the SOA concentrations and O:C increase of the toluene oxidation experiments; it generally underestimates the SOA concentrations and overestimates the O:C increase of the α-pinene experiments. With the first-generation oxidation treated explicitly, we could modify the 2D-VBS configuration individually for toluene and α-pinene to achieve good model-measurement agreement. However, we are unable to simulate the oxidation of both toluene and α-pinene with the same 2D-VBS configuration. We suggest that future models should implement parallel layers for anthropogenic (aromatic) and biogenic precursors, and that more modeling studies and laboratory research be done to optimize the "best-guess" parameters for each layer.

  7. Refined discrete and empirical horizontal gradients in VLBI analysis

    Science.gov (United States)

    Landskron, Daniel; Böhm, Johannes

    2018-02-01

    Missing or incorrect consideration of azimuthal asymmetry of troposphere delays is a considerable error source in space geodetic techniques such as Global Navigation Satellite Systems (GNSS) or Very Long Baseline Interferometry (VLBI). So-called horizontal troposphere gradients are generally utilized for modeling such azimuthal variations and are particularly required for observations at low elevation angles. Apart from estimating the gradients within the data analysis, which has become common practice in space geodetic techniques, there is also the possibility to determine the gradients beforehand from different data sources than the actual observations. Using ray-tracing through Numerical Weather Models (NWMs), we determined discrete gradient values referred to as GRAD for VLBI observations, based on the standard gradient model by Chen and Herring (J Geophys Res 102(B9):20489-20502, 1997. https://doi.org/10.1029/97JB01739) and also for new, higher-order gradient models. These gradients are produced on the same data basis as the Vienna Mapping Functions 3 (VMF3) (Landskron and Böhm in J Geod, 2017.https://doi.org/10.1007/s00190-017-1066-2), so they can also be regarded as the VMF3 gradients as they are fully consistent with each other. From VLBI analyses of the Vienna VLBI and Satellite Software (VieVS), it becomes evident that baseline length repeatabilities (BLRs) are improved on average by 5% when using a priori gradients GRAD instead of estimating the gradients. The reason for this improvement is that the gradient estimation yields poor results for VLBI sessions with a small number of observations, while the GRAD a priori gradients are unaffected from this. We also developed a new empirical gradient model applicable for any time and location on Earth, which is included in the Global Pressure and Temperature 3 (GPT3) model. Although being able to describe only the systematic component of azimuthal asymmetry and no short-term variations at all, even these

  8. Empirical evaluation methods in computer vision

    CERN Document Server

    Christensen, Henrik I

    2002-01-01

    This book provides comprehensive coverage of methods for the empirical evaluation of computer vision techniques. The practical use of computer vision requires empirical evaluation to ensure that the overall system has a guaranteed performance. The book contains articles that cover the design of experiments for evaluation, range image segmentation, the evaluation of face recognition and diffusion methods, image matching using correlation methods, and the performance of medical image processing algorithms. Sample Chapter(s). Foreword (228 KB). Chapter 1: Introduction (505 KB). Contents: Automate

  9. Empirical direction in design and analysis

    CERN Document Server

    Anderson, Norman H

    2001-01-01

    The goal of Norman H. Anderson's new book is to help students develop skills of scientific inference. To accomplish this he organized the book around the ""Experimental Pyramid""--six levels that represent a hierarchy of considerations in empirical investigation--conceptual framework, phenomena, behavior, measurement, design, and statistical inference. To facilitate conceptual and empirical understanding, Anderson de-emphasizes computational formulas and null hypothesis testing. Other features include: *emphasis on visual inspection as a basic skill in experimental analysis to help student

  10. Relative performance of empirical and physical models in assessing the seasonal and annual glacier surface mass balance of Saint-Sorlin Glacier (French Alps)

    Science.gov (United States)

    Réveillet, Marion; Six, Delphine; Vincent, Christian; Rabatel, Antoine; Dumont, Marie; Lafaysse, Matthieu; Morin, Samuel; Vionnet, Vincent; Litt, Maxime

    2018-04-01

    This study focuses on simulations of the seasonal and annual surface mass balance (SMB) of Saint-Sorlin Glacier (French Alps) for the period 1996-2015 using the detailed SURFEX/ISBA-Crocus snowpack model. The model is forced by SAFRAN meteorological reanalysis data, adjusted with automatic weather station (AWS) measurements to ensure that simulations of all the energy balance components, in particular turbulent fluxes, are accurately represented with respect to the measured energy balance. Results indicate good model performance for the simulation of summer SMB when using meteorological forcing adjusted with in situ measurements. Model performance however strongly decreases without in situ meteorological measurements. The sensitivity of the model to meteorological forcing indicates a strong sensitivity to wind speed, higher than the sensitivity to ice albedo. Compared to an empirical approach, the model exhibited better performance for simulations of snow and firn melting in the accumulation area and similar performance in the ablation area when forced with meteorological data adjusted with nearby AWS measurements. When such measurements were not available close to the glacier, the empirical model performed better. Our results suggest that simulations of the evolution of future mass balance using an energy balance model require very accurate meteorological data. Given the uncertainties in the temporal evolution of the relevant meteorological variables and glacier surface properties in the future, empirical approaches based on temperature and precipitation could be more appropriate for simulations of glaciers in the future.

  11. Relative performance of empirical and physical models in assessing the seasonal and annual glacier surface mass balance of Saint-Sorlin Glacier (French Alps

    Directory of Open Access Journals (Sweden)

    M. Réveillet

    2018-04-01

    Full Text Available This study focuses on simulations of the seasonal and annual surface mass balance (SMB of Saint-Sorlin Glacier (French Alps for the period 1996–2015 using the detailed SURFEX/ISBA-Crocus snowpack model. The model is forced by SAFRAN meteorological reanalysis data, adjusted with automatic weather station (AWS measurements to ensure that simulations of all the energy balance components, in particular turbulent fluxes, are accurately represented with respect to the measured energy balance. Results indicate good model performance for the simulation of summer SMB when using meteorological forcing adjusted with in situ measurements. Model performance however strongly decreases without in situ meteorological measurements. The sensitivity of the model to meteorological forcing indicates a strong sensitivity to wind speed, higher than the sensitivity to ice albedo. Compared to an empirical approach, the model exhibited better performance for simulations of snow and firn melting in the accumulation area and similar performance in the ablation area when forced with meteorological data adjusted with nearby AWS measurements. When such measurements were not available close to the glacier, the empirical model performed better. Our results suggest that simulations of the evolution of future mass balance using an energy balance model require very accurate meteorological data. Given the uncertainties in the temporal evolution of the relevant meteorological variables and glacier surface properties in the future, empirical approaches based on temperature and precipitation could be more appropriate for simulations of glaciers in the future.

  12. The Impact of an Epidemic Outbreak on Consumer Expenditures:An Empirical Assessment for MERS Korea

    Directory of Open Access Journals (Sweden)

    Hojin Jung

    2016-05-01

    Full Text Available In this paper, we investigate the effect of an epidemic outbreak on consumer expenditures. In light of scanner panel data on consumers’ debit and credit card transactions, we present empirical evidence that outbreaks cause considerable disruption in total consumer expenditures with significant heterogeneity across categories. Our findings strongly imply that customers alter their behaviors to reduce the risk of infection. The estimated effect of an epidemic outbreak is qualitatively different from that of other macroeconomic factors. The implications of this research provide important guidance for policy interventions and marketing decisions aimed at sustaining economic growth.

  13. Subtypes of batterers in treatment: empirical support for a distinction between type I, type II and type III.

    Directory of Open Access Journals (Sweden)

    José Luis Graña

    Full Text Available This study explores the existence of different types of batterers in a sample of 266 men who had been court referred for intimate partner violence. The data collected in the assessment that have been used to perform a hierarchical and a two-step cluster analysis fall into three areas: aggression towards the partner, general aggression and presence of psychopathology and personality traits, more specifically, alcohol use, borderline and antisocial personality traits, psychopathy traits, state anger and trait anger, anger expression and control, anger, hostility, and, finally, impulsivity. The results show a typology consisting of 3 types of batterers on the basis of violence level and psychopathology: low (65%, moderate (27.8% and high (7.1%. This study provides empirical support for the development of batterer typologies. These typologies will help achieve early detection of different types of batterers, allowing us to tailor interventions on the basis of the needs of each of the types.

  14. Symbiotic empirical ethics: a practical methodology.

    Science.gov (United States)

    Frith, Lucy

    2012-05-01

    Like any discipline, bioethics is a developing field of academic inquiry; and recent trends in scholarship have been towards more engagement with empirical research. This 'empirical turn' has provoked extensive debate over how such 'descriptive' research carried out in the social sciences contributes to the distinctively normative aspect of bioethics. This paper will address this issue by developing a practical research methodology for the inclusion of data from social science studies into ethical deliberation. This methodology will be based on a naturalistic conception of ethical theory that sees practice as informing theory just as theory informs practice - the two are symbiotically related. From this engagement with practice, the ways that such theories need to be extended and developed can be determined. This is a practical methodology for integrating theory and practice that can be used in empirical studies, one that uses ethical theory both to explore the data and to draw normative conclusions. © 2010 Blackwell Publishing Ltd.

  15. Reframing Serial Murder Within Empirical Research.

    Science.gov (United States)

    Gurian, Elizabeth A

    2017-04-01

    Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.

  16. Optimized Basis Sets for the Environment in the Domain-Specific Basis Set Approach of the Incremental Scheme.

    Science.gov (United States)

    Anacker, Tony; Hill, J Grant; Friedrich, Joachim

    2016-04-21

    Minimal basis sets, denoted DSBSenv, based on the segmented basis sets of Ahlrichs and co-workers have been developed for use as environmental basis sets for the domain-specific basis set (DSBS) incremental scheme with the aim of decreasing the CPU requirements of the incremental scheme. The use of these minimal basis sets within explicitly correlated (F12) methods has been enabled by the optimization of matching auxiliary basis sets for use in density fitting of two-electron integrals and resolution of the identity. The accuracy of these auxiliary sets has been validated by calculations on a test set containing small- to medium-sized molecules. The errors due to density fitting are about 2-4 orders of magnitude smaller than the basis set incompleteness error of the DSBSenv orbital basis sets. Additional reductions in computational cost have been tested with the reduced DSBSenv basis sets, in which the highest angular momentum functions of the DSBSenv auxiliary basis sets have been removed. The optimized and reduced basis sets are used in the framework of the domain-specific basis set of the incremental scheme to decrease the computation time without significant loss of accuracy. The computation times and accuracy of the previously used environmental basis and that optimized in this work have been validated with a test set of medium- to large-sized systems. The optimized and reduced DSBSenv basis sets decrease the CPU time by about 15.4% and 19.4% compared with the old environmental basis and retain the accuracy in the absolute energy with standard deviations of 0.99 and 1.06 kJ/mol, respectively.

  17. Empirical knowledge in legislation and regulation : A decision making perspective

    NARCIS (Netherlands)

    Trautmann, S.T.

    2013-01-01

    This commentary considers the pros and cons of the empirical approach to legislation from the vantage point of empirical decision making research. It focuses on methodological aspects that are typically not considered by legal scholars. It points out weaknesses in the empirical approach that are

  18. Mechanics of magnetic fluid column in strong magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Polunin, V.M.; Ryapolov, P.A., E-mail: r-piter@yandex.ru; Platonov, V.B.

    2017-06-01

    Elastic-and magnetic properties of magnetic fluid confined by ponderomotive force in a tube fixed in horizontal position are considered. The system is placed in a strong magnetic field under the influence of external static and dynamic perturbations. An experimental setup has been developed. A theoretical basis of the processes of magnetic colloid elastic deformation has been proposed. The values of the static ponderomotive elasticity coefficient and the elasticity coefficient under dynamic action are experimentally determined. The calculations of the saturation magnetization for two magnetic fluid samples, carried out according to the equation containing the dynamic elasticity coefficient, are in good agreement with the experimental magnetization curve. The described method is of interest when studying magnetophoresis and aggregation of nanoparticles in magnetic colloids.

  19. Essays in empirical microeconomics

    NARCIS (Netherlands)

    Péter, A.N.

    2016-01-01

    The empirical studies in this thesis investigate various factors that could affect individuals' labor market, family formation and educational outcomes. Chapter 2 focuses on scheduling as a potential determinant of individuals' productivity. Chapter 3 looks at the role of a family factor on

  20. A theoretical basis for the analysis of multiversion software subject to coincident errors

    Science.gov (United States)

    Eckhardt, D. E., Jr.; Lee, L. D.

    1985-01-01

    Fundamental to the development of redundant software techniques (known as fault-tolerant software) is an understanding of the impact of multiple joint occurrences of errors, referred to here as coincident errors. A theoretical basis for the study of redundant software is developed which: (1) provides a probabilistic framework for empirically evaluating the effectiveness of a general multiversion strategy when component versions are subject to coincident errors, and (2) permits an analytical study of the effects of these errors. An intensity function, called the intensity of coincident errors, has a central role in this analysis. This function describes the propensity of programmers to introduce design faults in such a way that software components fail together when executing in the application environment. A condition under which a multiversion system is a better strategy than relying on a single version is given.

  1. Strong ground motion prediction applying dynamic rupture simulations for Beppu-Haneyama Active Fault Zone, southwestern Japan

    Science.gov (United States)

    Yoshimi, M.; Matsushima, S.; Ando, R.; Miyake, H.; Imanishi, K.; Hayashida, T.; Takenaka, H.; Suzuki, H.; Matsuyama, H.

    2017-12-01

    We conducted strong ground motion prediction for the active Beppu-Haneyama Fault zone (BHFZ), Kyushu island, southwestern Japan. Since the BHFZ runs through Oita and Beppy cities, strong ground motion as well as fault displacement may affect much to the cities.We constructed a 3-dimensional velocity structure of a sedimentary basin, Beppu bay basin, where the fault zone runs through and Oita and Beppu cities are located. Minimum shear wave velocity of the 3d model is 500 m/s. Additional 1-d structure is modeled for sites with softer sediment: holocene plain area. We observed, collected, and compiled data obtained from microtremor surveys, ground motion observations, boreholes etc. phase velocity and H/V ratio. Finer structure of the Oita Plain is modeled, as 250m-mesh model, with empirical relation among N-value, lithology, depth and Vs, using borehole data, then validated with the phase velocity data obtained by the dense microtremor array observation (Yoshimi et al., 2016).Synthetic ground motion has been calculated with a hybrid technique composed of a stochastic Green's function method (for HF wave), a 3D finite difference (LF wave) and 1D amplification calculation. Fault geometry has been determined based on reflection surveys and active fault map. The rake angles are calculated with a dynamic rupture simulation considering three fault segments under a stress filed estimated from source mechanism of earthquakes around the faults (Ando et al., JpGU-AGU2017). Fault parameters such as the average stress drop, a size of asperity etc. are determined based on an empirical relation proposed by Irikura and Miyake (2001). As a result, strong ground motion stronger than 100 cm/s is predicted in the hanging wall side of the Oita plain.This work is supported by the Comprehensive Research on the Beppu-Haneyama Fault Zone funded by the Ministry of Education, Culture, Sports, Science, and Technology (MEXT), Japan.

  2. Empires, Exceptions, and Anglo-Saxons: Race and Rule between the British and Unites States Empires, 1880-1910. Teaching the JAH.

    Science.gov (United States)

    OAH Magazine of History, 2002

    2002-01-01

    Summarizes a teaching document that is part of "Teaching the JAH" (Journal of American History) which corresponds to the article, "Empires, Exceptions, and Anglo-Saxons: Race and Rule between the British and Unites States Empires, 1880-1910" (Paul A. Kramer). Provides the Web site address for the complete installment. (CMK)

  3. Three-dimensional photodissociation in strong laser fields: Memory-kernel effective-mode expansion

    International Nuclear Information System (INIS)

    Li Xuan; Thanopulos, Ioannis; Shapiro, Moshe

    2011-01-01

    We introduce a method for the efficient computation of non-Markovian quantum dynamics for strong (and time-dependent) system-bath interactions. The past history of the system dynamics is incorporated by expanding the memory kernel in exponential functions thereby transforming in an exact fashion the non-Markovian integrodifferential equations into a (larger) set of ''effective modes'' differential equations (EMDE). We have devised a method which easily diagonalizes the EMDE, thereby allowing for the efficient construction of an adiabatic basis and the fast propagation of the EMDE in time. We have applied this method to three-dimensional photodissociation of the H 2 + molecule by strong laser fields. Our calculations properly include resonance-Raman scattering via the continuum, resulting in extensive rotational and vibrational excitations. The calculated final kinetic and angular distribution of the photofragments are in overall excellent agreement with experiments, both when transform-limited pulses and when chirped pulses are used.

  4. Advancing Empirical Scholarship to Further Develop Evaluation Theory and Practice

    Science.gov (United States)

    Christie, Christina A.

    2011-01-01

    Good theory development is grounded in empirical inquiry. In the context of educational evaluation, the development of empirically grounded theory has important benefits for the field and the practitioner. In particular, a shift to empirically derived theory will assist in advancing more systematic and contextually relevant evaluation practice, as…

  5. Empirical scholarship in contract law: possibilities and pitfalls

    Directory of Open Access Journals (Sweden)

    Russell Korobkin

    2015-01-01

    Full Text Available Professor Korobkin examines and analyzes empirical contract law scholarship over the last fifteen years in an attempt to guide scholars concerning how empiricism can be used in and enhance the study of contract law. After defining the parameters of the study, Professor Korobkin categorizes empirical contract law scholarship by both the source of data and main purpose of the investigation. He then describes and analyzes three types of criticisms that can be made of empirical scholarship, explains how these criticisms pertain to contract law scholarship, and considers what steps researchers can take to minimize the force of such criticisms.

  6. An empirical assessment of near-source strong ground motion for a 6.6 mb (7.5 MS) earthquake in the Eastern United States

    International Nuclear Information System (INIS)

    Campbell, Kenneth W.

    1984-06-01

    To help assess the impact of the current U.S. Geological Survey position on the seismic safety of nuclear power plants in the Eastern United States (EUS), several techniques for estimating near-source strong ground motion for a Charleston size earthquake were evaluated. The techniques for estimating the near-source strong ground motion for a 6.6 m b (7.5 M S ) in the Eastern United States which were assessed are methods based on site specific analyses, semi-theoretical scaling techniques, and intensity-based estimates. The first involves the statistical analysis of ground motion records from earthquakes and recording stations having the same general characteristics (earthquakes with magnitudes of 7.5 M S or larger, epicentral distances of 25 km or less, and sites of either soil or rock). Some recommendations for source and characterization scaling of the bias resulting primarily from an inadequate sample of near-source recordings from earthquakes of large magnitude are discussed. The second technique evaluated requires that semi-theoretical estimates of peak ground motion parameters for a 6.6 m b (7.5 M S ) earthquake be obtained from scaling relations. Each relation uses a theoretical expression between peak acceleration magnitude and distance together with available strong motion data (majority coming from California) to develop a scaling relation appropriate for the Eastern United States. None of the existing ground motion models for the EUS include the potential effects of source or site characteristics. Adjustments to account for fault mechanisms, site topography, site geology, and the size and embedment of buildings are discussed. The final approach used relations between strong ground motion parameters and Modified Mercalli Intensity in conjunction with two methods to estimate peak parameters for a 6.6 m s (7.5 M S ) earthquake. As with other techniques, adjustment of peak acceleration estimates are discussed. Each method differently approaches the problem

  7. Cellular and molecular basis of chronic constipation: Taking the functional/idiopathic label out

    OpenAIRE

    Bassotti, Gabrio; Villanacci, Vincenzo; Creƫoiu, Dragos; Creƫoiu, Sanda Maria; Becheanu, Gabriel

    2013-01-01

    In recent years, the improvement of technology and the increase in knowledge have shifted several strongly held paradigms. This is particularly true in gastroenterology, and specifically in the field of the so-called “functional” or “idiopathic” disease, where conditions thought for decades to be based mainly on alterations of visceral perception or aberrant psychosomatic mechanisms have, in fact, be reconducted to an organic basis (or, at the very least, have shown one or more demonstrable a...

  8. Mixed and Complex Mixed Migration during Armed Conflict: Multidimensional Empirical Evidence from Nepal.

    Science.gov (United States)

    Williams, Nathalie E

    Historically, legal, policy, and academic communities largely ascribed to a dichotomy between forced and voluntary migration, creating a black and white vision that was convenient for legal and policy purposes. More recently, discussions have begun addressing the possibility of mixed migration, acknowledging that there is likely a wide continuum between forced and voluntary, and most migrants likely move with some amount of compulsion and some volition, even during armed conflict. While the mixed migration hypothesis is well-received, empirical evidence is disparate and somewhat blunt at this point. In this article, I contribute a direct theoretical and causal pathway discussion of mixed migration. I also propose the complex mixed migration hypothesis, which argues that not only do non-conflict related factors influence migration during conflict, but they do so differently than during periods of relative peace. I empirically test both hypotheses in the context of the recent armed conflict in Nepal. Using detailed survey data and event history models, results provide strong evidence for both mixed migration and complex mixed migration during conflict hypotheses. These hypotheses and evidence suggest that armed conflict might have substantial impacts on long-term population growth and change, with significant relevance in both academic and policy spheres.

  9. Design Load Basis for Offshore Wind turbines

    DEFF Research Database (Denmark)

    Natarajan, Anand; Hansen, Morten Hartvig; Wang, Shaofeng

    2016-01-01

    DTU Wind Energy is not designing and manufacturing wind turbines and does therefore not need a Design Load Basis (DLB) that is accepted by a certification body. However, to assess the load consequences of innovative features and devices added to existing offshore turbine concepts or new offshore...... turbine concept developed in our research, it is useful to have a full DLB that follows the current design standard and is representative of a general DLB used by the industry. It will set a standard for the offshore wind turbine design load evaluations performed at DTU Wind Energy, which is aligned...... with the challenges faced by the industry and therefore ensures that our research continues to have a strong foundation in this interaction. Furthermore, the use of a full DLB that follows the current standard can improve and increase the feedback from the research at DTU Wind Energy to the international...

  10. Consistent constitutive modeling of metallic target penetration using empirical, analytical, and numerical penetration models

    Directory of Open Access Journals (Sweden)

    John (Jack P. Riegel III

    2016-04-01

    Full Text Available Historically, there has been little correlation between the material properties used in (1 empirical formulae, (2 analytical formulations, and (3 numerical models. The various regressions and models may each provide excellent agreement for the depth of penetration into semi-infinite targets. But the input parameters for the empirically based procedures may have little in common with either the analytical model or the numerical model. This paper builds on previous work by Riegel and Anderson (2014 to show how the Effective Flow Stress (EFS strength model, based on empirical data, can be used as the average flow stress in the analytical Walker–Anderson Penetration model (WAPEN (Anderson and Walker, 1991 and how the same value may be utilized as an effective von Mises yield strength in numerical hydrocode simulations to predict the depth of penetration for eroding projectiles at impact velocities in the mechanical response regime of the materials. The method has the benefit of allowing the three techniques (empirical, analytical, and numerical to work in tandem. The empirical method can be used for many shot line calculations, but more advanced analytical or numerical models can be employed when necessary to address specific geometries such as edge effects or layering that are not treated by the simpler methods. Developing complete constitutive relationships for a material can be costly. If the only concern is depth of penetration, such a level of detail may not be required. The effective flow stress can be determined from a small set of depth of penetration experiments in many cases, especially for long penetrators such as the L/D = 10 ones considered here, making it a very practical approach. In the process of performing this effort, the authors considered numerical simulations by other researchers based on the same set of experimental data that the authors used for their empirical and analytical assessment. The goals were to establish a

  11. Empirical research through design

    NARCIS (Netherlands)

    Keyson, D.V.; Bruns, M.

    2009-01-01

    This paper describes the empirical research through design method (ERDM), which differs from current approaches to research through design by enforcing the need for the designer, after a series of pilot prototype based studies, to a-priori develop a number of testable interaction design hypothesis

  12. Empirically sampling Universal Dependencies

    DEFF Research Database (Denmark)

    Schluter, Natalie; Agic, Zeljko

    2017-01-01

    Universal Dependencies incur a high cost in computation for unbiased system development. We propose a 100% empirically chosen small subset of UD languages for efficient parsing system development. The technique used is based on measurements of model capacity globally. We show that the diversity o...

  13. Evaluation of empirical atmospheric diffusion data

    International Nuclear Information System (INIS)

    Horst, T.W.; Doran, J.C.; Nickola, P.W.

    1979-10-01

    A study has been made of atmospheric diffusion over level, homogeneous terrain of contaminants released from non-buoyant point sources up to 100 m in height. Current theories of diffusion are compared to empirical diffusion data, and specific dispersion estimation techniques are recommended which can be implemented with the on-site meteorological instrumentation required by the Nuclear Regulatory Commission. A comparison of both the recommended diffusion model and the NRC diffusion model with the empirical data demonstrates that the predictions of the recommended model have both smaller scatter and less bias, particularly for groundlevel sources

  14. Who supported the Deutsche Bundesbank? An empirical investigation

    NARCIS (Netherlands)

    Maier, P; Knaap, T

    2002-01-01

    The relevance of public support for monetary policy has largely been over-looked in the empirical Central Bank literature. We have constructed a new indicator for the support of the German Bundesbank and present descriptive and empirical evidence. We find that major German interest groups were quite

  15. Early Site Permit Demonstration Program: Guidelines for determining design basis ground motions. Volume 2, Appendices

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-18

    This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake`s ground motion is a function of the earthquake`s magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is a stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. This document, Volume II, contains Appendices 2, 3, 5, 6, and 7 covering the following topics: Eastern North American Empirical Ground Motion Data; Examination of Variance of Seismographic Network Data; Soil Amplification and Vertical-to-Horizontal Ratios from Analysis of Strong Motion Data From Active Tectonic Regions; Revision and Calibration of Ou and Herrmann Method; Generalized Ray Procedure for Modeling Ground Motion Attenuation; Crustal Models for Velocity Regionalization; Depth Distribution Models; Development of Generic Site Effects Model; Validation and Comparison of One-Dimensional Site Response Methodologies; Plots of Amplification Factors; Assessment of Coupling Between Vertical & Horizontal Motions in Nonlinear Site Response Analysis; and Modeling of Dynamic Soil Properties.

  16. Inelastic electron scattering influence on the strong coupling oxide superconductors

    International Nuclear Information System (INIS)

    Gabovich, A.M.; Voitenko, A.I.

    1995-01-01

    The superconducting order parameters Δ and energy gap Δ g are calculated taking into account the pair-breaking inelastic quasiparticle scattering by thermal Bose-excitations, e.g., phonons. The treatment is self-consistent because the scattering amplitude depends on Δ. The superconducting transition for any strength of the inelastic scattering is the phase transition of the first kind and the dependences Δ (T) and Δ g (T) tend to rectangular curve that agrees well with the experiment for high-Tc oxides. On the basis of the developed theory the nuclear spin-lattice relaxation rate R s in the superconducting state is calculated. The Hebel-Slichter peak in R s (T) is shown to disappear for strong enough inelastic scattering

  17. Goodness! The empirical turn in health care ethics

    NARCIS (Netherlands)

    Willems, D.; Pols, J.

    2010-01-01

    This paper is intended to encourage scholars to submit papers for a symposium and the next special issue of Medische Antropologie which will be on empirical studies of normative questions. We describe the ‘empirical turn’ in medical ethics. Medical ethics and bioethics in general have witnessed a

  18. Learning to Read Empirical Articles in General Psychology

    Science.gov (United States)

    Sego, Sandra A.; Stuart, Anne E.

    2016-01-01

    Many students, particularly underprepared students, struggle to identify the essential information in empirical articles. We describe a set of assignments for instructing general psychology students to dissect the structure of such articles. Students in General Psychology I read empirical articles and answered a set of general, factual questions…

  19. Principles Involving Marketing Policies: An Empirical Assessment

    OpenAIRE

    JS Armstrong; Randall L. Schultz

    2005-01-01

    We examined nine marketing textbooks, published since 1927, to see if they contained useful marketing principles. Four doctoral students found 566 normative statements about pricing, product, place, or promotion in these texts. None of these stateinents were supported by empirical evidence. Four raters agreed on only twenty of these 566 statements as providing meaningful principles. Twenty marketing professors rated whether the twenty meaningful principles were correct, supported by empirical...

  20. Empirical microeconomics action functionals

    Science.gov (United States)

    Baaquie, Belal E.; Du, Xin; Tanputraman, Winson

    2015-06-01

    A statistical generalization of microeconomics has been made in Baaquie (2013), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is modeled by an action functional-and the focus of this paper is to empirically determine the action functionals for different commodities. The correlation functions of the model are defined using a Feynman path integral. The model is calibrated using the unequal time correlation of the market commodity prices as well as their cubic and quartic moments using a perturbation expansion. The consistency of the perturbation expansion is verified by a numerical evaluation of the path integral. Nine commodities drawn from the energy, metal and grain sectors are studied and their market behavior is described by the model to an accuracy of over 90% using only six parameters. The paper empirically establishes the existence of the action functional for commodity prices that was postulated to exist in Baaquie (2013).

  1. Development and evaluation of an empirical diurnal sea surface temperature model

    Science.gov (United States)

    Weihs, R. R.; Bourassa, M. A.

    2013-12-01

    An innovative method is developed to determine the diurnal heating amplitude of sea surface temperatures (SSTs) using observations of high-quality satellite SST measurements and NWP atmospheric meteorological data. The diurnal cycle results from heating that develops at the surface of the ocean from low mechanical or shear produced turbulence and large solar radiation absorption. During these typically calm weather conditions, the absorption of solar radiation causes heating of the upper few meters of the ocean, which become buoyantly stable; this heating causes a temperature differential between the surface and the mixed [or bulk] layer on the order of a few degrees. It has been shown that capturing the diurnal cycle is important for a variety of applications, including surface heat flux estimates, which have been shown to be underestimated when neglecting diurnal warming, and satellite and buoy calibrations, which can be complicated because of the heating differential. An empirical algorithm using a pre-dawn sea surface temperature, peak solar radiation, and accumulated wind stress is used to estimate the cycle. The empirical algorithm is derived from a multistep process in which SSTs from MTG's SEVIRI SST experimental hourly data set are combined with hourly wind stress fields derived from a bulk flux algorithm. Inputs for the flux model are taken from NASA's MERRA reanalysis product. NWP inputs are necessary because the inputs need to incorporate diurnal and air-sea interactive processes, which are vital to the ocean surface dynamics, with a high enough temporal resolution. The MERRA winds are adjusted with CCMP winds to obtain more realistic spatial and variance characteristics and the other atmospheric inputs (air temperature, specific humidity) are further corrected on the basis of in situ comparisons. The SSTs are fitted to a Gaussian curve (using one or two peaks), forming a set of coefficients used to fit the data. The coefficient data are combined with

  2. Investigating Low-Carbon City: Empirical Study of Shanghai

    Directory of Open Access Journals (Sweden)

    Xuan Yang

    2018-04-01

    Full Text Available A low-carbon economy is an inevitable choice for achieving economic and ecological sustainable development. It is of significant importance to analyze a city’s low-carbon economy development level scientifically and reasonably. In order to achieve this goal, we propose an urban low-carbon economic development level evaluation model based on the matter-element extension method. First, we select some indicators from the existing indicator system based on past research and experience. Then, a matter-element model is established on the basis of weight to evaluate the level of a city’s low-carbon, the critical value of each index is determined through the classical domain and the section domain, calculating the correlation degree of a single index and a comprehensive index. Finally, we analyze the low-carbon economy development status and future development trends according to the analysis results. In this study, we select Shanghai as an empirical study—the results show that Shanghai is a city with a low-carbon level and there is a trend of further improvement in Shanghai’s low-carbon economy. But its low carbon construction and low carbon technology investment are relatively low. In summary, this method can provide another angle for evaluating a city’s low-carbon economy.

  3. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Camara Vincent A. R.

    1998-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results. It is shown that empirical Bayes reliability functions are in general sensitive to the choice of the loss function, and that the squared error loss does not always yield the best empirical Bayes reliability estimate.

  4. The physiological basis and application of renal radionuclide studies

    International Nuclear Information System (INIS)

    Britton, K.E.

    1983-01-01

    A knowledge of the basic physiology of the kidney is essential for an understanding of the application of radionuclide studies in clinical practice. A knowledge of the physiology of the kidney allows one to develop physiological models that are isomorphic and then apply the appropriate type of data analysis in relationship to these models. In this way mistakes in the type of analysis can be avoided and a strong basis for the interpretation of renal radionuclide studies in health and disease is thereby provided. Methods for measuring total renal function, the contribution of each kidney to total renal function, the presence or absence of obstructive nephropathy and the determination of the relative flows to the cortical and juxtamedullary nephrons are given as examples of this approach. (author)

  5. Population stochasticity, random determination of handedness, and the genetic basis of antisymmetry.

    Science.gov (United States)

    Kamimura, Yoshitaka

    2011-12-07

    Conspicuous lateral asymmetries of organisms are classified into two major categories: antisymmetry (AS), characterized by almost equal frequencies of dextral and sinistral morphs, and directional asymmetry (DA), in which one morph dominates. I compared and characterized two types of genes, both with existing examples, in their roles in the evolutionary transitions between AS and DA for the first time. Handedness genes (HGs) determine the chirality in a strict sense, while randomization genes (RGs) randomize the chirality. A theory predicts that, in an AS population maintained by HGs under negative frequency-dependent selection, RGs harness fluctuation of the morph frequencies as their driving force and thus increase their frequency until half of the population flips the phenotype. These predictions were confirmed by simulations. Consequently, RGs mask the genetic effects of HGs, which provides a possible explanation for the apparent lack of a genetic basis for AS in empirical AS studies. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Sources of Currency Crisis: An Empirical Analysis

    OpenAIRE

    Weber, Axel A.

    1997-01-01

    Two types of currency crisis models coexist in the literature: first generation models view speculative attacks as being caused by economic fundamentals which are inconsistent with a given parity. Second generation models claim self-fulfilling speculation as the main source of a currency crisis. Recent empirical research in international macroeconomics has attempted to distinguish between the sources of currency crises. This paper adds to this literature by proposing a new empirical approach ...

  7. 10 CFR 830.202 - Safety basis.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Safety basis. 830.202 Section 830.202 Energy DEPARTMENT OF ENERGY NUCLEAR SAFETY MANAGEMENT Safety Basis Requirements § 830.202 Safety basis. (a) The contractor responsible for a hazard category 1, 2, or 3 DOE nuclear facility must establish and maintain the safety basis...

  8. Alternation vs. Allomorphic Variation in Old English Word-Formation: Evidence from the Derivational Paradigm of Strong Verbs

    Directory of Open Access Journals (Sweden)

    Urraca Carmen Novo

    2015-01-01

    Full Text Available This article addresses the question of Old English alternations with a view to identifying instances of allomorphic variation attributable to the loss of motivation and the subsequent morphologization of alternations. The focus is on the strong verb and its derivatives, in such a way that the alternations in which the strong verb partakes can be predicted on the basis of phonological principles, whereas allomorphic variation with respect to the strong verb base is unpredictable. Out of 304 derivational paradigms based on strong verbs and comprising 4,853 derivatives, 478 instances have been found of phonologically motivated vocalic alternations. The conclusion is reached that the most frequent alternations are those that have /a/ as source and those with /y/ as target, because /a/ is the point of departure of i-mutation and /y/ its point of arrival. Sixteen instances of allomorphic variation have also been found, of which /e/ ~ /eo/, /e/ ~ /ea/ and /i/ ~ /e/ are relatively frequent.

  9. Note on the hydrodynamic description of thin nematic films: Strong anchoring model

    KAUST Repository

    Lin, Te-Sheng; Cummings, Linda J.; Archer, Andrew J.; Kondic, Lou; Thiele, Uwe

    2013-01-01

    We discuss the long-wave hydrodynamic model for a thin film of nematic liquid crystal in the limit of strong anchoring at the free surface and at the substrate. We rigorously clarify how the elastic energy enters the evolution equation for the film thickness in order to provide a solid basis for further investigation: several conflicting models exist in the literature that predict qualitatively different behaviour. We consolidate the various approaches and show that the long-wave model derived through an asymptotic expansion of the full nemato-hydrodynamic equations with consistent boundary conditions agrees with the model one obtains by employing a thermodynamically motivated gradient dynamics formulation based on an underlying free energy functional. As a result, we find that in the case of strong anchoring the elastic distortion energy is always stabilising. To support the discussion in the main part of the paper, an appendix gives the full derivation of the evolution equation for the film thickness via asymptotic expansion. © 2013 AIP Publishing LLC.

  10. Evaluation of empirical atmospheric diffusion data

    Energy Technology Data Exchange (ETDEWEB)

    Horst, T.W.; Doran, J.C.; Nickola, P.W.

    1979-10-01

    A study has been made of atmospheric diffusion over level, homogeneous terrain of contaminants released from non-buoyant point sources up to 100 m in height. Current theories of diffusion are compared to empirical diffusion data, and specific dispersion estimation techniques are recommended which can be implemented with the on-site meteorological instrumentation required by the Nuclear Regulatory Commission. A comparison of both the recommended diffusion model and the NRC diffusion model with the empirical data demonstrates that the predictions of the recommended model have both smaller scatter and less bias, particularly for ground-level sources.

  11. The effect of empirical potential functions on modeling of amorphous carbon using molecular dynamics method

    International Nuclear Information System (INIS)

    Li, Longqiu; Xu, Ming; Song, Wenping; Ovcharenko, Andrey; Zhang, Guangyu; Jia, Ding

    2013-01-01

    Empirical potentials have a strong effect on the hybridization and structure of amorphous carbon and are of great importance in molecular dynamics (MD) simulations. In this work, amorphous carbon at densities ranging from 2.0 to 3.2 g/cm 3 was modeled by a liquid quenching method using Tersoff, 2nd REBO, and ReaxFF empirical potentials. The hybridization, structure and radial distribution function G(r) of carbon atoms were analyzed as a function of the three potentials mentioned above. The ReaxFF potential is capable to model the change of the structure of amorphous carbon and MD results are in a good agreement with experimental results and density function theory (DFT) at low density of 2.6 g/cm 3 and below. The 2nd REBO potential can be used when amorphous carbon has a very low density of 2.4 g/cm 3 and below. Considering the computational efficiency, the Tersoff potential is recommended to model amorphous carbon at a high density of 2.6 g/cm 3 and above. In addition, the influence of the quenching time on the hybridization content obtained with the three potentials is discussed.

  12. Empirical approach to endorsement marketing and consumer fanaticism of telecom firms Nigeria’s Rivers State

    Directory of Open Access Journals (Sweden)

    Joy E. Akahome

    2017-10-01

    Full Text Available The paper is an empirical investigation of the relationship between endorsement marketing and consumer fanaticism of telecom firms in Rivers State. A sample of 200 customers of selected telecom firms was surveyed and 196 copies of the questionnaire were returned and valid after data collation and cleaning for analysis using Pearson’s Product Moment Correlation with the aid of SPSS version 21.0. Based on findings, the paper concludes that celebrity-product-fit has a strong relationship with consumer fanaticism of telecom firms in Rivers State. Amongst the recommendations is that more investment should be made on endorsement marketing activities as it enhances consumers’ brand recognition.

  13. Liturgy as Experience - the Psychology of Worship.
 A Theoretical and Empirical Lacuna

    Directory of Open Access Journals (Sweden)

    Owe Wikström

    1993-01-01

    Full Text Available This article has three aims: 1 to plead for an approach to the study of the liturgy based on the psychology of religion, 2 to draw up a preliminary theoretical model for how the liturgy can be interpreted, and 3 to narrow down the field for further interdisciplinary development and empirical analysis. People undergo more or less strong experiences during and in conjunction with church services. Perhaps people are moved, experience holiness, reverence, fellowship or closeness to the risen Christ. The problem is what factors during the service strengthen such a religious experience. What is the role played by the music, symbols, the place or building where the service is held, the number of participants and the liturgical event?

  14. Ionic charge transport in strongly structured molten salts

    International Nuclear Information System (INIS)

    Tatlipinar, H.; Amoruso, M.; Tosi, M.P.

    1999-08-01

    Data on the d.c. ionic conductivity for strongly structured molten halides of divalent and trivalent metals near freezing are interpreted as mainly reflecting charge transport by the halogen ions. On this assumption the Nernst-Einstein relation allows an estimate of the translational diffusion coefficient D tr of the halogen. In at least one case (molten ZnCl 2 ) D tr is much smaller than the measured diffusion coefficient, pointing to substantial diffusion via neutral units. The values of D tr estimated from the Nernst-Einstein relation are analyzed on the basis of a model involving two parameters, i.e. a bond-stretching frequency ω and an average waiting time τ. With the help of Raman scattering data for ω, the values of τ are evaluated and found to mostly lie in the range 0.02 - 0.3 ps for a vast class of materials. (author)

  15. An empirical model to predict infield thin layer drying rate of cut switchgrass

    International Nuclear Information System (INIS)

    Khanchi, A.; Jones, C.L.; Sharma, B.; Huhnke, R.L.; Weckler, P.; Maness, N.O.

    2013-01-01

    A series of 62 thin layer drying experiments were conducted to evaluate the effect of solar radiation, vapor pressure deficit and wind speed on drying rate of switchgrass. An environmental chamber was fabricated that can simulate field drying conditions. An empirical drying model based on maturity stage of switchgrass was also developed during the study. It was observed that solar radiation was the most significant factor in improving the drying rate of switchgrass at seed shattering and seed shattered maturity stage. Therefore, drying switchgrass in wide swath to intercept the maximum amount of radiation at these stages of maturity is recommended. Moreover, it was observed that under low radiation intensity conditions, wind speed helps to improve the drying rate of switchgrass. Field operations such as raking or turning of the windrows are recommended to improve air circulation within a swath on cloudy days. Additionally, it was found that the effect of individual weather parameters on the drying rate of switchgrass was dependent on maturity stage. Vapor pressure deficit was strongly correlated with the drying rate during seed development stage whereas, vapor pressure deficit was weakly correlated during seed shattering and seed shattered stage. These findings suggest the importance of using separate drying rate models for each maturity stage of switchgrass. The empirical models developed in this study can predict the drying time of switchgrass based on the forecasted weather conditions so that the appropriate decisions can be made. -- Highlights: • An environmental chamber was developed in the present study to simulate field drying conditions. • An empirical model was developed that can estimate drying rate of switchgrass based on forecasted weather conditions. • Separate equations were developed based on maturity stage of switchgrass. • Designed environmental chamber can be used to evaluate the effect of other parameters that affect drying of crops

  16. Towards a strong virtue ethics for nursing practice.

    Science.gov (United States)

    Armstrong, Alan E

    2006-07-01

    Illness creates a range of negative emotions in patients including anxiety, fear, powerlessness, and vulnerability. There is much debate on the 'therapeutic' or 'helping' nurse-patient relationship. However, despite the current agenda regarding patient-centred care, the literature concerning the development of good interpersonal responses and the view that a satisfactory nursing ethics should focus on persons and character traits rather than actions, nursing ethics is dominated by the traditional obligation, act-centred theories such as consequentialism and deontology. I critically examine these theories and the role of duty-based notions in both general ethics and nursing practice. Because of well-established flaws, I conclude that obligation-based moral theories are incomplete and inadequate for nursing practice. I examine the work of Hursthouse on virtue ethics' action guidance and the v-rules. I argue that the moral virtues and a strong (action-guiding) version of virtue ethics provide a plausible and viable alternative for nursing practice. I develop an account of a virtue-based helping relationship and a virtue-based approach to nursing. The latter is characterized by three features: (1) exercising the moral virtues such as compassion; (2) using judgement; and (3) using moral wisdom, understood to include at least moral perception, moral sensitivity, and moral imagination. Merits and problems of the virtue-based approach are examined. I relate the work of MacIntyre to nursing and I conceive nursing as a practice: nurses who exercise the virtues and seek the internal goods help to sustain the practice of nursing and thus prevent the marginalization of the virtues. The strong practice-based version of virtue ethics proposed is context-dependent, particularist, and relational. Several areas for future philosophical inquiry and empirical nursing research are suggested to develop this account yet further.

  17. Strong Flows of Bottom Water in Abyssal Channels of the Atlantic

    Science.gov (United States)

    Morozov, E. G.

    Analysis of bottom water transport through the abyssal channels of the Atlantic Ocean is presented. The study is based on recent observations in the Russian expeditions and historical data. A strong flow of Antarctic Bottom Water from the Argentine Basin to the Brazil Basin through the Vema Channel is observed on the basis of lowered profilers and anchored buoys with current meters. The further flow of bottom water in the Brazil Basin splits in the northern part of the basin. Part of the bottom water flows to the East Atlantic through the Romanche and Chain fracture zones. The other part follows the bottom topography and flows to the northwester into the North American Basin. Part of the northwesterly flow propagates through the Vema Fracture Zone into the Northeastern Atlantic. This flow generally fills the bottom layer in the Northeastern Atlantic basins. The flows of bottom waters through the Romanche and Chain fracture zones do not spread to the Northeast Atlantic due to strong mixing in the equatorial zone and enhanced transformation of bottom water properties.

  18. Prediction of strong earthquake motions on rock surface using evolutionary process models

    International Nuclear Information System (INIS)

    Kameda, H.; Sugito, M.

    1984-01-01

    Stochastic process models are developed for prediction of strong earthquake motions for engineering design purposes. Earthquake motions with nonstationary frequency content are modeled by using the concept of evolutionary processes. Discussion is focused on the earthquake motions on bed rocks which are important for construction of nuclear power plants in seismic regions. On this basis, two earthquake motion prediction models are developed, one (EMP-IB Model) for prediction with given magnitude and epicentral distance, and the other (EMP-IIB Model) to account for the successive fault ruptures and the site location relative to the fault of great earthquakes. (Author) [pt

  19. An empirically based steady state friction law and implications for fault stability.

    Science.gov (United States)

    Spagnuolo, E; Nielsen, S; Violay, M; Di Toro, G

    2016-04-16

    Empirically based rate-and-state friction laws (RSFLs) have been proposed to model the dependence of friction forces with slip and time. The relevance of the RSFL for earthquake mechanics is that few constitutive parameters define critical conditions for fault stability (i.e., critical stiffness and frictional fault behavior). However, the RSFLs were determined from experiments conducted at subseismic slip rates ( V   0.1 m/s) remains questionable on the basis of the experimental evidence of (1) large dynamic weakening and (2) activation of particular fault lubrication processes at seismic slip rates. Here we propose a modified RSFL (MFL) based on the review of a large published and unpublished data set of rock friction experiments performed with different testing machines. The MFL, valid at steady state conditions from subseismic to seismic slip rates (0.1 µm/s fault frictional stability with implications for slip event styles and relevance for models of seismic rupture nucleation, propagation, and arrest.

  20. Agency Theory and Franchising: Some Empirical Results

    OpenAIRE

    Francine Lafontaine

    1992-01-01

    This article provides an empirical assessment of various agency-theoretic explanations for franchising, including risk sharing, one-sided moral hazard, and two-sided moral hazard. The empirical models use proxies for factors such as risk, moral hazard, and franchisors' need for capital to explain both franchisors' decisions about the terms of their contracts (royalty rates and up-front franchise fees) and the extent to which they use franchising. In this article, I exploit several new sources...

  1. Gun Laws and Crime: An Empirical Assessment

    OpenAIRE

    Matti Viren

    2012-01-01

    This paper deals with the effect of gun laws on crime. Several empirical analyses are carried to investigate the relationship between five different crime rates and alternative law variables. The tests are based on cross-section data from US sates. Three different law variables are used in the analysis, together with a set of control variables for income, poverty, unemployment and ethnic background of the population. Empirical analysis does not lend support to the notion that crime laws would...

  2. Hartree-Fock calculations for strongly deformed and highly excited nuclei using the Skyrme force

    International Nuclear Information System (INIS)

    Zint, P.G.

    1975-01-01

    It has been shown that in CHF-calculations the Skyrme-force is usefull to describe strongly deformed nuclei with even proton and neutron number till separation. Thereby the eigenfunctions of the two-centre Hamiltonian form an adequate basis. With this procedure, we obtain the correct deformation of the 32 S-system. Induding the spurious energy of relative motion between the 16 O-fragments, the energy curve is a good approximation for the real potential, extracted form scattering experiments. (orig./WL) [de

  3. Many-body-localization: strong disorder perturbative approach for the local integrals of motion

    Science.gov (United States)

    Monthus, Cécile

    2018-05-01

    For random quantum spin models, the strong disorder perturbative expansion of the local integrals of motion around the real-spin operators is revisited. The emphasis is on the links with other properties of the many-body-localized phase, in particular the memory in the dynamics of the local magnetizations and the statistics of matrix elements of local operators in the eigenstate basis. Finally, this approach is applied to analyze the many-body-localization transition in a toy model studied previously from the point of view of the entanglement entropy.

  4. PWR surveillance based on correspondence between empirical models and physical

    International Nuclear Information System (INIS)

    Zwingelstein, G.; Upadhyaya, B.R.; Kerlin, T.W.

    1976-01-01

    An on line surveillance method based on the correspondence between empirical models and physicals models is proposed for pressurized water reactors. Two types of empirical models are considered as well as the mathematical models defining the correspondence between the physical and empirical parameters. The efficiency of this method is illustrated for the surveillance of the Doppler coefficient for Oconee I (an 886 MWe PWR) [fr

  5. 75 FR 75896 - Basis Reporting by Securities Brokers and Basis Determination for Stock

    Science.gov (United States)

    2010-12-07

    ... DEPARTMENT OF THE TREASURY Internal Revenue Service 26 CFR 1 [TD 9504] RIN 1545-BI66 Basis Reporting by Securities Brokers and Basis Determination for Stock Correction In rule document 2010-25504 beginning on page 64072 in the issue of Monday, October 18, 2010, make the following corrections: Sec. 1...

  6. The strong coupling constant: its theoretical derivation from a geometric approach to hadron structure

    International Nuclear Information System (INIS)

    Recami, E.; Tonin-Zanchin, V.

    1991-01-01

    Since more than a decade, a bi-scale, unified approach to strong and gravitational interactions has been proposed, that uses the geometrical methods of general relativity, and yielded results similar to strong gravity theory's. We fix our attention, in this note, on hadron structure, and show that also the strong interaction strength α s, ordinarily called the (perturbative) coupling-constant square, can be evaluated within our theory, and found to decrease (increase) as the distance r decreases (increases). This yields both the confinement of the hadron constituents for large values of r, and their asymptotic freedom [for small values of r inside the hadron]: in qualitative agreement with the experimental evidence. In other words, our approach leads us, on a purely theoretical ground, to a dependence of α s on r which had been previously found only on phenomenological and heuristical grounds. We expect the above agreement to be also quantitative, on the basis of a few checks performed in this paper, and of further work of ours about calculating meson mass-spectra. (author)

  7. Issues with Strong Compression of Plasma Target by Stabilized Imploding Liner

    Science.gov (United States)

    Turchi, Peter; Frese, Sherry; Frese, Michael

    2017-10-01

    Strong compression (10:1 in radius) of an FRC by imploding liquid metal liners, stabilized against Rayleigh-Taylor modes, using different scalings for loss based on Bohm vs 100X classical diffusion rates, predict useful compressions with implosion times half the initial energy lifetime. The elongation (length-to-diameter ratio) near peak compression needed to satisfy empirical stability criterion and also retain alpha-particles is about ten. The present paper extends these considerations to issues of the initial FRC, including stability conditions (S*/E) and allowable angular speeds. Furthermore, efficient recovery of the implosion energy and alpha-particle work, in order to reduce the necessary nuclear gain for an economical power reactor, is seen as an important element of the stabilized liner implosion concept for fusion. We describe recent progress in design and construction of the high energy-density prototype of a Stabilized Liner Compressor (SLC) leading to repetitive laboratory experiments to develop the plasma target. Supported by ARPA-E ALPHA Program.

  8. Strong-back safety latch

    International Nuclear Information System (INIS)

    DeSantis, G.N.

    1995-01-01

    The calculation decides the integrity of the safety latch that will hold the strong-back to the pump during lifting. The safety latch will be welded to the strong-back and will latch to a 1.5-in. dia cantilever rod welded to the pump baseplate. The static and dynamic analysis shows that the safety latch will hold the strong-back to the pump if the friction clamps fail and the pump become free from the strong-back. Thus, the safety latch will meet the requirements of the Lifting and Rigging Manual for under the hook lifting for static loading; it can withstand shock loads from the strong-back falling 0.25 inch

  9. Designing internet-based payment system: guidelines and empirical basis

    NARCIS (Netherlands)

    Abrazhevich, D.; Markopoulos, P.; Rauterberg, G.W.M.

    2009-01-01

    This article describes research into online electronic payment systems, focusing on the aspects of payment systems that are critical for their acceptance by end users. Based on our earlier research and a diary study of payments with an online payment system and with online banking systems of a

  10. Determining optimal interconnection capacity on the basis of hourly demand and supply functions of electricity

    International Nuclear Information System (INIS)

    Keppler, Jan Horst; Meunier, William; Coquentin, Alexandre

    2017-01-01

    Interconnections for cross-border electricity flows are at the heart of the project to create a common European electricity market. At the time, increase in production from variable renewables clustered during a limited numbers of hours reduces the availability of existing transport infrastructures. This calls for higher levels of optimal interconnection capacity than in the past. In complement to existing scenario-building exercises such as the TYNDP that respond to the challenge of determining optimal levels of infrastructure provision, the present paper proposes a new empirically-based methodology to perform Cost-Benefit analysis for the determination of optimal interconnection capacity, using as an example the French-German cross-border trade. Using a very fine dataset of hourly supply and demand curves (aggregated auction curves) for the year 2014 from the EPEX Spot market, it constructs linearized net export (NEC) and net import demand curves (NIDC) for both countries. This allows assessing hour by hour the welfare impacts for incremental increases in interconnection capacity. Summing these welfare increases over the 8 760 hours of the year, this provides the annual total for each step increase of interconnection capacity. Confronting welfare benefits with the annual cost of augmenting interconnection capacity indicated the socially optimal increase in interconnection capacity between France and Germany on the basis of empirical market micro-data. (authors)

  11. An improved empirical dynamic control system model of global mean sea level rise and surface temperature change

    Science.gov (United States)

    Wu, Qing; Luu, Quang-Hung; Tkalich, Pavel; Chen, Ge

    2018-04-01

    Having great impacts on human lives, global warming and associated sea level rise are believed to be strongly linked to anthropogenic causes. Statistical approach offers a simple and yet conceptually verifiable combination of remotely connected climate variables and indices, including sea level and surface temperature. We propose an improved statistical reconstruction model based on the empirical dynamic control system by taking into account the climate variability and deriving parameters from Monte Carlo cross-validation random experiments. For the historic data from 1880 to 2001, we yielded higher correlation results compared to those from other dynamic empirical models. The averaged root mean square errors are reduced in both reconstructed fields, namely, the global mean surface temperature (by 24-37%) and the global mean sea level (by 5-25%). Our model is also more robust as it notably diminished the unstable problem associated with varying initial values. Such results suggest that the model not only enhances significantly the global mean reconstructions of temperature and sea level but also may have a potential to improve future projections.

  12. An empirical analysis of the importance of controlling for unobserved heterogeneity when estimating the income-mortality gradient

    Directory of Open Access Journals (Sweden)

    Adriaan Kalwij

    2014-10-01

    Full Text Available Background: Statistical theory predicts that failing to control for unobserved heterogeneity in a Gompertz mortality risk model attenuates the estimated income-mortality gradient toward zero. Objective: I assess the empirical importance of controlling for unobserved heterogeneity in a Gompertz mortality risk model when estimating the income-mortality gradient. The analysis is carried out using individual-level administrative data from the Netherlands over the period 1996-2012. Methods: I estimate a Gompertz mortality risk model in which unobserved heterogeneity has a gamma distribution and left-truncation of life durations is explicitly taken into account. Results: I find that, despite a strong and significant presence of unobserved heterogeneity in both the male and female samples, failure to control for unobserved heterogeneity yields only a small and insignificant attenuation bias in the negative income-mortality gradient. Conclusions: The main finding, a small and insignificant attenuation bias in the negative income-mortality gradient when failing to control for unobserved heterogeneity, is positive news for the many empirical studies, whose estimations of the income-mortality gradient ignore unobserved heterogeneity.

  13. Comparison of precipitating electron energy flux on March 22, 1979 with an empirical model: CDAW-6

    International Nuclear Information System (INIS)

    Simons, S.L. Jr.; Reiff, P.H.; Spiro, R.W.; Hardy, D.A.; Kroehl, H.W.

    1985-01-01

    Data recorded by Defense Meterological Satellite Program, TIROS and P-78-1 satellites for the CDAW 6 event on March 22, 1979, have been compared with a statistical model of precipitating electron fluxes. Comparisons have been made on both an orbit-by-orbit basis and on a global basis by sorting and binning the data by AE index, invariant latitude and magnetic local time in a manner similar to which the model was generated. We conclude that the model flux agrees with the data to within a factor of two, although small features and the exact locations of features are not consistently reproduced. In addition, the latitude of highest electron precipitation usually occurs about 3 0 more pole-ward in the model than in the data. We attribute this discrepancy to ring current inflation of the storm time magnetosphere (as evidenced by negative Dst's). We suggest that a similar empirical model based on AL instead of AE and including some indicator of the history of the event would provide an even better comparison. Alternatively, in situ data such as electrojet location should be used routinely to normalize the latitude of the auroral precipitation

  14. 75 FR 6166 - Basis Reporting by Securities Brokers and Basis Determination for Stock

    Science.gov (United States)

    2010-02-08

    ... DEPARTMENT OF THE TREASURY Internal Revenue Service 26 CFR Parts, 1, 31, and 301 [REG-101896-09] RIN 1545-Bl66 Basis Reporting by Securities Brokers and Basis Determination for Stock Correction In proposed rule document E9-29855 beginning on page 67010 in the issue of Thursday, December 17, 2009, make...

  15. [Concise history of toxicology - from empiric knowledge to science].

    Science.gov (United States)

    Tompa, Anna; Balázs, Péter

    2018-01-01

    Toxicology is a science of poisonings by xenobiotics and endogenous physiological changes. Its empiric roots may be traced back to the emerging of the human race because the most important pledge of our predecessors' survival was the differentiation between eatable and poisonous plants and animals. In the course of social evolution, there were three main fields of using poisons: 1) hunting and warfare, 2) to settle social tensions by avoiding military conflicts through hiding strategy of eliminating enemies by toxic substances, 3) medicines applied first as anti-poisons and later by introducing strong substances to defeat diseases, but paradoxically active euthanasia is also a part of the whole story. The industrial revolution of the 19th century changed the sporadic occupational diseases to mass conditions. Later the chemical industry and subsequently the mass production of synthetic materials turned out as a global environmental catastrophe. This latest change initiated the emerging of ecological toxicology which is a future history of the concerning ancient science. Orv Hetil. 2018; 159(3): 83-90.

  16. Psycho-neural Identity as the Basis for Empirical Research and Theorization in Psychology: An Interview with Mario A. Bunge

    Science.gov (United States)

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Martin, Toby L.; Julio, Flávia

    2012-10-01

    Mario Bunge is one of the most prolific philosophers of our time. Over the past sixty years he has written extensively about semantics, ontology, epistemology, philosophy of science and ethics. Bunge has been interested in the philosophical and methodological implications of modern psychology and more specifically in the philosophies of the relation between the neural and psychological realms. According to Bunge, functionalism, the philosophical stand of current psychology, has limited explanatory power in that neural processes are not explicitly acknowledged as components or factors of psychological phenomena. In Matter and Mind (2010), Bunge has elaborated in great detail the philosophies of the mind-brain dilemma and the basis of the psychoneural identity hypothesis, which suggests that all psychological processes can be analysed in terms of neural and physical phenomena. This article is the result of a long interview with Dr. Bunge on psychoneural identity and brain-behaviour relations.

  17. CO2 capture in amine solutions: modelling and simulations with non-empirical methods

    Science.gov (United States)

    Andreoni, Wanda; Pietrucci, Fabio

    2016-12-01

    Absorption in aqueous amine solutions is the most advanced technology for the capture of CO2, although suffering from drawbacks that do not allow exploitation on large scale. The search for optimum solvents has been pursued with empirical methods and has also motivated a number of computational approaches over the last decade. However, a deeper level of understanding of the relevant chemical reactions in solution is required so as to contribute to this effort. We present here a brief critical overview of the most recent applications of computer simulations using ab initio methods. Comparison of their outcome shows a strong dependence on the structural models employed to represent the molecular systems in solution and on the strategy used to simulate the reactions. In particular, the results of very recent ab initio molecular dynamics augmented with metadynamics are summarized, showing the crucial role of water, which has been so far strongly underestimated both in the calculations and in the interpretation of experimental data. Indications are given for advances in computational approaches that are necessary if meant to contribute to the rational design of new solvents.

  18. CO2 capture in amine solutions: modelling and simulations with non-empirical methods

    International Nuclear Information System (INIS)

    Andreoni, Wanda; Pietrucci, Fabio

    2016-01-01

    Absorption in aqueous amine solutions is the most advanced technology for the capture of CO 2 , although suffering from drawbacks that do not allow exploitation on large scale. The search for optimum solvents has been pursued with empirical methods and has also motivated a number of computational approaches over the last decade. However, a deeper level of understanding of the relevant chemical reactions in solution is required so as to contribute to this effort. We present here a brief critical overview of the most recent applications of computer simulations using ab initio methods. Comparison of their outcome shows a strong dependence on the structural models employed to represent the molecular systems in solution and on the strategy used to simulate the reactions. In particular, the results of very recent ab initio molecular dynamics augmented with metadynamics are summarized, showing the crucial role of water, which has been so far strongly underestimated both in the calculations and in the interpretation of experimental data. Indications are given for advances in computational approaches that are necessary if meant to contribute to the rational design of new solvents. (topical review)

  19. Managerial Career Patterns: A Review of the Empirical Evidence

    NARCIS (Netherlands)

    Vinkenburg, C.J.; Weber, T.

    2012-01-01

    Despite the ubiquitous presence of the term "career patterns" in the discourse about careers, the existing empirical evidence on (managerial) career patterns is rather limited. From this literature review of 33 published empirical studies of managerial and similar professional career patterns found

  20. An in silico study of the molecular basis of B-RAF activation and conformational stability

    Directory of Open Access Journals (Sweden)

    Jónsdóttir Svava

    2009-07-01

    Full Text Available Abstract Background B-RAF kinase plays an important role both in tumour induction and maintenance in several cancers and it is an attractive new drug target. However, the structural basis of the B-RAF activation is still not well understood. Results In this study we suggest a novel molecular basis of B-RAF activation based on molecular dynamics (MD simulations of B-RAFWT and the B-RAFV600E, B-RAFK601E and B-RAFD594V mutants. A strong hydrogen bond network was identified in B-RAFWT in which the interactions between Lys601 and the well known catalytic residues Lys483, Glu501 and Asp594 play an important role. It was found that several mutations, which directly or indirectly destabilized the interactions between these residues within this network, contributed to the changes in B-RAF activity. Conclusion Our results showed that the above mechanisms lead to the disruption of the electrostatic interactions between the A-loop and the αC-helix in the activating mutants, which presumably contribute to the flipping of the activation segment to an active form. Conversely, in the B-RAFD594V mutant that has impaired kinase activity, and in B-RAFWT these interactions were strong and stabilized the kinase inactive form.

  1. The performance of selected semi-empirical and DFT methods in studying C60 fullerene derivatives

    Science.gov (United States)

    Sikorska, Celina; Puzyn, Tomasz

    2015-11-01

    The capability of reproducing the open circuit voltages (V oc) of 15 representative C60 fullerene derivatives was tested using the selected quantum mechanical methods (B3LYP, PM6, and PM7) together with the two one-electron basis sets. Certain theoretical treatments (e.g. PM6) were found to be satisfactory for preliminary estimates of the open circuit voltages (V oc), whereas the use of the B3LYP/6-31G(d) approach has been proven to assure highly accurate results. We also examined the structural similarity of 19 fullerene derivatives by employing principle component analysis (PCA). In order to express the structural features of the studied compounds we used molecular descriptors calculated with semi-empirical (PM6 and PM7) and density functional (B3LYP/6-31G(d)) methods separately. In performing PCA, we noticed that semi-empirical methods (i.e. PM6 and PM7) seem satisfactory for molecules, in which one can distinguish the aromatic and the aliphatic parts in the cyclopropane ring of PCBM (phenyl-C61-buteric acid methyl ester) and they significantly overestimate the energy of the highest occupied molecular orbital (E HOMO). The use of the B3LYP functional, however, is recommended for studying methanofullerenes, which closely resemble the structure of PCBM, and for their modifications.

  2. The performance of selected semi-empirical and DFT methods in studying C60 fullerene derivatives

    International Nuclear Information System (INIS)

    Sikorska, Celina; Puzyn, Tomasz

    2015-01-01

    The capability of reproducing the open circuit voltages (V oc ) of 15 representative C 60 fullerene derivatives was tested using the selected quantum mechanical methods (B3LYP, PM6, and PM7) together with the two one-electron basis sets. Certain theoretical treatments (e.g. PM6) were found to be satisfactory for preliminary estimates of the open circuit voltages (V oc ), whereas the use of the B3LYP/6-31G(d) approach has been proven to assure highly accurate results. We also examined the structural similarity of 19 fullerene derivatives by employing principle component analysis (PCA). In order to express the structural features of the studied compounds we used molecular descriptors calculated with semi-empirical (PM6 and PM7) and density functional (B3LYP/6-31G(d)) methods separately. In performing PCA, we noticed that semi-empirical methods (i.e. PM6 and PM7) seem satisfactory for molecules, in which one can distinguish the aromatic and the aliphatic parts in the cyclopropane ring of PCBM (phenyl-C 61 -buteric acid methyl ester) and they significantly overestimate the energy of the highest occupied molecular orbital (E HOMO ). The use of the B3LYP functional, however, is recommended for studying methanofullerenes, which closely resemble the structure of PCBM, and for their modifications. (paper)

  3. The performance of selected semi-empirical and DFT methods in studying C₆₀ fullerene derivatives.

    Science.gov (United States)

    Sikorska, Celina; Puzyn, Tomasz

    2015-11-13

    The capability of reproducing the open circuit voltages (V(oc)) of 15 representative C60 fullerene derivatives was tested using the selected quantum mechanical methods (B3LYP, PM6, and PM7) together with the two one-electron basis sets. Certain theoretical treatments (e.g. PM6) were found to be satisfactory for preliminary estimates of the open circuit voltages (V(oc)), whereas the use of the B3LYP/6-31G(d) approach has been proven to assure highly accurate results. We also examined the structural similarity of 19 fullerene derivatives by employing principle component analysis (PCA). In order to express the structural features of the studied compounds we used molecular descriptors calculated with semi-empirical (PM6 and PM7) and density functional (B3LYP/6-31G(d)) methods separately. In performing PCA, we noticed that semi-empirical methods (i.e. PM6 and PM7) seem satisfactory for molecules, in which one can distinguish the aromatic and the aliphatic parts in the cyclopropane ring of PCBM (phenyl-C61-buteric acid methyl ester) and they significantly overestimate the energy of the highest occupied molecular orbital (E(HOMO)). The use of the B3LYP functional, however, is recommended for studying methanofullerenes, which closely resemble the structure of PCBM, and for their modifications.

  4. Family dynamics and housing: Conceptual issues and empirical findings

    Directory of Open Access Journals (Sweden)

    Clara Mulder

    2013-09-01

    Full Text Available BACKGROUND In this reflection I discuss my conceptual ideas and the latest empirical findings regarding the connections between leaving the parental home, marriage, parenthood, and separation on the one hand, and housing on the other. I also discuss the limitations of the research and directions for future research. CONCLUSIONS Parental housing of good quality keeps specific categories of potential nest-leavers in the parental home, but is also positively associated with the likelihood of young adults starting their housing careers as homeowners. The connections between housing and marriage and between housing and parenthood can be characterized using the concepts of housing space, quality, and safety or security - all three of which married couples and families need more than singles - and flexibility, which couples and families need less. These four needs are strongly subject to social norms. There is a strong tendency for married couples and prospective families to move into home ownership and higher quality homes. Separation tends to lead ex-partners with lower moving costs and fewer resources to move from the joint home, and tends to lead to a longer lasting decrease in housing quality, particularly for women. Future research could focus on the impact of housing on the transformation of dating partnerships into co-residential partnerships, the impact of housing quality and home ownership on the quality of partner relationships, partnership and housing histories rather than single events and short-term effects, unraveling the causal connections between family and housing, and incorporating the impact of the socio-spatial context in the research.

  5. Empirical isotropic chemical shift surfaces

    International Nuclear Information System (INIS)

    Czinki, Eszter; Csaszar, Attila G.

    2007-01-01

    A list of proteins is given for which spatial structures, with a resolution better than 2.5 A, are known from entries in the Protein Data Bank (PDB) and isotropic chemical shift (ICS) values are known from the RefDB database related to the Biological Magnetic Resonance Bank (BMRB) database. The structures chosen provide, with unknown uncertainties, dihedral angles φ and ψ characterizing the backbone structure of the residues. The joint use of experimental ICSs of the same residues within the proteins, again with mostly unknown uncertainties, and ab initio ICS(φ,ψ) surfaces obtained for the model peptides For-(l-Ala) n -NH 2 , with n = 1, 3, and 5, resulted in so-called empirical ICS(φ,ψ) surfaces for all major nuclei of the 20 naturally occurring α-amino acids. Out of the many empirical surfaces determined, it is the 13C α ICS(φ,ψ) surface which seems to be most promising for identifying major secondary structure types, α-helix, β-strand, left-handed helix (α D ), and polyproline-II. Detailed tests suggest that Ala is a good model for many naturally occurring α-amino acids. Two-dimensional empirical 13C α - 1 H α ICS(φ,ψ) correlation plots, obtained so far only from computations on small peptide models, suggest the utility of the experimental information contained therein and thus they should provide useful constraints for structure determinations of proteins

  6. Empirically Testing Thematic Analysis (ETTA)

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Tingleff, Elllen B.

    2015-01-01

    Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the li...... for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize/develop...

  7. Addressing governance challenges in the provision of animal health services: A review of the literature and empirical application transaction cost theory.

    Science.gov (United States)

    Ilukor, John; Birner, Regina; Nielsen, Thea

    2015-11-01

    Providing adequate animal health services to smallholder farmers in developing countries has remained a challenge, in spite of various reform efforts during the past decades. The focuses of the past reforms were on market failures to decide what the public sector, the private sector, and the "third sector" (the community-based sector) should do with regard to providing animal health services. However, such frameworks have paid limited attention to the governance challenges inherent in the provision of animal health services. This paper presents a framework for analyzing institutional arrangements for providing animal health services that focus not only on market failures, but also on governance challenges, such as elite capture, and absenteeism of staff. As an analytical basis, Williamson's discriminating alignment hypothesis is applied to assess the cost-effectiveness of different institutional arrangements for animal health services in view of both market failures and governance challenges. This framework is used to generate testable hypotheses on the appropriateness of different institutional arrangements for providing animal health services, depending on context-specific circumstances. Data from Uganda and Kenya on clinical veterinary services is used to provide an empirical test of these hypotheses and to demonstrate application of Williamson's transaction cost theory to veterinary service delivery. The paper concludes that strong public sector involvement, especially in building and strengthening a synergistic relation-based referral arrangement between paraprofessionals and veterinarians is imperative in improving animal health service delivery in developing countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    Science.gov (United States)

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  9. 75 FR 3666 - Basis Reporting by Securities Brokers and Basis Determination for Stock; Correction

    Science.gov (United States)

    2010-01-22

    ... Basis Reporting by Securities Brokers and Basis Determination for Stock; Correction AGENCY: Internal... on Thursday, December 17, 2009, relating to reporting sales of securities by brokers and determining... 3, in the preamble, under paragraph heading ``a. Form and Manner of New Broker Reporting...

  10. Database for earthquake strong motion studies in Italy

    Science.gov (United States)

    Scasserra, G.; Stewart, J.P.; Kayen, R.E.; Lanzo, G.

    2009-01-01

    We describe an Italian database of strong ground motion recordings and databanks delineating conditions at the instrument sites and characteristics of the seismic sources. The strong motion database consists of 247 corrected recordings from 89 earthquakes and 101 recording stations. Uncorrected recordings were drawn from public web sites and processed on a record-by-record basis using a procedure utilized in the Next-Generation Attenuation (NGA) project to remove instrument resonances, minimize noise effects through low- and high-pass filtering, and baseline correction. The number of available uncorrected recordings was reduced by 52% (mostly because of s-triggers) to arrive at the 247 recordings in the database. The site databank includes for every recording site the surface geology, a measurement or estimate of average shear wave velocity in the upper 30 m (Vs30), and information on instrument housing. Of the 89 sites, 39 have on-site velocity measurements (17 of which were performed as part of this study using SASW techniques). For remaining sites, we estimate Vs30 based on measurements on similar geologic conditions where available. Where no local velocity measurements are available, correlations with surface geology are used. Source parameters are drawn from databanks maintained (and recently updated) by Istituto Nazionale di Geofisica e Vulcanologia and include hypocenter location and magnitude for small events (M< ??? 5.5) and finite source parameters for larger events. ?? 2009 A.S. Elnashai & N.N. Ambraseys.

  11. Empirical evidences of owners’ managerial behaviour - the case of small companies

    Science.gov (United States)

    Lobontiu, G.; Banica, M.; Ravai-Nagy, S.

    2017-05-01

    In a small firm, the founder or the owner-manager often leaves his or her own personal “stamp” on the way things are done, finding solutions for the multitude of problems the firm faces, and maintaining control over the firm’s operations. The paper aims to investigate the degree to which the owner-managers are controlling the operations of their firm on a day-to-day basis or even getting involved into the management of the functional areas. Our empirical research, conducted on a sample of 200 small and medium-sized enterprises (SME) from the North-Western Romania, Maramures (NUTS3 level - RO114), shows that owner-managers tend to be all-powerful, making decisions based on their experience. Furthermore, the survey highlights the focus of owner-managers on two functional areas, namely the production, and sales and marketing. Finally, the correlation analysis states that in the case of small firms, the owner-manager is more involved in managing the functional areas of the firm, as compared to the medium-ones.

  12. Guidelines for using empirical studies in software engineering education

    Directory of Open Access Journals (Sweden)

    Fabian Fagerholm

    2017-09-01

    Full Text Available Software engineering education is under constant pressure to provide students with industry-relevant knowledge and skills. Educators must address issues beyond exercises and theories that can be directly rehearsed in small settings. Industry training has similar requirements of relevance as companies seek to keep their workforce up to date with technological advances. Real-life software development often deals with large, software-intensive systems and is influenced by the complex effects of teamwork and distributed software development, which are hard to demonstrate in an educational environment. A way to experience such effects and to increase the relevance of software engineering education is to apply empirical studies in teaching. In this paper, we show how different types of empirical studies can be used for educational purposes in software engineering. We give examples illustrating how to utilize empirical studies, discuss challenges, and derive an initial guideline that supports teachers to include empirical studies in software engineering courses. Furthermore, we give examples that show how empirical studies contribute to high-quality learning outcomes, to student motivation, and to the awareness of the advantages of applying software engineering principles. Having awareness, experience, and understanding of the actions required, students are more likely to apply such principles under real-life constraints in their working life.

  13. Haplotype structure around Bru1 reveals a narrow genetic basis for brown rust resistance in modern sugarcane cultivars.

    Science.gov (United States)

    Costet, L; Le Cunff, L; Royaert, S; Raboin, L-M; Hervouet, C; Toubi, L; Telismart, H; Garsmeur, O; Rousselle, Y; Pauquet, J; Nibouche, S; Glaszmann, J-C; Hoarau, J-Y; D'Hont, A

    2012-09-01

    Modern sugarcane cultivars (Saccharum spp., 2n = 100-130) are high polyploid, aneuploid and of interspecific origin. A major gene (Bru1) conferring resistance to brown rust, caused by the fungus Puccinia melanocephala, has been identified in cultivar R570. We analyzed 380 modern cultivars and breeding materials covering the worldwide diversity with 22 molecular markers genetically linked to Bru1 in R570 within a 8.2 cM segment. Our results revealed a strong LD in the Bru1 region and strong associations between most of the markers and rust resistance. Two PCR markers, that flank the Bru1-bearing segment, were found completely associated with one another and only in resistant clones representing efficient molecular diagnostic for Bru1. On this basis, Bru1 was inferred in 86 % of the 194 resistant sugarcane accessions, revealing that it constitutes the main source of brown rust resistance in modern cultivars. Bru1 PCR diagnostic markers should be particularly useful to identify cultivars with potentially alternative sources of resistance to diversify the basis of brown rust resistance in breeding programs.

  14. Simple, empirical approach to predict neutron capture cross sections from nuclear masses

    Science.gov (United States)

    Couture, A.; Casten, R. F.; Cakirli, R. B.

    2017-12-01

    Background: Neutron capture cross sections are essential to understanding the astrophysical s and r processes, the modeling of nuclear reactor design and performance, and for a wide variety of nuclear forensics applications. Often, cross sections are needed for nuclei where experimental measurements are difficult. Enormous effort, over many decades, has gone into attempting to develop sophisticated statistical reaction models to predict these cross sections. Such work has met with some success but is often unable to reproduce measured cross sections to better than 40 % , and has limited predictive power, with predictions from different models rapidly differing by an order of magnitude a few nucleons from the last measurement. Purpose: To develop a new approach to predicting neutron capture cross sections over broad ranges of nuclei that accounts for their values where known and which has reliable predictive power with small uncertainties for many nuclei where they are unknown. Methods: Experimental neutron capture cross sections were compared to empirical mass observables in regions of similar structure. Results: We present an extremely simple method, based solely on empirical mass observables, that correlates neutron capture cross sections in the critical energy range from a few keV to a couple hundred keV. We show that regional cross sections are compactly correlated in medium and heavy mass nuclei with the two-neutron separation energy. These correlations are easily amenable to predict unknown cross sections, often converting the usual extrapolations to more reliable interpolations. It almost always reproduces existing data to within 25 % and estimated uncertainties are below about 40 % up to 10 nucleons beyond known data. Conclusions: Neutron capture cross sections display a surprisingly strong connection to the two-neutron separation energy, a nuclear structure property. The simple, empirical correlations uncovered provide model-independent predictions of

  15. Pluvials, droughts, the Mongol Empire, and modern Mongolia

    Science.gov (United States)

    Pederson, Neil; Hessl, Amy E.; Baatarbileg, Nachin; Anchukaitis, Kevin J.; Di Cosmo, Nicola

    2014-01-01

    Although many studies have associated the demise of complex societies with deteriorating climate, few have investigated the connection between an ameliorating environment, surplus resources, energy, and the rise of empires. The 13th-century Mongol Empire was the largest contiguous land empire in world history. Although drought has been proposed as one factor that spurred these conquests, no high-resolution moisture data are available during the rapid development of the Mongol Empire. Here we present a 1,112-y tree-ring reconstruction of warm-season water balance derived from Siberian pine (Pinus sibirica) trees in central Mongolia. Our reconstruction accounts for 56% of the variability in the regional water balance and is significantly correlated with steppe productivity across central Mongolia. In combination with a gridded temperature reconstruction, our results indicate that the regional climate during the conquests of Chinggis Khan’s (Genghis Khan’s) 13th-century Mongol Empire was warm and persistently wet. This period, characterized by 15 consecutive years of above-average moisture in central Mongolia and coinciding with the rise of Chinggis Khan, is unprecedented over the last 1,112 y. We propose that these climate conditions promoted high grassland productivity and favored the formation of Mongol political and military power. Tree-ring and meteorological data also suggest that the early 21st-century drought in central Mongolia was the hottest drought in the last 1,112 y, consistent with projections of warming over Inner Asia. Future warming may overwhelm increases in precipitation leading to similar heat droughts, with potentially severe consequences for modern Mongolia. PMID:24616521

  16. Pluvials, droughts, the Mongol Empire, and modern Mongolia.

    Science.gov (United States)

    Pederson, Neil; Hessl, Amy E; Baatarbileg, Nachin; Anchukaitis, Kevin J; Di Cosmo, Nicola

    2014-03-25

    Although many studies have associated the demise of complex societies with deteriorating climate, few have investigated the connection between an ameliorating environment, surplus resources, energy, and the rise of empires. The 13th-century Mongol Empire was the largest contiguous land empire in world history. Although drought has been proposed as one factor that spurred these conquests, no high-resolution moisture data are available during the rapid development of the Mongol Empire. Here we present a 1,112-y tree-ring reconstruction of warm-season water balance derived from Siberian pine (Pinus sibirica) trees in central Mongolia. Our reconstruction accounts for 56% of the variability in the regional water balance and is significantly correlated with steppe productivity across central Mongolia. In combination with a gridded temperature reconstruction, our results indicate that the regional climate during the conquests of Chinggis Khan's (Genghis Khan's) 13th-century Mongol Empire was warm and persistently wet. This period, characterized by 15 consecutive years of above-average moisture in central Mongolia and coinciding with the rise of Chinggis Khan, is unprecedented over the last 1,112 y. We propose that these climate conditions promoted high grassland productivity and favored the formation of Mongol political and military power. Tree-ring and meteorological data also suggest that the early 21st-century drought in central Mongolia was the hottest drought in the last 1,112 y, consistent with projections of warming over Inner Asia. Future warming may overwhelm increases in precipitation leading to similar heat droughts, with potentially severe consequences for modern Mongolia.

  17. Pluvials, droughts, the Mongol Empire, and modern Mongolia

    Science.gov (United States)

    Pederson, Neil; Hessl, Amy E.; Baatarbileg, Nachin; Anchukaitis, Kevin J.; Di Cosmo, Nicola

    2014-03-01

    Although many studies have associated the demise of complex societies with deteriorating climate, few have investigated the connection between an ameliorating environment, surplus resources, energy, and the rise of empires. The 13th-century Mongol Empire was the largest contiguous land empire in world history. Although drought has been proposed as one factor that spurred these conquests, no high-resolution moisture data are available during the rapid development of the Mongol Empire. Here we present a 1,112-y tree-ring reconstruction of warm-season water balance derived from Siberian pine (Pinus sibirica) trees in central Mongolia. Our reconstruction accounts for 56% of the variability in the regional water balance and is significantly correlated with steppe productivity across central Mongolia. In combination with a gridded temperature reconstruction, our results indicate that the regional climate during the conquests of Chinggis Khan's (Genghis Khan's) 13th-century Mongol Empire was warm and persistently wet. This period, characterized by 15 consecutive years of above-average moisture in central Mongolia and coinciding with the rise of Chinggis Khan, is unprecedented over the last 1,112 y. We propose that these climate conditions promoted high grassland productivity and favored the formation of Mongol political and military power. Tree-ring and meteorological data also suggest that the early 21st-century drought in central Mongolia was the hottest drought in the last 1,112 y, consistent with projections of warming over Inner Asia. Future warming may overwhelm increases in precipitation leading to similar heat droughts, with potentially severe consequences for modern Mongolia.

  18. An empirical correction for moderate multiple scattering in super-heterodyne light scattering.

    Science.gov (United States)

    Botin, Denis; Mapa, Ludmila Marotta; Schweinfurth, Holger; Sieber, Bastian; Wittenberg, Christopher; Palberg, Thomas

    2017-05-28

    Frequency domain super-heterodyne laser light scattering is utilized in a low angle integral measurement configuration to determine flow and diffusion in charged sphere suspensions showing moderate to strong multiple scattering. We introduce an empirical correction to subtract the multiple scattering background and isolate the singly scattered light. We demonstrate the excellent feasibility of this simple approach for turbid suspensions of transmittance T ≥ 0.4. We study the particle concentration dependence of the electro-kinetic mobility in low salt aqueous suspension over an extended concentration regime and observe a maximum at intermediate concentrations. We further use our scheme for measurements of the self-diffusion coefficients in the fluid samples in the absence or presence of shear, as well as in polycrystalline samples during crystallization and coarsening. We discuss the scope and limits of our approach as well as possible future applications.

  19. The Exploration Ethic: Its Historical-Intellectual Basis. Outlook for Space (1980 - 2000)

    Science.gov (United States)

    Priscoli, J. D.; Marney, M.

    1975-01-01

    Principle components of the exploration ethic are discussed. Attempts were made to justify both the historical and intellectual aspects of the concept. It was noted that intellectual justification is strongly grounded on: (1) the complementarity of objective and normative inquiry as to method, and (2) interdisciplinary alliance of ethics of adaptive systems with contemporary decision sciences, as a theoretical basis. Historical exploration justification was associated with: (1) periods of civilization transition, (2) changes in the process of exploration which cause change in types of rationals used, sponsors involved, and explorers interest, and (3) the incorrectness of proven prior cost/benefit calculations.

  20. Poisson and Gaussian approximation of weighted local empirical processes

    NARCIS (Netherlands)

    Einmahl, J.H.J.

    1995-01-01

    We consider the local empirical process indexed by sets, a greatly generalized version of the well-studied uniform tail empirical process. We show that the weak limit of weighted versions of this process is Poisson under certain conditions, whereas it is Gaussian in other situations. Our main

  1. Empiric potassium supplementation and increased survival in users of loop diuretics.

    Directory of Open Access Journals (Sweden)

    Charles E Leonard

    Full Text Available The effectiveness of the clinical strategy of empiric potassium supplementation in reducing the frequency of adverse clinical outcomes in patients receiving loop diuretics is unknown. We sought to examine the association between empiric potassium supplementation and 1 all-cause death and 2 outpatient-originating sudden cardiac death (SD and ventricular arrhythmia (VA among new starters of loop diuretics, stratified on initial loop diuretic dose.We conducted a one-to-one propensity score-matched cohort study using 1999-2007 US Medicaid claims from five states. Empiric potassium supplementation was defined as a potassium prescription on the day of or the day after the initial loop diuretic prescription. Death, the primary outcome, was ascertained from the Social Security Administration Death Master File; SD/VA, the secondary outcome, from incident, first-listed emergency department or principal inpatient SD/VA discharge diagnoses (positive predictive value = 85%.We identified 654,060 persons who met eligibility criteria and initiated therapy with a loop diuretic, 27% of whom received empiric potassium supplementation (N = 179,436 and 73% of whom did not (N = 474,624. The matched hazard ratio for empiric potassium supplementation was 0.93 (95% confidence interval, 0.89-0.98, p = 0.003 for all-cause death. Stratifying on initial furosemide dose, hazard ratios for empiric potassium supplementation with furosemide < 40 and ≥ 40 milligrams/day were 0.93 (0.86-1.00, p = 0.050 and 0.84 (0.79-0.89, p < 0.0001. The matched hazard ratio for empiric potassium supplementation was 1.02 (0.83-1.24, p = 0.879 for SD/VA.Empiric potassium supplementation upon initiation of a loop diuretic appears to be associated with improved survival, with a greater apparent benefit seen with higher diuretic dose. If confirmed, these findings support the use of empiric potassium supplementation upon initiation of a loop diuretic.

  2. Predicting acid dew point with a semi-empirical model

    International Nuclear Information System (INIS)

    Xiang, Baixiang; Tang, Bin; Wu, Yuxin; Yang, Hairui; Zhang, Man; Lu, Junfu

    2016-01-01

    Highlights: • The previous semi-empirical models are systematically studied. • An improved thermodynamic correlation is derived. • A semi-empirical prediction model is proposed. • The proposed semi-empirical model is validated. - Abstract: Decreasing the temperature of exhaust flue gas in boilers is one of the most effective ways to further improve the thermal efficiency, electrostatic precipitator efficiency and to decrease the water consumption of desulfurization tower, while, when this temperature is below the acid dew point, the fouling and corrosion will occur on the heating surfaces in the second pass of boilers. So, the knowledge on accurately predicting the acid dew point is essential. By investigating the previous models on acid dew point prediction, an improved thermodynamic correlation formula between the acid dew point and its influencing factors is derived first. And then, a semi-empirical prediction model is proposed, which is validated with the data both in field test and experiment, and comparing with the previous models.

  3. Theorising big IT programmes in healthcare: strong structuration theory meets actor-network theory.

    Science.gov (United States)

    Greenhalgh, Trisha; Stones, Rob

    2010-05-01

    The UK National Health Service is grappling with various large and controversial IT programmes. We sought to develop a sharper theoretical perspective on the question "What happens - at macro-, meso- and micro-level - when government tries to modernise a health service with the help of big IT?" Using examples from data fragments at the micro-level of clinical work, we considered how structuration theory and actor-network theory (ANT) might be combined to inform empirical investigation. Giddens (1984) argued that social structures and human agency are recursively linked and co-evolve. ANT studies the relationships that link people and technologies in dynamic networks. It considers how discourses become inscribed in data structures and decision models of software, making certain network relations irreversible. Stones' (2005) strong structuration theory (SST) is a refinement of Giddens' work, systematically concerned with empirical research. It views human agents as linked in dynamic networks of position-practices. A quadripartite approcach considers [a] external social structures (conditions for action); [b] internal social structures (agents' capabilities and what they 'know' about the social world); [c] active agency and actions and [d] outcomes as they feed back on the position-practice network. In contrast to early structuration theory and ANT, SST insists on disciplined conceptual methodology and linking this with empirical evidence. In this paper, we adapt SST for the study of technology programmes, integrating elements from material interactionism and ANT. We argue, for example, that the position-practice network can be a socio-technical one in which technologies in conjunction with humans can be studied as 'actants'. Human agents, with their complex socio-cultural frames, are required to instantiate technology in social practices. Structurally relevant properties inscribed and embedded in technological artefacts constrain and enable human agency. The fortunes

  4. Support Vector Regression Model Based on Empirical Mode Decomposition and Auto Regression for Electric Load Forecasting

    Directory of Open Access Journals (Sweden)

    Hong-Juan Li

    2013-04-01

    Full Text Available Electric load forecasting is an important issue for a power utility, associated with the management of daily operations such as energy transfer scheduling, unit commitment, and load dispatch. Inspired by strong non-linear learning capability of support vector regression (SVR, this paper presents a SVR model hybridized with the empirical mode decomposition (EMD method and auto regression (AR for electric load forecasting. The electric load data of the New South Wales (Australia market are employed for comparing the forecasting performances of different forecasting models. The results confirm the validity of the idea that the proposed model can simultaneously provide forecasting with good accuracy and interpretability.

  5. Residential PV system users' perception of profitability, reliability, and failure risk: An empirical survey in a local Japanese municipality

    International Nuclear Information System (INIS)

    Mukai, Toshihiro; Kawamoto, Shishin; Ueda, Yuzuru; Saijo, Miki; Abe, Naoya

    2011-01-01

    Although previous studies have addressed the reliability of residential PV systems in order to improve the dissemination of the systems among individual users and societies, few have examined users' perception of their own PV systems, which might contain solutions to firmly establish the system into society. First, the present paper examined the extent to which residential PV system users understand specification, reliability, and failure risk of their own systems. Second, causal factors affecting users' satisfaction with PV systems were examined. By analyzing data collected in Kakegawa City, this paper revealed that users did not appropriately understand the basic specifications of their residential PV systems, and in particular, the fact that the systems sometimes failed and therefore needed proper maintenance. Furthermore, a strong causal relationship between users' expectations of financial return from the system and their level of satisfaction was confirmed empirically. These results suggested that excessive focus on profitability and relatively low interest in the systems' reliability and failure risk should be addressed more to avoid problems that could potentially hamper the establishment of this technology into society. - Highlights: → We examined PV users' perception of its specification, reliability, and failure risk. → Data for analysis were collected by questionnaire survey in a Japanese local municipality. → We revealed users did not appropriately understand the basic specifications. → A strong causal relationship between users' expectations of financial return and their level of satisfaction was confirmed empirically.

  6. NOx PREDICTION FOR FBC BOILERS USING EMPIRICAL MODELS

    Directory of Open Access Journals (Sweden)

    Jiří Štefanica

    2014-02-01

    Full Text Available Reliable prediction of NOx emissions can provide useful information for boiler design and fuel selection. Recently used kinetic prediction models for FBC boilers are overly complex and require large computing capacity. Even so, there are many uncertainties in the case of FBC boilers. An empirical modeling approach for NOx prediction has been used exclusively for PCC boilers. No reference is available for modifying this method for FBC conditions. This paper presents possible advantages of empirical modeling based prediction of NOx emissions for FBC boilers, together with a discussion of its limitations. Empirical models are reviewed, and are applied to operation data from FBC boilers used for combusting Czech lignite coal or coal-biomass mixtures. Modifications to the model are proposed in accordance with theoretical knowledge and prediction accuracy.

  7. Have Market-oriented Reforms Decoupled China’s CO2 Emissions from Total Electricity Generation? An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Wei Shang

    2016-05-01

    Full Text Available Achieving the decoupling of electric CO2 emissions from total electricity generation is important in ensuring the sustainable socioeconomic development of China. To realize this, China implemented market-oriented reforms to its electric power industry at the beginning of 2003. This study used the Tapio decoupling index, the Laspeyres decomposition algorithm, and decoupling-related data from 1993 to 2012 to evaluate the effect of these reforms. Several conclusions can be drawn based on the empirical analysis. (1 The reforms changed the developmental trend of the decoupling index and facilitated its progress towards strong decoupling. (2 The results forecasted through fitting the curve to the decoupling index indicate that strong decoupling would be realized by 2030. (3 Limiting the manufacturing development and upgrading the generation equipment of the thermal power plants are essential for China to achieve strong decoupling at an early date. (4 China should enhance regulatory pressures and guidance for appropriate investment in thermal power plants to ensure the stable development of the decoupling index. (5 Transactions between multiple participants and electricity price bidding play active roles in the stable development of the decoupling index.

  8. The biological basis of radiotherapy

    International Nuclear Information System (INIS)

    Steel, G.G.; Adams, G.E.; Horwich, A.

    1989-01-01

    The focus of this book is the biological basis of radiotherapy. The papers presented include: Temporal stages of radiation action:free radical processes; The molecular basis of radiosensitivity; and Radiation damage to early-reacting normal tissue

  9. Resource specialization, customer orientation, and firm performance: an empirical investigation of valuable resources

    DEFF Research Database (Denmark)

    Sørensen, Hans Eibe

    2011-01-01

    This study contributes to the strategic marketing research by empirically investigating the role of customer orientation in explaining how firms leverage their specialized but vulnerable resources. The aim is thus to explore a subset of the means by which resources become valuable to the firm...... – the first criterion for a strategic resource. Hypotheses are developed and tested using CEO questionnaire responses from a sample of manufacturing firms and census accounting data. The results show that there is a strong link between industry-specific resources and return on assets for firms with high...... levels of customer orientation. We also report that firm-specific resources are unrelated to firm performance and that a customer orientation – investigated in isolation, may be detrimental to firm performance. Research and managerial implications are discussed....

  10. Managerial Career Patterns: A Review of the Empirical Evidence

    Science.gov (United States)

    Vinkenburg, Claartje J.; Weber, Torsten

    2012-01-01

    Despite the ubiquitous presence of the term "career patterns" in the discourse about careers, the existing empirical evidence on (managerial) career patterns is rather limited. From this literature review of 33 published empirical studies of managerial and similar professional career patterns found in electronic bibliographic databases, it is…

  11. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    Science.gov (United States)

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  12. Is deintegrated electric generation efficient; A proposed empirical reserach framework

    Energy Technology Data Exchange (ETDEWEB)

    Fox-Penner, P S [Charles River Associates Incorporated, Boston, Massachusetts (US)

    1990-01-01

    Partial or complete deregulation of public utility industries, which until recently were regulated even in market-driven economics such as the U.S. are examined. The causes and cures of this phenomenon are discussed. The purpose is to present a framework for examining empirically the net benefits to consumers from one particular form of public utility ''deregulation'': separating or deintegrating electricity generators from transmitters, and deregulating the former. The framework has been developed for this application because U.S. public utility law has permitted this particular form of ''deregulation'' to exist alongside traditional utilities for the past ten years. Hence, data are now available for comparing regulated and deregulated power generators and their interaction with the transmission system. More generally, however, the purpose of this article is to argue that the benefits of regulatory change in public utility industries must be examined on a comprehensive basis. It is not sufficient to examine one stage of the vertical production chain and conclude that deregulation will improve or hinder its performance. Rather, the long-range benefits of various industry regimes should be carefully examined from the standpoint of the consumer at the downstream end of the production process. (author) 41 refs.

  13. Strong Langmuir turbulence

    International Nuclear Information System (INIS)

    Goldman, M.V.

    1984-01-01

    After a brief discussion of beam-excited Langmuir turbulence in the solar wind, we explain the criteria for wave-particle, three-wave and strong turbulence interactions. We then present the results of a numerical integration of the Zakharov equations, which describe the strong turbulence saturation of a weak (low-density) high energy, bump-on-tail beam instability. (author)

  14. Empirical Model Building Data, Models, and Reality

    CERN Document Server

    Thompson, James R

    2011-01-01

    Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m

  15. Semi-empirical and empirical L X-ray production cross sections for elements with 50 ≤ Z ≤ 92 for protons of 0.5-3.0 MeV

    International Nuclear Information System (INIS)

    Nekab, M.; Kahoul, A.

    2006-01-01

    We present in this contribution, semi-empirical production cross sections of the main X-ray lines Lα, Lβ and Lγ for elements from Sn to U and for protons with energies varying from 0.5 to 3.0 MeV. The theoretical X-ray production cross sections are firstly calculated from the theoretical ionization cross sections of the Li (i = 1, 2, 3) subshell within the ECPSSR theory. The semi-empirical Lα, Lβ and Lγ cross sections are then deduced by fitting the available experimental data normalized to their corresponding theoretical values and give the better representation of the experimental data in some cases. On the other hand, the experimental data are directly fitted to deduce the empirical L X-ray production cross sections. A comparison is made between the semi-empirical cross sections, the empirical cross sections reported in this work and the empirical ones reported by Reis and Jesus [M.A. Reis, A.P. Jesus, Atom. Data Nucl. Data Tables 63 (1996) 1] and those of Strivay and Weber [Strivay, G. Weber, Nucl. Instr. and Meth. B 190 (2002) 112

  16. Strong Motion Earthquake Data Values of Digitized Strong-Motion Accelerograms, 1933-1994

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Strong Motion Earthquake Data Values of Digitized Strong-Motion Accelerograms is a database of over 15,000 digitized and processed accelerograph records from...

  17. Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Peng, E-mail: peng@ices.utexas.edu [The Institute for Computational Engineering and Sciences, The University of Texas at Austin, 201 East 24th Street, Stop C0200, Austin, TX 78712-1229 (United States); Schwab, Christoph, E-mail: christoph.schwab@sam.math.ethz.ch [Seminar für Angewandte Mathematik, Eidgenössische Technische Hochschule, Römistrasse 101, CH-8092 Zürich (Switzerland)

    2016-07-01

    We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov–Galerkin high-fidelity (“HiFi”) discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by the so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data

  18. Phenomenology and the Empirical Turn

    NARCIS (Netherlands)

    Zwier, Jochem; Blok, Vincent; Lemmens, Pieter

    2016-01-01

    This paper provides a phenomenological analysis of postphenomenological philosophy of technology. While acknowledging that the results of its analyses are to be recognized as original, insightful, and valuable, we will argue that in its execution of the empirical turn, postphenomenology forfeits

  19. Empirical ethics as dialogical practice

    NARCIS (Netherlands)

    Widdershoven, G.A.M.; Abma, T.A.; Molewijk, A.C.

    2009-01-01

    In this article, we present a dialogical approach to empirical ethics, based upon hermeneutic ethics and responsive evaluation. Hermeneutic ethics regards experience as the concrete source of moral wisdom. In order to gain a good understanding of moral issues, concrete detailed experiences and

  20. Empirical processes: theory and applications

    OpenAIRE

    Venturini Sergio

    2005-01-01

    Proceedings of the 2003 Summer School in Statistics and Probability in Torgnon (Aosta, Italy) held by Prof. Jon A. Wellner and Prof. M. Banerjee. The topic presented was the theory of empirical processes with applications to statistics (m-estimation, bootstrap, semiparametric theory).

  1. Understanding and capturing NSSS design basis

    International Nuclear Information System (INIS)

    Palo, W.J.; Miller, B.

    1993-01-01

    Changes to, and technical evaluations of nuclear generating station designs are often warranted. Comprehensive documentation and understanding of the NSSS Design Basis are essential to support these activities. Effective configuration management tools are also needed to maintain the plant within design basis limits. Efficient design basis reconstitution can be realized via: In-depth understanding of the design process; Utilization of effective data collection methodology; State of the art data basing tools. A database can be created to generate a Design Basis Manual (DBM). This database can communicate electronically with other plant databases. A living document vice a static snapshot of the plant design is the goal. A design basis database can serve as the cornerstone for a global electronic information control system

  2. BWR NSSS design basis documentation

    International Nuclear Information System (INIS)

    Vij, R.S.; Bates, R.E.

    2004-01-01

    In 1985 an incident at Toledo Edison's Davis Besse plant caused the U.S. Nuclear Regulatory Commission (NRC) to re-evaluate the technical information that the utilities had readily available to support the design of their plants. The Design Basis programs, currently on going in most U.S. utilities, have been the nuclear industry's response to the needs identified by this re-evaluation. In order to understand the Design Basis programs which have been implemented by the U.S. nuclear utilities, it is necessary to understand the problem as it was perceived by the nuclear industry (the utilities, the original NSSS designers and the regulators) after the Davis-Besse incident, the subsequent programs undertaken by the industry under the leadership of INPO and NUMARC, the NRC's actions, and the overall evolution of the industry's vision in relation to this problem. This paper presents the history of the design basis efforts from the first recognition of the problem by the NRC after the Davis-Besse incident, describes the actions taken by the NRC, INPO, NUMARC, the U.S. utilities and the NSSS designers, and brings the problem statement up-to-date in relation to the vision presently held by the U.S. nuclear industry. It then presents a technical discussion to develop a detailed definition of design basis information to support the problem statement. The information originally supplied by the NSSS designers during the plant design and construction is discussed as well as its relationship to the previously defined design basis information. This section of the paper concludes by defining the additional information needed by nuclear utilities to satisfy the requirements developed from the problem statement. Having developed a definition of the additional information (i.e., information not originally supplied during design and construction) required to solve the design basis problem as it is presently perceived by the U.S. nuclear industry, the paper then discusses design basis

  3. Worship, Reflection, Empirical Research

    OpenAIRE

    Ding Dong,

    2012-01-01

    In my youth, I was a worshipper of Mao Zedong. From the latter stage of the Mao Era to the early years of Reform and Opening, I began to reflect on Mao and the Communist Revolution he launched. In recent years I’ve devoted myself to empirical historical research on Mao, seeking the truth about Mao and China’s modern history.

  4. An analytical solution of Richards' equation providing the physical basis of SCS curve number method and its proportionality relationship

    Science.gov (United States)

    Hooshyar, Milad; Wang, Dingbao

    2016-08-01

    The empirical proportionality relationship, which indicates that the ratio of cumulative surface runoff and infiltration to their corresponding potentials are equal, is the basis of the extensively used Soil Conservation Service Curve Number (SCS-CN) method. The objective of this paper is to provide the physical basis of the SCS-CN method and its proportionality hypothesis from the infiltration excess runoff generation perspective. To achieve this purpose, an analytical solution of Richards' equation is derived for ponded infiltration in shallow water table environment under the following boundary conditions: (1) the soil is saturated at the land surface; and (2) there is a no-flux boundary which moves downward. The solution is established based on the assumptions of negligible gravitational effect, constant soil water diffusivity, and hydrostatic soil moisture profile between the no-flux boundary and water table. Based on the derived analytical solution, the proportionality hypothesis is a reasonable approximation for rainfall partitioning at the early stage of ponded infiltration in areas with a shallow water table for coarse textured soils.

  5. Empirical Reduced-Order Modeling for Boundary Feedback Flow Control

    Directory of Open Access Journals (Sweden)

    Seddik M. Djouadi

    2008-01-01

    Full Text Available This paper deals with the practical and theoretical implications of model reduction for aerodynamic flow-based control problems. Various aspects of model reduction are discussed that apply to partial differential equation- (PDE- based models in general. Specifically, the proper orthogonal decomposition (POD of a high dimension system as well as frequency domain identification methods are discussed for initial model construction. Projections on the POD basis give a nonlinear Galerkin model. Then, a model reduction method based on empirical balanced truncation is developed and applied to the Galerkin model. The rationale for doing so is that linear subspace approximations to exact submanifolds associated with nonlinear controllability and observability require only standard matrix manipulations utilizing simulation/experimental data. The proposed method uses a chirp signal as input to produce the output in the eigensystem realization algorithm (ERA. This method estimates the system's Markov parameters that accurately reproduce the output. Balanced truncation is used to show that model reduction is still effective on ERA produced approximated systems. The method is applied to a prototype convective flow on obstacle geometry. An H∞ feedback flow controller is designed based on the reduced model to achieve tracking and then applied to the full-order model with excellent performance.

  6. Tests of Parameters Instability: Theoretical Study and Empirical Applications on Two Types of Models (ARMA Model and Market Model

    Directory of Open Access Journals (Sweden)

    Sahbi FARHANI

    2012-01-01

    Full Text Available This paper considers tests of parameters instability and structural change with known, unknown or multiple breakpoints. The results apply to a wide class of parametric models that are suitable for estimation by strong rules for detecting the number of breaks in a time series. For that, we use Chow, CUSUM, CUSUM of squares, Wald, likelihood ratio and Lagrange multiplier tests. Each test implicitly uses an estimate of a change point. We conclude with an empirical analysis on two different models (ARMA model and simple linear regression model.

  7. Finnish Factor in the History of the Northern Frontier of the Russian Empire 1809–1855

    Directory of Open Access Journals (Sweden)

    Konstantin S. Zaikov

    2016-09-01

    Full Text Available The article is devoted to the little-known pages of the "Northern Frontier" history – the Russian-Norwegian border zone, namely the role of the Grand Duchy of Finland in the border policy of the Russian Empire and the Swedish-Norwegian in the border area in 1809–1855. The authors demonstrate that in 1809 the entry of Finland into the Russian Empire strengthened its ability to defend national interests in the far north of Europe. At the same time, the growing influence of the Grand Duchy on the Russian home and foreign policy contributed to the total indoctrination of the Russian-Swedish/Norwegian border and the image of the "Russian threat", which was distributed among the political elite of the Swedish-Norwegian state in the 1820–1850s. Distribution of russophobian sentiments in the United Kingdom of Sweden and Norway accelerated the urgency of the Northern Frontier formal delimitation for the Russian-Swedish diplomatic relations in the first half of the 1820s. The "Russian threat" also served the ideological basis for gradual securitization and politicization of the Russian-Norwegian border area. Thus, the United Kingdom of Sweden and Norway viewed the whole range of cross-border relations between the population of Finnmark (Sweden-Norway, Uleåborg province (Grand Duchy of Finland and the Arkhangelsk province (Russian Empire as one of the potential threats to national security in the second half of the 19th century. The closure of the Finnish-Norwegian section of the Russian-Swedish/Norwegian border in 1852 and joining the anti-Russian coalition with Britain and France, formally enshrined in the so-called the November Treaty of 1855, become the culmination of this process.

  8. Shear-wave velocity characterization of the USGS Hawaiian strong-motion network on the Island of Hawaii and development of an NEHRP site-class map

    Science.gov (United States)

    Wong, Ivan G.; Stokoe, Kenneth; Cox, Brady R.; Yuan, Jiabei; Knudsen, Keith L.; Terra, Fabia; Okubo, Paul G.; Lin, Yin-Cheng

    2011-01-01

    To assess the level and nature of ground shaking in Hawaii for the purposes of earthquake hazard mitigation and seismic design, empirical ground-motion prediction models are desired. To develop such empirical relationships, knowledge of the subsurface site conditions beneath strong-motion stations is critical. Thus, as a first step to develop ground-motion prediction models for Hawaii, spectral-analysis-of-surface-waves (SASW) profiling was performed at the 22 free-field U.S. Geological Survey (USGS) strong-motion sites on the Big Island to obtain shear-wave velocity (VS) data. Nineteen of these stations recorded the 2006 Kiholo Bay moment magnitude (M) 6.7 earthquake, and 17 stations recorded the triggered M 6.0 Mahukona earthquake. VS profiling was performed to reach depths of more than 100 ft. Most of the USGS stations are situated on sites underlain by basalt, based on surficial geologic maps. However, the sites have varying degrees of weathering and soil development. The remaining strong-motion stations are located on alluvium or volcanic ash. VS30 (average VS in the top 30 m) values for the stations on basalt ranged from 906 to 1908 ft/s [National Earthquake Hazards Reduction Program (NEHRP) site classes C and D], because most sites were covered with soil of variable thickness. Based on these data, an NEHRP site-class map was developed for the Big Island. These new VS data will be a significant input into an update of the USGS statewide hazard maps and to the operation of ShakeMap on the island of Hawaii.

  9. Multisensor Distributed Track Fusion AlgorithmBased on Strong Tracking Filter and Feedback Integration1)

    Institute of Scientific and Technical Information of China (English)

    YANGGuo-Sheng; WENCheng-Lin; TANMin

    2004-01-01

    A new multisensor distributed track fusion algorithm is put forward based on combiningthe feedback integration with the strong tracking Kalman filter. Firstly, an effective tracking gateis constructed by taking the intersection of the tracking gates formed before and after feedback.Secondly, on the basis of the constructed effective tracking gate, probabilistic data association andstrong tracking Kalman filter are combined to form the new multisensor distributed track fusionalgorithm. At last, simulation is performed on the original algorithm and the algorithm presented.

  10. Forecasting outpatient visits using empirical mode decomposition coupled with back-propagation artificial neural networks optimized by particle swarm optimization.

    Science.gov (United States)

    Huang, Daizheng; Wu, Zhihui

    2017-01-01

    Accurately predicting the trend of outpatient visits by mathematical modeling can help policy makers manage hospitals effectively, reasonably organize schedules for human resources and finances, and appropriately distribute hospital material resources. In this study, a hybrid method based on empirical mode decomposition and back-propagation artificial neural networks optimized by particle swarm optimization is developed to forecast outpatient visits on the basis of monthly numbers. The data outpatient visits are retrieved from January 2005 to December 2013 and first obtained as the original time series. Second, the original time series is decomposed into a finite and often small number of intrinsic mode functions by the empirical mode decomposition technique. Third, a three-layer back-propagation artificial neural network is constructed to forecast each intrinsic mode functions. To improve network performance and avoid falling into a local minimum, particle swarm optimization is employed to optimize the weights and thresholds of back-propagation artificial neural networks. Finally, the superposition of forecasting results of the intrinsic mode functions is regarded as the ultimate forecasting value. Simulation indicates that the proposed method attains a better performance index than the other four methods.

  11. Normalization of time-series satellite reflectance data to a standard sun-target-sensor geometry using a semi-empirical model

    Science.gov (United States)

    Zhao, Yongguang; Li, Chuanrong; Ma, Lingling; Tang, Lingli; Wang, Ning; Zhou, Chuncheng; Qian, Yonggang

    2017-10-01

    Time series of satellite reflectance data have been widely used to characterize environmental phenomena, describe trends in vegetation dynamics and study climate change. However, several sensors with wide spatial coverage and high observation frequency are usually designed to have large field of view (FOV), which cause variations in the sun-targetsensor geometry in time-series reflectance data. In this study, on the basis of semiempirical kernel-driven BRDF model, a new semi-empirical model was proposed to normalize the sun-target-sensor geometry of remote sensing image. To evaluate the proposed model, bidirectional reflectance under different canopy growth conditions simulated by Discrete Anisotropic Radiative Transfer (DART) model were used. The semi-empirical model was first fitted by using all simulated bidirectional reflectance. Experimental result showed a good fit between the bidirectional reflectance estimated by the proposed model and the simulated value. Then, MODIS time-series reflectance data was normalized to a common sun-target-sensor geometry by the proposed model. The experimental results showed the proposed model yielded good fits between the observed and estimated values. The noise-like fluctuations in time-series reflectance data was also reduced after the sun-target-sensor normalization process.

  12. Localifecation of variable-basis topological systems | Solovyov ...

    African Journals Online (AJOL)

    The paper provides another approach to the notion of variable-basis topological system generalizing the fixed-basis concept of S. Vickers, considers functorial relationships between the categories of modified variable-basis topological systems and variable-basis fuzzy topological spaces in the sense of S.E. Rodabaugh ...

  13. Analysis of the Contribution of Wind Drift Factor to Oil Slick Movement under Strong Tidal Condition: Hebei Spirit Oil Spill Case

    OpenAIRE

    Kim, Tae-Ho; Yang, Chan-Su; Oh, Jeong-Hwan; Ouchi, Kazuo

    2014-01-01

    The purpose of this study is to investigate the effects of the wind drift factor under strong tidal conditions in the western coastal area of Korea on the movement of oil slicks caused by the Hebei Spirit oil spill accident in 2007. The movement of oil slicks was computed using a simple simulation model based on the empirical formula as a function of surface current, wind speed, and the wind drift factor. For the simulation, the Environmental Fluid Dynamics Code (EFDC) model and Automatic Wea...

  14. Empirical evidence for site coefficients in building code provisions

    Science.gov (United States)

    Borcherdt, R.D.

    2002-01-01

    Site-response coefficients, Fa and Fv, used in U.S. building code provisions are based on empirical data for motions up to 0.1 g. For larger motions they are based on theoretical and laboratory results. The Northridge earthquake of 17 January 1994 provided a significant new set of empirical data up to 0.5 g. These data together with recent site characterizations based on shear-wave velocity measurements provide empirical estimates of the site coefficients at base accelerations up to 0.5 g for Site Classes C and D. These empirical estimates of Fa and Fnu; as well as their decrease with increasing base acceleration level are consistent at the 95 percent confidence level with those in present building code provisions, with the exception of estimates for Fa at levels of 0.1 and 0.2 g, which are less than the lower confidence bound by amounts up to 13 percent. The site-coefficient estimates are consistent at the 95 percent confidence level with those of several other investigators for base accelerations greater than 0.3 g. These consistencies and present code procedures indicate that changes in the site coefficients are not warranted. Empirical results for base accelerations greater than 0.2 g confirm the need for both a short- and a mid- or long-period site coefficient to characterize site response for purposes of estimating site-specific design spectra.

  15. Empirical research on international environmental migration: a systematic review.

    Science.gov (United States)

    Obokata, Reiko; Veronis, Luisa; McLeman, Robert

    2014-01-01

    This paper presents the findings of a systematic review of scholarly publications that report empirical findings from studies of environmentally-related international migration. There exists a small, but growing accumulation of empirical studies that consider environmentally-linked migration that spans international borders. These studies provide useful evidence for scholars and policymakers in understanding how environmental factors interact with political, economic and social factors to influence migration behavior and outcomes that are specific to international movements of people, in highlighting promising future research directions, and in raising important considerations for international policymaking. Our review identifies countries of migrant origin and destination that have so far been the subject of empirical research, the environmental factors believed to have influenced these migrations, the interactions of environmental and non-environmental factors as well as the role of context in influencing migration behavior, and the types of methods used by researchers. In reporting our findings, we identify the strengths and challenges associated with the main empirical approaches, highlight significant gaps and future opportunities for empirical work, and contribute to advancing understanding of environmental influences on international migration more generally. Specifically, we propose an exploratory framework to take into account the role of context in shaping environmental migration across borders, including the dynamic and complex interactions between environmental and non-environmental factors at a range of scales.

  16. Calculation of the surface energy of hcp-metals with the empirical electron theory

    International Nuclear Information System (INIS)

    Fu Baoqin; Liu Wei; Li Zhilin

    2009-01-01

    A brief introduction of the surface model based on the empirical electron theory (EET) and the dangling bond analysis method (DBAM) is presented in this paper. The anisotropy of spatial distribution of covalent bonds of hexagonal close-packed (hcp) metals such as Be, Mg, Sc, Ti, Co, Zn, Y, Zr, Tc, Cd, Hf, and Re, has been analyzed. And under the first-order approximation, the calculated surface energy values for low index surfaces of these hcp-metals are in agreement with experimental and other theoretical values. Correlated analysis showed that the anisotropy of surface energy of hcp-metals was related with the ratio of lattice constants (c/a). The calculation method for the research of surface energy provides a good basis for models of surface science phenomena, and the model may be extended to the surface energy estimation of more metals, alloys, ceramics, and so on, since abundant information about the valence electronic structure (VES) is generated from EET.

  17. Identification and Structural Basis of Binding to Host Lung Glycogen by Streptococcal Virulence Factors

    Energy Technology Data Exchange (ETDEWEB)

    Lammerts van Bueren,A.; Higgins, M.; Wang, D.; Burke, R.; Boraston, A.

    2007-01-01

    The ability of pathogenic bacteria to recognize host glycans is often essential to their virulence. Here we report structure-function studies of previously uncharacterized glycogen-binding modules in the surface-anchored pullulanases from Streptococcus pneumoniae (SpuA) and Streptococcus pyogenes (PulA). Multivalent binding to glycogen leads to a strong interaction with alveolar type II cells in mouse lung tissue. X-ray crystal structures of the binding modules reveal a novel fusion of tandem modules into single, bivalent functional domains. In addition to indicating a structural basis for multivalent attachment, the structure of the SpuA modules in complex with carbohydrate provides insight into the molecular basis for glycogen specificity. This report provides the first evidence that intracellular lung glycogen may be a novel target of pathogenic streptococci and thus provides a rationale for the identification of the streptococcal {alpha}-glucan-metabolizing machinery as virulence factors.

  18. Empirical Productivity Indices and Indicators

    NARCIS (Netherlands)

    B.M. Balk (Bert)

    2016-01-01

    textabstractThe empirical measurement of productivity change (or difference) by means of indices and indicators starts with the ex post profit/loss accounts of a production unit. Key concepts are profit, leading to indicators, and profitability, leading to indices. The main task for the productivity

  19. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that

  20. Appropriate methodologies for empirical bioethics: it's all relative.

    Science.gov (United States)

    Ives, Jonathan; Draper, Heather

    2009-05-01

    In this article we distinguish between philosophical bioethics (PB), descriptive policy orientated bioethics (DPOB) and normative policy oriented bioethics (NPOB). We argue that finding an appropriate methodology for combining empirical data and moral theory depends on what the aims of the research endeavour are, and that, for the most part, this combination is only required for NPOB. After briefly discussing the debate around the is/ought problem, and suggesting that both sides of this debate are misunderstanding one another (i.e. one side treats it as a conceptual problem, whilst the other treats it as an empirical claim), we outline and defend a methodological approach to NPOB based on work we have carried out on a project exploring the normative foundations of paternal rights and responsibilities. We suggest that given the prominent role already played by moral intuition in moral theory, one appropriate way to integrate empirical data and philosophical bioethics is to utilize empirically gathered lay intuition as the foundation for ethical reasoning in NPOB. The method we propose involves a modification of a long-established tradition on non-intervention in qualitative data gathering, combined with a form of reflective equilibrium where the demands of theory and data are given equal weight and a pragmatic compromise reached.

  1. Effects of arousal on cognitive control: empirical tests of the conflict-modulated Hebbian-learning hypothesis.

    Science.gov (United States)

    Brown, Stephen B R E; van Steenbergen, Henk; Kedar, Tomer; Nieuwenhuis, Sander

    2014-01-01

    An increasing number of empirical phenomena that were previously interpreted as a result of cognitive control, turn out to reflect (in part) simple associative-learning effects. A prime example is the proportion congruency effect, the finding that interference effects (such as the Stroop effect) decrease as the proportion of incongruent stimuli increases. While this was previously regarded as strong evidence for a global conflict monitoring-cognitive control loop, recent evidence has shown that the proportion congruency effect is largely item-specific and hence must be due to associative learning. The goal of our research was to test a recent hypothesis about the mechanism underlying such associative-learning effects, the conflict-modulated Hebbian-learning hypothesis, which proposes that the effect of conflict on associative learning is mediated by phasic arousal responses. In Experiment 1, we examined in detail the relationship between the item-specific proportion congruency effect and an autonomic measure of phasic arousal: task-evoked pupillary responses. In Experiment 2, we used a task-irrelevant phasic arousal manipulation and examined the effect on item-specific learning of incongruent stimulus-response associations. The results provide little evidence for the conflict-modulated Hebbian-learning hypothesis, which requires additional empirical support to remain tenable.

  2. Effects of arousal on cognitive control: Empirical tests of the conflict-modulated Hebbian-learning hypothesis

    Directory of Open Access Journals (Sweden)

    Stephen B.R.E. Brown

    2014-01-01

    Full Text Available An increasing number of empirical phenomena that were previously interpreted as a result of cognitive control, turn out to reflect (in part simple associative-learning effects. A prime example is the proportion congruency effect, the finding that interference effects (such as the Stroop effect decrease as the proportion of incongruent stimuli increases. While this was previously regarded as strong evidence for a global conflict monitoring-cognitive control loop, recent evidence has shown that the proportion congruency effect is largely item-specific and hence must be due to associative learning. The goal of our research was to test a recent hypothesis about the mechanism underlying such associative-learning effects, the conflict-modulated Hebbian-learning hypothesis, which proposes that the effect of conflict on associative learning is mediated by phasic arousal responses. In Experiment 1, we examined in detail the relationship between the item-specific proportion congruency effect and an autonomic measure of phasic arousal: task-evoked pupillary responses. In Experiment 2, we used a task-irrelevant phasic arousal manipulation and examined the effect on item-specific learning of incongruent stimulus-response associations. The results provide little evidence for the conflict-modulated Hebbian-learning hypothesis, which requires additional empirical support to remain tenable.

  3. The frontiers of empirical science: A Thomist-inspired critique of ...

    African Journals Online (AJOL)

    2016-07-08

    Jul 8, 2016 ... of scientism, is, however, self-destructive of scientism because contrary to its ... The theory that only empirical facts have epistemic meaning is supported by the ..... (2002:1436). The cyclic model lacks empirical verification,.

  4. An Empirical Study Of User Acceptance Of Online Social Networks Marketing

    Directory of Open Access Journals (Sweden)

    Olumayowa Mulero

    2013-07-01

    Full Text Available The explosion of Internet usage has drawn the attention of researchers towards online Social Networks Marketing (SNM. Research has shown that a number of the Internet users are distrustful and indecisive, when it comes to the use of social networks marketing system. Therefore, there is a need for researchers to identify some of the factors that determine users’ acceptance of social networks marketing using Technology Acceptance Model (TAM. This study extended the Technology Acceptance Model theoretical framework to predict consumer acceptance of social networks marketing within Western Cape Province of South Africa. The research model was tested using data collected from 470 questionnaires and analysed using linear regression. The results showed that user intentions to use SNM are strongly and positively correlated with user acceptance of using SNM systems. Empirical results confirmed that perceived credibility and perceived usefulness are the strongest determinant in predicting user intentions to use SNM system.

  5. Living network meta-analysis compared with pairwise meta-analysis in comparative effectiveness research: empirical study

    Science.gov (United States)

    Nikolakopoulou, Adriani; Mavridis, Dimitris; Furukawa, Toshi A; Cipriani, Andrea; Tricco, Andrea C; Straus, Sharon E; Siontis, George C M; Egger, Matthias

    2018-01-01

    Abstract Objective To examine whether the continuous updating of networks of prospectively planned randomised controlled trials (RCTs) (“living” network meta-analysis) provides strong evidence against the null hypothesis in comparative effectiveness of medical interventions earlier than the updating of conventional, pairwise meta-analysis. Design Empirical study of the accumulating evidence about the comparative effectiveness of clinical interventions. Data sources Database of network meta-analyses of RCTs identified through searches of Medline, Embase, and the Cochrane Database of Systematic Reviews until 14 April 2015. Eligibility criteria for study selection Network meta-analyses published after January 2012 that compared at least five treatments and included at least 20 RCTs. Clinical experts were asked to identify in each network the treatment comparison of greatest clinical interest. Comparisons were excluded for which direct and indirect evidence disagreed, based on side, or node, splitting test (Pmeta-analysis. The frequency and time to strong evidence was compared against the null hypothesis between pairwise and network meta-analyses. Results 49 comparisons of interest from 44 networks were included; most (n=39, 80%) were between active drugs, mainly from the specialties of cardiology, endocrinology, psychiatry, and rheumatology. 29 comparisons were informed by both direct and indirect evidence (59%), 13 by indirect evidence (27%), and 7 by direct evidence (14%). Both network and pairwise meta-analysis provided strong evidence against the null hypothesis for seven comparisons, but for an additional 10 comparisons only network meta-analysis provided strong evidence against the null hypothesis (P=0.002). The median time to strong evidence against the null hypothesis was 19 years with living network meta-analysis and 23 years with living pairwise meta-analysis (hazard ratio 2.78, 95% confidence interval 1.00 to 7.72, P=0.05). Studies directly comparing

  6. A unified model of the strong and electroweak interactions based on the gauge group SU(18)L x SU(18)R

    International Nuclear Information System (INIS)

    Kim Il Kang

    1986-01-01

    On the basis of semi-simple gauge group G=SU(18) L x SU(18) R the unified theory of strong, weak and electromagnetic fields is constructed, and it is shown that the Weinberg angle and the energy of unification are in good agreement with the experimental values. (author)

  7. Iterative solutions of nonlinear equations with strongly accretive or strongly pseudocontractive maps

    International Nuclear Information System (INIS)

    Chidume, C.E.

    1994-03-01

    Let E be a real q-uniformly smooth Banach space. Suppose T is a strongly pseudo-contractive map with open domain D(T) in E. Suppose further that T has a fixed point in D(T). Under various continuity assumptions on T it is proved that each of the Mann iteration process or the Ishikawa iteration method converges strongly to the unique fixed point of T. Related results deal with iterative solutions of nonlinear operator equations involving strongly accretive maps. Explicit error estimates are also provided. (author). 38 refs

  8. Empirical Bayesian inference and model uncertainty

    International Nuclear Information System (INIS)

    Poern, K.

    1994-01-01

    This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability

  9. ACTUALIZATION OF THE PERSON OF STUDENTS AS BEARERS OF INNOVATION CULTURE IN HIGHER EDUCATION: EMPIRICAL STUDIES

    Directory of Open Access Journals (Sweden)

    Tatyana B. Zagorulya

    2015-01-01

    Full Text Available The aim of the study is to consider the problem of empirical research aimed at identifying the personality traits of students as bearers of innovation culture: innovative susceptibility, assertiveness, autonomy in decision-making, initiative and responsibility.Methods. In accordance with the object and purpose of the empirical research, empirical methods are used: instructional design model, observation, notes and forming experiments, testing, questionnaire, interview, qualitative analysis of empirical data; block of complementary techniques: «Research of features of response to conflict» (by K. Thomas, «Assertiveness», «Leader. Qualities of a Leader», «Leading representative system».Scientific novelty and results. Scientific novelty consists in the justification of psycho-pedagogical tools for diagnosing the level of development of the innovation culture of students of high school. It is found that students who successfully realizing the potential strength and ability to organize their own life, educational activities and communication on the basis of aggregate conscious goals, values in assertiveness, are outer-directed to innovation taking place in society and in the world. The use of innovative technologies in the educational process, especially creative projects, the decision of cases, holding debates, simulations and role-playing games creates conditions for the development of assertive behavior required in the process of successful adaptation and integration of students in the educational environment of the university, the acquisition of competitiveness in the society. It is concluded that the presence of the students’ considerable potential for the development of an innovation culture, in particular the leading representative of different systems, allowing to develop communication skills and engage in constructive dialogue.Practical significance. Appropriate methods and appropriate tools for diagnosing the level

  10. Improved MODIS aerosol retrieval in urban areas using a land classification approach and empirical orthogonal functions

    Science.gov (United States)

    Levitan, Nathaniel; Gross, Barry

    2016-10-01

    New, high-resolution aerosol products are required in urban areas to improve the spatial coverage of the products, in terms of both resolution and retrieval frequency. These new products will improve our understanding of the spatial variability of aerosols in urban areas and will be useful in the detection of localized aerosol emissions. Urban aerosol retrieval is challenging for existing algorithms because of the high spatial variability of the surface reflectance, indicating the need for improved urban surface reflectance models. This problem can be stated in the language of novelty detection as the problem of selecting aerosol parameters whose effective surface reflectance spectrum is not an outlier in some space. In this paper, empirical orthogonal functions, a reconstruction-based novelty detection technique, is used to perform single-pixel aerosol retrieval using the single angular and temporal sample provided by the MODIS sensor. The empirical orthogonal basis functions are trained for different land classes using the MODIS BRDF MCD43 product. Existing land classification products are used in training and aerosol retrieval. The retrieval is compared against the existing operational MODIS 3 KM Dark Target (DT) aerosol product and co-located AERONET data. Based on the comparison, our method allows for a significant increase in retrieval frequency and a moderate decrease in the known biases of MODIS urban aerosol retrievals.

  11. EMPIRICAL RESEARCH AND CONGREGATIONAL ANALYSIS ...

    African Journals Online (AJOL)

    empirical research has made to the process of congregational analysis. 1 Part of this ... contextual congegrational analysis – meeting social and divine desires”) at the IAPT .... methodology of a congregational analysis should be regarded as a process. ... essential to create space for a qualitative and quantitative approach.

  12. US biofuels subsidies and CO2 emissions: An empirical test for a weak and a strong green paradox

    International Nuclear Information System (INIS)

    Grafton, R. Quentin; Kompas, Tom; Long, Ngo Van; To, Hang

    2014-01-01

    Using energy data over the period 1981–2011 we find that US biofuels subsidies and production have provided a perverse incentive for US fossil fuel producers to increase their rate of extraction that has generated a weak green paradox. Further, in the short-run if the reduction in the CO 2 emissions from a one-to-one substitution between biofuels and fossil fuels is less than 26 percent, or less than 57 percent if long run effect is taken into account, then US biofuels production is likely to have resulted in a strong green paradox. These results indicate that subsidies for first generation biofuels, which yield a low level of per unit CO 2 emission reduction compared to fossil fuels, might have contributed to additional net CO 2 emissions over the study period. - Highlights: • US biofuels subsidies increased fossil fuel extraction from 1981 to 2011. • US biofuels subsidies likely increased carbon emissions from 1981 to 2011. • Governments must consider effects of biofuel subsidies on fossil fuel extraction

  13. Strong intrinsic motivation

    OpenAIRE

    Dessi, Roberta; Rustichini, Aldo

    2015-01-01

    A large literature in psychology, and more recently in economics, has argued that monetary rewards can reduce intrinsic motivation. We investigate whether the negative impact persists when intrinsic motivation is strong, and test this hypothesis experimentally focusing on the motivation to undertake interesting and challenging tasks, informative about individual ability. We find that this type of task can generate strong intrinsic motivation, that is impervious to the effect of monetary incen...

  14. Selection Bias in Educational Transition Models: Theory and Empirical Evidence

    DEFF Research Database (Denmark)

    Holm, Anders; Jæger, Mads

    variables. This paper, first, explains theoretically how selection on unobserved variables leads to waning coefficients and, second, illustrates empirically how selection leads to biased estimates of the effect of family background on educational transitions. Our empirical analysis using data from...

  15. Single-particle model of a strongly driven, dense, nanoscale quantum ensemble

    Science.gov (United States)

    DiLoreto, C. S.; Rangan, C.

    2018-01-01

    We study the effects of interatomic interactions on the quantum dynamics of a dense, nanoscale, atomic ensemble driven by a strong electromagnetic field. We use a self-consistent, mean-field technique based on the pseudospectral time-domain method and a full, three-directional basis to solve the coupled Maxwell-Liouville equations. We find that interatomic interactions generate a decoherence in the state of an ensemble on a much faster time scale than the excited-state lifetime of individual atoms. We present a single-particle model of the driven, dense ensemble by incorporating interactions into a dephasing rate. This single-particle model reproduces the essential physics of the full simulation and is an efficient way of rapidly estimating the collective dynamics of a dense ensemble.

  16. A risk-informed framework for establishing a beyond design basis safety basis for external hazards

    Energy Technology Data Exchange (ETDEWEB)

    Amico, P. [Hughes Associates, Inc, Baltimore, MD (United States); Anoba, R. [Hughes Associates, Inc, Raleigh, NC (United States); Najafi, B. [Hughes Associates, Inc., Los Gatos, CA (United States)

    2014-07-01

    The events at Fukushima Daiichi taught us that meeting a deterministic design basis requirement for external hazards does not assure that the risk is low. As observed at the plant, the two primary reasons for this are failure cliffs above the design basis event and that combined hazard effects are not considered in design. Because the possible combinations of design basis exceedences and external hazard combinations are very large and complex, an approach focusing only on the most important ones is needed. For this reason, a risk informed approach is the most effective approach, which is discussed in this paper. (author)

  17. A Multicenter Evaluation of Prolonged Empiric Antibiotic Therapy in Adult ICUs in the United States.

    Science.gov (United States)

    Thomas, Zachariah; Bandali, Farooq; Sankaranarayanan, Jayashri; Reardon, Tom; Olsen, Keith M

    2015-12-01

    The purpose of this study is to determine the rate of prolonged empiric antibiotic therapy in adult ICUs in the United States. Our secondary objective is to examine the relationship between the prolonged empiric antibiotic therapy rate and certain ICU characteristics. Multicenter, prospective, observational, 72-hour snapshot study. Sixty-seven ICUs from 32 hospitals in the United States. Nine hundred ninety-eight patients admitted to the ICU between midnight on June 20, 2011, and June 21, 2011, were included in the study. None. Antibiotic orders were categorized as prophylactic, definitive, empiric, or prolonged empiric antibiotic therapy. Prolonged empiric antibiotic therapy was defined as empiric antibiotics that continued for at least 72 hours in the absence of adjudicated infection. Standard definitions from the Centers for Disease Control and Prevention were used to determine infection. Prolonged empiric antibiotic therapy rate was determined as the ratio of the total number of empiric antibiotics continued for at least 72 hours divided by the total number of empiric antibiotics. Univariate analysis of factors associated with the ICU prolonged empiric antibiotic therapy rate was conducted using Student t test. A total of 660 unique antibiotics were prescribed as empiric therapy to 364 patients. Of the empiric antibiotics, 333 of 660 (50%) were continued for at least 72 hours in instances where Centers for Disease Control and Prevention infection criteria were not met. Suspected pneumonia accounted for approximately 60% of empiric antibiotic use. The most frequently prescribed empiric antibiotics were vancomycin and piperacillin/tazobactam. ICUs that utilized invasive techniques for the diagnosis of ventilator-associated pneumonia had lower rates of prolonged empiric antibiotic therapy than those that did not, 45.1% versus 59.5% (p = 0.03). No other institutional factor was significantly associated with prolonged empiric antibiotic therapy rate. Half of all

  18. Review of US ESCO industry market trends: an empirical analysis of project data

    International Nuclear Information System (INIS)

    Goldman, Charles A.; Hopper, Nicole C.; Osborn, Julie G.

    2005-01-01

    This comprehensive empirical analysis of US energy service company (ESCO) industry trends and performance employs two parallel analytical approaches: a survey of firms to estimate total industry size, and a database of ∼1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US$2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m 2 /year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time (SPT) is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ESCOs have proven resilient in the face of restructuring and will probably shift toward selling 'energy solutions', with energy efficiency part of a package. We conclude that appropriate policy support - both financial and non-financial - can 'jump-start' a viable private-sector energy-efficiency services industry that targets large institutional and commercial/industrial customers

  19. Review of US ESCO industry market trends: an empirical analysis of project data

    International Nuclear Information System (INIS)

    Goldman, C.A.; Hopper, N.C.; Osborn, J.G.

    2005-01-01

    This comprehensive empirical analysis of US energy service company (ESCO) industry trends and performance employs two parallel analytical approaches: a survey of firms to estimate total industry size, and a database of ∼1500 ESCO projects, from which we report target markets and typical project characteristics, energy savings and customer economics. We estimate that industry investment for energy-efficiency related services reached US$2 billion in 2000 following a decade of strong growth. ESCO activity is concentrated in states with high economic activity and strong policy support. Typical projects save 150-200 MJ/m 2 /year and are cost-effective with median benefit/cost ratios of 1.6 and 2.1 for institutional and private sector projects. The median simple payback time (SPT) is 7 years among institutional customers; 3 years is typical in the private sector. Reliance on DSM incentives has decreased since 1995. Preliminary evidence suggests that state enabling policies have boosted the industry in medium-sized states. ECSOs have proven resilient in the face of restructuring and will probably shift toward selling 'energy solutions', with energy efficiency part of a package. We conclude that appropriate policy support - both financial and non-financial - can 'jump-start' a viable private-sector energy-efficiency services industry that targets large institutional and commercial/industrial customers. (author)

  20. Personality, risk aversion and speeding: an empirical investigation.

    Science.gov (United States)

    Greaves, Stephen P; Ellison, Adrian B

    2011-09-01

    Evidence suggests that in addition to demographics, there are strong relationships between facets of drivers' personality (e.g., aggression, thrill-seeking, altruism), aversion to risk and driving behaviour, particularly speeding. However, evidence is muted by the reliance on self-reported driving behaviour, which is thought to not accurately reflect actual driving behaviour. This paper reports on a study of 133 drivers in Sydney, who were asked to complete a short survey to develop their personality and risk aversion profiles and self-reported speeding behaviour. A Global Positioning System (GPS) device was then installed in their vehicle for several weeks as part of a major investigation of driving behaviour from which empirical measures of speeding are derived. Among the most pertinent findings are: (1) the tendency for drivers to both under and over-estimate their propensity to speed, (2) significant heterogeneity in speeding with a small, but notable number of drivers exceeding the limit for more than 20 percent of the distance driven, (3) weak relationships between the personality/risk-aversion measures and actual speeding, and (4) the suggestion that different personality traits appear to influence behaviour in different situations both from self-reported and actual speeding behaviour. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Amino acid empirical contact energy definitions for fold recognition in the space of contact maps

    Directory of Open Access Journals (Sweden)

    Fogolari Federico

    2003-02-01

    Full Text Available Abstract Background Contradicting evidence has been presented in the literature concerning the effectiveness of empirical contact energies for fold recognition. Empirical contact energies are calculated on the basis of information available from selected protein structures, with respect to a defined reference state, according to the quasi-chemical approximation. Protein-solvent interactions are estimated from residue solvent accessibility. Results In the approach presented here, contact energies are derived from the potential of mean force theory, several definitions of contact are examined and their performance in fold recognition is evaluated on sets of decoy structures. The best definition of contact is tested, on a more realistic scenario, on all predictions including sidechains accepted in the CASP4 experiment. In 30 out of 35 cases the native structure is correctly recognized and best predictions are usually found among the 10 lowest energy predictions. Conclusion The definition of contact based on van der Waals radii of alpha carbon and side chain heavy atoms is seen to perform better than other definitions involving only alpha carbons, only beta carbons, all heavy atoms or only backbone atoms. An important prerequisite for the applicability of the approach is that the protein structure under study should not exhibit anomalous solvent accessibility, compared to soluble proteins whose structure is deposited in the Protein Data Bank. The combined evaluation of a solvent accessibility parameter and contact energy allows for an effective gross screening of predictive models.

  2. Dynamical basis set

    International Nuclear Information System (INIS)

    Blanco, M.; Heller, E.J.

    1985-01-01

    A new Cartesian basis set is defined that is suitable for the representation of molecular vibration-rotation bound states. The Cartesian basis functions are superpositions of semiclassical states generated through the use of classical trajectories that conform to the intrinsic dynamics of the molecule. Although semiclassical input is employed, the method becomes ab initio through the standard matrix diagonalization variational method. Special attention is given to classical-quantum correspondences for angular momentum. In particular, it is shown that the use of semiclassical information preferentially leads to angular momentum eigenstates with magnetic quantum number Vertical BarMVertical Bar equal to the total angular momentum J. The present method offers a reliable technique for representing highly excited vibrational-rotational states where perturbation techniques are no longer applicable

  3. Testing strong interaction theories

    International Nuclear Information System (INIS)

    Ellis, J.

    1979-01-01

    The author discusses possible tests of the current theories of the strong interaction, in particular, quantum chromodynamics. High energy e + e - interactions should provide an excellent means of studying the strong force. (W.D.L.)

  4. Empirical laws, regularity and necessity

    NARCIS (Netherlands)

    Koningsveld, H.

    1973-01-01

    In this book I have tried to develop an analysis of the concept of an empirical law, an analysis that differs in many ways from the alternative analyse's found in contemporary literature dealing with the subject.

    1 am referring especially to two well-known views, viz. the regularity and

  5. Psychological Models of Art Reception must be Empirically Grounded

    DEFF Research Database (Denmark)

    Nadal, Marcos; Vartanian, Oshin; Skov, Martin

    2017-01-01

    We commend Menninghaus et al. for tackling the role of negative emotions in art reception. However, their model suffers from shortcomings that reduce its applicability to empirical studies of the arts: poor use of evidence, lack of integration with other models, and limited derivation of testable...... hypotheses. We argue that theories about art experiences should be based on empirical evidence....

  6. Demonstration and information center on the basis of the research reactor IR-50

    International Nuclear Information System (INIS)

    Krupenina, F.

    2001-01-01

    Many problems exist in the nuclear field, but the most significant one is the public's mistrust of Nuclear Energy. Strong downfalls of the radiological culture affect public perception, the main paradox being the situation after Chernobyl. The task of creating a Demonstration and-Information Center (Minatom RF) on the basis of the research reactor IR-50 is conducted by Research and Development Institute of Power Engineering (ENTEK). The IR-50 is situated on the grounds of the institute. It will be a unique event when the functional reactor is situated in the center of the city (about 5 km from Kremlin). (author)

  7. Maternal environment affects the genetic basis of seed dormancy in Arabidopsis thaliana.

    Science.gov (United States)

    Postma, Froukje M; Ågren, Jon

    2015-02-01

    The genetic basis of seed dormancy, a key life history trait important for adaptive evolution in plant populations, has yet been studied only using seeds produced under controlled conditions in greenhouse environments. However, dormancy is strongly affected by maternal environmental conditions, and interactions between seed genotype and maternal environment have been reported. Consequently, the genetic basis of dormancy of seeds produced under natural field conditions remains unclear. We examined the effect of maternal environment on the genetic architecture of seed dormancy using a recombinant inbred line (RIL) population derived from a cross between two locally adapted populations of Arabidopsis thaliana from Italy and Sweden. We mapped quantitative trait loci (QTL) for dormancy of seeds produced in the greenhouse and at the native field sites of the parental genotypes. The Italian genotype produced seeds with stronger dormancy at fruit maturation than did the Swedish genotype in all three environments, and the maternal field environments induced higher dormancy levels compared to the greenhouse environment in both genotypes. Across the three maternal environments, a total of nine dormancy QTL were detected, three of which were only detected among seeds matured in the field, and six of which showed significant QTL × maternal environment interactions. One QTL had a large effect on dormancy across all three environments and colocalized with the candidate gene DOG1. Our results demonstrate the importance of studying the genetic basis of putatively adaptive traits under relevant conditions. © 2015 John Wiley & Sons Ltd.

  8. Computing as Empirical Science – Evolution of a Concept

    Directory of Open Access Journals (Sweden)

    Polak Paweł

    2016-12-01

    Full Text Available This article presents the evolution of philosophical and methodological considerations concerning empiricism in computer/computing science. In this study, we trace the most important current events in the history of reflection on computing. The forerunners of Artificial Intelligence H.A. Simon and A. Newell in their paper Computer Science As Empirical Inquiry (1975 started these considerations. Later the concept of empirical computer science was developed by S.S. Shapiro, P. Wegner, A.H. Eden and P.J. Denning. They showed various empirical aspects of computing. This led to a view of the science of computing (or science of information processing - the science of general scope. Some interesting contemporary ways towards a generalized perspective on computations were also shown (e.g. natural computing.

  9. On the Empirical Evidence of Mutual Fund Strategic Risk Taking

    NARCIS (Netherlands)

    Goriaev, A.P.; Nijman, T.E.; Werker, B.J.M.

    2001-01-01

    We reexamine empirical evidence on strategic risk-taking behavior by mutual fund managers.Several studies suggest that fund performance in the first semester of a year influences risk-taking in the second semester.However, we show that previous empirical studies implicitly assume that idiosyncratic

  10. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.

    2010-02-16

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks theorem for the limiting distributions of the empirical likelihood ratios is derived. It is shown that one of the proposed methods is locally efficient among a class of within-subject variance-covariance matrices. A simulation study is conducted to investigate the finite sample properties of the proposed methods and compare them with the block empirical likelihood method by You et al. (2006) and the normal approximation with a correctly estimated variance-covariance. The results suggest that the proposed methods are generally more efficient than existing methods which ignore the correlation structure, and better in coverage compared to the normal approximation with correctly specified within-subject correlation. An application illustrating our methods and supporting the simulation study results is also presented.

  11. Understanding the cognitive basis for human-wildlife relationships as a key to successful protected-area management

    DEFF Research Database (Denmark)

    Teel, Tara L.; Manfredo, Michael J.; Jensen, Frank Søndergaard

    2010-01-01

    or value orientations form basis for more specific cognitions that in turn drive individual action. We extend this cognitive hierarchy framework to account for the rolke of societal forces that give rise to cultural values and their orientations over time. Using empirical data from two cases, we surview...... this micro-macro approach and explore its implications for protected-area management. First, data from nineteen-state study conducted in 2004 via mail survey in the united States show how two contrasting orientations - dominations and mutualism - produce different attitudes and behaviors toward wildlife....... Hierarchical linear modeling of these data supports a societal-level shift from domination to mutualism in response to modernization. Second, a 2007-8 exploratory application of outdoor approach in ten European countries provides further evidence of the role of value orientations in shaping individual response...

  12. 'Nobody tosses a dwarf!' The relation between the empirical and the normative reexamined.

    Science.gov (United States)

    Leget, Carlo; Borry, Pascal; de Vries, Raymond

    2009-05-01

    This article discusses the relation between empirical and normative approaches in bioethics. The issue of dwarf tossing, while admittedly unusual, is chosen as a point of departure because it challenges the reader to look with fresh eyes upon several central bioethical themes, including human dignity, autonomy, and the protection of vulnerable people. After an overview of current approaches to the integration of empirical and normative ethics, we consider five ways that the empirical and normative can be brought together to speak to the problem of dwarf tossing: prescriptive applied ethics, theoretical ethics, critical applied ethics, particularist ethics and integrated empirical ethics. We defend a position of critical applied ethics that allows for a two-way relation between empirical and normative theories. Against efforts fully to integrate the normative and the empirical into one synthesis, we propose that the two should stand in tension and relation to one another. The approach we endorse acknowledges that a social practice can and should be judged both by the gathering of empirical data and by normative ethics. Critical applied ethics uses a five stage process that includes: (a) determination of the problem, (b) description of the problem, (c) empirical study of effects and alternatives, (d) normative weighing and (e) evaluation of the effects of a decision. In each stage, we explore the perspective from both the empirical (sociological) and the normative ethical point of view. We conclude by applying our five-stage critical applied ethics to the example of dwarf tossing.

  13. Moment Conditions Selection Based on Adaptive Penalized Empirical Likelihood

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available Empirical likelihood is a very popular method and has been widely used in the fields of artificial intelligence (AI and data mining as tablets and mobile application and social media dominate the technology landscape. This paper proposes an empirical likelihood shrinkage method to efficiently estimate unknown parameters and select correct moment conditions simultaneously, when the model is defined by moment restrictions in which some are possibly misspecified. We show that our method enjoys oracle-like properties; that is, it consistently selects the correct moment conditions and at the same time its estimator is as efficient as the empirical likelihood estimator obtained by all correct moment conditions. Moreover, unlike the GMM, our proposed method allows us to carry out confidence regions for the parameters included in the model without estimating the covariances of the estimators. For empirical implementation, we provide some data-driven procedures for selecting the tuning parameter of the penalty function. The simulation results show that the method works remarkably well in terms of correct moment selection and the finite sample properties of the estimators. Also, a real-life example is carried out to illustrate the new methodology.

  14. Gazprom the new russian empire

    International Nuclear Information System (INIS)

    Cosnard, D.

    2004-01-01

    The author analyzes the economical and political impacts of the great Gazprom group, leader in the russian energy domain, in Russia. Already number one of the world gas industry, this Group is becoming the right-hand of the Kremlin. Thus the author wonders on this empire transparency and limits. (A.L.B.)

  15. Collective Labour Supply, Taxes, and Intrahousehold Allocation: An Empirical Approach

    NARCIS (Netherlands)

    Bloemen, H.G.

    2017-01-01

    Most empirical studies of the impact of labour income taxation on the labour supply behaviour of households use a unitary modelling approach. In this paper we empirically analyze income taxation and the choice of working hours by combining the collective approach for household behaviour and the

  16. Bias-dependent hybrid PKI empirical-neural model of microwave FETs

    Science.gov (United States)

    Marinković, Zlatica; Pronić-Rančić, Olivera; Marković, Vera

    2011-10-01

    Empirical models of microwave transistors based on an equivalent circuit are valid for only one bias point. Bias-dependent analysis requires repeated extractions of the model parameters for each bias point. In order to make model bias-dependent, a new hybrid empirical-neural model of microwave field-effect transistors is proposed in this article. The model is a combination of an equivalent circuit model including noise developed for one bias point and two prior knowledge input artificial neural networks (PKI ANNs) aimed at introducing bias dependency of scattering (S) and noise parameters, respectively. The prior knowledge of the proposed ANNs involves the values of the S- and noise parameters obtained by the empirical model. The proposed hybrid model is valid in the whole range of bias conditions. Moreover, the proposed model provides better accuracy than the empirical model, which is illustrated by an appropriate modelling example of a pseudomorphic high-electron mobility transistor device.

  17. Design basis 2

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, G.; Soerensen, P. [Risoe National Lab., Roskilde (Denmark)

    1996-09-01

    Design Basis Program 2 (DBP2) is comprehensive fully coupled code which has the capability to operate in the time domain as well as in the frequency domain. The code was developed during the period 1991-93 and succeed Design Basis 1, which is a one-blade model presuming stiff tower, transmission system and hub. The package is designed for use on a personal computer and offers a user-friendly environment based on menu-driven editing and control facilities, and with graphics used extensively for the data presentation. Moreover in-data as well as results are dumped on files in Ascii-format. The input data is organized in a in-data base with a structure that easily allows for arbitrary combinations of defined structural components and load cases. (au)

  18. NMR and IR Investigations of Strong Intramolecular Hydrogen Bonds

    Directory of Open Access Journals (Sweden)

    Poul Erik Hansen

    2017-03-01

    Full Text Available For the purpose of this review, strong hydrogen bonds have been defined on the basis of experimental data, such as OH stretching wavenumbers, νOH, and OH chemical shifts, δOH (in the latter case, after correction for ring current effects. Limits for O–H···Y systems are taken as 2800 > νOH > 1800 cm−1, and 19 ppm > δOH > 15 ppm. Recent results as well as an account of theoretical advances are presented for a series of important classes of compounds such as β-diketone enols, β-thioxoketone enols, Mannich bases, proton sponges, quinoline N-oxides and diacid anions. The O···O distance has long been used as a parameter for hydrogen bond strength in O–H···O systems. On a broad scale, a correlation between OH stretching wavenumbers and O···O distances is observed, as demonstrated experimentally as well as theoretically, but for substituted β-diketone enols this correlation is relatively weak.

  19. Institutionalization of conflict capability in the management of natural resources : theoretical perspectives and empirical experience in Indonesia

    OpenAIRE

    Yasmi, Y.

    2007-01-01

    Keywords: natural resource conflict, conflict capability, impairment, escalation This study concerns natural resource management (NRM) conflict particularly conflict in forestry sector and how such conflict can be addressed effectively. It consists of two major parts. The first deals with the theoretical review of conflict literature. It shows how conflict can conceptualized distinctively and how such distinctive conceptualization can be used as a strong basis for understanding and addressing...

  20. Empirical and theoretical challenges in aboveground-belowground ecology

    DEFF Research Database (Denmark)

    W.H. van der Putten,; R.D. Bardgett; P.C. de Ruiter

    2009-01-01

    of the current conceptual succession models into more predictive models can help targeting empirical studies and generalising their results. Then, we discuss how understanding succession may help to enhance managing arable crops, grasslands and invasive plants, as well as provide insights into the effects...... and environmental settings, we explore where and how they can be supported by theoretical approaches to develop testable predictions and to generalise empirical results. We review four key areas where a combined aboveground-belowground approach offers perspectives for enhancing ecological understanding, namely...

  1. Unveiling the checkered fortunes of the Ottoman Empire

    OpenAIRE

    Dimitrova-Grajzl, Valentina

    2013-01-01

    The Ottoman Empire has been predominantly viewed as the ćSick Man of Europe.ć The question arises, however, how this perceived inefficiency can be reconciled with the long existence and prosperity of the Empire. I argue that the Ottoman system could have been efficient subject to constraints. More specifically, I explore the role of the technology of predation and the adherence to the law in determining relative changes in the social order and the power of the Sultan, which in turn led to the...

  2. Teaching "Empire of the Sun."

    Science.gov (United States)

    Riet, Fred H. van

    1990-01-01

    A Dutch teacher presents reading, film viewing, and writing activities for "Empire of the Sun," J. G. Ballard's autobiographical account of life as a boy in Shanghai and in a Japanese internment camp during World War II (the subject of Steven Spielberg's film of the same name). Includes objectives, procedures, and several literature,…

  3. Empirical Specification of Utility Functions.

    Science.gov (United States)

    Mellenbergh, Gideon J.

    Decision theory can be applied to four types of decision situations in education and psychology: (1) selection; (2) placement; (3) classification; and (4) mastery. For the application of the theory, a utility function must be specified. Usually the utility function is chosen on a priori grounds. In this paper methods for the empirical assessment…

  4. Basis set construction for molecular electronic structure theory: natural orbital and Gauss-Slater basis for smooth pseudopotentials.

    Science.gov (United States)

    Petruzielo, F R; Toulouse, Julien; Umrigar, C J

    2011-02-14

    A simple yet general method for constructing basis sets for molecular electronic structure calculations is presented. These basis sets consist of atomic natural orbitals from a multiconfigurational self-consistent field calculation supplemented with primitive functions, chosen such that the asymptotics are appropriate for the potential of the system. Primitives are optimized for the homonuclear diatomic molecule to produce a balanced basis set. Two general features that facilitate this basis construction are demonstrated. First, weak coupling exists between the optimal exponents of primitives with different angular momenta. Second, the optimal primitive exponents for a chosen system depend weakly on the particular level of theory employed for optimization. The explicit case considered here is a basis set appropriate for the Burkatzki-Filippi-Dolg pseudopotentials. Since these pseudopotentials are finite at nuclei and have a Coulomb tail, the recently proposed Gauss-Slater functions are the appropriate primitives. Double- and triple-zeta bases are developed for elements hydrogen through argon. These new bases offer significant gains over the corresponding Burkatzki-Filippi-Dolg bases at various levels of theory. Using a Gaussian expansion of the basis functions, these bases can be employed in any electronic structure method. Quantum Monte Carlo provides an added benefit: expansions are unnecessary since the integrals are evaluated numerically.

  5. Pluvials, Droughts, the Mongol Empire, and Modern Mongolia

    Science.gov (United States)

    Hessl, A. E.; Pederson, N.; Baatarbileg, N.; Anchukaitis, K. J.

    2013-12-01

    Understanding the connections between climate, ecosystems, and society during historical and modern climatic transitions requires annual resolution records with high fidelity climate signals. Many studies link the demise of complex societies with deteriorating climate conditions, but few have investigated the connection between climate, surplus energy, and the rise of empires. Inner Asia in the 13th century underwent a major political transformation requiring enormous energetic inputs that altered human history. The Mongol Empire, centered on the city of Karakorum, became the largest contiguous land empire in world history (Fig. 1 inset). Powered by domesticated grazing animals, the empire grew at the expense of sedentary agriculturalists across Asia, the Middle East, and Eastern Europe. Although some scholars and conventional wisdom agree that dry conditions spurred the Mongol conquests, little paleoenvironmental data at annual resolution are available to evaluate the role of climate in the development of the Mongol Empire. Here we present a 2600 year tree-ring reconstruction of warm-season, self-calibrating Palmer Drought Severity Index (scPDSI), a measure of water balance, derived from 107 live and dead Siberian pine (Pinus sibirica) trees growing on a Holocene lava flow in central Mongolia. Trees growing on the Khorgo lava flow today are stunted and widely spaced, occurring on microsites with little to no soil development. These trees are extremely water-stressed and their radial growth is well-correlated with both drought (scPDSI) and grassland productivity (Normalized Difference Vegetation Index (NDVI)). Our reconstruction, calibrated and validated on instrumental June-September scPDSI (1959-2009) accounts for 55.8% of the variability in the regional scPDSI when 73% of the annual rainfall occurs. Our scPDSI reconstruction places historic and modern social change in Mongolia in the context of the range of climatic variability during the Common Era. Our record

  6. Porphyry of Russian Empires in Paris

    Science.gov (United States)

    Bulakh, Andrey

    2014-05-01

    Porphyry of Russian Empires in Paris A. G. Bulakh (St Petersburg State University, Russia) So called "Schokhan porphyry" from Lake Onega, Russia, belongs surely to stones of World cultural heritage. One can see this "porphyry" at facades of a lovely palace of Pavel I and in pedestal of the monument after Nicolas I in St Petersburg. There are many other cases of using this stone in Russia. In Paris, sarcophagus of Napoleon I Bonaparte is constructed of blocks of this stone. Really, it is Proterozoic quartzite. Geology situation, petrography and mineralogical characteristic will be reported too. Comparison with antique porphyre from the Egyptian Province of the Roma Empire is given. References: 1) A.G.Bulakh, N.B.Abakumova, J.V.Romanovsky. St Petersburg: a History in Stone. 2010. Print House of St Petersburg State University. 173 p.

  7. Sequential nonadiabatic excitation of large molecules and ions driven by strong laser fields

    International Nuclear Information System (INIS)

    Markevitch, Alexei N.; Levis, Robert J.; Romanov, Dmitri A.; Smith, Stanley M.; Schlegel, H. Bernhard; Ivanov, Misha Yu.

    2004-01-01

    Electronic processes leading to dissociative ionization of polyatomic molecules in strong laser fields are investigated experimentally, theoretically, and numerically. Using time-of-flight ion mass spectroscopy, we study the dependence of fragmentation on laser intensity for a series of related molecules and report regular trends in this dependence on the size, symmetry, and electronic structure of a molecule. Based on these data, we develop a model of dissociative ionization of polyatomic molecules in intense laser fields. The model is built on three elements: (i) nonadiabatic population transfer from the ground electronic state to the excited-state manifold via a doorway (charge-transfer) transition; (ii) exponential enhancement of this transition by collective dynamic polarization of all electrons, and (iii) sequential energy deposition in both neutral molecules and resulting molecular ions. The sequential nonadiabatic excitation is accelerated by a counterintuitive increase of a large molecule's polarizability following its ionization. The generic theory of sequential nonadiabatic excitation forms a basis for quantitative description of various nonlinear processes in polyatomic molecules and ions in strong laser fields

  8. A strongly quasiconvex PAC-Bayesian bound

    DEFF Research Database (Denmark)

    Thiemann, Niklas; Igel, Christian; Wintenberger, Olivier

    2017-01-01

    We propose a new PAC-Bayesian bound and a way of constructing a hypothesis space, so that the bound is convex in the posterior distribution and also convex in a trade-off parameter between empirical performance of the posterior distribution and its complexity. The complexity is measured by the Ku...

  9. Operation and profits of energy boards. A study of the basis of municipal business activities and the equitableness of the profits of municipal energy boards

    International Nuclear Information System (INIS)

    Karhu, V.; Nissinen, T.; Valkama, P.

    1999-01-01

    The objective of the empirical part of the study (Chapter 6) is to evaluate the equitableness of profits on capital invested of the 16 municipal energy boards selected for this study and, at the same time, to create a general evaluation basis for equity decisions made by the authorities case by case. In this part of the study, answers are sought for the following questions: (1) how has the economic situation of the energy boards studied been recently developing based on various economic parameters? (2) have there been differences in the returns and profitability of energy boards operating as public utilities or energy boards operating in company form? (3) what kind of a price level the energy boards studied have maintained in relation to the national averages of this field? (4) is a city in a weaker economic position more tempted to require higher profits on capital invested than a city with a sound economic basis? (5) how high profits on capital invested can be considered reasonable for the whole energy board and particularly for a network business holding a monopoly? The structure of the study is as follows. Chapter 2 contains a brief description of the energy boards selected for this study and of the economic situation of the cities owning them. The theoretical part of the study is included in Chapter 3 'Municipal Self-Government and Business'. It analyses rather deeply the terminology of the municipal business, norm basis, steering of actions, restructuring of companies into business profit centres and privatisation, as well as application of the Act on Restrictions on Competition from the standpoint of a municipal self-government. Chapter 4 deals with the establishment of energy board activity, the legal basis and the criteria for pricing electricity, network services and district heat. Chapter 5 examines the Act on Restrictions on Competition as a regulator of the energy board activities. After this, there are the presentations of the research results of the

  10. Emotional intelligence as a basis for self-esteem in young adults.

    Science.gov (United States)

    Cheung, Chau-Kiu; Cheung, Hoi Yan; Hue, Ming-Tak

    2015-01-01

    As self-esteem is likely to build on favorable social experiences, such as those derived from achievement (i.e., GPA) and social competence, emotional intelligence is likely to be pivotal in fostering social experiences conducive to self-esteem. Accordingly, emotional intelligence is likely to underlie social competence and mediate the contribution of achievement to self-esteem. This uncharted role is the focus of this study, which surveyed 405 undergraduates in Hong Kong, China. Results demonstrated the pivotal role of emotional intelligence. Essentially, emotional intelligence appeared to be a strong determinant of self-esteem and explain away the positive effect of social competence on self-esteem. The results imply the value of raising emotional intelligence in order to consolidate the basis for the young adult's self-esteem.

  11. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Vincent A. R. Camara

    1999-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results.

  12. Resources in academic discourse: An empirical investigation of management journals

    Directory of Open Access Journals (Sweden)

    Marko Seppänen

    2010-06-01

    Full Text Available Commonly shared conceptualizations of resources are scant in academic management research which strikes as somewhat peculiar since resources and their allocation thereof have long been recognised to be at the heart of the competitive advantage and performance of a firm. The research literature considering resources as basis for competitive advantages has further faced contemporary criticism for the vagueness of the fundamental definition of the resource concept. Therefore, this paper empirically studies the representation of resource concept in academic management research literature. The paper reports results on the state of conceptualisations of organisations’ resources found in two distinct sources of research literature, namely ScienceDirect’s database and ISI’s top management journals, resulting in two data sets of a total of 457 articles. The findings illustrate the two-dimensional conceptual farrago in the conceptualisations; on the definitions of the resource concept itself and on the internal structure and the level of analysis when the concept is considered. In addition, the paper sheds light on the temporal evolution of the discourse explicitly considering resources. Finally, the paper considers several remedies for these deficiencies in order both to aid future theory development in management studies and to help increase the practical impact of the research in assisting managerial decision-making.

  13. Empirical Scientific Research and Legal Studies Research--A Missing Link

    Science.gov (United States)

    Landry, Robert J., III

    2016-01-01

    This article begins with an overview of what is meant by empirical scientific research in the context of legal studies. With that backdrop, the argument is presented that without engaging in normative, theoretical, and doctrinal research in tandem with empirical scientific research, the role of legal studies scholarship in making meaningful…

  14. Recent developments in matter of strong motions data bank creation held by ENEA (Rome), Imperial College (London) and CEA/IPSN (Paris)

    International Nuclear Information System (INIS)

    Goula, X.; Mohammadioun, G.; Bommer, J.

    1988-03-01

    A pooling of strong motion data held by ENEA (Rome), Imperial College (London) and CEA/IPSN (Paris) will, in the future, give rise to a unified set of data, accessible from any one of the three centers, composed of a data bank of uncorrected accelerograms associated with an accessory data base containing as ample information as possible concerning the earthquake itself and the recording conditions. All three centers are equipped with VAX computer material, and a DECNET link is currently under consideration. The data thus structured is destined to form the basis of a European strong-motion data bank [fr

  15. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    Science.gov (United States)

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  16. Empirical P-L-C relations for delta Scuti stars

    International Nuclear Information System (INIS)

    Gupta, S.K.

    1978-01-01

    Separate P-L-C relations have been empirically derived by sampling the delta Scuti stars according to their pulsation modes. The results based on these relations have been compared with those estimated from the model based P-L-C relations and the other existing empirical P-L-C relations. It is found that a separate P-L-C relation for each pulsation mode provides a better correspondence with observations. (Auth.)

  17. Economic reasons behind the decline of the Ottoman empire

    OpenAIRE

    Duranoglu, Erkut; Okutucu, Guzide

    2009-01-01

    This study addresses the economic reasons of the decline and fall of the Ottoman Empire. On the contrary to the previous researches, by undertaking both global and domestic developments, the paper examines the decline of the empire from an economical point of perspective. Although international developments such as industrialization in European countries, pressure on the Ottomans in terms of integrating with the world economy, global economic factors like depressions and war...

  18. Empirical distribution function under heteroscedasticity

    Czech Academy of Sciences Publication Activity Database

    Víšek, Jan Ámos

    2011-01-01

    Roč. 45, č. 5 (2011), s. 497-508 ISSN 0233-1888 Grant - others:GA UK(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10750506 Keywords : Robustness * Convergence * Empirical distribution * Heteroscedasticity Subject RIV: BB - Applied Statistics , Operational Research Impact factor: 0.724, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-0365534.pdf

  19. Contributions to the European workshop on investigation of strong motion processing procedures

    International Nuclear Information System (INIS)

    Mohammadioun, B.; Goula, X.; Hamaide, D.

    1985-11-01

    The first paper is one contribution to a joint study program in the numerical processing of accelerograms from strong earthquakes. A method is proposed for generating an analytic signal having characteristics similar to those of an actual ground displacement. From this signal, a simulated accelerogram is obtained analytically. Various numerical processing techniques are to be tested using this signal: the ground displacements they yield will be compared with the original analytic signal. The second contribution deals with a high-performance digitization complex, custom-designed to stringent technical criteria by the CISI Petrole Services, which has recently been put into service at the Bureau d'Evaluation des Risques Sismiques pour la Surete des Installations Nucleaires. Specially tailored to cope with the problems raised by the sampling of Strong-Motion photographic recordings, it offers considerable flexibility, due to its self-teaching conception, constant monitoring of the work ongoing, and numerous preprocessing options. In the third contribution, a critical examination of several processing techniques applicable to photographic recordings of SMA-1 type accelerometers is conducted. The basis for comparison was a set of two accelerograms drawn from synthetic signals, the characteristics of which were already well known

  20. Abortion: Strong's counterexamples fail

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2009-01-01

    This paper shows that the counterexamples proposed by Strong in 2008 in the Journal of Medical Ethics to Marquis's argument against abortion fail. Strong's basic idea is that there are cases--for example, terminally ill patients--where killing an adult human being is prima facie seriously morally...